[phenixbb] R free set in twinned dataset
PAfonine at lbl.gov
Mon Oct 8 14:23:19 PDT 2007
The coordinates and B-factors shaking itself take fractions of a second.
The question is how many refinement macro-cycles you will need to
perform after that to make sure the memory is removed and proper R/Rfree
values are recovered back (5 or 10 or 50 or ...?).
As for the second part of your question -- I don't know and I don't
think there is a definitive answer to this. I never did any comparisons.
Jianghai Zhu wrote:
>> I saw cases where even simulated annealing didn't help; but this is
>> relatively rare. You can do shake coordinates and B-factors and do
>> refinement as many macro-cycles as necessary (eg., until the gap
>> Rfree-Rwork does not grow anymore):
>> phenix.refine model.pdb data.mtz modify_start_model.sites.shake=1.0
>> Instead of modify_start_model.adp.randomize=true you can use
>> The values in modify_start_model.sites.shake=1.0 is something to play
>> with, but something like 1.0 ... 1.5 is ok in most of cases.
> Is this coordiantes and B-factors shaking as effective as SA? I
> believe it should be much faster.
> phenixbb mailing list
> phenixbb at phenix-online.org
More information about the phenixbb