[phenixbb] Geometry Restraints - Anisotropic truncation

Pavel Afonine pafonine at lbl.gov
Tue May 1 17:46:27 PDT 2012


>> 1) maps calculated using all (unmodified) data by phenix.refine, 
>> phenix.maps and similar tools are better than maps calculated using 
>> anisotropy truncated data. So, yes, for the purpose of map 
>> calculation there is no need to do anything: Phenix map calculation 
>> tools deal with anisotropy very well.
>
> That is not our experience in cases of really severe anisotropy.

Can you send me an example off-list, please (need model and data files 
(before and after truncation)).

>> 2) phenix.refine refinement may fail if one uses original anisotropic 
>> data set. This is probably because the ML target does not use 
>> experimental sigmas (and anisotropy correction by UCLA server is 
>> nothing but Miller index dependent removing the data by sigma 
>> criterion - yeah, that old well criticized practice of throwing away 
>> the data that you worked hard to measure!). May be using sigmas in ML 
>> calculation could solve the problem but that has to be proved.
>
> UCLA server removes all the data beyond set ellipsoid, it does not 
> deal with individual reflections by sigma.

Yes, it's not done per reflection, but indirectly as part of 
determination of the parameters of that ellipsoid that is then used to 
cut the data (as far as I understand it).

> Also one can set its own resolution limits on UCLA server, depending 
> on personally preferred criteria.

May be it's just fine (given the current state-of-the-art of methods 
used in refinement) if it's done carefully and thoughtfully. The trend 
though seem to be to blindly use it "just in case it gives me a lower 
R", which I find dangerous (and yes, it is wrong to compare R-factors 
calculated using different amount of data!).

Pavel



More information about the phenixbb mailing list