[phenixbb] Are sigma cutoffs for R-free reflections cheating?

Joe Krahn krahn at niehs.nih.gov
Wed Dec 2 15:26:27 PST 2009


Using a sigma cutoff in refinement is almost always a bad idea. It 
appears that some people still use them. The problem is that sigma 
cutoffs almost always improve R-factors, because it increases the 
denominator in the error/average equation. Some people incorrectly think 
that the reduced R-factor means that the cutoff was an improvement in 
the structure quality.

The result of a sigma cutoff is that the R-factor can be made to look
better than it really is. I think that this should be prevented by never
excluding any R-free reflections by any sort of cutoff criteria. That
keeps the R-free value unbiased. If culling R-free is required for
proper error analysis, or just because the culled data are almost 
certainly bogus values, then the un-culled R-free could be a separate value.

Now days, most people use a sigma cutoff of zero, so it is normally not
a big problem. However, it appears that PHENIX still throws out many
reflections where Fobs==0, which can be a significant fraction in the 
last shell with anisotropic data. Unfortunately, the exclusion of weak 
reflections depends on how amplitudes were derived. If using CCP4 
Truncate, those weak reflections will be inflated a bit to a non-zero 
value, and a zero-sigma cutoff will have a significantly different 
affect. Therefore, I think that the default should be to use reflections 
with Fobs==0, with  SigFobs > 0 as the criterion for non-absent 
reflections in reflection files without a missing number flag (i.e. CNS 
format).

Joe Krahn




More information about the phenixbb mailing list