To put this in context:
I originally wrote that page when phenix.refine used a |F|>0 cutoff in refinement. Given that I<0 often results in |F|=0 after the conversion the program threw out data for which we knew something (i.e. it was weak) albeit imprecisely. That was a self-evidently
bad idea. However the defaults in phenix.refine have changed and as Ed Berry observes it does emulate CCP4's TRUNCATE now (in "truncate yes" mode, probably).
I convert my data with CCP4's TRUNCATE or CTRUNCATE programs. For strong data I don't apply the French and Wilson intensity data remapping. For weak data I do - but only for refinement purposes, not for SAD/MAD phasing. My standard ploy is to use CAD
to strip out Imean and SigImean from the MTZ file so that phenix.refine cannot attempt to "outsmart" me again. And by outsmart I really mean "usurp my intentions". I would suggest that everyone do the same - or at least test the difference. My known bias
is to trust CCP4 programs more.
Cheers
Phil Jeffrey
Princeton
Dear All,
I noticed Phenix refine default settings are using data labels "Imean" and "SigImean" instead of F and SigF. I searched this on the internet and found a post that titled "Phenix.refine Defaults Considered Harmful" ( web link
http://xray0.princeton.edu/~phil/Facility/phenix_fubar.html ). One point in this post shows that "the first thing the program does is ignore that structure factor data and use the IMEAN data instead." This may force rejecting weak data according to the
post.
The latest Phenix version may be updated from the version the post author used, I wonder if this risk is still in the newest version of Phenix ?