![](https://secure.gravatar.com/avatar/1dea6e0a6fdffa6fd747a13f5f5e24fd.jpg?s=120&d=mm&r=g)
AFAIU, Joe is referring to the situation where negative intensities are converted to FOBS=0. Ignoring them would allow refinement to have any FCALC for these reflections when in fact we do have some information about them. I am not aware of any systematic study of this question, but it *seems* rather obvious that ignoring zero reflections is a wrong thing to do *in these circumstances*. Of course, none of this is a problem when negative intensity reflections are processed by truncate to have positive FOBS according to their sigmas, since it is then virtually impossible to have FOBS exactly zero. So I would say that the problem is not phenix.refine, but the I->F conversion protocol which assigns zero values to reflections which are actually not zero.
Imagine I have a dataset of resolution 26.0-2.3A. Do you really think it would be great to do refinement in resolution say 100.0-0.25A, where all missing Fobs are zeros?
No, it won't be great but I don't think Joe or anyone else is suggesting that. If I understand correctly, he is saying that it might be a good idea not to ignore 25% of reflections in the highest resolution shell that had negative intensities (that is if I choose not to use French&Wilson, which would be my fault). Cheers, Ed. --