Hi all, Many have argued that we should include weak data in refinement --- e.g., reflections much weaker than I/sigI=2 --- in order to take advantage of the useful information found in large numbers of uncertain data points (like argued in the recent Karplus and Diederichs Science paper on CC1/2). This makes sense to me as long as the uncertainty attached to each HKL is properly accounted for. However, I was surprised to hear rumors that with phenix "the data are not properly weighted in refinement by incorporating observed sigmas" and such. I was wondering if the phenix developers could comment on the sanity of including weak data in phenix refinement, and on how phenix handles it. Douglas ^`^`^`^`^`^`^`^`^`^`^`^`^`^`^`^`^`^`^`^` Douglas L. Theobald Assistant Professor Department of Biochemistry Brandeis University Waltham, MA 02454-9110 [email protected] http://theobald.brandeis.edu/ ^\ /` /^. / /\ / / /`/ / . /` / / ' ' '