On Mon, May 27, 2013 at 15:08:26 -0700, Nat Echols wrote:
It would be nice if you could confirm if my settings are ok, or if there are still some settings in the background which I overlooked, because we see small changes in R and Rfree. Changes relative to what? It is not expected that feeding the output PDB file back to phenix.refine (or another program) will exactly reproduce the R-factors from the previous round of refinement, because the precision of the atomic parameters is limited by PDB format. (Using mmCIF instead would be more reliable, but we don't think a change in R-factor of 0.0004 is worth worrying about anyway.)
I added the model refined at higher resolution (e.g. 1.5A) and the mtz file cut at CC (e.g.1.5A) in the ?Input data? tab, and set High resolution cutoff in ?X-ray data and experimental phases? box to (e.g. 1.8A) In the refinement setting strategy tab I uncheck everything and set Number of cycles = 1. In ?other options? I uncheck everything too. This sounds correct, but it's much easier to simply run the validation GUI, because the underlying code (specifically phenix.model_vs_data) is designed to calculate identical R-factors to what phenix.refine shows you.
From what I've seen, recent phenix.model_vs_data works correctly and reproduces the statistics from phenix.refine within the numerical accuracy that one can expect.
Alternatively adding a file to the input data with I don't recommend this - using parameter files as input is fine for large repetitive parameter blocks such as custom geometry restraints, selections from the TLSMD server, or nucleic acid base pairs, but anything that attempts to override the controls in the main GUI may not work.
2. For whatever reason I would like to do that (molprobity) - can I (and how can I do this) add the command line commands also in the gui? Could you please clarify?
3. In december 2012 there was a discussion about weak data and refinement. I would like to know if there are new data available showing that ?Whether the benefit they describe is considered cosmetic or non-trivial,.....? Maybe Kay could comment on this...
There is not a single answer that fits all situations. When done correctly (e.g. keeping Rfree flags, keeping refinement strategy), in those cases that I looked at, using weak reflections (cutoff at ~0.5 to 1, cc1/2~0.15-0.25) improved the model's Rfree at lower resolution (paired refinement!) by a few percent, compared to cutting the data at I/sigma>2. Just try it and see if the improvement is worth it, according to your own judgement!
4. Additionally could one short comment on what phenix is doing with strong (weak) reflections marked as aliens in XDS. Assuming that XDS doesn't actually remove these, phenix.refine may or may not discard them at the outlier filtering stage (which can be disabled). I don't know how the different methods for flagging suspect resolutions compare; Kay and/or Pavel would know more about this.
The "Alien" list in CORRECT.LP was only (originally) meant to alert the user to ice reflections, zingers, and cosmic rays (not sure if the latter play a role, but the former certainly do!). If the "Alien" list just shows reflections at very high resolution, in low-<intensity> resolution shells, then this is due to the fact that Wilson statistics does not hold in these shells, and therefore the algorithm for deciding about "Aliens" is no longer valid. If this happens, you should (as identified in analysis together with James Holton) prevent CORRECT from marking "Aliens" as outliers by changing REJECT_ALIEN from 20 (the default) to 100 or so. Otherwise the mean intensity in that resolution shell will drop (due to rejecting the strongest reflections), and that is bad for applying the Wilson&French algorithm. The XDSCONV in the latest XDS package (identifying itself as March-30-2013) has two fixes for dealing with weak data (again, based on an exchange with James Holton). Unfortunately, a small bug (affecting very few datasets) was identified by Graeme Winter (thanks!) in the XDS program of that package, so be prepared to download the XDS package again in a few days. hope that helps, Kay
According to the supplementary of P. Andrew Karplus, Kay Diederichs ?Linking Crystallographic Model and Data Quality? I would like to do anisotropic refinement, with additional options ?ordered_solvent.new_solvent=anisotropic adp.individual.anisotropic="not element H" fix_rotamers=True?. If your resolution really is 1.5?, I wouldn't recommend making waters anisotropic. There is some difference of opinion whether anisotropic refinement of protein atoms is acceptable; it's probably good to try both ways. (It would be interesting to know whether including the weak high-resolution data allows you to parameterize the model more aggressively.)
-Nat
-- Kay Diederichs http://strucbio.biologie.uni-konstanz.de email: [email protected] Tel +49 7531 88 4049 Fax 3183 Fachbereich Biologie, Universität Konstanz, Box M647, D-78457 Konstanz This e-mail is digitally signed. If your e-mail client does not have the necessary capabilities, just ignore the attached signature "smime.p7s".