Anisotropic B-factors with Trucated Data
Dear Crystallographers, I have a dataset which was artificially truncated at 1.7 Ang by the limitations of our setup, but the resolution is probably 1.5 Ang or so. I would like to try running anisotropic B refinement in Phenix, but am not sure how to tweak to get this to happen. I know about the parameter:measurement ratio concept, but I think it may still be better than isotropic. NB the dataset is also 42% twinned. All the best, Jacob Keller ******************************************* Jacob Pearson Keller, PhD Looger Lab/HHMI Janelia Research Campus 19700 Helix Dr, Ashburn, VA 20147 email: [email protected] *******************************************
Actually, I think I found how to override the iso/aniso decision: under "global parameters," set aniso cutoff to lower resolution. Seems to be working. Any thoughts on the more general question about aniso refinement in the case of truncated data, or +/- twinning? Seems twinning changes ML refinement to LSQ, but I am not sure how that changes this. Jacob -----Original Message----- From: [email protected] [mailto:[email protected]] On Behalf Of Keller, Jacob Sent: Friday, July 31, 2015 3:53 PM To: [email protected] Subject: [phenixbb] Anisotropic B-factors with Trucated Data Dear Crystallographers, I have a dataset which was artificially truncated at 1.7 Ang by the limitations of our setup, but the resolution is probably 1.5 Ang or so. I would like to try running anisotropic B refinement in Phenix, but am not sure how to tweak to get this to happen. I know about the parameter:measurement ratio concept, but I think it may still be better than isotropic. NB the dataset is also 42% twinned. All the best, Jacob Keller ******************************************* Jacob Pearson Keller, PhD Looger Lab/HHMI Janelia Research Campus 19700 Helix Dr, Ashburn, VA 20147 email: [email protected] ******************************************* _______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb Unsubscribe: [email protected]
Hi Jacob,
Actually, I think I found how to override the iso/aniso decision: under "global parameters," set aniso cutoff to lower resolution. Seems to be working.
Ok, let me know if it still does not do what you want. Also you can always do: Refinement settings -> Modify selections for: choose "Individual B-factors" and you will get a pop up menu, where you can tell what atoms need to be isotropic and what atoms need be anisotropic. For example you can type: Isotropic atoms: water Anisotropic atoms: not water
Any thoughts on the more general question about aniso refinement in the case of truncated data, or +/- twinning?
Try both and see which one works best.
Seems twinning changes ML refinement to LSQ, but I am not sure how that changes this.
phenix.refine uses LS in case of twining. Pavel
Hi
1.7 Angstrom resolution seems to me awfully low for thinking about anisotropic protein model refinement! Ie in terms of number of diffraction data to model parameters. The truncation due to the set up geometry, leading to high at the edge of the data set, IMHO, should not lead to too much oddity (in the atom B factors).
Just my two cents worth,
Best wishes,
John
Emeritus Prof John R Helliwell DSc
On 31 Jul 2015, at 21:42, "Keller, Jacob"
Actually, I think I found how to override the iso/aniso decision: under "global parameters," set aniso cutoff to lower resolution. Seems to be working.
Any thoughts on the more general question about aniso refinement in the case of truncated data, or +/- twinning? Seems twinning changes ML refinement to LSQ, but I am not sure how that changes this.
Jacob
-----Original Message----- From: [email protected] [mailto:[email protected]] On Behalf Of Keller, Jacob Sent: Friday, July 31, 2015 3:53 PM To: [email protected] Subject: [phenixbb] Anisotropic B-factors with Trucated Data
Dear Crystallographers,
I have a dataset which was artificially truncated at 1.7 Ang by the limitations of our setup, but the resolution is probably 1.5 Ang or so. I would like to try running anisotropic B refinement in Phenix, but am not sure how to tweak to get this to happen. I know about the parameter:measurement ratio concept, but I think it may still be better than isotropic. NB the dataset is also 42% twinned.
All the best,
Jacob Keller
******************************************* Jacob Pearson Keller, PhD Looger Lab/HHMI Janelia Research Campus 19700 Helix Dr, Ashburn, VA 20147 email: [email protected] *******************************************
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb Unsubscribe: [email protected]
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb Unsubscribe: [email protected]
Dear John,
1.7 Angstrom resolution seems to me awfully low for thinking about anisotropic protein model refinement! Ie in terms of number of diffraction data to model parameters.
in refinement we use restraints that contribution is added with variable weights: Ttotal = Tdata + w * Trestraints . One can think of Trestraints being "observations" added to the actual data in different amounts depending on choice of the weight w. This is exactly why we can still refine individual coordinates or isotropic B-factors at "macromolecular" resolutions like 2-6A or so. If we think of this in terms of data/parameters ratio counted as Nreflections vs Natoms*(4xyz+1B) (that is without restraints) that would be impossible but restraints do the trick. Likewise, we can still refine anisotropic B-factors at resolutions beyond 1-1.2A. I agree 1.7A is a bit low(ish) but in favorable cases (data is very good and comes from a crystal that can diffract to a higher resolution) might be worth a try. All the best, Pavel
Dear Pavel,
I appreciate your reply, thankyou.
And what would you use firstly to monitor success (an Rfree drop presumably but how much is of a drop is significant?) and second to guard against 'over fitting' eg (restraining the aniso Bij?).
PDB Redo use a Hamilton Rfactor test algorithm to judge if aniso can sensibly be applied. That seems to me the ideal check (Hamilton).
Greetings,
John
Cc
Emeritus Prof John R Helliwell DSc_Physics
FInstP FRSC FRSB Fellow of the ACA
Emeritus Member of the British Biochemical Society
School of Chemistry, University of Manchester, M13 9PL, UK.
On 1 Aug 2015, at 17:23, Pavel Afonine
Dear John,
1.7 Angstrom resolution seems to me awfully low for thinking about anisotropic protein model refinement! Ie in terms of number of diffraction data to model parameters.
in refinement we use restraints that contribution is added with variable weights:
Ttotal = Tdata + w * Trestraints .
One can think of Trestraints being "observations" added to the actual data in different amounts depending on choice of the weight w.
This is exactly why we can still refine individual coordinates or isotropic B-factors at "macromolecular" resolutions like 2-6A or so. If we think of this in terms of data/parameters ratio counted as Nreflections vs Natoms*(4xyz+1B) (that is without restraints) that would be impossible but restraints do the trick.
Likewise, we can still refine anisotropic B-factors at resolutions beyond 1-1.2A. I agree 1.7A is a bit low(ish) but in favorable cases (data is very good and comes from a crystal that can diffract to a higher resolution) might be worth a try.
All the best, Pavel
Dear John,
And what would you use firstly to monitor success (an Rfree drop presumably but how much is of a drop is significant?) and second to guard against 'over fitting' eg (restraining the aniso Bij?).
PDB Redo use a Hamilton Rfactor test algorithm to judge if aniso can sensibly be applied. That seems to me the ideal check (Hamilton).
Rfree and other single-number quality metrics are global figures, they will not tell you if one or few refined B-factors are bad or atoms are misfit or local geometry is distorted. This is why when it comes to validation I like to think of it being applied to data, model and model-to-data fit, for each item looking at local and global metrics: http://phenix-online.org/presentations/latest/pavel_validation.pdf Specifically to B-factors: I think Ethan Merritt's tools are most comprehensive, such as http://skuld.bmsc.washington.edu/parvati/ and other. All the best, Pavel
Yes indeed quite so.
We agree that 1.7Angstrom is low for this type of refinement.
But I concede there may be more to try in this case, as you summarised.
And as you remark in your recent talk slides (thankyou for the weblink by the way) 'avoid fitting into noise' or as I would put it 'false precision' is also important. (I was impressed by your very recent FEM maps paper in Acta D by the way, and put out a tweet from my twitter account @HelliwellJohn highlighting it).
Greetings,
John
Emeritus Prof John R Helliwell DSc
On 1 Aug 2015, at 20:23, Pavel Afonine
Dear John,
And what would you use firstly to monitor success (an Rfree drop presumably but how much is of a drop is significant?) and second to guard against 'over fitting' eg (restraining the aniso Bij?).
PDB Redo use a Hamilton Rfactor test algorithm to judge if aniso can sensibly be applied. That seems to me the ideal check (Hamilton).
Rfree and other single-number quality metrics are global figures, they will not tell you if one or few refined B-factors are bad or atoms are misfit or local geometry is distorted.
This is why when it comes to validation I like to think of it being applied to data, model and model-to-data fit, for each item looking at local and global metrics: http://phenix-online.org/presentations/latest/pavel_validation.pdf
Specifically to B-factors: I think Ethan Merritt's tools are most comprehensive, such as http://skuld.bmsc.washington.edu/parvati/ and other.
All the best, Pavel
participants (3)
-
Jrh
-
Keller, Jacob
-
Pavel Afonine