Implementation of French & Wilson data correction?
Hi All, Any pointers on implementing the new option in the latest build for French & Wilson correction? Does scala output negative intensities so that a proper scaling with this new option can be used? Also, what column labels should be used particularly for proper treatment of anomalous data? Can one use the labels output by scala? Joe ___________________________________________________________ Joseph P. Noel, Ph.D. Investigator, Howard Hughes Medical Institute Professor, The Jack H. Skirball Center for Chemical Biology and Proteomics The Salk Institute for Biological Studies 10010 North Torrey Pines Road La Jolla, CA 92037 USA Phone: (858) 453-4100 extension 1442 Cell: (858) 349-4700 Fax: (858) 597-0855 E-mail: [email protected] Web Site (Salk): http://www.salk.edu/faculty/faculty_details.php?id=37 Web Site (HHMI): http://hhmi.org/research/investigators/noel.html ___________________________________________________________
Hi Joe,
Any pointers on implementing the new option in the latest build for French & Wilson correction?
I'm sorry for the lack of documentation, I'm working on that and it should be available soon. To run F&W correction from the command line, you can run: phenix.french_wilson data.mtz output_file=data_f.mtz You can also turn the method on as part of phenix.refine by adding this to your parameter file: refinement { input { xray_data { french_wilson_scale=True } } } There is also a GUI for F&W under the 'Reflection tools' header.
Does scala output negative intensities so that a proper scaling with this new option can be used?
I am not experienced with scala, so hopefully someone else can weigh in on what it outputs. If you run your data through truncate/ctruncate, our F&W wilson scaling is unnecessary and you should use the F/SIGF arrays generated by truncate/ctruncate.
Also, what column labels should be used particularly for proper treatment of anomalous data? Can one use the labels output by scala?
Phenix is very tolerant of column labels, so any labels that you have used with Phenix before will work fine. In general for anomalous data I like I(+), SIGI(+), I(-), SIGI(-), but as far as I know the labels coming out of scala should work fine. If you run into an error messages I would be interested to know. Thanks, Jeff
Scala outputs negative intensities where appropriate after merging, generally for input into [c]truncate Phil On 25 Feb 2011, at 17:34, Jeff Headd wrote:
Hi Joe,
Any pointers on implementing the new option in the latest build for French & Wilson correction?
I'm sorry for the lack of documentation, I'm working on that and it should be available soon.
To run F&W correction from the command line, you can run:
phenix.french_wilson data.mtz output_file=data_f.mtz
You can also turn the method on as part of phenix.refine by adding this to your parameter file:
refinement { input { xray_data { french_wilson_scale=True } } }
There is also a GUI for F&W under the 'Reflection tools' header.
Does scala output negative intensities so that a proper scaling with this new option can be used?
I am not experienced with scala, so hopefully someone else can weigh in on what it outputs. If you run your data through truncate/ctruncate, our F&W wilson scaling is unnecessary and you should use the F/SIGF arrays generated by truncate/ctruncate.
Also, what column labels should be used particularly for proper treatment of anomalous data? Can one use the labels output by scala?
Phenix is very tolerant of column labels, so any labels that you have used with Phenix before will work fine. In general for anomalous data I like I(+), SIGI(+), I(-), SIGI(-), but as far as I know the labels coming out of scala should work fine. If you run into an error messages I would be interested to know.
Thanks, Jeff _______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi Joe, the main motivation to implement F&W based conversion of Iobs to Fobs was the community driven suggestion that this is a better way of converting Iobs to Fobs, compared to simple if(Iobs >= 0): Fobs = sqrt(Iobs) else: Fobs = 0 that phenix.refine uses. I'm not aware of any systematic and conclusive study demonstrating that one way is better or worse than the other one. Since now we have both options available in PHENIX this opens a good opportunity (for whoever who has enough of enthusiasm and time) to study this some more within one software framework. Some good discussion on related subjects can be found here: "Structure refinement: some background theory and practical strategies". David Watkin. J. Appl. Cryst. (2008). 41, 491–522 Pavel. On 2/25/11 8:56 AM, Joseph Noel wrote:
Hi All,
Any pointers on implementing the new option in the latest build for French & Wilson correction? Does scala output negative intensities so that a proper scaling with this new option can be used? Also, what column labels should be used particularly for proper treatment of anomalous data? Can one use the labels output by scala?
Joe
I'll give it a go! I know Otwinowski well and we have had many discussions about negative intensities. Do you know if Scala/Truncate resets negative intensities to 0?
Sent from my iPhone 4
Joseph P. Noel, Ph.D.
Investigator, Howard Hughes Medical Institute
Professor, The Jack H. Skirball Center for Chemical Biology and Proteomics
The Salk Institute for Biological Studies
10010 North Torrey Pines Road
La Jolla, CA 92037 USA
Phone: (858) 453-4100 extension 1442
Cell: (858) 349-4700
Fax: (858) 597-0855
E-mail: [email protected]
On Feb 25, 2011, at 10:05 AM, Pavel Afonine
Hi Joe,
the main motivation to implement F&W based conversion of Iobs to Fobs was the community driven suggestion that this is a better way of converting Iobs to Fobs, compared to simple
if(Iobs >= 0): Fobs = sqrt(Iobs) else: Fobs = 0
that phenix.refine uses.
I'm not aware of any systematic and conclusive study demonstrating that one way is better or worse than the other one. Since now we have both options available in PHENIX this opens a good opportunity (for whoever who has enough of enthusiasm and time) to study this some more within one software framework.
Some good discussion on related subjects can be found here:
"Structure refinement: some background theory and practical strategies". David Watkin. J. Appl. Cryst. (2008). 41, 491–522
Pavel.
On 2/25/11 8:56 AM, Joseph Noel wrote:
Hi All,
Any pointers on implementing the new option in the latest build for French & Wilson correction? Does scala output negative intensities so that a proper scaling with this new option can be used? Also, what column labels should be used particularly for proper treatment of anomalous data? Can one use the labels output by scala?
Joe
Hi Joe,
On Fri, Feb 25, 2011 at 10:38 AM, Joseph Noel
I'll give it a go! I know Otwinowski well and we have had many discussions about negative intensities. Do you know if Scala/Truncate resets negative intensities to 0?
Both truncate and ctruncate include an implementation of the F&W method, so negative intensities will be scaled to have a positive value. Thanks, Jeff
Jeff,
Thanks! So it sounds like if we use Mosfilm to process raw image files then Scala/Truncate prior to input into Phenix as a MTZ file we wouldn't want to employ this option in Phenix.
Joe
Sent from my iPhone 4
Joseph P. Noel, Ph.D.
Investigator, Howard Hughes Medical Institute
Professor, The Jack H. Skirball Center for Chemical Biology and Proteomics
The Salk Institute for Biological Studies
10010 North Torrey Pines Road
La Jolla, CA 92037 USA
Phone: (858) 453-4100 extension 1442
Cell: (858) 349-4700
Fax: (858) 597-0855
E-mail: [email protected]
On Feb 25, 2011, at 10:58 AM, Jeff Headd
Hi Joe,
On Fri, Feb 25, 2011 at 10:38 AM, Joseph Noel
wrote: I'll give it a go! I know Otwinowski well and we have had many discussions about negative intensities. Do you know if Scala/Truncate resets negative intensities to 0?
Both truncate and ctruncate include an implementation of the F&W method, so negative intensities will be scaled to have a positive value.
Thanks, Jeff
Joe, if you are into testing these different approaches, you may also consider this (which is also described by Bernhard on p.344 of his book) Sivia&David, 1994, Acta Cryst A50:703. It's a simple calculation that is easy to implement, and I have the code that does the conversion, let me know if you'll need it. Also beware that phenix used to ignore input Fo's as long as the intensities are still present (this may have changed though). So if you want it to use the Fo's that you have supplied, make sure that intensities are deleted from the input mtz-file. Ed. On Fri, 2011-02-25 at 11:13 -0800, Joseph Noel wrote:
Jeff,
Thanks! So it sounds like if we use Mosfilm to process raw image files then Scala/Truncate prior to input into Phenix as a MTZ file we wouldn't want to employ this option in Phenix.
Joe
Sent from my iPhone 4
Joseph P. Noel, Ph.D. Investigator, Howard Hughes Medical Institute Professor, The Jack H. Skirball Center for Chemical Biology and Proteomics The Salk Institute for Biological Studies 10010 North Torrey Pines Road La Jolla, CA 92037 USA
Phone: (858) 453-4100 extension 1442 Cell: (858) 349-4700 Fax: (858) 597-0855 E-mail: [email protected]
On Feb 25, 2011, at 10:58 AM, Jeff Headd
wrote: Hi Joe,
On Fri, Feb 25, 2011 at 10:38 AM, Joseph Noel
wrote: I'll give it a go! I know Otwinowski well and we have had many discussions about negative intensities. Do you know if Scala/Truncate resets negative intensities to 0?
Both truncate and ctruncate include an implementation of the F&W method, so negative intensities will be scaled to have a positive value.
Thanks, Jeff
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
-- "I'd jump in myself, if I weren't so good at whistling." Julian, King of Lemurs
On Mon, Feb 28, 2011 at 8:34 AM, Ed Pozharski
Also beware that phenix used to ignore input Fo's as long as the intensities are still present (this may have changed though). So if you want it to use the Fo's that you have supplied, make sure that intensities are deleted from the input mtz-file.
This hasn't changed - at least not yet. (I'm not sure what we decided to do here.) The plan is for phenix.refine (and other apps) to automatically run the procedure when intensities are used, skipping an extra conversion step. Right now it's optional, pending some changes to the output MTZ file, so it needs an extra parameter: phenix.refine i_obs.mtz model.pdb french_wilson=True (It isn't obvious what to do in the GUI, but I'll fix that today.) -Nat
Dear PHENIX developers,
A little off-topic, but will phenix implement MLI (maximum likelihood
intensity target)?
I heard that MLI target gives a good result because MLI handles
negative intensities naturally.
Keitaro
2011/3/1 Nathaniel Echols
On Mon, Feb 28, 2011 at 8:34 AM, Ed Pozharski
wrote: Also beware that phenix used to ignore input Fo's as long as the intensities are still present (this may have changed though). So if you want it to use the Fo's that you have supplied, make sure that intensities are deleted from the input mtz-file.
This hasn't changed - at least not yet. (I'm not sure what we decided to do here.) The plan is for phenix.refine (and other apps) to automatically run the procedure when intensities are used, skipping an extra conversion step. Right now it's optional, pending some changes to the output MTZ file, so it needs an extra parameter:
phenix.refine i_obs.mtz model.pdb french_wilson=True
(It isn't obvious what to do in the GUI, but I'll fix that today.)
-Nat _______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi Keitaro,
A little off-topic, but will phenix implement MLI (maximum likelihood intensity target)?
eventually. The formulas are there, someone just needs to get enough of motivation to spend a couple of days and code them thus making available in phenix.refine. Once this is done, then there will be another set of troubles to deal with: - what to use to compute R-factors: Iobs or Fobs? If Fobs, then in turn you will go back to the original problem of converting I to F. If Iobs, then I bet a number of people will scream that they want to see the R-factors computed the "usual way" (using Fobs, that is). - Same questions about computing the maps.
I heard that MLI target gives a good result because MLI handles negative intensities naturally.
"Good" compared to what? How you evaluate that one is better than the other in this case? Obviosuly, you wouldn't be able to use the R-fators as the criteria. I guess I know how, but I'm not aware that anyone has ever done it (another little project for someone enthusiastic). Overall, I conclude quoting one of my favorite papers: "The choice of refinement against F**2 or F has generated more discussions than it probably warrants. (...)" David Watkin "Structure refinement: some background theory and practical strategies" J. Appl. Cryst. (2008). 41, 491–522 All the best! Pavel.
Hi Pavel, Actually, when we defined the MLI target (originally called MLF2: Pannu & Read, 1996), we did two tests where we compared R-factors (computed with the same set of F values) and phase errors (measured with the mean cosine of the phase error). In one case (Streptomyces griseus trypsin, where the bovine trypsin starting model was a very poor model) the difference was very small, but in the other case (Trypanosoma brucei GAPDH) the MLI target gave significantly better results. I'm not sure why you say that you couldn't use R-factors (though I prefer phase errors or map correlations). You could argue that R-factors computed using F should be biased to give better results to a target based on F than on I, but that wasn't a problem in our tests. Clearly there's room for more exhaustive tests, but that initial result demonstrated the theoretical advantage mentioned by Keitaro (accounting for negative net intensities) plus the advantage that it's a better approximation to an ideal likelihood target (measurement errors are closer to Gaussian in measured intensities than to Gaussian in amplitudes -- assumed in our MLF target -- or complex Gaussian in complex structure factors -- assumed in the Rice likelihood target). It would also be good to bring the Rice likelihood target into the tests. Regards, Randy On 1 Mar 2011, at 01:31, Pavel Afonine wrote:
Hi Keitaro,
A little off-topic, but will phenix implement MLI (maximum likelihood intensity target)?
eventually. The formulas are there, someone just needs to get enough of motivation to spend a couple of days and code them thus making available in phenix.refine. Once this is done, then there will be another set of troubles to deal with: - what to use to compute R-factors: Iobs or Fobs? If Fobs, then in turn you will go back to the original problem of converting I to F. If Iobs, then I bet a number of people will scream that they want to see the R-factors computed the "usual way" (using Fobs, that is). - Same questions about computing the maps.
I heard that MLI target gives a good result because MLI handles negative intensities naturally.
"Good" compared to what? How you evaluate that one is better than the other in this case? Obviosuly, you wouldn't be able to use the R-fators as the criteria. I guess I know how, but I'm not aware that anyone has ever done it (another little project for someone enthusiastic).
Overall, I conclude quoting one of my favorite papers:
"The choice of refinement against F**2 or F has generated more discussions than it probably warrants. (...)"
David Watkin "Structure refinement: some background theory and practical strategies" J. Appl. Cryst. (2008). 41, 491–522
All the best! Pavel.
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
------ Randy J. Read Department of Haematology, University of Cambridge Cambridge Institute for Medical Research Tel: + 44 1223 336500 Wellcome Trust/MRC Building Fax: + 44 1223 336827 Hills Road E-mail: [email protected] Cambridge CB2 0XY, U.K. www-structmed.cimr.cam.ac.uk
Dear Randy and Pavel,
It's amazing to me that MLI target test case gave significantly better
reesults because I thought contributions of weak intensities to
refinement are very small.
I wonder when does MLI work very well -- e.g. in case there're many
negative intensities (measured with much noise)?
And, Did the test case where MLI gave significantly better results
give also better electron density map?
I understand MLI can give more accurate model but there're still some
problems such like I->F conversion etc..
I will read the paper introduced by Pavel -- thanks!
Cheers,
Keitaro
2011/3/1 Randy Read
Hi Pavel,
Actually, when we defined the MLI target (originally called MLF2: Pannu & Read, 1996), we did two tests where we compared R-factors (computed with the same set of F values) and phase errors (measured with the mean cosine of the phase error). In one case (Streptomyces griseus trypsin, where the bovine trypsin starting model was a very poor model) the difference was very small, but in the other case (Trypanosoma brucei GAPDH) the MLI target gave significantly better results.
I'm not sure why you say that you couldn't use R-factors (though I prefer phase errors or map correlations). You could argue that R-factors computed using F should be biased to give better results to a target based on F than on I, but that wasn't a problem in our tests.
Clearly there's room for more exhaustive tests, but that initial result demonstrated the theoretical advantage mentioned by Keitaro (accounting for negative net intensities) plus the advantage that it's a better approximation to an ideal likelihood target (measurement errors are closer to Gaussian in measured intensities than to Gaussian in amplitudes -- assumed in our MLF target -- or complex Gaussian in complex structure factors -- assumed in the Rice likelihood target). It would also be good to bring the Rice likelihood target into the tests.
Regards,
Randy
On 1 Mar 2011, at 01:31, Pavel Afonine wrote:
Hi Keitaro,
A little off-topic, but will phenix implement MLI (maximum likelihood intensity target)?
eventually. The formulas are there, someone just needs to get enough of motivation to spend a couple of days and code them thus making available in phenix.refine. Once this is done, then there will be another set of troubles to deal with: - what to use to compute R-factors: Iobs or Fobs? If Fobs, then in turn you will go back to the original problem of converting I to F. If Iobs, then I bet a number of people will scream that they want to see the R-factors computed the "usual way" (using Fobs, that is). - Same questions about computing the maps.
I heard that MLI target gives a good result because MLI handles negative intensities naturally.
"Good" compared to what? How you evaluate that one is better than the other in this case? Obviosuly, you wouldn't be able to use the R-fators as the criteria. I guess I know how, but I'm not aware that anyone has ever done it (another little project for someone enthusiastic).
Overall, I conclude quoting one of my favorite papers:
"The choice of refinement against F**2 or F has generated more discussions than it probably warrants. (...)"
David Watkin "Structure refinement: some background theory and practical strategies" J. Appl. Cryst. (2008). 41, 491–522
All the best! Pavel.
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
------ Randy J. Read Department of Haematology, University of Cambridge Cambridge Institute for Medical Research Tel: + 44 1223 336500 Wellcome Trust/MRC Building Fax: + 44 1223 336827 Hills Road E-mail: [email protected] Cambridge CB2 0XY, U.K. www-structmed.cimr.cam.ac.uk
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
The old truncate allows you to do a simple square root, and it sets
the negative Is to zero. This option is available when running
truncate separately from scala (i.e. in the scala interface switch
truncate running off). The default is to apply the French & Wilson
correction.
The new truncate (ctruncate) seems to have lost that option, and it
always 'truncates'. So the old version would allow you to compare the
approaches easily, the new one would not.
Johan
Dr. Johan P. Turkenburg X-ray facilities manager
York Structural Biology Laboratory
University of York Phone (+) 44 1904 328251
York YO10 5DD UK Fax (+) 44 1904 328266
On 25 February 2011 18:58, Jeff Headd
Hi Joe,
On Fri, Feb 25, 2011 at 10:38 AM, Joseph Noel
wrote: I'll give it a go! I know Otwinowski well and we have had many discussions about negative intensities. Do you know if Scala/Truncate resets negative intensities to 0?
Both truncate and ctruncate include an implementation of the F&W method, so negative intensities will be scaled to have a positive value.
Thanks, Jeff _______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
participants (9)
-
Ed Pozharski
-
Jeff Headd
-
Johan Turkenburg
-
Joseph Noel
-
Keitaro Yamashita
-
Nathaniel Echols
-
Pavel Afonine
-
Phil Evans
-
Randy Read