Dear Phenix Users, I have R-merge values equal 0.000 from highest resolution shell (which corresponds to values larger than 1.000) in a data set processed by HKL2000, with redundancies higher than 10 (I know this may help in increasing R-merge). I am not sure if this is going to be a problem of Table 1 for the reviewers. Could you kindly drop me your suggestions? Thank you in advance, Mengbin Chen -- Mengbin Chen Department of Chemistry University of Pennsylvania
On Wed, May 29, 2013 at 2:37 PM, Mengbin Chen
I have R-merge values equal 0.000 from highest resolution shell (which corresponds to values larger than 1.000) in a data set processed by HKL2000, with redundancies higher than 10 (I know this may help in increasing R-merge). I am not sure if this is going to be a problem of Table 1 for the reviewers. Could you kindly drop me your suggestions?
Here is how to recalculate the statistics in Phenix: 1) in the "macros" tab of HKL2000, under "Scaling", type "no merge original index" 2) re-run the scaling - you will get a new .sca file in a different format, which contains unmerged (but scaled) intensities 3) in Phenix, run phenix.merging_statistics (under "Reflection tools" in the GUI) with the unmerged .sca file as input You can also input the unmerged data in the "Table 1" program in Phenix (which just runs phenix.merging_statistics internally). -Nat PS #1: You should probably email the HKL2000 developers in case they don't already know about this bug. PS #2: if you run xia2 instead, you automatically get both merged and unmerged data!
On 05/29/2013 05:44 PM, Nathaniel Echols wrote:
PS #1: You should probably email the HKL2000 developers in case they don't already know about this bug.
This "bug" has been there as long as I remember, and it's probably more of a "feature". To be fair, HKL manual explicitly says that Rmerge is inferior to I/sigma for resolution cutoff choice. -- Oh, suddenly throwing a giraffe into a volcano to make water is crazy? Julian, King of Lemurs
I would try to determine if there's meaningful signal in your outer shell. You can try three things: 1. Calculate a precision-indicating merging R factor (Rpim) 2. (if you have ncs) calculate a self-rotation function using outer shell data only and see if you have meaningful information 3. Scale less data together (although this may hurt your completeness depending on your space group). Also try Scala instead of Scalepack. ****************************************************************************** Gino Cingolani, Ph.D. Associate Professor Thomas Jefferson University Dept. of Biochemistry & Molecular Biology 233 South 10th Street - Room 826 Philadelphia PA 19107 Office (215) 503 4573 Lab (215) 503 4595 Fax (215) 923 2117 E-mail: [email protected]mailto:[email protected] Website: http://www.cingolanilab.orghttp://www.cingolanilab.org/ ****************************************************************************** "Nati non foste per viver come bruti, ma per seguir virtute e canoscenza" ("You were not born to live like brutes, but to follow virtue and knowledge") Dante, The Divine Comedy (Inferno, XXVI, vv. 119-120) From: [email protected] [mailto:[email protected]] On Behalf Of Mengbin Chen Sent: Wednesday, May 29, 2013 5:37 PM To: PHENIX user mailing list Subject: [phenixbb] R-merge Dear Phenix Users, I have R-merge values equal 0.000 from highest resolution shell (which corresponds to values larger than 1.000) in a data set processed by HKL2000, with redundancies higher than 10 (I know this may help in increasing R-merge). I am not sure if this is going to be a problem of Table 1 for the reviewers. Could you kindly drop me your suggestions? Thank you in advance, Mengbin Chen -- Mengbin Chen Department of Chemistry University of Pennsylvania The information contained in this transmission contains privileged and confidential information. It is intended only for the use of the person named above. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message. CAUTION: Intended recipients should NOT use email communication for emergent or urgent health care matters.
Got it, thank you! Mengbin On Wed, May 29, 2013 at 5:48 PM, Gino Cingolani < [email protected]> wrote:
I would try to determine if there’s meaningful signal in your outer shell. You can try three things:
*1. *Calculate a precision-indicating merging R factor (*Rpim)***
2. (if you have ncs) calculate a self-rotation function using outer shell data only and see if you have meaningful information
3. Scale less data together (although this may hurt your completeness depending on your space group).
Also try Scala instead of Scalepack.
****************************************************************************** Gino Cingolani, Ph.D. Associate Professor Thomas Jefferson University Dept. of Biochemistry & Molecular Biology 233 South 10th Street - Room 826 Philadelphia PA 19107 Office (215) 503 4573 Lab (215) 503 4595 Fax (215) 923 2117 E-mail: [email protected] Website: http://www.cingolanilab.org
****************************************************************************** "Nati non foste per viver come bruti, ma per seguir virtute e canoscenza" ("You were not born to live like brutes, but to follow virtue and knowledge") Dante, The Divine Comedy (Inferno, XXVI, vv. 119-120)
*From:* [email protected] [mailto: [email protected]] *On Behalf Of *Mengbin Chen *Sent:* Wednesday, May 29, 2013 5:37 PM *To:* PHENIX user mailing list *Subject:* [phenixbb] R-merge
Dear Phenix Users,
I have R-merge values equal 0.000 from highest resolution shell (which corresponds to values larger than 1.000) in a data set processed by HKL2000, with redundancies higher than 10 (I know this may help in increasing R-merge). I am not sure if this is going to be a problem of Table 1 for the reviewers. Could you kindly drop me your suggestions?
Thank you in advance,
Mengbin Chen
-- Mengbin Chen
Department of Chemistry
University of Pennsylvania
The information contained in this transmission contains privileged and confidential information. It is intended only for the use of the person named above. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
*CAUTION*: Intended recipients should NOT use email communication for emergent or urgent health care matters.
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
-- Mengbin Chen Department of Chemistry University of Pennsylvania
On 05/29/2013 05:48 PM, Gino Cingolani wrote:
Scale less data together This seems a complete opposite of what an experimentalist should do (more data -> better, assuming no radiation damage). Can we give Rmerge proper burial at last?
-- Oh, suddenly throwing a giraffe into a volcano to make water is crazy? Julian, King of Lemurs
Ed, you are taking my words out of context. If you read my entire post ....: "I would try to determine if there’s meaningful signal in your outer shell. You can try three things: 1. Calculate a precision-indicating merging R factor (Rpim)2 2. (if you have ncs) calculate a self-rotation function using outer shell data only and see if you have meaningful information 3. Scale less data together (although this may hurt your completeness depending on space group) I'm not referring to cutting off data for refinement. I'm suggesting to analyze diffraction data before throwing them blindly in refinement, especially when the Rsym is that high. Scaling smaller batches of data is a simple way to detect problems such as radiation damages, change in unit cell parameters (quite common for large unit cell crystals), problems with detector, build up of ice, etc. ****************************************************************************** Gino Cingolani, Ph.D. Associate Professor Thomas Jefferson University Dept. of Biochemistry & Molecular Biology 233 South 10th Street - Room 826 Philadelphia PA 19107 Office (215) 503 4573 Lab (215) 503 4595 Fax (215) 923 2117 E-mail: [email protected] Website: http://www.cingolanilab.org ****************************************************************************** "Nati non foste per viver come bruti, ma per seguir virtute e canoscenza" ("You were not born to live like brutes, but to follow virtue and knowledge") Dante, The Divine Comedy (Inferno, XXVI, vv. 119-120) From: [email protected] [[email protected]] on behalf of Ed Pozharski [[email protected]] Sent: Wednesday, May 29, 2013 10:28 PM To: PHENIX user mailing list Subject: Re: [phenixbb] R-merge On 05/29/2013 05:48 PM, Gino Cingolani wrote: Scale less data together This seems a complete opposite of what an experimentalist should do (more data -> better, assuming no radiation damage). Can we give Rmerge proper burial at last? -- Oh, suddenly throwing a giraffe into a volcano to make water is crazy? Julian, King of Lemurs The information contained in this transmission contains privileged and confidential information. It is intended only for the use of the person named above. If you are not the intended recipient, you are hereby notified that any review, dissemination, distribution or duplication of this communication is strictly prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message. CAUTION: Intended recipients should NOT use email communication for emergent or urgent health care matters.
On 05/29/2013 11:43 PM, Gino Cingolani wrote:
Ed,
you are taking my words out of context. If you read my entire post ....:
Well, perhaps I was just worried how your exact words may be understood. Frankly, "scale less data together (although this may hurt your completeness depending on space group)" seems to me like a suggestion that merging fewer frames may solve the "problem" of high Rmerge. Your much more detailed explanation of what you meant originally will hopefully prevent such erroneous interpretation. -- Oh, suddenly throwing a giraffe into a volcano to make water is crazy? Julian, King of Lemurs
On 05/29/2013 05:37 PM, Mengbin Chen wrote:
Could you kindly drop me your suggestions?
Read this http://www.sciencemag.org/content/336/6084/1030.short If for some reason you would like to stick with denzo/scalepack instead of re-processing your data with aimless, the phenix.cc_star is what you can use (presumably, as others have already pointed out, you need unmerged data for this). -- Oh, suddenly throwing a giraffe into a volcano to make water is crazy? Julian, King of Lemurs
I am not sure if this is going to be a problem of Table 1 for the reviewers.
yes it will.
Could you kindly drop me your suggestions?
don't use HKL2000 but use xia2/xds/imosflm/scala instead. Peter -- ----------------------------------------------------------------- P.H. Zwart Research Scientist Berkeley Center for Structural Biology Lawrence Berkeley National Laboratories 1 Cyclotron Road, Berkeley, CA-94703, USA Cell: 510 289 9246 BCSB: http://bcsb.als.lbl.gov PHENIX: http://www.phenix-online.org SASTBX: http://sastbx.als.lbl.gov -----------------------------------------------------------------
participants (5)
-
Ed Pozharski
-
Gino Cingolani
-
Mengbin Chen
-
Nathaniel Echols
-
Peter Zwart