Hi Joseph,
It may be worth trying as well:
- wxray_coupled_tbath_offset values of 0.5 and 10,
- and primary_map_cutoff values of 2.0.
I'll see if I can find the ensemble probability scripts and send them to you.
Cheers,
Tom
On 4 December 2014 at 10:13, Joseph Brock
Hi all,
I hope it is OK if I revive this old thread to ask a few additional questions regarding phenix.ensemble_refinement, specifically regarding its use with data at atomic resolution.
While I had good and immediate results using the software with data sets at around 1.7 Å, I have been having a very hard time with a data set of 1.1 Å resolution, phenix.refine R/Rfree values of ~12/13% (alt conformations removed and ligand occupancy refined).
I have now tried allot of different combinations (~180!) screening:
- ptls values of 0.6, 0.7, 0.8, 0.9 and 0.95,
- wxray_coupled_tbath_offset values of 1, 2.5 and 5,
- and primary_map_cutoff values of 2.5, 3.0 and 3.5
combined with various different TLS definitions and restraints on ligands. I also tried to manually set tx values to 1.5, 2.0 and 2.5 to reflect those of a dataset of similar resolution (1KZK) reported in Burnely et. al., 2012. However, ensemble refinement consistently makes the model considerably worse, with the best Rfree values I have achieved being ~15.
It just occurred to me that increasing the x-ray weight with wxray_coupled_tbath_offset values of ~10 would probably make sense for data of this quality, but I would be greatly appreciate any further tips if anybody has specific experience with using phenix.ensemble_refinement with high resolution datasets.
On a different note, Burnely et. al., 2012, shows some very cool figures in which atoms are coloured by atom probability. If there is an easy way to output these values, to say the occupancy column in the resulting .pdb files, I would love to hear it!
Thanks!
-Joseph.
Joseph Brock | PhD Division of Physiological Chemistry II Department of Medical Biochemistry and Biophysics Karolinska Institutet Scheeles väg 2 SE-171 77 Stockholm, Sweden
________________________________________ From: Joseph Brock Sent: Monday, 17 November 2014 9:25 AM To: Mark Wilson; Nathaniel Echols Cc: [email protected] Subject: RE: [phenixbb] Phenix.ensemble_refinment questions
I greatly appreciate all the helpful info everyone.
Thanks!
-Joseph.
Joseph Brock | PhD Division of Physiological Chemistry II Department of Medical Biochemistry and Biophysics Karolinska Institutet Scheeles väg 2 SE-171 77 Stockholm, Sweden
________________________________________ From: Mark Wilson [[email protected]] Sent: Thursday, 13 November 2014 5:03 PM To: Nathaniel Echols; Joseph Brock Cc: [email protected] Subject: Re: [phenixbb] Phenix.ensemble_refinment questions
Hi All, As I am (I think) the other user that Nat is referring to, I'll comment. I requested support for experimental phase information in the PHENIX ensemble refinement target and can verify that it is accepted (in my case as HL coeffs) and will run. The result in my test case was not dramatically different than an amplitude-based target, but obviously a great many factors could affect this. What differences I saw were minor improvements in R/Rfree (~1%) with Se-Met SAD phases in a 1.05 Å resolution structure of a flexible protein. I've not dug too much more into this, but I can verify that phases are accepted and do influence the final ensemble. Best regards, Mark
Mark A. Wilson Associate Professor Department of Biochemistry/Redox Biology Center University of Nebraska N118 Beadle Center 1901 Vine Street Lincoln, NE 68588 (402) 472-3626 [email protected]
On 11/13/14 9:53 AM, "Nathaniel Echols"
wrote: On Thu, Nov 13, 2014 at 6:46 AM, Joseph Brock
wrote: 1. In the associated publication (Burnley et al. eLife 2012;), the ensemble refinement is validated by comparing the correlation of the ensemble generated map, with the map generated from the experimental phases for PDB entry 1YTT... I am confused how one computes an experimentally phased from structure factors deposited in the PDB that contain only anomalous intensities/amplitudes and not Hendrickson-Lattman coefficients. Is there a program within the phenix package that can do this?
AutoSol can be used to re-solve such datasets, although in the case of 1YTT it requires additional information that wasn't deposited.
Is it possible to include experimental phases during the rolling average refinement process and could this be beneficial (if the phases were of a sufficient quality)?
It is possible, but completely untested aside from verifying that it doesn't crash. I added this a year ago at the request of another user but haven't looked into it since.
3. What is the function of the "nproc" keyword? If this is the number of CPU cores that can be used in parallel, what is the most efficient way of using phenix.ensemble_refinement on a cluster?
The only parallelization is in the optimization of the ptls parameter - i.e. if you try N values for ptls, you can run N jobs at once. For a single ptls it will run in serial. So on a cluster, you are better off running N different jobs separately.
Finally, I noticed that I cannot run phenix.ensemble_refinement using a "my_parameters.eff" file, it is necessary to type on the command line.
That sounds like a bug...
-Nat
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb