The individual site refinement for Low resolution data (~3.8A)
Dear phenixer, Phenix suggested "Individual site" refinement is only suitable for high or midi resolution data (<3.5A). What is the alternate strategy to refine the atom coordinates of low resolution data? Yu
Hi Yu, I have no idea where you read this ("Phenix suggested "Individual site" refinement is only suitable for high or midi resolution data (<3.5A)") - I would never say this. With proper set of restraints (Ramachandran plot restraints, secondary structure restraints, reference model (if available) restraints) and weights (tight enough geometry) you can refine individual coordinates (as well as B-factors) at this resolution. Pavel. On 3/18/11 2:31 PM, Zhang yu wrote:
Dear phenixer,
Phenix suggested "Individual site" refinement is only suitable for high or midi resolution data (<3.5A). What is the alternate strategy to refine the atom coordinates of low resolution data?
Yu
On Fri, Mar 18, 2011 at 2:39 PM, Pavel Afonine
I have no idea where you read this ("Phenix suggested "Individual site" refinement is only suitable for high or midi resolution data (<3.5A)") - I would never say this.
With proper set of restraints (Ramachandran plot restraints, secondary structure restraints, reference model (if available) restraints) and weights (tight enough geometry) you can refine individual coordinates (as well as B-factors) at this resolution.
The documentation for the GUI says that individual_sites "is almost always appropriate except at low-resolution (typically 3.5A or worse), unless additional restraints (NCS, reference model, etc.) are used." Which, in retrospect, was not the clearest way to state it, so I've modified that sentence to sound less negative. The intended message was the same as Pavel's - you usually need to add more restraints to make coordinate refinement behave well at low resolution. However, I would recommend trying them in the opposite order: a high-resolution reference model is ideal if available. (You should also use NCS restraints if NCS is present.) The Ramachandran restraints should only be used as a last resort, at the end of refinement, and after you've fixed as many problems as possible manually in Coot (with real-space refinement). -Nat
Pavel,
I set 'wxc_scale = 0.1', and also included secondary structure restraints,
reference model. The 'individual site' still give me higher Rfree and bigger
gap between Rfree and Rwork of my dataset (~3.8A).
Is the 'wxc_scale = 0.1' not tight enough?
Thanks,
2011/3/18 Pavel Afonine
Hi Yu,
I have no idea where you read this ("Phenix suggested "Individual site" refinement is only suitable for high or midi resolution data (<3.5A)") - I would never say this.
With proper set of restraints (Ramachandran plot restraints, secondary structure restraints, reference model (if available) restraints) and weights (tight enough geometry) you can refine individual coordinates (as well as B-factors) at this resolution.
Pavel.
On 3/18/11 2:31 PM, Zhang yu wrote:
Dear phenixer,
Phenix suggested "Individual site" refinement is only suitable for high or midi resolution data (<3.5A). What is the alternate strategy to refine the atom coordinates of low resolution data?
Yu
Hi Yu,
I set 'wxc_scale = 0.1', and also included secondary structure restraints, reference model.
Using Ramachandran restraints may be a good idea to use at this resolution too.
The 'individual site' still give me higher Rfree and bigger gap between Rfree and Rwork of my dataset (~3.8A).
"Higher" and "bigger" than what? Your statement above does not contain any information for me to comment on.
Is the 'wxc_scale = 0.1' not tight enough?
If the default setting does not work for you, then you have two options: 1) let phenix.refine find optimal values for you automatically, using optimize_wxc=true and optimize_wxu=true or if this doesn't work for you too, then 2) keep trying different values for wxc_scale and wxu_scale until you hit the one that produces the results you like. Pavel.
Hi, I run phenix.modelvsdata on several systems and found from the output file that, the optimized ksol and bsol is 0. One example is the system of 3FUS. Any suggestions are appreicated! Best Regards, Hailiang
Hi Hailiang, this is what one of my slides on validation is about (see page 29): http://www.phenix-online.org/presentations/latest/pavel_validation.pdf I just ran phenix.model_vs_data on this PDB entry and I wonder what it took to deposit it at all, given: - this poor geometry counts: Molprobity statistics: Ramachandran plot, number of: outliers : 30 (10.14 %) allowed : 58 (19.59 %) favored : 208 (70.27 %) Rotamer outliers : 61 (22.43 %) - such a suspicious Rfree-Rwork gap, most likely indicating under-refinement: Model_vs_Data: r_work(re-computed) : 0.3479 r_free(re-computed) : 0.3590 bulk_solvent_(k_sol,b_sol) : 0.09 306.62 although what's nice it is reproducible: Information extracted from PDB file header: program_name : REFMAC year : 9 r_work : 0.346 r_free : 0.354 Yes, one may naively like such a small rfree-rwork gap, but it's good to realize that if you refine just one parameter per whole your structure, say a scale factor between Fobs and Fcalc, then your Rfree will not be much different compared to Rwork. It's all about proper parametrization and achieving refinement convergence... Good luck! Pavel.
Hi,
I run phenix.modelvsdata on several systems and found from the output file that, the optimized ksol and bsol is 0. One example is the system of 3FUS. Any suggestions are appreicated!
Best Regards, Hailiang
Thanks Pavel! My phenix (1.6.4-486) generated the same R/Rfree as you have, but just ended up with all 0 ksol and bsol. Just very weird. Thanks anyway. Hailiang
Hi Hailiang,
this is what one of my slides on validation is about (see page 29): http://www.phenix-online.org/presentations/latest/pavel_validation.pdf
I just ran phenix.model_vs_data on this PDB entry and I wonder what it took to deposit it at all, given:
- this poor geometry counts:
Molprobity statistics: Ramachandran plot, number of: outliers : 30 (10.14 %) allowed : 58 (19.59 %) favored : 208 (70.27 %) Rotamer outliers : 61 (22.43 %)
- such a suspicious Rfree-Rwork gap, most likely indicating under-refinement:
Model_vs_Data: r_work(re-computed) : 0.3479 r_free(re-computed) : 0.3590 bulk_solvent_(k_sol,b_sol) : 0.09 306.62
although what's nice it is reproducible:
Information extracted from PDB file header: program_name : REFMAC year : 9 r_work : 0.346 r_free : 0.354
Yes, one may naively like such a small rfree-rwork gap, but it's good to realize that if you refine just one parameter per whole your structure, say a scale factor between Fobs and Fcalc, then your Rfree will not be much different compared to Rwork. It's all about proper parametrization and achieving refinement convergence...
Good luck! Pavel.
Hi,
I run phenix.modelvsdata on several systems and found from the output file that, the optimized ksol and bsol is 0. One example is the system of 3FUS. Any suggestions are appreicated!
Best Regards, Hailiang
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi Hailiang, I guess the only way to find out is to talk to the authors of that structure hoping that they may to explain what happened to the data: why a structure with reported ~70% solvent content has refined ksol=0 ( = no solvent evidence in the data). Pavel. On 3/19/11 6:36 PM, [email protected] wrote:
Thanks Pavel! My phenix (1.6.4-486) generated the same R/Rfree as you have, but just ended up with all 0 ksol and bsol. Just very weird. Thanks anyway.
Hailiang
Hi Hailiang,
this is what one of my slides on validation is about (see page 29): http://www.phenix-online.org/presentations/latest/pavel_validation.pdf
I just ran phenix.model_vs_data on this PDB entry and I wonder what it took to deposit it at all, given:
- this poor geometry counts:
Molprobity statistics: Ramachandran plot, number of: outliers : 30 (10.14 %) allowed : 58 (19.59 %) favored : 208 (70.27 %) Rotamer outliers : 61 (22.43 %)
- such a suspicious Rfree-Rwork gap, most likely indicating under-refinement:
Model_vs_Data: r_work(re-computed) : 0.3479 r_free(re-computed) : 0.3590 bulk_solvent_(k_sol,b_sol) : 0.09 306.62
although what's nice it is reproducible:
Information extracted from PDB file header: program_name : REFMAC year : 9 r_work : 0.346 r_free : 0.354
Yes, one may naively like such a small rfree-rwork gap, but it's good to realize that if you refine just one parameter per whole your structure, say a scale factor between Fobs and Fcalc, then your Rfree will not be much different compared to Rwork. It's all about proper parametrization and achieving refinement convergence...
Good luck! Pavel.
Hi,
I run phenix.modelvsdata on several systems and found from the output file that, the optimized ksol and bsol is 0. One example is the system of 3FUS. Any suggestions are appreicated!
Best Regards, Hailiang
phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi there, Can phenix generate the NCS masks just like CCP4-NCSMASK does? Thanks! Hailiang
Hi Hailiang, Yes, you can run phenix.autobuild with your data file, sequence file, model (if any), maps_only=True and ncs_output_mask_file=my_mask_file.map and ncs_file=my_ncs.ncs_spec and it should write out a ccp4-style my_mask_file.map file showing the ncs asymmetric unit (in addition to generating map coefficients for a density-modified map). The my_mask_file.map will be in AutoBuild_run_1_/TEMP0/ I used these commands just now to test it out (takes about 10 minutes; you can do the same if you want because the data for the regression tests are in your phenix installation already): phenix_regression.wizards.test_command_line_ncs test_find_ncs_from_density cd test_find_ncs_from_density/ phenix.autobuild maps_only=true ncs_file= find_ncs.ncs_spec \ data= cycle_best_1.mtz seq_file= sequence.dat \ ncs_output_mask_file=my_mask_file.map (The first two commands are just to set up some data that has NCS in it and to find that NCS; in this case 6 NCS copies. You presumably already have done both of these already and so you can go right to the autobuild maps_only step.) Looking at my_mask_file.map and resolve_work.mtz in the resulting AutoBuild_run_1_/TEMP0/ directory, you can see that one asymmetric unit of the NCS is within region defined by the my_mask_file.map. I hope that helps! -Tom T
Hi there,
Can phenix generate the NCS masks just like CCP4-NCSMASK does?
Thanks!
Hailiang
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Got it!
Hi Hailiang,
Yes, you can run phenix.autobuild with your data file, sequence file, model (if any), maps_only=True and ncs_output_mask_file=my_mask_file.map and ncs_file=my_ncs.ncs_spec and it should write out a ccp4-style my_mask_file.map file showing the ncs asymmetric unit (in addition to generating map coefficients for a density-modified map).
The my_mask_file.map will be in AutoBuild_run_1_/TEMP0/
I used these commands just now to test it out (takes about 10 minutes; you can do the same if you want because the data for the regression tests are in your phenix installation already):
phenix_regression.wizards.test_command_line_ncs test_find_ncs_from_density cd test_find_ncs_from_density/ phenix.autobuild maps_only=true ncs_file= find_ncs.ncs_spec \ data= cycle_best_1.mtz seq_file= sequence.dat \ ncs_output_mask_file=my_mask_file.map
(The first two commands are just to set up some data that has NCS in it and to find that NCS; in this case 6 NCS copies. You presumably already have done both of these already and so you can go right to the autobuild maps_only step.)
Looking at my_mask_file.map and resolve_work.mtz in the resulting AutoBuild_run_1_/TEMP0/ directory, you can see that one asymmetric unit of the NCS is within region defined by the my_mask_file.map.
I hope that helps! -Tom T
Hi there,
Can phenix generate the NCS masks just like CCP4-NCSMASK does?
Thanks!
Hailiang
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi, I want to run phenix.refine on a glycoprotein, but the sugars are not in the standard library. I just wonder whether there is a refmac-style syntax (LIBIN) in phenix.refine to include a user-written ligand library, instead of adding and modifying the existing phenix library. Thanks! Best Regards, Hailiang
Hi, Actually I have generated the sugar cif using phenix.elbow (phenix is pretty handy at this:-), but not sure how to generate the link information... Best Regards, Hailiang
Hi,
I want to run phenix.refine on a glycoprotein, but the sugars are not in the standard library. I just wonder whether there is a refmac-style syntax (LIBIN) in phenix.refine to include a user-written ligand library, instead of adding and modifying the existing phenix library.
Thanks!
Best Regards, Hailiang
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi,
Actually I have generated the sugar cif using phenix.elbow (phenix is pretty handy at this:-), but not sure how to generate the link information...
as I mentioned in previous email, please use custom bonds for this that allows you to define a covalent bond between any two selected atoms. You can define as many of such bonds as you wish. You can define angles too. Please note, phenix.refine does not use or outputs LINK records at all. Pavel.
Hi, I ccp4-refmac, there is a "BREF OVER" mechanism to refine the overall B-factor (Boverall, obtained from scaling is added to the atomic B values, or, http://www.ccp4.ac.uk/html/refmac5/keywords/xray-principal.html#refi_bref). I am just wondering in phenix.refine, whether we could do this after positional and tls refinement. It seems to me the Boverall corresponds to the overall anisotropic scaling factor and will be refined prior to any positional/B-factor refinement in phenix.refine by default. I may be wrong, but if not, my question was whether we could do it afterward. Thanks! Best Regards, Hailiang
Hi Hailiang,
I ccp4-refmac, there is a "BREF OVER" mechanism to refine the overall B-factor (Boverall, obtained from scaling is added to the atomic B values, or, http://www.ccp4.ac.uk/html/refmac5/keywords/xray-principal.html#refi_bref). I am just wondering in phenix.refine, whether we could do this after positional and tls refinement. It seems to me the Boverall corresponds to the overall anisotropic scaling factor and will be refined prior to any positional/B-factor refinement in phenix.refine by default. I may be wrong, but if not, my question was whether we could do it afterward.
this is described in great details here: http://www.phenix-online.org/newsletter/CCN_2010_07.pdf see article "On atomic displacement parameters ...". In this article it's called Ucryst. I guess this is more or less what is called Boverall in Refmac. Pavel.
Hi Hailiang, I guess to be able to run phenix.refine you need to follow phenix.refine style -;) Most likely using phenix.elbow or phenix.ready_set and defining custom bonds and angles as described here: http://phenix-online.org/documentation/refinement.htm#anch86 should be enough to do what you want. If not, then please send me the inputs and tell what exactly you are trying to do, and I will send you back a working example. Pavel.
I want to run phenix.refine on a glycoprotein, but the sugars are not in the standard library. I just wonder whether there is a refmac-style syntax (LIBIN) in phenix.refine to include a user-written ligand library, instead of adding and modifying the existing phenix library.
participants (5)
-
Nathaniel Echols
-
Pavel Afonine
-
Thomas C. Terwilliger
-
Zhang yu
-
zhangh1@umbc.edu