Search results for query "look through"
- 527 messages
Re: [phenixbb] phaser MR
by Francis E Reyes
Shya
I'm curious as to how you resolved this problem. To quote, you applied
a twin law for the P 42 22 space group? I wasn't aware that there were
twin laws for this space group? How did you determine the twin law?
Thanks
FR
On Jul 24, 2009, at 9:15 AM, sbiswas2(a)ncsu.edu wrote:
> Hi,
> Thanks for your response. So I did one cycle of refinement and
> indeed the
> R work goes down by 5 points when I apply twin law at P4222 space
> group
> and when I look at the map the clashes that were present before are no
> longer there. I will try to scale it in P4 or P422 and see how it
> looks. I
> used HKL2000 to scale the data and have no idea if it will make a
> difference to use SCALA.
> This is what I got from phenix xtriage:
> These values look inbetween untwinned and perfect twin
> Acentric reflections
> <I^2>/<I>^2 :1.980 (untwinned: 2.000; perfect twin 1.500)
> <F>^2/<F^2> :0.797 (untwinned: 0.785; perfect twin 0.885)
> <|E^2 - 1|> :0.727 (untwinned: 0.736; perfect twin 0.541)
> Centric reflections
> <I^2>/<I>^2 :2.863 (untwinned: 3.000; perfect twin 2.000)
> <F>^2/<F^2> :0.673 (untwinned: 0.637; perfect twin 0.785)
> <|E^2 - 1|> :0.925 (untwinned: 0.968; perfect twin 0.736)
>
> -----------------------------------------------
> | Z | Nac_obs | Nac_theo | Nc_obs | Nc_theo |
> -----------------------------------------------
> | 0.0 | 0.000 | 0.000 | 0.000 | 0.000 |
> | 0.1 | 0.074 | 0.095 | 0.208 | 0.248 |
> | 0.2 | 0.164 | 0.181 | 0.319 | 0.345 |
> | 0.3 | 0.246 | 0.259 | 0.391 | 0.419 |
> | 0.4 | 0.322 | 0.330 | 0.452 | 0.474 |
> | 0.5 | 0.388 | 0.394 | 0.505 | 0.520 |
> | 0.6 | 0.445 | 0.451 | 0.552 | 0.561 |
> | 0.7 | 0.497 | 0.503 | 0.592 | 0.597 |
> | 0.8 | 0.541 | 0.551 | 0.630 | 0.629 |
> | 0.9 | 0.587 | 0.593 | 0.659 | 0.657 |
> | 1.0 | 0.631 | 0.632 | 0.679 | 0.683 |
> -----------------------------------------------
> | Maximum deviation acentric : 0.021 |
> | Maximum deviation centric : 0.040 |
> | |
> | <NZ(obs)-NZ(twinned)>_acentric : -0.009 |
> | <NZ(obs)-NZ(twinned)>_centric : -0.014 |
>
> Thanks again for the valuable input,
>
> Shya
>
>
>
>
> Shya,
>>
>> Did phaser complain that the asymmetric unit was too full? How do
>> the
>> self rotation maps look? Are the crystallographic peaks exact or off
>> by a few degrees (your resolution data may make it difficult to see
>> this)? How do the N(z) cumulative intensity distributions look (make
>> sure to calculate this with thin resolution bins, i.e. increase BINS
>> in Scala I think)? Does your data look sigmoidal on this plot?
>>
>> Perfect twinning or an NCS that's close to a crystallographic axis is
>> difficult to diagnose from merged intensity statistics and even more
>> difficult with resolution worse than 2.5. I recommend Dauter Acta
>> Cryst (2003) D59 2004-2016 for a good discussion of this.
>>
>> Your space group might be too high. See the subgroups of P422 at
>> http://cci.lbl.gov/~phzwart/p422_2.png
>> . Reintegrate and merge the data in each space group, MR a single
>> copy
>> of your model (let phaser complete the ASU) and compare the Rpim's
>> (from scaling/merging) and the Rwork/Rfree from a rigid body refine
>> without NCS, with NCS, with appropriate twin laws, and with twin laws
>> + NCS. No need to do a full refinement just yet. Allow phenix.refine
>> to create the Rfree flags. Choose the space group which gives the
>> best
>> statistics.
>>
>> I recently had a case (Hardin, Reyes, Batey J. Biol. Chem., Vol. 284,
>> Issue 22, 15317-15324, May 29, 2009) of a protein that merged into
>> P422 but was difficult to refine in that space group. I brought it
>> back to P4 and refined with NCS+twin to give more reasonable Rwork/
>> Rfree (5-7% difference from the P422 to P4).
>>
>> HTH,
>>
>> FR
>>
>>
>>
>> On Jul 23, 2009, at 3:54 PM, sbiswas2(a)ncsu.edu wrote:
>>
>>> Hi Francis,
>>> Thanks for your response. The matthews coefficient suggests two
>>> molecules
>>> in the AU. Phaser also finds two molecules. I ran the dataset
>>> through
>>> phenix xtriage it did not indicate twinning though. The molecule
>>> also
>>> exists in nature as a monomer.
>>> Shya
>>>
>>>
>>>> Twinning? What's your matthews coefficient say? Do you know if your
>>>> structure is a multimer (biochemistry, etc)? Does it agree with
>>>> the
>>>> matthews coefficient?
>>>>
>>>> If the unit cell is not big enough to hold all of the
>>>> contents,then
>>>> this is an indicator for twinning .
>>>>
>>>> FR
>>>>
>>>> On Jul 23, 2009, at 3:09 PM, sbiswas2(a)ncsu.edu wrote:
>>>>
>>>>> Hi all,
>>>>>
>>>>> I was trying to solve a structure by molecular replacement. I
>>>>> scaled
>>>>> the
>>>>> data in P4222 space group (resolution 2.7A) with two molecules in
>>>>> the
>>>>> assymmetric unit (molecule A and B) I ran phaser with my model and
>>>>> got a
>>>>> Zscore of 5.1. When I look at the map that I got from phaser I
>>>>> could
>>>>> easily see good electron density for both molecules, However upon
>>>>> inspection of the electron density map there were considerable
>>>>> interaction
>>>>> or clashes with molecule B and a symmetry atom. Molecule A had no
>>>>> clashes
>>>>> however with the symmetry atoms. I was wondering if anyone knows
>>>>> how
>>>>> to
>>>>> resolve this. Could it be a problem of space group. The
>>>>> statistics
>>>>> are
>>>>> good for space group P4222 and the I/sigI was good till 2.7A.
>>>>> Any advice is appreciated,
>>>>> Shya
>>>>>
>>>>> _______________________________________________
>>>>> phenixbb mailing list
>>>>> phenixbb(a)phenix-online.org
>>>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>>>
>>>> ---------------------------------------------
>>>> Francis Reyes M.Sc.
>>>> 215 UCB
>>>> University of Colorado at Boulder
>>>>
>>>> gpg --keyserver pgp.mit.edu --recv-keys 67BA8D5D
>>>>
>>>> 8AE2 F2F4 90F7 9640 28BC 686F 78FD 6669 67BA 8D5D
>>>>
>>>> _______________________________________________
>>>> phenixbb mailing list
>>>> phenixbb(a)phenix-online.org
>>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>>>
>>>
>>> _______________________________________________
>>> phenixbb mailing list
>>> phenixbb(a)phenix-online.org
>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>
>> ---------------------------------------------
>> Francis Reyes M.Sc.
>> 215 UCB
>> University of Colorado at Boulder
>>
>> gpg --keyserver pgp.mit.edu --recv-keys 67BA8D5D
>>
>> 8AE2 F2F4 90F7 9640 28BC 686F 78FD 6669 67BA 8D5D
>>
>>
>>
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://www.phenix-online.org/mailman/listinfo/phenixbb
---------------------------------------------
Francis Reyes M.Sc.
215 UCB
University of Colorado at Boulder
gpg --keyserver pgp.mit.edu --recv-keys 67BA8D5D
8AE2 F2F4 90F7 9640 28BC 686F 78FD 6669 67BA 8D5D
16 years, 6 months
[phenixbb] Re: inconsistency of Molprobity results between RealSpaceRefine and ValidationCryoEM
by Pavel Afonine
Hi Kevin,
additionally, check to see if this behavior persists in the latest
Phenix, say in 2.0-5824. You are using an 'ancient' version and I had an
impression that I was fixing something relevant to this some time ago.
Also, Nigel's CDL vs no CDL would reasoning can explain differences for
bond/angle deviations but still does not explain discrepancies in other
metrics (Ramachandran, clashes, rotamers).
Pavel
On 9/25/25 11:37, Nigel Moriarty wrote:
> Kevin
>
> All of this arises from a simple miscommunication. The default restraints
> for refinement in Phenix is the Conformation Dependent Library. It
> basically changes the ideal values for the protein backbone based on the
> phi/psi.angles. While Moprobity in Phenix uses the same libraries, it
> appears the communication has broken so it's not using the CDL but rather
> the single value restraints. As a result, .geo file (which is the gold
> standard) is correct and the Molprobity is slightly wrong. As a power user
> (you look in the .geo file), I suggest you check the .geo file until I
> correct this.
>
> Cheers
>
> Nigel
>
> ---
> Nigel W. Moriarty
> Building 91, Molecular Biophysics and Integrated Bioimaging
> Lawrence Berkeley National Laboratory
> Berkeley, CA 94720-8235
> Email : NWMoriarty(a)LBL.gov
> Web : CCI.LBL.gov
> ORCID : orcid.org/0000-0001-8857-9464
>
>
> On Tue, Sep 23, 2025 at 12:20 PM Kevin M Jude <kjude(a)stanford.edu> wrote:
>
>> I notice inconsistencies in the Molprobity results reported for the same
>> structure from RealSpaceRefine and from ValidationCryoEM. As an example,
>> for bond angle restraints, RSR reports that there are four >4sigma outliers
>> (but does not list the outliers), while ValidationCryoEM reports 0.
>>
>> RSR
>> CryoEMValidation
>> Number of restraints
>> 13228
>> 13228
>> RMS(deviation)
>> 0.509
>> 0.51
>> Max. deviation
>> 9.997
>> 9.998
>> Min. deviation
>> 0
>> 0
>> Number of outliers > 4sigma
>> 4
>> 0
>>
>> Browsing through the .geo file, I don’t see any >4 sigma deviations among
>> the highest residual angles, so I think this is an error.
>>
>> I also see small discrepancies in Ramachandran favored, rotamer outliers,
>> and clash score which seem to come down to minuscule differences in
>> measurements at the margins.
>>
>> This is for Phenix 1.21.2-5419. Happy to share files offline.
>>
>>
>> --
>>
>> Kevin Jude, PhD
>>
>> Structural Biology Research Specialist, Garcia Lab
>>
>> Howard Hughes Medical Institute
>>
>> Stanford University School of Medicine
>>
>> Beckman B177, 279 Campus Drive, Stanford CA 94305
>> _______________________________________________
>> phenixbb mailing list -- phenixbb(a)phenix-online.org
>> To unsubscribe send an email to phenixbb-leave(a)phenix-online.org
>> Unsubscribe: phenixbb-leave@%(host_name)s
>>
> _______________________________________________
> phenixbb mailing list -- phenixbb(a)phenix-online.org
> To unsubscribe send an email to phenixbb-leave(a)phenix-online.org
> Unsubscribe: phenixbb-leave@%(host_name)s
4 months, 2 weeks
Re: [phenixbb] measuring the angle between two DNA duplexes
by Tim Gruene
Hi Pavel,
that's the method described in
http://journals.iucr.org/a/issues/2011/01/00/sc5036/index.html ;-) based
on the moments of inertia (a computer scientist might name it
differently). I am not sure, though, you would get the desired result
for short helices. E.g. a helix defined by three atoms the eigenvalue
would point roughly in the direction of the external phosphates, which
is far from parallel with the helix axis.
Best,
Tim
On 01/21/2014 04:20 AM, Pavel Afonine wrote:
> Hi Ed,
>
> interesting idea! Although I was thinking to have a tool that is a
> little more general and a little less context dependent. Say you have
> two clouds of points that are (thinking in terms of macromolecules) two
> alpha helices (for instance), and you want to know the angle between the
> axes of the two helices. How would I approach this?..
>
> First, for each helix I would compute a symmetric 3x3 matrix like this:
>
> sum(xn-xc)**2 sum(xn-xc)*(yn-xc) sum(xn-xc)*(zn-zc)
> sum(xn-xc)*(yn-xc) sum(yn-yc)**2 sum(yn-yc)*(yz-zc)
> sum(xn-xc)*(zn-zc) sum(yn-yc)*(yz-zc) sum(zn-zc)**2
>
> where (xn,yn,zn) is the coordinate of nth atom, the sum is taken over
> all atoms, and (xc,yc,zc) is the coordinate of the center of mass.
>
> Second, for each of the two matrices I would find its eigen-values and
> eigen-vectors, and select eigen-vectors corresponding to largest
> eigenvalues.
>
> Finally, the desired angle is the angle between the two eigen-vectors
> found above, which is computed trivially.
> I think this a little simpler than finding the best fit for a 3D line.
>
> What you think?
>
> Pavel
>
>
> On 1/20/14, 2:14 PM, Edward A. Berry wrote:
>>
>>
>> Pavel Afonine wrote:
>> . .
>>
>>> The underlying procedure would do the following:
>>> - extract two sets of coordinates of atoms corresponding to two
>>> provided atom selections;
>>> - draw two optimal lines (LS fit) passing through the above sets
>>> of coordinates;
>>> - compute and report angle between those two lines?
>>>
>>
>> This could be innacurate for very short helices (admittedly not the
>> case one usually would be looking for angles), or determining the axis
>> of a short portion of a curved helix. A more accurate way to
>> determine the axis- have a long canonical duplex constructed with its
>> axis along Z (0,0,1). Superimpose as many residues of that as required
>> on the duplex being tested, using only backbone atoms or even only
>> phosphates. Operate on (0,0,1) with the resulting operator (i.e. take
>> the third column of the rotation matrix) and use that as a vector
>> parallel to the axis of the duplex being tested.
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://phenix-online.org/mailman/listinfo/phenixbb
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
>
--
Dr Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen
GPG Key ID = A46BEE1A
12 years
Re: [phenixbb] Molecular replacement at low resolution
by mohamed noor
Dear all
After some fiddling with different software, I managed to get a solution
from BALBES in CCP4.
As some suggested, self-rotation indicated only 4 NCS copies in the asu and
not 8 based on Matthew's coefficient calculation. In hindsight, the
resulting 75 % solvent content could be the cause for the rather poor
diffraction.
By using MR Rosetta with the BALBES model, I have something better (R of
32/37 % without adding all the heme ligands yet and some missing atoms not
built by AutoBuild).
A request to Phenix developers: When I clicked the Run phenix.refine and
Validation buttons in the GUI, it loaded the file overall_best_one.pdb into
the GUI which has only one chain. I only noticed this when looking at the
50 % R factor compared to the MR Rosetta log file.
Thanks again.
On Sat, May 2, 2015 at 7:00 PM, Gino Cingolani <Gino.Cingolani(a)jefferson.edu
> wrote:
> always check k=180 (2-fold ncs) in addition to whatever high symmetry
> ncs-axis you think is in your data (e.g. k=45, etc).
> There's no guarantee it will work. It's an experimental approach to
> attempt for difficult MR cases.
>
>
> ******************************************************************************
> Gino Cingolani, Ph.D.
> Professor
> Thomas Jefferson University
> Dept. of Biochemistry & Molecular Biology
> 233 South 10th Street - Room 826
> Philadelphia PA 19107
> Tel: (215) 503 4573
> Website: http://www.cingolanilab.org
>
> ******************************************************************************
> "Nati non foste per viver come bruti, ma per seguir virtute e canoscenza"
> ("You were not born to live like brutes, but to follow virtue and
> knowledge")
> Dante, The Divine Comedy (Inferno, XXVI, vv. 119-120)
>
>
> ________________________________________
> From: phenixbb-bounces(a)phenix-online.org <
> phenixbb-bounces(a)phenix-online.org> on behalf of Engin Özkan <
> eozkan(a)uchicago.edu>
> Sent: Saturday, May 02, 2015 1:09 PM
> To: phenixbb(a)phenix-online.org
> Subject: Re: [phenixbb] Molecular replacement at low resolution
>
> On 5/2/15 11:18 AM, Gino Cingolani wrote:
> > If you don't see anything at k=45, then you don't have 8 copies in the
> > asu.
> While checking self rotation functions is a great idea, I would shy away
> from such categorical assertions as self rotation functions are not
> always cleanly interpretable (despite GLRF, which is my favorite as
> well), and an eight-fold NCS might not be due to an eight-fold rotation:
> you can have eight molecules with different arrangements (such as two C4
> "tetramers") as well as with improper NCS, or through translational NCS
> as well.
>
> Engin
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
> Unsubscribe: phenixbb-leave(a)phenix-online.org
> The information contained in this transmission contains privileged and
> confidential information. It is intended only for the use of the person
> named above. If you are not the intended recipient, you are hereby notified
> that any review, dissemination, distribution or duplication of this
> communication is strictly prohibited. If you are not the intended
> recipient, please contact the sender by reply email and destroy all copies
> of the original message.
>
> CAUTION: Intended recipients should NOT use email communication for
> emergent or urgent health care matters.
>
>
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
> Unsubscribe: phenixbb-leave(a)phenix-online.org
>
10 years, 9 months
Re: [phenixbb] Unmasking cavities
by Morten Grøftehauge
Hi guys,
I'm a bit confused by this answer.
I get the "add dummy atoms and calculate map" to check whether it is Fourier
truncation ripples (which I don't think it will turn out to be).
But I wouldn't feel comfortable depositing a structure with dummy atoms even
if they do have zero occupancy. Are you really suggesting that people do
that?
Secondly, when I look in the .def for my refinements I find two entries for
mask calculation:
Under the fake_f_obs heading
mask {
solvent_radius = 1.11
shrink_truncation_radius = 0.9
grid_step_factor = 4
verbose = 1
mean_shift_for_mask_update = 0.1
ignore_zero_occupancy_atoms = True
ignore_hydrogens = True
}
And again under it's own heading towards the end
mask {
solvent_radius = 1.11
shrink_truncation_radius = 0.9
grid_step_factor = 4
verbose = 1
mean_shift_for_mask_update = 0.1
ignore_zero_occupancy_atoms = True
ignore_hydrogens = True
}
Which one is relevant? Also why didn't any of you suggest the
optimize_mask=true parameter? Shouldn't that automatically find the best
solvent_radius and shrink_truncation_radius values?
Sorry if these are dumb questions (and sorry that there are so many) but I
was just really confused by these answers.
Sincerely,
Morten Grøftehauge
2008/10/4 Pavel Afonine <PAfonine(a)lbl.gov>
> Hi Frank,
>
> I just want to add to Ralf's very comprehensive reply... The parameters
> solvent_radius, shrink_truncation_radius and grid_step_factor are
> explained in the original paper:
>
> Jiang, J.-S. & Brünger, A. T. (1994). J. Mol. Biol. 243, 100-115.
> "Protein hydration observed by X-ray diffraction. Solvation properties
> of penicillopepsin and neuraminidase crystal structures."
>
> The details of PHENIX implementation of this are described here:
>
> P.V. Afonine, R.W. Grosse-Kunstleve & P.D. Adams. Acta Cryst. (2005).
> D61, 850-855. "A robust bulk-solvent correction and anisotropic scaling
> procedure"
>
> Also, the negative peaks you observe can easily be Fourier series
> truncation ripples. I think Ralf's suggestion to place some dummy atoms
> there with zero occupancy is a good idea. I wouldn't even do any
> refinement (since moving atoms may cancel these artifacts), but just
> compute two maps - with and w/o the dummy atoms and see what happens to
> these negative peaks.
>
> Cheers,
> Pavel.
>
>
> On 9/28/2008 3:25 PM, Frank von Delft wrote:
> > Hi
> >
> > After being through phenix.refine, I see in my hydrophobic core a big
> > space (a few atoms wide) that is filled with strong negative difference
> > density. I suspect the culprit is the bulk solvent mask, which is
> > defined too tightly.
> >
> > The online manual mentions three parameters, but not what they do.
> > solvent_radius,
> > shrink_truncation_radius,
> > grid_step_factor
> >
> > What *exactly* do they do?
> >
> > (I thought I'd elicit a contribution for the online docs this way :)
> > Cheers
> > phx
> > _______________________________________________
> > phenixbb mailing list
> > phenixbb(a)phenix-online.org
> > http://www.phenix-online.org/mailman/listinfo/phenixbb
> >
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://www.phenix-online.org/mailman/listinfo/phenixbb
>
--
Morten K Grøftehauge
PhD student
Department of Molecular Biology
Gustav Wieds Vej 10 C
8000 Aarhus C - Denmark
Phone: +45 89 42 52 61
Fax: +45 86 12 31 78
www.bioxray.dk
17 years, 1 month
Re: [phenixbb] matplotlib
by Francois Berenger
On 08/19/2011 03:29 AM, Francis E Reyes wrote:
> Since we're talking about phenix distributions....
>
>
> <flamebait>
> So when are we going to see phenix on the App Store?
> I hear the next version of OS X will only run binaries signed by Apple.
Well, let's not accept this business model.
> </flamebait>
>
> F
>
> On Aug 18, 2011, at 12:06 PM, Nathaniel Echols wrote:
>
>> On Thu, Aug 18, 2011 at 10:39 AM, Ed Pozharski<epozh001(a)umaryland.edu> wrote:
>> In a nutshell, phenix gui may in some circumstances screw up other
>> programs that use matplotlib.
>>
>> It's not the fault of Phenix; matplotlib is unusually inflexible in how it deals with these cache files, and it is nearly unique among Python modules in its dependence on writing to the user's home directory. This isn't the only problem; another issue is that matplotlib creates these caches, and the maintainers appear to never have considered what would happen if the cached directory were removed. So when you run the GUI in version X of Phenix, then remove version X and install version Y, it will still look for the fonts installed with version X. I complained about this last December, and it remains unsolved in any of the official releases (one of which was this year).
>>
>> The likely cause is that phinix gui calls on bundled matplotlib which is
>> different from one I have installed (not to mention that I am using
>> Lucid (because it's LTS) which has python 2.6 and not the 2.7 that is
>> bundled with phenix). However, it still writes into the same
>> ~/.matplotlib folder, thus I end up with incompatible data. Certainly,
>> the problem will be gone when matplotlib gets bumped up to 1.0.1 in next
>> Ubuntu release.
>>
>> The issue with removing installations will remain, however. You could avoid the incompatibility problem by running "phenix.wxpython" if you need to use matplotlib. (We're using Python 2.7.2 right now, and generally update to the latest release in the 2.x series shortly after it comes out.)
>>
>> This is yet another example of why the standalone installation approach
>> is ideologically objectionable on modern Linux. But of course, the
>> practical advantage gained by not having to package the software for any
>> possible OS flavor/version users may choose outweighs the lower risks of
>> package incompatibility and the reduced size of the packaged product.
>>
>> We don't have the resources to support a more ideologically pure distribution mechanism - the installers are maintained by me and Ralf in between other projects. Also, we often depend on new features in the various dependencies that would not be immediately available through the package managers (for instance, we switched to Python 2.6 almost immediately because I needed the multiprocessing module). There are many things in the current installers that I'm unhappy with, but they don't take very much time to maintain, which is essential.
>>
>> -Nat
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://phenix-online.org/mailman/listinfo/phenixbb
>
>
>
> ---------------------------------------------
> Francis E. Reyes M.Sc.
> 215 UCB
> University of Colorado at Boulder
>
>
>
>
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
14 years, 5 months
Re: [phenixbb] Using LigandFit to identify unknown density
by Pavel Afonine
Hi Maia,
first, I agree with Peter - the B-factor restraints should help, indeed.
Second, I think we discussed this subject already on November 25, 2009:
Subject: Re: [phenixbb] occupancy refinement
Date: 11/25/09 7:38 AM
and I believe I didn't change my mind about it since that. I'm appending
that email conversation to the bottom of this email.
Overall, if you get good 2mFo-DFc map and clear residual mFo-DFc map,
and ligand's B-factors are similar or slightly larger than those of
surrounding atoms, and refined occupancy looks reasonable, then I think
you are fine.
Pavel.
On 1/27/10 2:05 PM, Maia Cherney wrote:
> Hi Pavel,
>
> I have six ligands at partial occupacies in my structure. Simultaneous
> refinement of occupancy and B factors in phenix gives a value of 0.7
> for the ligand occupancy that looks reasonable.
> How does phenix can perform such a refinement given the occupancies
> and B factors are highly correlated? Indeed, you can increase/decrease
> the ligand occupancies while simultaneously increacing/decreasing
> their B factors without changing the R factor value. What criteria
> does phenix use in such a refinement if R factor does not tell much?
>
> Maia
******* COPY (11/25/09)************
On 11/25/09 7:38 AM, Maia Cherney wrote:
> Hi Pavel,
>
> It looks like all different refined occupancies starting from different
> initial occupancies converged to the same number upon going through very
> many cycles of refinement.
>
> Maia
>
>
> Pavel Afonine wrote:
>
>> Hi Maia,
>>
>> the atom parameters, such as occupancy, B-factor and even position are
>> interdependent in some sense. That is, if you have somewhat incorrect
>> occupancy, that B-factor refinement may compensate for it; if you
>> misplaced an atom the refinement of its occupancy or/and B-factor will
>> compensate for this. Note in all the above cases the 2mFo-DFc and
>> mFo-DFc maps will appear almost identical, as well as R-factors.
>>
>> So, I think your goal of finding a "true" occupancy is hardly achievable.
>>
>> Although, I think you can approach it by doing very many refinements
>> (say, several hundreds) (where you refine occupancies, B-factors and
>> coordinates) each refinement starting with different occupancy and
>> B-factor values, and make sure that each refinement converges. Then
>> select a subset of refined structures with similar and low R-factors
>> (discard those cases where refinement got stuck for whatever reason
>> and R-factors are higher) (and probably similar looking 2mFo-DFc and
>> mFo-DFc maps in the region of interest). Then see where the refined
>> occupancies and B-factors are clustering, and the averaged values will
>> probably give you an approximate values for occupancy and B. I did not
>> try this myself but always wanted to.
>>
>> If you have a structure consisting of 9 carbons and one gold atom,
>> then I would expect that the "second digit" in gold's occupancy would
>> matter. However, if we speak about dozen of ligand atoms (which are
>> probably a combination of C,N,O) out of a few thousands of atoms of
>> the whole structure, then I would not expect the "second digit" to be
>> visibly important.
>>
>> Pavel.
>>
>>
>> On 11/24/09 8:08 PM, chern wrote:
>>
>>> Thank you Kendall and Pavel for your responces.
>>> I really want to determine the occupancy of my ligand. I saw one
>>> suggestion to try different refinements with different occupancies
>>> and compare the B-factors.
>>> The occupancy with a B-factor that is at the level with the average
>>> protein B-factors, is a "true" occupancy.
>>> I also noticed the dependence of the ligand occupancy on the initial
>>> occupancy. I saw the difference of 10 to 15%, that is why I am
>>> wondering if the second digit after the decimal point makes any sence.
>>> Maia
>>>
>>> ----- Original Message -----
>>> *From:* Kendall Nettles <mailto:[email protected]>
>>> *To:* PHENIX user mailing list <mailto:[email protected]>
>>> *Sent:* Tuesday, November 24, 2009 8:22 PM
>>> *Subject:* Re: [phenixbb] occupancy refinement
>>>
>>> Hi Maia,
>>> I think the criteria for occupancy refinement of ligands is
>>> similar to a decision to add an alt conformation for an amino
>>> acid. I don’t refine occupancy of a ligand unless the difference
>>> map indicates that we have to. Sometimes part of the igand may be
>>> conformationally mobile and show poor density, but I personally
>>> don’t think this justifies occupancy refinement without evidence
>>> from the difference map. I agree with Pavel that you shouldn’t
>>> expect much change in overall statistics, unless the ligand has
>>> very low occupancy., or you have a very small protein. We
>>> typically see 0.5-1% difference in R factors from refining with
>>> ligand versus without for nuclear receptor igand binding domains
>>> of about 250 amino acids, and we see very small differences from
>>> occupancy refinement of the ligands.
>>>
>>> Regarding the error, I have noticed differences of 10% percent
>>> occupancy depending on what you set the starting occupancy before
>>> refinement. That is, if the starting occupancy starts at 1, you
>>> might end up with 50%, but if you start it at 0.01, you might get
>>> 40%. I don’t have the expertise to explain why this is, but I
>>> also don’t think it is necessarily important. I think it is more
>>> important to convince yourself that the ligand binds how you
>>> think it does. With steroid receptors, the ligand is usually
>>> planer, and tethered by hydrogen bonds on two ends. That leaves
>>> us with with four possible poses, so if in doubt, we will dock in
>>> the ligand in all of the four orientations and refine. So far, we
>>> have had only one of several dozen structures where the ligand
>>> orientation was not obvious after this procedure. I worry about a
>>> letter to the editor suggesting that the electron density for the
>>> ligand doesn’t support the conclusions of the paper, not whether
>>> the occupancy is 40% versus 50%.
>>>
>>> You might also want to consider looking at several maps, such as
>>> the simple or simulated annealing composite omit maps. These can
>>> be noisy, so also try the kicked maps (
>>> http://www.phenix-online.org/pipermail/phenixbb/2009-September/002573.html),
>>> <http://www.phenix-online.org/pipermail/phenixbb/2009-September/002573.html%…,>
>>> which I have become a big fan of.
>>>
>>> Regards,
>>> Kendall Nettles
>>>
>>> On 11/24/09 3:07 PM, "chern(a)ualberta.ca" <chern(a)ualberta.ca> wrote:
>>>
>>> Hi,
>>> I am wondering what is the criteria for occupancy refinement of
>>> ligands. I noticed that R factors change very little, but the
>>> ligand
>>> B-factors change significantly . On the other hand, the
>>> occupancy is
>>> refined to the second digit after the decimal point. How can
>>> I find
>>> out the error for the refined occupancy of ligands?
>>>
>>> Maia
>>>
16 years
Re: [phenixbb] phenix autobuild question
by Peter Zwart
Hi,
Try this (substring matching in action):
phenix.xtriage ab_xds_pointless_scala5.mtz obs_labels=+
or
phenix.xtriage ab_xds_pointless_scala5.mtz obs=M
HTH
P
2008/9/15 <aberndt(a)mrc-lmb.cam.ac.uk>:
> Hi Tom,
>
> thanks for your answer. I just checked the phenix version I used and it is
> version-1.3-final. Please find attached the .log file you asked for. I
> just realised that resolve apparently had a problem with the .mtz label
> assignment. I never had this problem before. Now that I mention that I
> have to confess that this is not entirely true. All the phenix tutorials I
> have looked through so far state that you can use ccp4s mtz the way they
> are. Yet, this is not entirely true. Say I want to have an analysis from
> phenix.xtriage by typing 'phenix.xtriage latest.mtz'. So phenix will give
> me the error message:
> Multiple equally suitable arrays of observed xray data found.
>
> Possible choices:
> ab_xds_pointless_scala5.mtz:IMEAN_XDSdataset,SIGIMEAN_XDSdataset
> ab_xds_pointless_scala5.mtz:I_XDSdataset(+),SIGI_XDSdataset(+),I_XDSdataset(-),SIGI_XDSdataset(-),merged
>
> Please use scaling.input.xray_data.obs_labels
> to specify an unambiguous substring of the target label.
>
> Sorry: Multiple equally suitable arrays of observed xray data found.
>
> Is there a way a way to avoid going back to ccp and sorting the labels
> accordingly. What would be the 'scaling.input.xray_data.obs_labels'
> command in the xtriage command line?
>
> Anyway, Tom, many thanks for your answer in advance! I reaaly like working
> with phenix. I just need to learn loads more ...
>
> Best,
> Alex
>
>
>
>> Hi Alex,
>>
>> I'm sorry for both the failure and the lack of a clear message here! I
>> will try to fix both.
>>
>> Is this running phenix-1.3-final? (If not, could you possibly run with
>> that version so we are on the same version.)
>>
>> Can you possibly send me the end of the output in
>>
>> /someplace/home/someuser/
>> work/someproject/phenix2/AutoBuild_run_1_/TEMP0/AutoBuild_run_1_/AutoBuild_run_1_1.log
>>
>> which I hope will have an actual error message to look at?
>>
>> The reason for the subdirectories is this: phenix.autobuild runs several
>> jobs in parallel (if you have set nproc ) or one after the other (if you
>> have not set nproc). These subdirectories contain those jobs...which are
>> then combined together to give the final results in your overall
>> AutoBuild_run_1_/ directory.
>>
>> I don't know the answer to the phenix.refine question...perhaps Pawel or
>> Ralf can answer that one.
>>
>> All the best,
>> Tom T
>>
>>> Dear phenix community,
>>>
>>> I started an AutoBuild run using the phenix GUI (ver 1.3 rc4) on a
>>> linux cluster. It worked okay and all the refinement looked pretty
>>> decent. However, after quite a while I obtained the following error
>>> message (see below). Would anyone please tell me what to do to prevent
>>> this error and get phenix get its job finished?
>>>
>>> Also, I don't understand why phenix AutoBuild creates 5 (or whatever
>>> number) AutoBuild_run_x in the AutoBuild-run_1/TEMP0 directory and not
>>> two dirs up in the tree. In other words in my /someplace/home/someuser/
>>> work/someproject/phenix2/AutoBuild_run_1_/TEMP0 directory are more
>>> AutoBuild_run_x_ directories (with x=1 to 5). It is a bit confusing to
>>> me.
>>>
>>> A final question: I realised that the phenix.refine drastically
>>> increases the number of outliers. I know that thare is a weighting
>>> term someplace ... but what was it again?
>>>
>>> Many thanks in advance,
>>> Alex
>>>
>>>
>>> warnings
>>> Failed to carry out AutoBuild_multiple_models:
>>> Sorry, subprocess failed...message is:
>>> ********************************************************************************
>>> phenix.autobuild \
>>>
>>> write_run_directory_to_file=/someplace/home/someuser/work/someproject/
>>> phenix2/AutoBuild_run_1_/TEMP0/INFO_FILE_1
>>>
>>> Reading effective parameters from /someplace/home/someuser/work/
>>> someproject/phenix2/AutoBuild_run_1_/TEMP0/PARAMS_1.eff
>>>
>>> Sending output to AutoBuild_run_1_/AutoBuild_run_1_1.log
>>>
>>> ********************************************************************************
>>>
>>> Failed to carry out AutoBuild_build_cycle:
>>>
>>> failure
>>>
>>> ********************************************************************************
>>>
>>> ********************************************************************************
>>>
>>> work_pdb_file "AutoBuild_run_1_/edited_pdb.pdb"
>>> working_directory "/someplace/home/someuser/work/someproject/phenix2"
>>> worst_percent_res_rebuild 2.0
>>>
>>> ---------------------------------------------------
>>> Alex Berndt
>>> MRC-Laboratory of Molecular Biology
>>> Hills Road
>>> Cambridge, CB2 0QH
>>> U.K.
>>>
>>> phone: +44 (0)1223 402113
>>> ---------------------------------------------------
>>>
>>> _______________________________________________
>>> phenixbb mailing list
>>> phenixbb(a)phenix-online.org
>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>>
>>
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://www.phenix-online.org/mailman/listinfo/phenixbb
>
>
--
-----------------------------------------------------------------
P.H. Zwart
Beamline Scientist
Berkeley Center for Structural Biology
Lawrence Berkeley National Laboratories
1 Cyclotron Road, Berkeley, CA-94703, USA
Cell: 510 289 9246
BCSB: http://bcsb.als.lbl.gov
PHENIX: http://www.phenix-online.org
CCTBX: http://cctbx.sf.net
-----------------------------------------------------------------
17 years, 4 months
Re: [phenixbb] Coot mutation of Asp residue to isoAsp residue
by Zhijie Li
Hi Xiao,
The previous IAS.cif I sent had the OXT atom missing. Please find the attached IAS_mon_lib.cif, which has this mistake fixed.
Zhijie
From: Xiao Lei
Sent: Tuesday, August 18, 2015 1:56 PM
To: Zhijie Li
Cc: PHENIX user mailing list
Subject: Re: [phenixbb] Coot mutation of Asp residue to isoAsp residue
Hi Zhejie,
For the 1AT6 structure, I downloaded its density in coot using "fetch density from EDS", but when I found the IAS at 101 position and try to do real space refine, it gives an error that "Refinement setup failure. Failed to find restraints for IAS."
I do not know how to fix this but it seems to me it's caused by incomplete restraints dictionary or monomer library in ccp4?
Thanks.
Xiao
On Tue, Aug 18, 2015 at 10:42 AM, Xiao Lei <xiaoleiusc(a)gmail.com> wrote:
Hi Zhijie,
Thank you very much for the information. For step 1 you mentioned, I can get monomer with L-Asp but it seems I can not drag it (or I do not know how to do) and can not delete or modify it to become isoAsp. I will try play around more though.
Xiao
On Mon, Aug 17, 2015 at 6:33 PM, Zhijie Li <zhijie.li(a)utoronto.ca> wrote:
Hi Xiao,
IsoAsp is essentially an L-Asp linked with next aa through its side chain (beta) carboxyl. So the mutation button won’t help you. You need to build in a new L-ASP, which is treated as a covalently linked ligand (HETATM records), instead of a standard residue (ATOM records) of the protein chain.
A practical method might be: 1) delete the original Asp, 2) import a free L-Asp using “get monomer”, delete its hydrogen atoms and drag it into the density, delete one oxygen atom on the beta-carboxyl and change the residue’s numbering and chain id to fit it into the sequence, 3) edit the PDB, if necessary, to turn the ASP into a ligand (a HETATM record inside the chain).
For step 3, you may need to rename the ASP to something else (IAS was used for isoASP in older pdb, so I would go with IAS ) so that coot won’t try to make a regular peptide bond using its main chain carboxyl during real space refinement. Of course you will need to make a cif file for the “new” compound too. I guess you can make a copy of ASP.cif from the monomer library and change everything in it to IAS. I think if you have placed the IAS to the right location and its ends are in bonding distance with the neighbouring aa residues you may not need to do anything for refmac. For phenix.refine you will need to add a bond description to the .edit file for each linkage the IAS makes to the neighboring aas.
You may take a look at the structure 1AT6 and its PDB file. The residue IAS 101 is an example of isoASP. Note that the IAS atoms are HETATM in the chain and there are two LINK records in the header to indicate its linkage to neighbouring aas (LINK records are normally not generated or needed during refinement using refmac or phenix.refine).
Zhijie
From: Xiao Lei
Sent: Monday, August 17, 2015 6:50 PM
To: PHENIX user mailing list
Subject: [phenixbb] Coot mutation of Asp residue to isoAsp residue
Dear Phenixbb members,
I suspect one Asp residue in my model may be an isoAsp (isomerization of Asp). I am asking if there is way to mutate Asp residue to isoAsp(isoaspartic acid) residue in coot GUI (I'm using coot 0.8.1 EL in Mac OS X10.10.5)?
I know there is a mutation button on coot, but the mutated aa lists are all natural amino acids. If I have to delete the Asp residue first and then build isoAsp into the density map, is there a way in coot to build an isoAsp residue in map?
Thanks ahead.
Xiao
----------------------------------------------------------------------------
_______________________________________________
phenixbb mailing list
phenixbb(a)phenix-online.org
http://phenix-online.org/mailman/listinfo/phenixbb
Unsubscribe: phenixbb-leave(a)phenix-online.org
10 years, 5 months
Re: [phenixbb] Geometry Restraints - Anisotropic truncation
by Dale Tronrud
While philosophically I see no difference between a spherical resolution
cutoff and an elliptical one, a drop in the free R can't be the justification
for the switch. A model cannot be made more "publishable" simply by discarding
data.
We have a whole bunch of empirical guides for judging the quality of this
and that in our field. We determine the resolution limit of a data set (and
imposing a "limit" is another empirical choice made) based on Rmrg, or Rmes,
or Rpim getting too big or I/sigI getting too small and there is no agreement
on how "too big/small" is too "too big/small".
We then have other empirical guides for judging the quality of the models
we produce (e.g. Rwork, Rfree, rmsds of various sorts). Most people seem to
recognize that the these criteria need to be applied differently for different
resolutions. A lower resolution model is allowed a higher Rfree, for example.
Isn't is also true that a model refined to data with a cutoff of I/sigI of
1 would be expected to have a free R higher than a model refined to data with
a cutoff of 2? Surely we cannot say that the decrease in free R that results
from changing the cutoff criteria from 1 to 2 reflects an improved model. It
is the same model after all.
Sometimes this shifting application of empirical criteria enhances the
adoption of new technology. Certainly the TLS parametrization of atomic
motion has been widely accepted because it results in lower working and free
Rs. I've seen it knock 3 to 5 percent off, and while that certainly means
that the model fits the data better, I'm not sure that the quality of the
hydrogen bond distances, van der Waals distances, or maps are any better.
The latter details are what I really look for in a model.
On the other hand, there has been good evidence through the years that
there is useful information in the data beyond an I/sigI of 2 or an
Rmeas > 100% but getting people to use this data has been a hard slog. The
reason for this reluctance is that the R values of the resulting models
are higher. Of course they are higher! That does not mean the models
are of poorer quality, only that data with lower signal/noise has been
used that was discarded in the models you used to develop your "gut feeling"
for the meaning of R.
When you change your criteria for selecting data you have to discard
your old notions about the acceptable values of empirical quality measures.
You either have to normalize your measure, as Phil Jeffrey recommends, by
ensuring that you calculate your R's with the same reflections, or by
making objective measures of map quality.
Dale Tronrud
P.S. It is entirely possible that refining a model to a very optimistic
resolution cutoff and calculating the map to a lower resolution might be
better than throwing out the data altogether.
On 5/1/2012 10:34 AM, Kendall Nettles wrote:
> I have seen dramatic improvements in maps and behavior during refinement following use of the UCLA anisotropy server in two different cases. For one of them the Rfree went from 33% to 28%. I don't think it would have been publishable otherwise.
> Kendall
>
> On May 1, 2012, at 11:10 AM, Bryan Lepore wrote:
>
>> On Mon, Apr 30, 2012 at 4:22 AM, Phil Evans<pre(a)mrc-lmb.cam.ac.uk> wrote:
>>> Are anisotropic cutoff desirable?
>>
>> is there a peer-reviewed publication - perhaps from Acta
>> Crystallographica - which describes precisely why scaling or
>> refinement programs are inadequate to ameliorate the problem of
>> anisotropy, and argues why the method applied in Strong, et. al. 2006
>> satisfies this need?
>>
>> -Bryan
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://phenix-online.org/mailman/listinfo/phenixbb
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
13 years, 9 months