Search results for query "look through"
- 520 messages

Re: [phenixbb] rotamers
by Pavel Afonine
Hi Kendall,
we've been disusing this off-list between developers and some users...
I'm going to copy-paste one of my latest comments:
"""
Given the poll result I guess there is no way to make everyone happy
with a unique solution, so having multiple options (keep/trim/do
something else) is clearly the way to go.
I guess another poll (or proper research) about "what's distinguishable"
(or what you call poor density) would be of help too. Because I bet once
we start doing trimming there will be always someone screaming "No! you
are trimming/not building at too low/high CC/sigma!"
Personally, I'm not leaning towards one or another option. I believe all
of them have approximately equal amount of clear advantages and
disadvantages. I was just thinking of a longer term research-like
solution that might (or might not) bring a novel idea...
By the way, another alternative way to model them is to define a
probability distribution mask around a side chain where its atoms are
expected, and use that mask as a contribution to Fcalc. I guess (not
100% sure) this is what you can do in BUSTER-TNT (at least this is what
advertised in their paper). That would account for these missing atoms
to some degree without actually including the ATOM records for them into
final PDB. One wrinkle though is that in this case you would need to
deposit that mask along with your PDB file, since you will be having a
mixed model - atomic model + nonatomic model. (I wonder how many users
who ever used this option actually did deposit the masks? -:) )
"""
All the best!
Pavel.
On 3/29/11 6:43 PM, Kendall Nettles wrote:
> We have been doing a lot of parallel refinements where we are checking out the new options in PHENIX refinement, and one of the things we have observed is that sometimes the sidechains with no clear electron density end up in the main chain density, and distort the model. There is no clear pattern as to which options lead to this phenotype, as different combinations give different results. Until we can sort out what is causing this, it seems clear to me that it is better to delete the side chains. If you want to leave a side chain with no clear electron density, you have to make sure each one is not distorting the model. So leaving the side chains with no clear electron density requires much work, with a benefit that is not clear to me.
>
> Kendall Nettles
>
> On Mar 28, 2011, at 1:04 PM, Ed Pozharski wrote:
>
>> Pavel,
>>
>>> - what you mean by "no density",
>> Lack of confidence in placement of the side chain. Everyone would have
>> somewhat different take on it, but the question is more about what to
>> do, not how to decide if the side chain is disordered.
>>
>>> Therefore this raises another item for your questionnaire:
>> There is "other" option, feel free to use it
>>
>>> refine group
>>> occupancy for these atoms (one occupancy per all atoms in question - the
>>> occupancy typically will refine to something less than 0.5 or so).
>> This raises an entirely different question regarding reliability of
>> occupancy refinement in general due to its correlation with the
>> B-factors. Another can of worms.
>>
>>> This trick with smearing out an atom by B-factor may only work for
>>> isolated (single) atoms such as waters because they are not bonded to
>>> anything through restraints.
>> Certainly, presence of restraints makes the B-factor increase less
>> steep. I just looked at an instance of a disordered arginine (no
>> density above 1 sigma for any side chain atoms), and B-factors jump from
>> 30 at the backbone to 90 at the tip of the side chain. This would
>> reduce the density level ~5x, which is probably quite sufficient for
>> blending it into the solvent. There could be a bit of a problem in the
>> middle, where B-factors are inflated/deflated, but it does take care of
>> density reduction.
>>
>> Things like atom-specific restraints and modified restraint target may
>> be of some help, but the effect on the final model may be too small to
>> validate the effort.
>>
>> --
>> "I'd jump in myself, if I weren't so good at whistling."
>> Julian, King of Lemurs
>>
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://phenix-online.org/mailman/listinfo/phenixbb
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
14 years, 3 months

Re: [cctbxbb] bootstrap.py build on Ubuntu
by Billy Poon
Hi David,
Actually, it looks like the lib64z1-dev package provides libz.so in
/usr/lib64, so installing that package should fix your issue. It's a bit
odd that the lib64z1 package does not provide that file.
--
Billy K. Poon
Research Scientist, Molecular Biophysics and Integrated Bioimaging
Lawrence Berkeley National Laboratory
1 Cyclotron Road, M/S 33R0345
Berkeley, CA 94720
Tel: (510) 486-5709
Fax: (510) 486-5909
Web: https://phenix-online.org
On Mon, Jun 13, 2016 at 1:53 PM, Billy Poon <bkpoon(a)lbl.gov> wrote:
> Hi David,
>
> I don't have a fix yet, but here is a workaround. It seems like setup.py
> is looking for libz.so instead of libz.so.1, so you can fix the issue by
> making a symbolic link for libz.so in /usr/lib64.
>
> sudo ln -s /usr/lib64/libz.so.1 /usr/lib64/libz.so
>
> This requires root access, so that's why it's just a workaround.
>
> --
> Billy K. Poon
> Research Scientist, Molecular Biophysics and Integrated Bioimaging
> Lawrence Berkeley National Laboratory
> 1 Cyclotron Road, M/S 33R0345
> Berkeley, CA 94720
> Tel: (510) 486-5709
> Fax: (510) 486-5909
> Web: https://phenix-online.org
>
> On Sat, Jun 11, 2016 at 5:05 PM, Billy Poon <bkpoon(a)lbl.gov> wrote:
>
>> Hi David,
>>
>> Sorry it look so long! Setting up all the virtual machines was a time
>> sink and getting things to work on 32-bit CentOS 5 and Ubuntu 12.04 was a
>> little tricky.
>>
>> It looks like Ubuntu 16.04 moved its libraries around. I used apt-get to
>> install libz-dev and lib64z1 (the 64-bit library). There is a libz.so.1
>> file in /lib/x86_64-linux-gnu and in /usr/lib64.
>>
>> I have not gotten it to work yet, but I'm pretty sure this is the issue.
>> I'll have to double-check 12.04 and 14.04.
>>
>> As for Pillow, I did test it a few months ago, but I remember there being
>> API changes that will need to fixed.
>>
>> --
>> Billy K. Poon
>> Research Scientist, Molecular Biophysics and Integrated Bioimaging
>> Lawrence Berkeley National Laboratory
>> 1 Cyclotron Road, M/S 33R0345
>> Berkeley, CA 94720
>> Tel: (510) 486-5709
>> Fax: (510) 486-5909
>> Web: https://phenix-online.org
>>
>> On Sat, Jun 11, 2016 at 2:04 AM, David Waterman <dgwaterman(a)gmail.com>
>> wrote:
>>
>>> Hi Billy,
>>>
>>> I'm replying on this old thread because I have finally got round to
>>> trying a bootstrap build for DIALS out again on Ubuntu, having waited for
>>> updates to the dependencies and updating the OS to 16.04.
>>>
>>> The good news is, the build ran through fine. This is the first time
>>> I've had a bootstrap build complete without error on Ubuntu, so thanks to
>>> you and the others who have worked on improving the build in the last few
>>> months!
>>>
>>> The bad news is I'm getting two failures in the DIALS tests:
>>>
>>> dials/test/command_line/tst_export_bitmaps.py
>>> dials_regression/test.py
>>>
>>> Both are from PIL
>>>
>>> File
>>> "/home/fcx32934/dials_test_build/base/lib/python2.7/site-packages/PIL/Image.py",
>>> line 401, in _getencoder
>>> raise IOError("encoder %s not available" % encoder_name)
>>> IOError: encoder zip not available
>>>
>>> Indeed, from base_tmp/imaging_install_log it looks like PIL is not
>>> configured properly
>>>
>>> --------------------------------------------------------------------
>>> PIL 1.1.7 SETUP SUMMARY
>>> --------------------------------------------------------------------
>>> version 1.1.7
>>> platform linux2 2.7.8 (default_cci, Jun 10 2016, 16:04:32)
>>> [GCC 5.3.1 20160413]
>>> --------------------------------------------------------------------
>>> *** TKINTER support not available
>>> *** JPEG support not available
>>> *** ZLIB (PNG/ZIP) support not available
>>> *** FREETYPE2 support not available
>>> *** LITTLECMS support not available
>>> --------------------------------------------------------------------
>>>
>>> Any ideas? I have zlib headers but perhaps PIL can't find them.
>>>
>>> On a related note, the free version of PIL has not been updated for
>>> years. The replacement Pillow has started to diverge. I first noticed this
>>> when Ubuntu 16.04 gave me Pillow 3.1.2 and my cctbx build with the system
>>> python produced failures because it no longer supports certain deprecated
>>> methods from PIL. I worked around that in r24587, but these things are a
>>> losing battle. Is it time to switch cctbx over to Pillow instead of PIL?
>>>
>>> Cheers
>>>
>>> -- David
>>>
>>> On 7 January 2016 at 18:12, Billy Poon <bkpoon(a)lbl.gov> wrote:
>>>
>>>> Hi all,
>>>>
>>>> Since wxPython was updated to 3.0.2, I have been thinking about
>>>> updating the other GUI-related packages to more recent versions. I would
>>>> probably update to the latest, stable version that does not involve major
>>>> changes to the API so that backwards compatibility is preserved. Let me
>>>> know if that would be helpful and I can prioritize the migration and
>>>> testing.
>>>>
>>>> --
>>>> Billy K. Poon
>>>> Research Scientist, Molecular Biophysics and Integrated Bioimaging
>>>> Lawrence Berkeley National Laboratory
>>>> 1 Cyclotron Road, M/S 33R0345
>>>> Berkeley, CA 94720
>>>> Tel: (510) 486-5709
>>>> Fax: (510) 486-5909
>>>> Web: https://phenix-online.org
>>>>
>>>> On Thu, Jan 7, 2016 at 9:30 AM, Nicholas Sauter <nksauter(a)lbl.gov>
>>>> wrote:
>>>>
>>>>> David,
>>>>>
>>>>> I notice that the Pango version, 1.16.1, was released in 2007, so
>>>>> perhaps it is no surprise that the latest Ubuntu does not support it.
>>>>> Maybe this calls for stepping forward the Pango version until you find one
>>>>> that works. I see that the latest stable release is 1.39.
>>>>>
>>>>> This would be valuable information for us..Billy Poon in the Phenix
>>>>> group is supporting the Phenix GUI, so it might be advisable for him to
>>>>> update the Pango version in the base installer.
>>>>>
>>>>> Nick
>>>>>
>>>>> Nicholas K. Sauter, Ph. D.
>>>>> Computer Staff Scientist, Molecular Biophysics and Integrated
>>>>> Bioimaging Division
>>>>> Lawrence Berkeley National Laboratory
>>>>> 1 Cyclotron Rd., Bldg. 33R0345
>>>>> Berkeley, CA 94720
>>>>> (510) 486-5713
>>>>>
>>>>> On Thu, Jan 7, 2016 at 8:54 AM, David Waterman <dgwaterman(a)gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi again
>>>>>>
>>>>>> Another data point: I just tried this on a different Ubuntu machine,
>>>>>> this time running 14.04. In this case pango installed just fine. In fact
>>>>>> all other packages installed too and the machine is now compiling cctbx.
>>>>>>
>>>>>> I might have enough for comparison between the potentially working
>>>>>> 14.04 and failed 15.04 builds to figure out what is wrong in the second
>>>>>> case.
>>>>>>
>>>>>> Cheers
>>>>>>
>>>>>> -- David
>>>>>>
>>>>>> On 7 January 2016 at 09:56, David Waterman <dgwaterman(a)gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi folks
>>>>>>>
>>>>>>> I recently tried building cctbx+dials on Ubuntu 15.04 following the
>>>>>>> instructions here:
>>>>>>> http://dials.github.io/documentation/installation_developer.html
>>>>>>>
>>>>>>> This failed during installation of pango-1.16.1. Looking
>>>>>>> at pango_install_log, I see the command that failed was as follows:
>>>>>>>
>>>>>>> gcc -DHAVE_CONFIG_H -I. -I. -I../..
>>>>>>> -DSYSCONFDIR=\"/home/fcx32934/sw/dials_bootstrap_test/base/etc\"
>>>>>>> -DLIBDIR=\"/home/fcx32934/sw/dials_bootstrap_test/base/lib\"
>>>>>>> -DG_DISABLE_CAST_CHECKS -I../.. -DG_DISABLE_DEPRECATED
>>>>>>> -I/home/fcx32934/sw/dials_bootstrap_test/base/include
>>>>>>> -I/home/fcx32934/sw/dials_bootstrap_test/base/include/freetype2 -g -O2
>>>>>>> -Wall -MT fribidi.lo -MD -MP -MF .deps/fribidi.Tpo -c fribidi.c -fPIC
>>>>>>> -DPIC -o .libs/fribidi.o
>>>>>>> In file included from fribidi.h:31:0,
>>>>>>> from fribidi.c:28:
>>>>>>> fribidi_config.h:1:18: fatal error: glib.h: No such file or directory
>>>>>>>
>>>>>>> The file glib.h appears to be in base/include/glib-2.0/, however
>>>>>>> this directory was not explicitly included in the command above, only its
>>>>>>> parent. This suggests a configuration failure in pango to me. Taking a look
>>>>>>> at base_tmp/pango-1.16.1/config.log, I see what look like the relevant
>>>>>>> lines:
>>>>>>>
>>>>>>> configure:22227: checking for GLIB
>>>>>>> configure:22235: $PKG_CONFIG --exists --print-errors "$GLIB_MODULES"
>>>>>>> configure:22238: $? = 0
>>>>>>> configure:22253: $PKG_CONFIG --exists --print-errors "$GLIB_MODULES"
>>>>>>> configure:22256: $? = 0
>>>>>>> configure:22304: result: yes
>>>>>>>
>>>>>>> but this doesn't tell me very much. Does anyone have any suggestions
>>>>>>> as to how I might proceed?
>>>>>>>
>>>>>>> Many thanks
>>>>>>>
>>>>>>> -- David
>>>>>>>
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> cctbxbb mailing list
>>>>>> cctbxbb(a)phenix-online.org
>>>>>> http://phenix-online.org/mailman/listinfo/cctbxbb
>>>>>>
>>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> cctbxbb mailing list
>>>>> cctbxbb(a)phenix-online.org
>>>>> http://phenix-online.org/mailman/listinfo/cctbxbb
>>>>>
>>>>>
>>>>
>>>> _______________________________________________
>>>> cctbxbb mailing list
>>>> cctbxbb(a)phenix-online.org
>>>> http://phenix-online.org/mailman/listinfo/cctbxbb
>>>>
>>>>
>>>
>>> _______________________________________________
>>> cctbxbb mailing list
>>> cctbxbb(a)phenix-online.org
>>> http://phenix-online.org/mailman/listinfo/cctbxbb
>>>
>>>
>>
>
9 years

Re: [phenixbb] alternatives to RMSD
by Pavel Afonine
Hi Ed,
yes, this makes sense, sure: having models to be superposed
appropriately to begin with certainly is a good idea. My point was that
coordinate-based rmsd does not account for B-factors and occupancy that
is a problem if you are comparing flexible molecules. A possible
solution may be to use a more generous (=more information rich)
representation of atomic model such as electron density map that would
'automatically' take care of disorder.
Pavel
On 7/5/14, 9:37 AM, Edward A. Berry wrote:
> I agree that would be useful as an alternative to RMSD,
> but if I understand the original post, the problem
> with RMSD is that the two secondary structure elements
> are connected by a variable turn so that they cannot be superimposed
> simultaneously. That could still be a problem, comparing maps.
>
> What you can do is report the change in angle between them,
> and the residues making up the hinge.
> A program called dyndom (dynamic domains) is good for this,
> Or you can superimpose each "domain" separately, view the
> superimposed molecules, and see haw far into the turn
> from each side thesuperposition is good
>
> To get the change in angle between the two parts,
> first superimpose model A on model B using only residues
> in domain 1 (say, the helix).
> Save that reoriented model A, and now superimpose it on model B
> using only residues in domain 2 (the strand).
> The angle involved in this second rotation is the chang in interdomain
> angle.
>
> (You could also report RMSD for superposition of the individual domain,
> but helix-on-helix or strand-on-strand are likely to be pretty good fits
> and not very informative.)
> eab
>
> On 07/05/2014 10:22 AM, PC wrote:
>> Hi Pavel,
>>
>> Thank you very much, this sounds very interesting.
>>
>> I have used ccp4, coot and phenix but I am no expert but I am
>> definitely interested in trying this method if you could give more
>> information.
>>
>> Thank you,
>> Patrick.
>>
>>
>> -----Original Message-----
>> *From:* pafonine(a)lbl.gov
>> *Sent:* Fri, 04 Jul 2014 20:34:33 -0700
>> *To:* patrick.cossins(a)inbox.com, phenixbb(a)phenix-online.org
>> *Subject:* Re: [phenixbb] alternatives to RMSD
>>
>> Hi Patrick,
>>
>> RMSD is a poor measure in this case as it does not account for
>> B-factors, occupancies, alternative conformations and so on
>> information a crystal structure model may make available.
>> Macromolecules are not a bunch of points in space.
>>
>> While I'm sure more thorough methods exist, I would vote for the
>> simplest, most direct and obvious one. You can calculate electron
>> density map using a Gaussian approximation from model A and B (yes,
>> electron density map - not a Fourier image of it!). That will
>> naturally account for all: B-factors, occupancies, other disorder.
>> Then you can calculate a map similarity measure, such as map
>> correlation, for instance. After all, why use a cannon to kill a fly?!
>>
>> If you are interested to follow this route I can explain the
>> details.
>>
>> All the best,
>> Pavel
>>
>>> Hi Phenix users,
>>>
>>> I am not a crystallographer but I though you guys might be a
>>> good place to ask this question.
>>>
>>> I have 2 super secondary structures, A and B and they consist of
>>> Helix-turn-Strand
>>>
>>> Due to the turn the two structures have a poor RMSD because the
>>> two flanking fragments of Helix and Strand are far from each other
>>> but when I superimpose the two fragments individually(helixA with
>>> helix B and standA with strandB in Pymol they align very well).
>>>
>>> Now, is there a way to express this instead of using the RMSD?
>>> When the two structures align well the RMSD is very good but a
>>> slight movement and the RMSD is awful.
>>> But looking at the two structures I can see they follow the same
>>> path through space.
>>>
>>> Thank you,
>>> Patrick
>>
>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>
>> Protect your computer files with professional cloud backup. Get PCRx
>> Backup and upload unlimited files automatically.
>> <http://backup.pcrx.com/mail>
>>
>>
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://phenix-online.org/mailman/listinfo/phenixbb
>>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
10 years, 11 months

Re: [phenixbb] Geometry Restraints - Anisotropic truncation
by Kendall Nettles
I didnt think the structure was publishable with Rfree of 33% because I was expecting the reviewers to complain.
We have tested a number of data sets on the UCLA server and it usually doesn't make much difference. I wouldn't expect truncation alone to change Rfree by 5%, and it usually doesn't. The two times I have seen dramatic impacts on the maps ( and Rfree ), the highly anisotrophic sets showed strong waves of difference density as well, which was fixed by throwing out the noise. We have moved to using loose data cutoffs for most structures, but I do think anisotropic truncation can be helpful in rare cases.
Kendall
On May 1, 2012, at 3:07 PM, "Dale Tronrud" <det102(a)uoxray.uoregon.edu> wrote:
>
> While philosophically I see no difference between a spherical resolution
> cutoff and an elliptical one, a drop in the free R can't be the justification
> for the switch. A model cannot be made more "publishable" simply by discarding
> data.
>
> We have a whole bunch of empirical guides for judging the quality of this
> and that in our field. We determine the resolution limit of a data set (and
> imposing a "limit" is another empirical choice made) based on Rmrg, or Rmes,
> or Rpim getting too big or I/sigI getting too small and there is no agreement
> on how "too big/small" is too "too big/small".
>
> We then have other empirical guides for judging the quality of the models
> we produce (e.g. Rwork, Rfree, rmsds of various sorts). Most people seem to
> recognize that the these criteria need to be applied differently for different
> resolutions. A lower resolution model is allowed a higher Rfree, for example.
>
> Isn't is also true that a model refined to data with a cutoff of I/sigI of
> 1 would be expected to have a free R higher than a model refined to data with
> a cutoff of 2? Surely we cannot say that the decrease in free R that results
> from changing the cutoff criteria from 1 to 2 reflects an improved model. It
> is the same model after all.
>
> Sometimes this shifting application of empirical criteria enhances the
> adoption of new technology. Certainly the TLS parametrization of atomic
> motion has been widely accepted because it results in lower working and free
> Rs. I've seen it knock 3 to 5 percent off, and while that certainly means
> that the model fits the data better, I'm not sure that the quality of the
> hydrogen bond distances, van der Waals distances, or maps are any better.
> The latter details are what I really look for in a model.
>
> On the other hand, there has been good evidence through the years that
> there is useful information in the data beyond an I/sigI of 2 or an
> Rmeas > 100% but getting people to use this data has been a hard slog. The
> reason for this reluctance is that the R values of the resulting models
> are higher. Of course they are higher! That does not mean the models
> are of poorer quality, only that data with lower signal/noise has been
> used that was discarded in the models you used to develop your "gut feeling"
> for the meaning of R.
>
> When you change your criteria for selecting data you have to discard
> your old notions about the acceptable values of empirical quality measures.
> You either have to normalize your measure, as Phil Jeffrey recommends, by
> ensuring that you calculate your R's with the same reflections, or by
> making objective measures of map quality.
>
> Dale Tronrud
>
> P.S. It is entirely possible that refining a model to a very optimistic
> resolution cutoff and calculating the map to a lower resolution might be
> better than throwing out the data altogether.
>
> On 5/1/2012 10:34 AM, Kendall Nettles wrote:
>> I have seen dramatic improvements in maps and behavior during refinement following use of the UCLA anisotropy server in two different cases. For one of them the Rfree went from 33% to 28%. I don't think it would have been publishable otherwise.
>> Kendall
>>
>> On May 1, 2012, at 11:10 AM, Bryan Lepore wrote:
>>
>>> On Mon, Apr 30, 2012 at 4:22 AM, Phil Evans<pre(a)mrc-lmb.cam.ac.uk> wrote:
>>>> Are anisotropic cutoff desirable?
>>>
>>> is there a peer-reviewed publication - perhaps from Acta
>>> Crystallographica - which describes precisely why scaling or
>>> refinement programs are inadequate to ameliorate the problem of
>>> anisotropy, and argues why the method applied in Strong, et. al. 2006
>>> satisfies this need?
>>>
>>> -Bryan
>>> _______________________________________________
>>> phenixbb mailing list
>>> phenixbb(a)phenix-online.org
>>> http://phenix-online.org/mailman/listinfo/phenixbb
>>
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org
>> http://phenix-online.org/mailman/listinfo/phenixbb
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
13 years, 2 months

Re: [cctbxbb] some thoughts on cctbx and pip
by Luc Bourhis
Hi,
Even if we managed to ship our the boost dynamic libraries with pip, it would still not be pip-like, as we would still need our python wrappers to set LIBTBX_BUILD and LD_LIBRARY_PATH. Normal pip packages work with the standard python exe. LD_LIBRARY_PATH, we could get around that by changing the way we compile, using -Wl,-R, which is the runtime equivalent of build time -L. That’s a significant change that would need to be tested. But there is no way around setting LIBTBX_BUILD right now. Leaving that to the user is horrible. Perhaps there is a way to hack libtbx/env_config.py so that we can hardwire LIBTBX_BUILD in there when pip installs?
Best wishes,
Luc
> On 16 Aug 2019, at 22:47, Luc Bourhis <luc_j_bourhis(a)mac.com> wrote:
>
> Hi,
>
> I did look into that many years ago, and even toyed with building a pip installer. What stopped me is the exact conclusion you reached too: the user would not have the pip experience he expects. You are right that it is a lot of effort but is it worth it? Considering that remark, I don’t think so. Now, Conda was created specifically to go beyond pip pure-python-only support. Since cctbx has garnered support for Conda, the best avenue imho is to go the extra length to have a package on Anaconda.org <http://anaconda.org/>, and then to advertise it hard to every potential user out there.
>
> Best wishes,
>
> Luc
>
>
>> On 16 Aug 2019, at 21:45, Aaron Brewster <asbrewster(a)lbl.gov <mailto:[email protected]>> wrote:
>>
>> Hi, to avoid clouding Dorothee's documentation email thread, which I think is a highly useful enterprise, here's some thoughts about putting cctbx into pip. Pip doesn't install non-python dependencies well. I don't think boost is available as a package on pip (at least the package version we use). wxPython4 isn't portable through pip (https://wiki.wxpython.org/How%20to%20install%20wxPython#Installing_wxPython… <https://wiki.wxpython.org/How%20to%20install%20wxPython#Installing_wxPython…>). MPI libraries are system dependent. If cctbx were a pure python package, pip would be fine, but cctbx is not.
>>
>> All that said, we could build a manylinux1 version of cctbx and upload it to PyPi (I'm just learning about this). For a pip package to be portable (which is a requirement for cctbx), it needs to conform to PEP513, the manylinux1 standard (https://www.python.org/dev/peps/pep-0513/ <https://www.python.org/dev/peps/pep-0513/>). For example, numpy is built according to this standard (see https://pypi.org/project/numpy/#files <https://pypi.org/project/numpy/#files>, where you'll see the manylinux1 wheel). Note, the manylinux1 standard is built with Centos 5.11 which we no longer support.
>>
>> There is also a manylinux2010 standard, which is based on Centos 6 (https://www.python.org/dev/peps/pep-0571/ <https://www.python.org/dev/peps/pep-0571/>). This is likely a more attainable target (note though by default C++11 is not supported on Centos 6).
>>
>> If we built a manylinuxX version of cctbx and uploaded it to PyPi, the user would need all the non-python dependencies. There's no way to specify these in pip. For example, cctbx requires boost 1.63 or better. The user will need to have it in a place their python can find it, or we could package it ourselves and supply it, similar to how the pip h5py package now comes with an hd5f library, or how the pip numpy package includes an openblas library. We'd have to do the same for any packages we depend on that aren't on pip using the manylinux standards, such as wxPython4.
>>
>> Further, we need to think about how dials and other cctbx-based packages interact. If pip install cctbx is set up, how does pip install dials work, such that any dials shared libraries can find the cctbx libraries? Can shared libraries from one pip package link against libraries in another pip package? Would each package need to supply its own boost? Possibly this is well understood in the pip field, but not by me :)
>>
>> Finally, there's the option of providing a source pip package. This would require the full compiler toolchain for any given platform (macOS, linux, windows). These are likely available for developers, but not for general users.
>>
>> Anyway, these are some of the obstacles. Not saying it isn't possible, it's just a lot of effort.
>>
>> Thanks,
>> -Aaron
>>
>> _______________________________________________
>> cctbxbb mailing list
>> cctbxbb(a)phenix-online.org <mailto:[email protected]>
>> http://phenix-online.org/mailman/listinfo/cctbxbb
>
> _______________________________________________
> cctbxbb mailing list
> cctbxbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/cctbxbb
5 years, 10 months

Re: [phenixbb] questions / TLS+NCS bug
by Pavel Afonine
Hi Jianghai,
Thanks for the .log file ! I looked at it and now I think I now what's
the problem: this is the bug in phenix.refine that we will fix for the
next release. The bug arises when you try to refine using TLS and NCS at
once. Sorry for this.
The possible solution is to split your refinement into 2 parts:
1) First, refine coordinates and isotropic B-factors. Use NCS
restraints. DO NOT refine TLS.
2) At the end, as the final tune up, refine ONLY isotropic B-factors +
TLS and do not refine coordinates. For this run, please REMOVE all NCS
information (NCS selections) from the command file. This refinement run
should be the final one.
Once again, sorry for the inconvenience. This will be fixed in the next
release of CCI APPS and PHENIX. Once this fixed, you will be able to run
everything in "one go" and in any combination.
Please let me know if what I suggested helped you. Any further questions
are welcome!
Cheers,
Pavel.
Jianghai Zhu wrote:
> Here is the log file. Thanks.
>
> Jianghai
>
> =
> ------------------------------------------------------------------------
>
>
>
> On Dec 14, 2006, at 12:50 PM, Pavel Afonine wrote:
>
>> Hi Jianghai,
>>
>> could you please send us .log file from your refinement run, so we
>> can analyze what's going on.
>>
>> In general, "bad" B-factors can be:
>> - misplaced model;
>> - inadequate TLS model (= domains chosen for TLS do not correspond to
>> the reality).
>>
>> If you are using like 2 months old or older version of phenix.refine,
>> you may want to get the latest CCI APPS since we made lots of
>> improvements. Just goto http://www.phenix-online.org/download/cci_apps/
>>
>> Pavel.
>>
>>
>> Jianghai Zhu wrote:
>>> The resolution is 2.5 A. The wilson B is about 50. I know B factor
>>> of the backbone is lower than that of the sidechian. But a B factor
>>> like 4 is definitely wrong.
>>>
>>> Jianghai
>>>
>>>
>>> On Dec 14, 2006, at 12:11 PM, Peter Zwart wrote:
>>>
>>>>
>>>>>> 4) The refinement (TLS + ML + B individual) went through, I got
>>>>>> reasonable R, Rfree, rmsdBOND, rmsdANGLE. But the B factors are
>>>>>> pretty low. The B factor of the backbone is much lower than the
>>>>>> side
>>>>>> chain, some have numbers like 4. Some metal atoms also have B
>>>>>> factors around 4. What did I do wrong?
>>>>
>>>> What is the resolution of your data?
>>>>
>>>> Backbone B-s usually are lower than the main chain.
>>>>
>>>> What is the Wilson B value reported by phenix.refine?
>>>> You could re-refine and randomize all B-values and see what happens (I
>>>> have to get back to you to to get the exact command for this).
>>>> Maybe it
>>>> is useful to obtain a copy of the latest verison of phenix.refine by
>>>> downloading cci_apps from our server http://www.phenix-online.org.
>>>>
>>>>
>>>> If your B-values still come out lowish, try growing crystals that
>>>> do not
>>>> diffract very well, that usually does the trick.
>>>>
>>>> HTH
>>>>
>>>> Peter
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> phenixbb mailing list
>>>> phenixbb(a)phenix-online.org <mailto:[email protected]>
>>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>>
>>> ------------------------------------------------------------------------
>>> _______________________________________________
>>> phenixbb mailing list
>>> phenixbb(a)phenix-online.org
>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>>
>> _______________________________________________
>> phenixbb mailing list
>> phenixbb(a)phenix-online.org <mailto:[email protected]>
>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>
> =
18 years, 6 months

Re: [phenixbb] calculate Fc for the whole unit cell from a Fc of a single symmetric unit.
by Edward A. Berry
It seems to me there are two things that could be meant by "expand to P1"
One is when data has been reduced to the Reciprocal Space
asymmetric unit (or otherwise one asymmetric unit of a
symmetric dataset has been obtained) and you want to expand
it to P1 by using symmetry to generate all the
symmetry -equivalent reflections.
The other is where a full P1 dataset has been calculated from just
one asymmetric unit of the crystal (and hence does not exhibit the
crystallographic symmetry) and you want to generate the transform
of the entire crystal. (I think this is how all the space-group -
specific fft programs used to work before computers got so fast it
was less bother to just do everything in P1 with the whole cell)
Presumably this involves applying the real space symmetry
operators to get n rotated (or phase-shifted for translational
symmetry) P1 datasets and adding them vectorially.
It would be important to decide which of these is required, and which
each of the suggested methods provides
eab
Ralf Grosse-Kunstleve wrote:
> We can expand reciprocal-space arrays, too, with the
> cctbx.miller.array.expand_to_p1() method. You can use it from the
> command line via
>
> phenix.reflection_file_converter --expand-to-p1 ...
>
> See also:
> http://www.phenix-online.org/documentation/reflection_file_tools.htm
>
> Ralf
>
>
> On Mon, Jul 11, 2011 at 10:56 AM, <zhangh1(a)umbc.edu
> <mailto:[email protected]>> wrote:
>
> Sorry I haven't got a chance to check my email recently.
>
> Yes, I meant expansion to P1. The thing is cctbx relies on the atomic
> model I think, but I only have model Fc available.
>
> Hailiang
>
> > I suspect what Hailang means is expansion into P1.
> >
> > I am sure this can be accomplished through some either existing or
> > easily coded cctbx tool. However, when I looked into a different
> task
> > recently that included P1 expansion as a step, I learned that SFTOOLs
> > can do this, albeit there was a bug there which caused trouble in
> > certain space groups (may be fixed by now so check if there is an
> > update).
> >
> > Hailang - if P1 expansion is what you need, I could share my own
> code as
> > well, let me know if that is something you want to try.
> >
> > Cheers,
> >
> > Ed.
> >
> > On Fri, 2011-07-08 at 14:44 -0700, Ralf Grosse-Kunstleve wrote:
> >> Did you get responses already?
> >> If not, could you explain your situation some more?
> >> We have algorithms that do the symmetry summation in reciprocal
> space.
> >> The input is a list of Fc in P1, based on the unit cell of the
> >> crystal. Is that what you have?
> >> Ralf
> >>
> >> On Wed, Jul 6, 2011 at 1:38 PM, <zhangh1(a)umbc.edu
> <mailto:[email protected]>> wrote:
> >> Hi,
> >>
> >> I am wondering if I only have structure factors calculated
> >> from a single
> >> symmetric unit, is there any phenix utility which can
> >> calculate the
> >> structure factor for the whole unit cell given the symmetric
> >> operation or
> >> space group and crystal parameters? Note I don't have an
> >> atomic model and
> >> only have Fc.
> >>
> >> Thanks!
> >>
> >> Hailiang
> >>
> >> _______________________________________________
> >> phenixbb mailing list
> >> phenixbb(a)phenix-online.org <mailto:[email protected]>
> >> http://phenix-online.org/mailman/listinfo/phenixbb
> >>
> >>
> >> _______________________________________________
> >> phenixbb mailing list
> >> phenixbb(a)phenix-online.org <mailto:[email protected]>
> >> http://phenix-online.org/mailman/listinfo/phenixbb
> >
> >
> > _______________________________________________
> > phenixbb mailing list
> > phenixbb(a)phenix-online.org <mailto:[email protected]>
> > http://phenix-online.org/mailman/listinfo/phenixbb
> >
> >
>
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org <mailto:[email protected]>
> http://phenix-online.org/mailman/listinfo/phenixbb
>
>
>
>
> _______________________________________________
> phenixbb mailing list
> phenixbb(a)phenix-online.org
> http://phenix-online.org/mailman/listinfo/phenixbb
13 years, 11 months

[phenixbb] POSITION AVAILABLE: Visiting Research Specialist Position at the University of Illinois at Chicago
by Yury Polikanov
*Visiting Research Specialist Position Available in the Polikanov Lab
<https://sites.google.com/view/polikanovlab>Department of Biological
Sciences, University of Illinois at Chicago*
*The Polikanov Laboratory <https://sites.google.com/view/polikanovlab> in
the Department of Biological Sciences at the University of Illinois at
Chicago is looking for a talented, skillful, and highly motivated
researcher to fill the Visiting Research Scientist position to study the
structure and function of the bacterial ribosome.*
For full consideration, please submit your application along with a Cover
Letter, CV, and three references by May 31, 2025. Cover letter should
describe previous research accomplishments and scientific interests as well
as how these fit within the scope of the research in the Polikanov lab.
Questions can be addressed to Yury Polikanov at yuryp(a)uic.edu.
Visiting Research Specialist- UIC Department of Biological Sciences,
Polikanov Lab
Hiring Department: Biological Sciences
Location: Chicago, IL USA
Requisition ID: 1033622
Posting Close Date: 05/31/2025
Link to Apply:
https://uic.csod.com/ux/ats/careersite/1/home/requisition/14681?c=uic
*Position Summary:*
The Polikanov Lab is seeking a Visiting Research Specialist. Successful
candidate will conduct research by using X-ray crystallography and cryo-EM
techniques to obtain a structural basis for various functions of the
bacterial ribosome and its inhibition by different small-molecule
inhibitors. The successful candidate should have a strong theoretical
background in standard biochemical, molecular biology, and biophysical
methods, as well as practical skills in the most standard methods currently
used in molecular biology and biochemistry (such as protein expression,
purification, etc.). Knowledge in X-ray crystallography or other structural
techniques is desired but not required.
The successful candidate would conduct research and assist with training
laboratory personnel in various techniques, experiments, and the
appropriate use of research equipment. He/She would also perform data
analysis and manage, document, and help with the dissemination of research
activities for publication and outreach.
*Duties & Responsibilities:*
- Record keeping: log and track inventory of chemicals, reagents, and
lab supplies
- Maintain accurate records of experiments, procedures, and results
- Keep the laboratory clean, organized, and free of contaminants
- Glassware & Equipment Maintenance
- Prepare culture media, buffers, and reagents according to protocols
- Assist other researchers in the lab in conducting experiments
- Purify bacterial ribosomes and all additional components necessary for
the experiments, such as recombinant proteins, tRNAs
- Train undergraduate and graduate students in research methods, use of
equipment, record keeping, and laboratory database entry
*Minimum Qualifications :*
- Bachelor’s degree in biology, chemistry, biochemistry, biophysics, or
related field required. Master’s degree in science preferred.
- A minimum of 3 years of related research experience.
- Strong background in biochemistry and molecular biology
- Practical skills with the most standard methods currently used in
molecular biology, and biochemistry (such as protein expression,
purification)
*Salary:* The budgeted salary range for the position is $48,000 to $60,000.
The pay offered to the selected candidate will be determined based on
factors including (but not limited to) the experience and qualifications of
the selected candidate, including equivalent years in rank, training, and
field or discipline; internal equity; and external market pay for
comparable jobs.
*About the University of Illinois Chicago*
UIC is among the nation’s preeminent urban public research universities, a
Carnegie RU/VH research institution, and the largest university in Chicago.
UIC serves over 34,000 students, comprising one of the most diverse student
bodies in the nation and is designated as a Minority Serving Institution
(MSI), an Asian American and Native American Pacific Islander Serving
Institution (AANAPSI) and a Hispanic Serving Institution (HSI). Through its
16 colleges, UIC produces nationally and internationally recognized
multidisciplinary academic programs in concert with civic, corporate and
community partners worldwide, including a full complement of health
sciences colleges. By emphasizing cutting-edge and transformational
research along with a commitment to the success of all students, UIC
embodies the dynamic, vibrant and engaged urban university. Recent “Best
Colleges” rankings published by U.S. News & World Report, found UIC climbed
up in its rankings among top public schools in the nation and among all
national universities. UIC has over 300,000 alumni, and is one of the
largest employers in the city of Chicago.
Benefits eligible positions include a comprehensive benefits package which
offers: Health, Dental, Vision, Life, Disability & AD&D insurance; a
defined benefit pension plan; paid leaves such as Vacation, Holiday and
Sick; tuition waivers for employees and dependents. Click for a complete
list of *Employee Benefits <https://www.hr.uillinois.edu/benefits/>*.
*Final authorization of the position is subject to availability of funding.*
*UIC is an EOE including Disability/Vets.*
*The University of Illinois may conduct background checks on all job
candidates upon acceptance of a contingent offer. Background checks will be
performed in compliance with the Fair Credit Reporting Act*
*The University of Illinois System requires candidates selected for hire to
disclose any documented finding of sexual misconduct or sexual harassment
and to authorize inquiries to current and former employers regarding
findings of sexual misconduct or sexual harassment. For more information,
visit **https://www.hr.uillinois.edu/cms/One.aspx?portalId=4292&pageId=1411899
<https://www.hr.uillinois.edu/cms/One.aspx?portalId=4292&pageId=1411899>*
2 months

Re: [phenixbb] Table 1 successor in 3D?
by John R Helliwell
Dear Gerard,
Thankyou for your detailed and informative reply to our message to Phenixbb
in reply to yours.
The inadequacies of detector setting will not be improved by saving
unmerged intensities, we agree. The raw diffraction images contain more
information nevertheless, even at an imperfect detector setting. The role
of the crystallographic associations and the facilities in professional
training through their courses is important.
There is much enthusiasm in the earlier literature about preserving raw
diffraction images, as Bernhard has also referred to (1913). We appreciate
the usefulness of Staraniso and the fact that it is more informative than
Table 1. Not least we greatly appreciate your work for the IUCr on these
matters as consultant to dddwg and now CommDat.
All best wishes,
Loes and John
On Tue, Jun 12, 2018 at 5:24 PM, Gerard Bricogne <gb10(a)globalphasing.com>
wrote:
> Dear John and Loes,
>
> Thank you for reiterating on this BB your point about depositing
> raw diffraction images. I will never disagree with any promotion of
> that deposition, or praise of its benefits, given that I was one of
> the earliest proponents and most persistently vociferous defenders of
> the idea, long before it gained general acceptance (see Acta D65, 2009
> 176-185). There has never been any statement on our part that the
> analysis done by STARANISO disposes of the need to store the original
> images and to revisit them regularly with improved processing and/or
> troubleshooting software. At any given stage in this evolution,
> however, (re)processing results will need to be displayed, and it is
> with the matter of what information about data quality is conveyed (or
> not) by various modes of presentation of such results that Bernhard's
> argument and (part of) our work on STARANISO are concerned.
>
> Furthermore we have made available the PDBpeep server at
>
> http://staraniso.globalphasing.org/cgi-bin/PDBpeep.cgi
>
> that takes as input a 4-character PDB entry code and generates figures
> from the deposited *merged amplitudes* associated with that entry. The
> numbers coming out of a PDBpeep run may well have questionable
> quantitative value (this is pointed out in the home page for that
> server) but the 3D WebGL picture it produces has informative value
> independently from that. Take a look, for instance, at 4zc9, 5f6m or
> 6c79: it is quite plain that these high-resolution datasets have
> significant systematic incompleteness issues, a conclusion that would
> not necessarily jump out of a Table 1 page, even after reprocessing
> the original raw images, without that 3D display.
>
> The truly pertinent point about this work in relation to keeping
> raw images is that the STARANISO display very often suggests that the
> merged data have been subjected to too drastic a resolution cut-off,
> and that it would therefore be worth going back to the raw images and
> to let autoPROC+STARANISO apply a more judicious cut-off. Sometimes,
> however, as in the example given in Bernhard's paper, significant data
> fail to be recorded because the detector was positioned too far from
> the crystal, in which case the raw images would only confirm that
> infelicity and would provide no means of remediating it.
>
>
> With best wishes,
>
> Gerard.
>
> --
> On Wed, Jun 06, 2018 at 09:35:38AM +0100, John R Helliwell wrote:
> > Dear Colleagues
> > Given that this message is now also placed on Phenixbb, we reiterate our
> > key point that deposition of raw diffraction images offers flexibility to
> > readers of our science results for their reuse and at no cost to the
> user.
> > As with all fields our underpinning data should be FAIR (Findable,
> > Accessible, Interoperable and Reusable). Possibilities for free storage
> of
> > data are Zenodo, SBGrid and proteindiffraction.org (IRRMC).
> > With respect to graphic displays of anisotropy of data Gerard's three
> > figures are very informative, we agree.
> > Best wishes
> >
> > Loes and John
> >
> > Kroon-Batenburg et al (2017) IUCrJ and Helliwell et al (2017) IUCrJ
> >
> > On Tue, Jun 5, 2018 at 4:49 PM, Gerard Bricogne <gb10(a)globalphasing.com>
> > wrote:
> >
> > > Dear phenixbb subscribers,
> > >
> > > I sent the message below to the CCP4BB and phenixbb at the same
> > > time last Friday. It went straight through to the CCP4BB subscribers
> > > but was caught by the phenixbb Mailman because its size exceeded 40K.
> > >
> > > Nigel, as moderator of this list, did his best to rescue it, but
> > > all his attempts failed. He therefore asked me to resubmit it, now
> > > that he has increased the upper size limit.
> > >
> > > Apologies to those of you who are also CCP4BB subscribers, who
> > > will already have received this message and the follow-up discussion
> > > it has given rise to.
> > >
> > >
> > > With best wishes,
> > >
> > > Gerard.
> > >
> > > ----- Forwarded message from Gerard Bricogne <gb10> -----
> > >
> > > Date: Fri, 1 Jun 2018 17:30:48 +0100
> > > From: Gerard Bricogne <gb10>
> > > Subject: Table 1 successor in 3D?
> > > To: CCP4BB(a)JISCMAIL.AC.UK, phenixbb(a)phenix-online.org
> > >
> > > Dear all,
> > >
> > > Bernhard Rupp has just published a "Perspective" article in
> > > Structure, accessible in electronic form at
> > >
> > > https://www.cell.com/structure/fulltext/S0969-2126(18)30138-2
> > >
> > > in which part of his general argument revolves around an example
> > > (given as Figure 1) that he produced by means of the STARANISO server
> > > at
> > > http://staraniso.globalphasing.org/ .
> > >
> > > The complete results of his submission to the server have been saved
> > > and may be accessed at
> > >
> > > http://staraniso.globalphasing.org/Gallery/Perspective01.html
> > >
> > > and it is to these results that I would like to add some annotations
> > > and comments. To help with this, I invite the reader to connect to
> > > this URL, type "+" a couple of times to make the dots bigger, and
> > > press/toggle "h" whenever detailed information on the display, or
> > > selection of some elements, or the thresholds used for colour coding
> > > the displays, needs to be consulted.
> > >
> > > The main comment is that the WebGL interactive 3D display does
> > > give information that makes visible characteristics that could hardly
> > > be inferred from the very condensed information given in Table 1, and
> > > the annotations will be in the form of a walk through the main
> > > elements of this display.
> > >
> > > For instance the left-most graphical object (a static view of
> > > which is attached as "Redundancy.png") shows the 3D distribution of
> > > the redundancy (or multiplicity) of measurements. The view chosen for
> > > the attached picture shows a strong non-uniformity in this redundancy,
> > > with the region dominated by cyan/magenta/white having about twice the
> > > redundancy (in the 6/7/8 range) of that which prevails in the region
> > > dominated by green/yellow (in the 3/5 range). Clear concentric gashes
> > > in both regions, with decreased redundancy, show the effects of the
> > > inter-module gaps on the Pilatus 2M detector of the MASSIF-1 beamline.
> > > The blue spherical cap along the a* axis corresponds to HKLs for which
> > > no measurement is available: it is clearly created by the detector
> > > being too far from the crystal.
> > >
> > > The second (central) graphical object, of which a view is given
> > > in Figure 1 of Bernhard's article and another in the attached picture
> > > "Local_I_over_sigI.png") shows vividly the blue cap of measurements
> > > that were missed but would probably have been significant (had they
> > > been measured) cutting into the green region, where the local average
> > > of I/sig(I) ranges between 16 and 29! If the detector had been placed
> > > closer, significant data extending to perhaps 3.0A resolution would
> > > arguably have been measured from this sample.
> > >
> > > The right-most graphical object (of which a static view is
> > > attached as "Debye-Waller.png") depicts the distribution of the
> > > anisotropic Debye-Waller factor (an anisotropic generalisation of the
> > > Wilson B) of the dataset, giving yet another visual hint that good
> > > data were truncated by the edges of a detector placed too far.
> > >
> > > Apologies for such a long "STARANISO 101" tutorial but Bernhard's
> > > invitation to lift our eyes from the terse numbers in Table 1 towards
> > > 3D illustrations of data quality criteria was irresistible ;-) . His
> > > viewpoint also agrees with one of the main purposes of our STARANISO
> > > developments (beyond the analysis and remediation of anisotropy, about
> > > which one can - and probably will - argue endlessly) namely contribute
> > > to facilitating a more direct and vivid perception by users of the
> > > quality of their data (or lack of it) and to nurturing evidence-based
> > > motivation to make whatever extra effort it takes to improve that
> > > quality. In this case, the undeniable evidence of non-uniformity of
> > > redundancy and of a detector placed too far would give immediate
> > > practical guidance towards doing a better experiment, while statistics
> > > in Table 1 for the same dataset would probably not ... .
> > >
> > > Thank you Bernhard!
> > >
> > >
> > > With best wishes,
> > >
> > > Gerard,
> > > for and on behalf of the STARANISO developers
> > >
> > >
> > > ----- End forwarded message -----
> > >
> > > _______________________________________________
> > > phenixbb mailing list
> > > phenixbb(a)phenix-online.org
> > > http://phenix-online.org/mailman/listinfo/phenixbb
> > > Unsubscribe: phenixbb-leave(a)phenix-online.org
> > >
> >
> >
> >
> > --
> > Professor John R Helliwell DSc
>
--
Professor John R Helliwell DSc
7 years

Re: [phenixbb] reflection file utility and use of modified phases in refinement
by Thomas C. Terwilliger
Hi Engin,
Thanks, yes, I see now what you are looking at:
--- Data for refinement FP SIGFP PHIM FOMM HLAM HLBM HLCM HLDM
FreeR_flag ---
hklout_ref: AutoBuild_run_1_/exptl_fobs_phases_freeR_flags.mtz
The file exptl_fobs_phases_freeR_flags.mtz has a copy of the
(experimental) HL coefficients that were input to autobuild. The labels
HLAM HLBM etc are indeed confusing...they have the ending "M" because they
were copied by resolve and it outputs HLAM etc...but in fact they are not
density modified, just copied straight from the input data file.
Thank you for pointing that out.
All the best,
Tom T
>> Hi Tom,
>>
>> My second question was about autobuild recommending modified phases to
>> be used in further refinement.
>> This is the end of the log file printed by autobuild. See the line for
>> "Data for refinement":
>>
>> Summary of output files for Solution 3 from rebuild cycle 4
>>
>> --- Model (PDB file) ---
>> pdb_file: AutoBuild_run_1_/cycle_best_4.pdb
>>
>> --- Refinement log file ---
>> log_refine: AutoBuild_run_1_/cycle_best_4.log_refine
>>
>> --- Model-building log file ---
>> log: AutoBuild_run_1_/cycle_best_4.log
>>
>> --- Model-map correlation log file ---
>> log_eval: AutoBuild_run_1_/cycle_best_4.log_eval
>>
>> --- 2FoFc and FoFc map coefficients from refinement 2FOFCWT PH2FOFCWT
>> FOFCWT PH
>> FOFCWT ---
>> refine_map_coeffs: AutoBuild_run_1_/cycle_best_refine_map_coeffs_4.mtz
>>
>> --- Data for refinement FP SIGFP PHIM FOMM HLAM HLBM HLCM HLDM
>> FreeR_flag ---
>> hklout_ref: AutoBuild_run_1_/exptl_fobs_phases_freeR_flags.mtz
>>
>> --- Density-modification log file ---
>> log_denmod: AutoBuild_run_1_/cycle_best_4.log_denmod
>>
>> --- Density-modified map coefficients FP PHIM FOM ---
>> hklout_denmod: AutoBuild_run_1_/cycle_best_4.mtz
>>
>> If HLAM, HLBM, HLCM and HLDM are density-modified phases, it looks like
>> that's what autobuild suggests.
>>
>> Thanks again,
>>
>> Engin
>>
>> On 8/28/09 7:07 AM, Thomas C. Terwilliger wrote:
>>> Hi Engin,
>>>
>>> I'm not sure about your main question...I hope that Nat or Pavel will
>>> answer you on that.
>>>
>>> On the use of density-modified phases in refinement: AutoBuild expects
>>> experimental phases in the data file, with experimental HL coefficients,
>>> and it by default it will use those HL coefficients in refinement with
>>> an
>>> MLHL target.
>>>
>>> The phase probabilities from resolve statistical density modification
>>> are
>>> pretty accurate, and not inflated, so you could use them in refinement
>>> if
>>> you wanted to. I don't suggest it, however, because the
>>> density-modification phase information is not fully independent of the
>>> other information used in refinement (e.g., a flat solvent is implicit
>>> in
>>> your refinement already, so including that through density modification
>>> is
>>> partially redundant).
>>>
>>> ps: I hope AutoBuild doesn't recommend using density-modified phases in
>>> refinement, so if you could send me the text where it says that, I will
>>> check that out!
>>>
>>> All the best,
>>> Tom T
>>>
>>>
>>>>> Hi everybody,
>>>>>
>>>>> I had some trouble with the reflection file utility today. I've been
>>>>> trying to import Rfree-flag column from one of the mtz's to my
>>>>> combined
>>>>> mtz, and it never does. The R-free flag is always left out of the
>>>>> output
>>>>> even when I have it selected. Have you guys seen this (I'm using 147)?
>>>>>
>>>>> Another question I have is about the output of phenix.autobuild.
>>>>> Phenix.autobuild tells me to use modified phase probabilities (HLAM,
>>>>> etc.) in refinement. I am assuming this is density-modified phases.
>>>>> But
>>>>> I've always thought that this would be bad practice (possibly because
>>>>> of
>>>>> unrealistically high FOMs and possible flattening of loops, etc, but
>>>>> maybe resolve does a better job than, say DM). Any ideas on that one?
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Engin
>>>>>
>>>>> --
>>>>> Engin Özkan
>>>>> Post-doctoral Scholar
>>>>> Dept of Molecular and Cellular Physiology
>>>>> Howard Hughes Medical Institute
>>>>> Stanford University School of Medicine
>>>>> ph: (650)-498-7111
>>>>>
>>>>> _______________________________________________
>>>>> phenixbb mailing list
>>>>> phenixbb(a)phenix-online.org
>>>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>>>>
>>>>>
>>> _______________________________________________
>>> phenixbb mailing list
>>> phenixbb(a)phenix-online.org
>>> http://www.phenix-online.org/mailman/listinfo/phenixbb
>>>
>>
>>
>> --
>> Engin Özkan
>> Post-doctoral Scholar
>> Laboratory of K. Christopher Garcia
>> Howard Hughes Medical Institute
>> Dept of Molecular and Cellular Physiology
>> 279 Campus Drive, Beckman Center B173
>> Stanford School of Medicine
>> Stanford, CA 94305
>> ph: (650)-498-7111
>>
>>
15 years, 10 months