[phenixbb] lower limits on grid spacing/binning
dale at uoxray.uoregon.edu
Wed Oct 31 00:01:00 PDT 2007
Despite some of my outdated opinions, I'm quite sure that one
has to sample a map at least twice the rate of the highest resolution
Fourier component present in the map prior to sampling on the grid.
While this rule is complicated to implement when sampling an electron
density map calculated from coordinates (or derivatives of such a map)
the rule is very simple when calculating a map from structure factors.
If you have 2A data you can sample at 1A.
If you back transform that density map you will get exactly the
structure factors that you started with, which is the real test
Now, just to make your life more difficult I guess, I have to
bring up the fact that the choice of how finely to sample is not
created by the FFT: the limits are imposed simply by the fact that
you are sampling a continuous function on a grid. Even if you
never plan to FFT the map, if you sample it too coarsely you have
created problems for yourself.
These functions are really continuous functions so you can sample
them at any rate higher than 2x without any consequence other than
the use of more memory and CPU time. In fact, if you are going to
use a map in some algorithm that interpolates between grid points,
such as ncs averaging, you will have to sample at higher than 2x,
and how high you have to go depends on the details of the interpolation.
Even contouring a map involves interpolation, so to get accurate
contours you need to sample at higher than 2x, and how high depends
on the contouring algorithm and how much accuracy you want. Most people
don't seem to care too much about the accuracy of their contours, but
the problem exists none-the-less.
Terry Lang wrote:
> Dear Everyone,
> Thank you to everyone for their thorough responses to my
> question. I do have one follow-up question, though. The majority of
> the references and discussion involves the influence of grid spacing on
> the generation of structure factors during refinement. Once the
> structure is fully refined, I will have the final set of structure
> factors, presumably to be deposited to the pdb. I am interested in the
> back-transformation from final structure factors to electron density
> maps. Does the grid spacing in this case have a general rule of thumb
> as well or, because the data has already been truncated, does it matter?
> Ralf W. Grosse-Kunstleve wrote:
>> Probably, the original poster already got more than he bargained
>> for, but for the records, a few more comments:
>>> All FFT based structure factor programs require that the sampling
>>> rates along each axis be even.
>> I don't think this is true, unless I'm misunderstanding what "sampling
>> rate" means. The FFT in phenix is based on FFTPACK (written in the
>> 80s) which works for any number of grid points. FFTw also supports
>> arbitrary gridding.
>>> They may have other required factors
>>> depending on the space group, but they will be happy to inform you
>>> if you make a choice it doesn't like. They are also more efficient
>>> when the prime factors of the sampling rates are small numbers. Try
>>> to stick with multiples of 2,3, and 5 if possible.
>> That's good advice. FFTPACK is fastest for transform sizes that are
>> multiples of 2,3,4 and 5. The map calculation algorithms in phenix
>> automatically take this into account.
>>> All FFT programs will fail if you
>>> sample your map courser than twice that frequency, as SFALL did for
>> This may be true for SFALL, but not for the FFT algorithms in the
>> phenix libraries ($PHENIX/cctbx/include/cctbx/maptbx/structure_factors.h).
>> The critical reference is:
>> David A. Langs (2002), J. Appl. Cryst. 35, 505.
>> As short as it is, this was an incredibly important paper. In
>> retrospect, it is amazing that it took so long for someone to discover
>> the trick.
>> phenixbb mailing list
>> phenixbb at phenix-online.org
More information about the phenixbb