Hi Jon,
you can find the answer without wasting time on guesswork... Take
a few extreme cases from PDB (big model, many reflections, tricky
space group), and run them from the command line:
phenix.refine model.pdb data.mtz ordered_solvent=true
--show-process-info
The log file should contain memory usage throughout the run. Look
for the max memory intake in the last record (towards the end of
log file). This will give you an idea about how much memory you
may need.
Pavel
On 7/22/12 4:31 AM, Jon Agirre wrote:
Dear Nat and Pavel,
thank you so much for your explanations. Assuming that the 8
in Nat's formula provides conversion to bytes, it is not such a
big RAM requirement. I guess most virus structures should be
approachable with an 8GB machine.
About the CPU, I think I'm going to invest in a quad-core. I
feel quite comfortable in command line and I don't have fear
paralelizing existing code.
Thanks again,
Jon
2012/7/21 Pavel Afonine
<[email protected]>
But the resolution_factor is inconsistent - for the FFT
structure
factors calculation (which is unavoidable), we are
definitely using
1/3 (I assume for speed reasons). For most of the other
optional
tasks like rotamer correction and filling missing F-obs,
it's 1/4.
Yes, we use 1/3 for structure factors and gradients
calculations, and 1/4 in map calculation if the map is going
to be used for things like water picking, real-space
refinement, etc. This is intentional.
Pavel
--
Jon Agirre, PhD
Biophysics Unit (CSIC-UPV/EHU)
http://www.ehu.es/jon.agirre
+34656756888
_______________________________________________
phenixbb mailing list
[email protected]
http://phenix-online.org/mailman/listinfo/phenixbb