On Mon, May 17, 2010 at 3:18 PM, Ed Pozharski <span dir="ltr"><<a href="mailto:epozh001@umaryland.edu">epozh001@umaryland.edu</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im">I have always used denzo/scalepack, and then scalepack2mtz to</div>
convert .sca file to .mtz file. So my data is always processed<br>
according to French&Wilson.<br>
<br>
Now from what you are saying I understand that there is some possibility<br>
to get into using non-truncated data with phenix? And not only that, it<br>
seems to be the default?<br></blockquote><div><br></div><div>FYI, AutoSol, AutoMR, and Phaser all accept scalepack files as input (or d*TREK or XDS, I think), and generate MTZ files as output, so if a user jumps directly from HKL2000 to Phenix, it would be very easy to skip the French&Wilson step. The need to run an extra conversion step in a different suite is not going to be obvious to grad students (and many if not most postdocs).</div>
<div><br></div><div>We've discussed implementing the French&Wilson protocol in CCTBX, but I don't know how much work that is (since I still don't know what it actually does after reading this entire discussion).</div>
<div><br></div><div>-Nat</div></div>