Hi Gerwald,

First, my R/Rfree at the moment is 0.18/0.25. 

first of all, some statistics -:)

Here are the histograms of Rwork, Rfree and Rfree-Rwork for structures in PDB solved at similar to yours resolution:

Histogram of Rwork for all model in PDB at resolution 3.70-4.70:
     0.185 - 0.216      : 10  <<< your model
     0.216 - 0.246      : 21
     0.246 - 0.277      : 44
     0.277 - 0.308      : 25
     0.308 - 0.338      : 21
     0.338 - 0.369      : 15
     0.369 - 0.400      : 8
     0.400 - 0.430      : 3
     0.430 - 0.461      : 1
     0.461 - 0.492      : 1
Histogram of Rfree for all model in PDB at resolution 3.70-4.70:
     0.226 - 0.253      : 10  <<< your model
     0.253 - 0.280      : 17
     0.280 - 0.308      : 22
     0.308 - 0.335      : 39
     0.335 - 0.362      : 24
     0.362 - 0.389      : 19
     0.389 - 0.417      : 10
     0.417 - 0.444      : 3
     0.444 - 0.471      : 3
     0.471 - 0.498      : 2
Histogram of Rfree-Rwork for all model in PDB at resolution 3.70-4.70:
     0.001 - 0.012      : 15
     0.012 - 0.023      : 17
     0.023 - 0.034      : 20
     0.034 - 0.045      : 24
     0.045 - 0.055      : 40
     0.055 - 0.066      : 17
     0.066 - 0.077      : 9  <<< your model
     0.077 - 0.088      : 5
     0.088 - 0.099      : 0
     0.099 - 0.110      : 2


Well, given not that large amount of structures, still you are not alone with such R-factors. (I can print the same histogram for B-factors, if interested).

The refinement is not 
complete and weights are not optimized, so the gap may get smaller. 

Yes, and that may put your R-factor values into more populated pools in the above histograms.

I would suggest:

- use TLS (try to make your best guess about TLS groups or use TLSMD server to define them, but for that do some group B-factor refinement first with two B-factor per residue); In fact use "tls+group_adp" - in your case I think it's the most appropriate.
- try torsion angle dynamics instead of refining individual coordinates.

 I 
am aware that one possible answer is that phenix is just really good...
  

Could be -:)

The other issue is about B factors. Phenix will happily refine B factors 
to values of 300 if there is no density 

Yes, we don't have any ad hoc restrictions, it's not good to have them. If there is no density, then the atom gets smeared out by increased B-factor - it is honest to do it from phenix.refine part. But is there is some density and B-factors are high, once again, try TLS, because may be that part of your model does some large movement and TLS model will be more physically appropriate to model that kind of movements. The B-factors may still be large, and it is ok - see lot's of relevant discussions on ccp4bb about meaning of huge B-factors (comments by Ian Tickle, for example).

Pavel.