mfisch
When performing sequential Pawley or LeBail refinements on temperature dependent data, I have the feeling that the agreement of Pawley refinements is generally better (but slower) than LeBail with the same convergence criterion. Moreover, LeBail seems to have a problem to get out of zero intensity for a peak if the peak had zero intensity in the (n-1)th pattern. Is this true? If yes, why? Or, am I doing a mistake?
mfisch
Anyone else noticed this?
alancoelho
The Lebail method uses the Rietveld decomposition formula. It scales peak intensities for peak 'k' according to:
Q(k) = Sum[ Peak(k, i) Yobs(i)/Ycalc(i), i]
If Yobs == Ycalc (a perfect fit) then the peak intensity is not scaled.
The formula has a divide by Ycalc(i). If Ycalc(i) is zero or small then there's a problem. Typically Ycalc is not zero by starting the peak intensities to 1. The end result does depend on the starting values and for numerical stability Ycalc(i) is set to a lower limit of 1.0e-15. In TOPAS and after a lot of experimentation the following is used:
Q(k) =
Limit[
Sum[
Peak(k, i) Limit[Yobs(i)/Max[Ycalc(i), 1e-15], 1/c, c],
i
],
1/c,c
];
where Limit(x,x1,x2) = If(x<x1, x2, If(x>x2, x2, x));
Thus the change in intensities are damped. It a iterative numerical procedure and predicting its properties is non-trivial.
mfisch
Thanks, Alan! OK then... is it somehow possible to use Pawley instead but not use the Pawley intensities in the correlation matrix? I am only interested in lattice parameter errors. The file becomes huge and the overall process very slow...
alancoelho
The intensities affects the lattice parameters but maybe not by much; thus simply do an extra run without the intensities refining.
If refining on intensities then the most accurate error, I think, is bootstrap errors; SVD errors or use_LU errors do not consider negative intensities where as bootstrapping does.
mfisch
Hi Alan
Thanks - I will think of a way to automate this on many files.
Cheers!