The presence of Rician noise in magnetic resonance imaging (MRI) introduces systematic errors in diffusion tensor imaging (DTI) measurements. This paper evaluates gradient direction schemes and tensor estimation routines to determine how to achieve the maximum accuracy and precision of tensor derived measures for a fixed amount of scan time. We present Monte Carlo simulations that quantify the effect of noise on diffusion measurements and validate these simulation results against appropriate in-vivo images. The predicted values of the systematic and random error caused by imaging noise are essential both for interpreting the results of statistical analysis and for selecting optimal imaging protocols given scan time limitations.
Casey Goodlett, P. Thomas Fletcher, Weili Lin, Gui