(
SNR or S/N) The
signal to
noise ratio is used in
MRI to describe the relative contributions to a detected
signal of the true
signal and random superimposed
signals ('background noise') - a criterion for
image quality.
One common method to increase the
SNR is to average several measurements of the
signal, on the expectation that random contributions will tend to cancel out. The
SNR can also be improved by
sampling larger volumes (increasing the
field of view and
slice thickness with a corresponding loss of
spatial resolution) or, within limits, by increasing the strength of the
magnetic field used.
Surface coils can also be used to improve local
signal intensity. The
SNR will depend, in part, on the electrical properties of the sample or patient being studied.
The
SNR increases in proportion to
voxel volume (1/resolution), the square root of the number of
acquisitions (
NEX), and the square root of the number of scans (
phase encodings).
SNR decreases with the
field of view squared (FOV2) and wider bandwidths. See also
Signal Intensity and
Spin Density.
Measuring SNR:
Record the mean value of a small ROI placed in the most homogeneous area of tissue with high
signal intensity (e.g. white
matter in thalamus). Calculate the standard deviation for the largest possible ROI placed outside the object in the image background (avoid ghosting/aliasing or eye movement
artifact regions).
The
SNR is then:
Mean
Signal/Standard Deviation of Background
Noise