Neuroimaging Data Processing/Processing/Steps/Smoothing

Spatial Smoothing
Basically a signal stream could be decomposed into time domain, space domain or frequency domain separately, and the story about "smoothing" happens on the frequency domain of the signal. Imagine that a batch of signal could be represented as a vector of frequencies versus its power in each point. The curve may follow a jumpy period before reaching its maximum, and then the peak is followed by a long undershoot. Smoothing is just to filter out high frequency regions, with the aim to increase the signal-to noise ratio in images. Normally so called low-pass filters will remove high frequency signals while retaining low parts, whereas high-pass filters do the other way around. So in the smoothing what is needed is a low-pass filter. The most popular ways to smoothing is by deconvoluting three-dimensional images with a three-dimensional Gaussian filter. The degree of smoothing is proportional to the full width at half-maximum (FWHM) of the Gaussian distribution, which is associated to the standard deviation (σ) by the equation FWHM = 2σ$\sqrt{2ln(2)}$. The larger the standard deviation (σ), the more gentle of the curve, and the larger smoothing is achieved.

What is a good smoothing ?
Intuitively, the larger the smoothing, the more blurring of an image is, and more information lost concomitantly. Then what is a suitable smoothing to get a balance between retaining high resolution and removing artifacts. Depending on the purpose to smoothing, there may have different answers. First of all, if the aim is to reduce noise in the image, the sensible choice of smoothing is no larger than the activation signals with intention to be found. This is straightforward to understand because no tadpole would be caught by using a net with dolphin-scale holes. Then the purpose of smoothing may lie in reducing the effects of anatomical variability, which is failed to correct during spatial normalization. In this circumstance, the optimal smoothing will rely on the variability degree in the series of images you are comparing. The last but not the least aim for smoothing could be as a prerequisite for the statistical analysis (such as the Gaussian random fields needs certain degree of spatial smoothness), then it's appropriate to select an FWHM of twice the voxel dimensions.

Resting State fMRI
In graph-based analysis, the application of spatial smoothing is controversial. It has been argued that the increased spatial dependency introduced by smoothing might confound local connectivity strength especially for the small ROIs used ind voxel-based parecellation

SPM
SPM(Statistical parametric mapping)

FSL
via GUI:



Start the FSL GUI, click on FEAT FMRI analysis and select Pre-stats. In the field Spatial smoothing FWHM (mm) you can set your smoothing kernel. Per default it is set to 5 mm. If you don´t want to smooth at all, set the value to 0. Bear in mind that the extent of smoothing (that makes sense) is dependent on the size of the underlying activation area. So if you are looking for a large area of activation, you can increase the kernel (e.g. to values of 10–15 mm); if you are looking for a small activation area, you should reduce the kernel from the default 5 mm.

AFNI
3dmerge function contains an option for spatial smoothing which is -1blur. There are different parameters you can adjust (e.g. to use rsm or sigma thresholds), check out the manual page. For a Gaussian smoothing kernel of 4mm FWHM (which is the default) applied to the whole dataset the command would be: 3dmerge -1blur_fwhm 4.0 -doall -prefix OUTPUTFILE INPUTFILE

In afni_proc.py blur is a default block with the following settings, which however can be changed: -blur_filter -1blur_fwhm -blur_size 4 -blur_in_mask no