PET Cookbook

UNDER CONSTRUCTION

Simple recipe for PET analysis

Easily analyze PET image data with PandA. This analysis is intended for data obtained by the Radboud University Medical Centre / Otolaryngology department and the Radboud University Nijmegen / Biophysics department. For a more detailed description, you will have to wait until I have finished this section.

SPM8

First you have to download SPM. You will use this toolbox to align and reorient the PET-images. You will analyze the data with a matlab toolbox provided by Jimmy Shen.

Always use the default settings, unless otherwise indicated.

Data

The data are provided as Siemens DICOM-format IMA or DCM-files. Ask for a representative data-set from Marc van Wanrooij. The images are reconstructed from the original PET data.

Analysis

Import data

Type in:

spm

to start the SPM package. You will be asked which type of experiment you will want to analyze. Choose PET&VBM.

Then you will have to import from the DICOM-format (SPM uses the NIFTI-format). Press the [Import DICOM]-button. From the pop-up window you have to choose and select the files. Navigating is terrible in SPM, so try some things out yourself (and press the question mark for an explanation of the browse dialog). You can choose all IMA/DCM-files, by right-clicking and "Select all". Then choose the output-folder, and press "done".

Some processing occurs (taking time). You will end up witha hdr- and an img-file starting with s (and consisting of a string of numbers). As you will probably have several conditions measured, you will probably want to rename these files to correspond to these conditions (e.g. "Control","Vision","Audition").

After the images have been imported, you can watch them by pressing the "Display"-button. You need to select the img-file just imported.

Check Registration?

You have to check whether coordinates are okay: e.g. is the origin at the center of the brain? Compare to default scan. It is important that the "origin" is (approximately) the same for all scans. If they are not the registration-check will look bad:

If this is the case, you can simply reset the origin in MatLab with Jimmy Shen's toolbox:

origin   = [64 71 29]; % Just a guess, check yourself whether this is approx. the brain center
cd('C:\DATA\PET\Subject\Scan'); % go to data directory, change this to your datafiles structure
fname    = 's6633786-0684-02806-000856'; % filename of the scan you want to re-origin
nii.hdr.hist.originator(1:3) = origin; %reset the origin.
save_nii(nii, 'o_scan'); % save the scan, give it a unique name

or slightly simpler:

origin   = [101 104 41]; % Just a guess, check yourself whether this is approx. the brain center
file   = 'C:\DATA\PET\Subject\SPM\Control.img'; % data-file, change this to your filename
pa_pet_originset(file,origin);

This last function will save the re-originated data and append 'o_' to the filename.

Be sure to reset the origin of all scans to the same coordinates.

These steps will (lead to a question to) close SPM. You will have to restart SPM to continue.

Spatial preprocessing single subject

Realign

"Realign" is a basic method to match images. Only transformations and rotations are allowed to achieve realignment. You will want to "estimate&reslice": this will yield new files, with an r appended to the img-filenames (so now you will have an appended 'ro_' if you also reset the origin). In the first folder, you will have a mean img-file. This function is used to correct for motion inbetween functional scans.

Single-subject between condition comparison

If you are only interested in seeing differences between conditions of one subject, you can do the following:

• correct for scan time
• determine difference, and set threshold

So, first load some visual data.

fname  = 'rs4433787-0004-02001-000053';

Then load the control baseline data

fname  = 'rs4433787-0004-02001-000053';

Then obtain the images from these structures

V      = niiv.img;
C      = niic.img;

A problem with PET is to always administer the same amount of radioactive-labeled material, and scan always at the same time after administering. Since this is impossible, a large apparent difference in activity between conditions can be caused. Fortunately we can correct for this, by applying:

$$A_0 = A_t \times \frac{e^{\lambda \times t_1} \times \lambda \times (t_2-t_1)}{1-e^{-\lambda \times (t_2-t_1)}}$$

with {math}\Delta{/math} the activity {math}A{/math} at time {math}1{/math}, considering the decay after administering at time {math}0{/math}; {math}{\log{2}}/{\lambda}=109.771*24*60{/math}is half-time of Fluorine-18 (days); {math}\delta T{/math}time (days).

So for example:

thresh    = 200;
threshd    = 3500;
threshdiff  = 10;
figure
x = 0:100:max([C(:); V(:)]);
NC = hist(C(:),x);
NV = hist(V(:),x);
subplot(211)
plot(x,NC,'k-','LineWidth',2);
hold on
plot(x,NV,'r-','LineWidth',2);
ylim([0 3200]);
xlim([150 20000]);
f = round(pa_oct2bw(200,0:1:9));
set(gca,'Xscale','log','XTick',f,'XTickLabel',f);
ylabel('Number of voxels');

% 12u15m00s - 12u55m00s
t1 = [0 0 0 12 15 00];
t2 = [0 0 0 12 55 00];
e = etime(t2,t1)/(60*60*24);
V = pa_pet_decayfun(V,120.19,e);
% 15u03m31s - 16:40:21
t1 = [0 0 0 15 03 31];
t2 = [0 0 0 16 40 21];
e = etime(t2,t1)/(60*60*24);
C = pa_pet_decayfun(C,134,e);

% The after
x = 0:25:max([C(:); V(:)]);
NC = hist(C(:),x);
NV = hist(V(:),x);
subplot(212)
plot(x,NC,'k-','LineWidth',2);
hold on
plot(x,NV,'r-','LineWidth',2);
ylim([0 1500]);
xlim([100 9000]);
f = round(pa_oct2bw(100,0:1:6));
set(gca,'Xscale','log','XTick',f,'XTickLabel',f);
pa_verline(thresh);
% pa_verline(threshd);
ylabel('Number of vpxels');

will yield:

Now we have to theshold the data to correct for noise in low numbers.

%% Everything above 0
V(V<thresh)  = thresh;
C(C<thresh)  = thresh;

%% Image back into nii
niiv.img  = V;
niic.img  = C;

%% Change origin
niiv.hdr.hist.originator(1:3) = [64 71 29];
niic.hdr.hist.originator(1:3) = [64 71 29];

%% View
view_nii(niic);
view_nii(niiv);

Now we can plot the difference as a color-scale on the scan.

%% Difference
C = niic.img;
V = niiv.img;
V(V<threshd)  = threshd;
C(C<threshd)  = threshd;

D =(V-C)./C;

D(isnan(D)) = nanmin(D(:));
D(isinf(D)) = nanmin(D(:));
D      = D*100;
D(D<threshdiff)    = 0;

figure
x = min(D(:)):max(D(:));
hist(D(:),x);
ylim([0 9000]);
xlim([-50 50]);
niidiff    = niic;
niidiff.img = D;

%% overlay?
option.setvalue.idx = find(D);
option.setvalue.val = D(option.setvalue.idx);
option.useinterp = 1;
% option.setviewpoint = [51      67      25];
view_nii(niiv,option);

Which yields:

Spatial preprocessing across modalities

Coregister

Usually, you want to match structural scans with functional scans. These structural scans can then later be transformed to fit in a standardized space, together with the coregistered functional scans. So open spm again, and load a structural CT scan to nii-format, as described above for a functional scan.

Then Coregister & Reslice, choose the reference and source image. The source image will be manipulated (transformed and rotated) to fit the reference image. This procedure will add an r to the filename. NOTE 1: this also changes the original s-file! NOTE 2: This produces crap when aligning a CT and a PET scan.

Spatial preprocessing across subjects

Normalization

The next step is to normalize the scnas into standardized (MNI) space. The normalize transforms the scans by rotation, translation, zooming, shearing, and some non-linear deformations, allowing for small inter-subject differences. Choose "Normalise and Write". You will:

• click to a source image (e.g. the mean of the realign step), that will be matched to the standard template.
• click the "Images to Write": these scans will be transformed and written, with a w added.
• click the "Template image", and choose the default PET-scan provided by SMP8.

Putting things together - 1 subject

Here follows a matlab-analysis.

%% Initialization
close all
clear all
clc

thresh    = 200;
threshd    = 3500;
threshdiff  = 10;

cd('C:\DATA\KNO\PET\Subject\Vision');
fname  = 'wrvision';

cd('C:\DATA\KNO\PET\Subject\Control');
fname  = 'wrcontrol';

cd('C:\DATA\KNO\PET\Subject\Audiovisual');
fname  = 'wraudiovisual';

%% Images
V      = niiv.img;
C      = niic.img;
AV      = niiav.img;

%% Correct for scan time
% The before
figure
mx  = max([C(:); V(:); AV(:)]);
x  = 0:mx/1000:mx;
NC = hist(C(:),x);
NV = hist(V(:),x);
NAV = hist(AV(:),x);

subplot(211)
plot(x,NC,'k-','LineWidth',2);
hold on
plot(x,NV,'r-','LineWidth',2);
plot(x,NAV,'g-','LineWidth',2);
ylim([0 3200]);
xlim([100 20000]);
f = round(pa_oct2bw(200,0:1:9));
set(gca,'Xscale','log','XTick',f,'XTickLabel',f);
ylabel('Number of voxels');

t1    = [0 0 0 12 08 00];
t2    = [0 0 0 12 51 16];
e    = etime(t2,t1);
C    = pa_pet_decayfun([126 e],C);

t1    = [0 0 0 11 51 25];
t2    = [0 0 0 12 35 00];
e    = etime(t2,t1);
V    = pa_pet_decayfun([119 e],V);

t1    = [0 0 0 12 44 05];
t2    = [0 0 0 13 28 20];
e    = etime(t2,t1); % days
AV    = pa_pet_decayfun([124 e],AV);

beta0  = [100 1];
sel    = C>thresh;
beta  = nlinfit(V(sel),C(sel),@(beta,X)pa_pet_decayfun(beta,X),beta0);
V    = pa_pet_decayfun(beta,V);
beta  = nlinfit(AV(sel),C(sel),@(beta,X)pa_pet_decayfun(beta,X),beta0);
AV    = pa_pet_decayfun(beta,AV);
mx    = max([C(:); V(:); AV(:)]);
x    = 0:mx/1000:mx;
NC    = hist(C(:),x);
NV    = hist(V(:),x);
NAV    = hist(AV(:),x);
subplot(212)
plot(x,NC,'k-','LineWidth',2);
hold on
plot(x,NV,'r-','LineWidth',2);
plot(x,NAV,'g-','LineWidth',2);
ylim([0 3000]);
xlim([100 20000]);
f = round(pa_oct2bw(100,0:1:6));
set(gca,'Xscale','log','XTick',f,'XTickLabel',f);
pa_verline(thresh);
pa_verline(threshd);
ylabel('Number of vpxels');

% AV  = AV-C;
% V  = V-C;

%% Everything above 0
AV(AV<thresh)  = thresh;
V(V<thresh)  = thresh;
C(C<thresh)  = thresh;

%% Image back into nii
niiv.img  = V;
niic.img  = C;
niiav.img  = AV;

%% Change origin
niiv.hdr.hist.originator(1:3) = [64 71 29];
niic.hdr.hist.originator(1:3) = [64 71 29];
niiav.hdr.hist.originator(1:3) = [64 71 29];

%% View
view_nii(niic);
view_nii(niiv);
view_nii(niiav);

%% Difference
C = niic.img;
V = niiv.img;
AV = niiav.img;

V(V<threshd)  = threshd;
AV(AV<threshd)  = threshd;
C(C<threshd)  = threshd;

D =(AV-V)./V;
% D =(AV-C)./C;
% D =(V-C)./C;

D(isnan(D)) = nanmin(D(:));
D(isinf(D)) = nanmin(D(:));
D      = D*100;
D(D<threshdiff)    = 0;

figure
x = min(D(:)):max(D(:));
hist(D(:),x);
ylim([0 9000]);
xlim([-50 50]);
niidiff    = niic;
niidiff.img = D;

%% overlay?
option.setvalue.idx = find(D);
option.setvalue.val = D(option.setvalue.idx);
option.useinterp = 1;
% option.setviewpoint = [51      67      25];
view_nii(niic,option);