qMRLab icon indicating copy to clipboard operation
qMRLab copied to clipboard

sequence/parameters optimization

Open zrazi opened this issue 3 years ago • 12 comments

How would you optimize your sequence/parameters for T1/T2 mapping using IR for different T1/T2 series? What would you set up as your ground truth? What is the best way #to evaluate your model afterward quantitively?

zrazi avatar Apr 08 '21 02:04 zrazi

Same question!

AlexMorello avatar Jan 18 '22 19:01 AlexMorello

For T1 mapping, I suggest these blog posts.

As for T2, e.g., for the mono_t2 model:

image

You can run single voxel curve simulation for the TEs shown on the protocol panel to fit data simulated for T2/M0 values you set:

image

It also reports Cramer-Rao lower bound. By changing TEs, T2/M0 and SNR, you can see how estimation changes. It is also important that your data acquisition complies with the signal representations the models are build on. For example, mono_t2 is based on a multi-echo spin echo acquisition.

You can expand this simulation to a sensitivity analysis to see the behaviour for a range of values. The simulations are available for inversion_recovery as well.

I highly recommend blog post on vfa_t1 for a more complete understanding.

agahkarakuzu avatar Jan 18 '22 19:01 agahkarakuzu

What is the best way #to evaluate your model afterward quantitively?

For relaxometry measurements, having a ground truth is the most convenient. If you have a range of T1/T2 reference values, then you can look at percent error to see if your protocol is acceptable.

If not, one way is to check how data was fitted to the model per voxel (there's a sanity check section on the GUI). Although this gives an overall information about the appropriateness of the fit, not an actual measurement of accuracy.

If you think that your quantifications makes sense for whichever substance you are imaging, then another good practice is to test the stability of your measurements. I can offer some useful stats for such evaluation, if it is of concern.

On the other hand, these are broad pointers. A more complete answer often calls for details of the application, which sequence was used, what was the target etc. I hope that some of these answers are useful for you. Cheers!

agahkarakuzu avatar Jan 18 '22 20:01 agahkarakuzu

Thank you Agah, Would you elaborate more on the Stability of the measurements? Do you mean reproducibility over time? One challenge that I have with the qMRLab is to prepare my images to be suitable as input. I understand it should be 4D (row, col, slices, the inversion times). Is that right? Thank you for your help Warmly, Zahra

On Tue, Jan 18, 2022 at 3:07 PM Agah @.***> wrote:

What is the best way #to evaluate your model afterward quantitively?

For relaxometry measurements, having a ground truth is the most convenient. If you have a range of T1/T2 reference values, then you can look at percent error to see if your protocol is acceptable.

If not, one way is to check how data was fitted to the model per voxel (there's a sanity check section on the GUI). Although this gives an overall information about the appropriateness of the fit, not an actual measurement of accuracy.

If you think that your quantifications makes sense for whichever substance you are imaging, then another good practice is to test the stability of your measurements. I can offer some useful stats for such evaluation, if it is of concern.

On the other hand, these are broad pointers. A more complete answer often calls for details of the application, which sequence was used, what was the target etc. I hope that some of these answers are useful for you. Cheers!

— Reply to this email directly, view it on GitHub https://github.com/qMRLab/qMRLab/issues/440#issuecomment-1015793007, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALJLUTPI6MYLL3YT7VYRCVTUWXCBXANCNFSM42R6AFIA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you authored the thread.Message ID: @.***>

zrazi avatar Jan 19 '22 03:01 zrazi

As for stability, you could say that reproducibility over time. More common terms are reliability/consistency/agreement. As it is quite common in the literature, intraclass correlations would be a good start, this article is insightful for that. Again, it depends on your experimental design and what you'd like to achieve.

Have you seen the documentation about IR? It shows that for 9 TIs the input data is [128, 128, 1, 9]. In which format are your multiple inversion images? We can add a simple tool to merge and sort volumes.

agahkarakuzu avatar Jan 19 '22 15:01 agahkarakuzu

Thank you for the links you provided. I download DICOM file from the Visage. A tool would make my life much easier and I would be so really thankful. Would the Mask be derived from the same database too? Yours, Zahra

On Wed, Jan 19, 2022 at 10:46 AM Agah @.***> wrote:

As for stability, you could say that reproducibility over time. More common terms are reliability/consistency/agreement. As it is quite common in the literature intraclass correlations would be a good start, this article https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6645485/ is insightful for that. Again, it depends on your experimental design and what you'd like to achieve.

Have you seen the documentation about IR https://qmrlab.readthedocs.io/en/master/inversion_recovery_batch.html? It shows that for 9 TIs the input data is [128, 128, 1, 9]. In which format are your multiple inversion images? We can add a simple tool to merge and sort volumes.

— Reply to this email directly, view it on GitHub https://github.com/qMRLab/qMRLab/issues/440#issuecomment-1016598837, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALJLUTJSL3Q6H3SH56HZNYDUW3MEDANCNFSM42R6AFIA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you authored the thread.Message ID: @.***>

zrazi avatar Jan 19 '22 16:01 zrazi

Can you send me a set of example files via email, I can quickly come up with something for you.

For mask, only volumetric dimensions should match (e.g., [10x1] TI, [128,128,2,10] IRData, [128,128,2] mask)

agahkarakuzu avatar Jan 19 '22 16:01 agahkarakuzu

This scan was only one slice, 11 TI. Can you please kindly make the mask too and let me know how you do it?

On Wed, Jan 19, 2022 at 11:54 AM Agah @.***> wrote:

Can you send me a set of example files via email, I can quickly come up with something for you.

For mask, only volumetric dimensions should match (e.g., [10x1] TI, [128,128,2,10] IRData, [128,128,2] mask)

— Reply to this email directly, view it on GitHub https://github.com/qMRLab/qMRLab/issues/440#issuecomment-1016665196, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALJLUTIKDYEINLPQII4BZE3UW3UDJANCNFSM42R6AFIA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you authored the thread.Message ID: @.***>

zrazi avatar Jan 19 '22 17:01 zrazi

I think you replied to the comment via email with an attachment, but they don't show up here. You'll need to send it to my address, I think you have it.

agahkarakuzu avatar Jan 19 '22 17:01 agahkarakuzu

Agah, I emailed you my images :)

On Wed, Jan 19, 2022 at 12:47 PM Agah @.***> wrote:

I think you replied to the comment via email with an attachment, but they don't show up here. You'll need to send it to my address, I think you have it.

— Reply to this email directly, view it on GitHub https://github.com/qMRLab/qMRLab/issues/440#issuecomment-1016713303, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALJLUTOAPNQUH4U7AZZD3DDUW32LBANCNFSM42R6AFIA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you authored the thread.Message ID: @.***>

zrazi avatar Jan 20 '22 02:01 zrazi

@zrazi

  1. Create a new matlab script with the following code and save it as dicom2ir.m.
  2. Make sure that qMRLab is added to your path
  3. Run the code, and select multiple DICOM files (as many TIs as you want) and confirm

The code will:

  • sort and merge files according to TI
  • create two masks for your phantom using simple morphometry, one binary and one indexed
  • fit data using merged volume, set protocol values and the binary mask
  • will save FitResults.mat and a csv (mean STD for T1 with the respective mask index).
  • You will find the outputs in the same directory where your input images are.

The mask detection is far from robust, yet still may perform well for this particular purpose.

function dicom2ir

[list,folder]=uigetfile('*','Select input DICOM images','MultiSelect','on');
lut = cell(2,length(list));
lut(1,:) = list;
for ii=1:length(list)
    in = dicominfo([folder filesep list{ii}]);
    lut(2,ii) = num2cell(in.InversionTime);
    im = dicomread(in);
    if ii==1
       sz = size(im);
       if ndims(sz) < 3
           ims = zeros(sz(1),sz(2),1,length(list)); 
       else
           ims = zeros(sz(1),sz(2),sz(3),length(list)); 
       end
    end
    
    ims(:,:,:,ii) = double(im);
end


[TI,idx] = sort([lut{2,:}]);
IRData = ims(:,:,:,idx);


Model = inversion_recovery;
Model.Prot.IRData.Mat = TI';
Model.Prot.TimingTable.Mat = in.RepetitionTime;
data = struct();
data.IRData = IRData; 

% A simple mask generator only for this phantom 
[a,b,~] = imfindcircles(mat2gray(ims(:,:,1,end)),[10 20],'Sensitivity', 0.98);
b = b - 2; % Erode radius by 2 pixels

% A simple sort logic assuming that all 12 circles are detected
% If not, will skip sorting
try
[a,idx] = sortrows(a,1);
a = round(a);
b = b(idx);
yall = repmat(sort(a(1:3,2)),[4,1]);
catch
yall = a(:,2);    
end
theta = (0:1000-1)*(2*pi/1000);
x = a(:,1) + b.*cos(theta);
y = yall + b.*sin(theta);

mask = zeros(sz(1),sz(2));
maskIdx = zeros(sz(1),sz(2));
for ii = 1:length(a)
    cur = poly2mask(x(ii,:),y(ii,:),sz(1),sz(2));
    mask = mask + cur ;
    maskIdx = maskIdx + cur*ii;
end

data.Mask= double(mask);
FitResults = FitData(data,Model,0);

FitResults.maskIdx = maskIdx;

% In case there are not 12 circles captured
idxs = unique(maskIdx(maskIdx(:)>0));
stat = zeros(length(idxs),3);
for ii=1:length(idxs)
    stat(ii,1) = ii; 
    stat(ii,2) = nanmean(FitResults.T1(maskIdx==ii));
    stat(ii,3) = nanstd(FitResults.T1(maskIdx==ii));
end

stat = sortrows(stat,2);
writematrix(stat,[folder filesep 'simplestat.csv']) 

save([folder filesep 'FitResults.mat'],'FitResults');
end

image

agahkarakuzu avatar Jan 21 '22 07:01 agahkarakuzu

@zrazi the script below works the same way, but for the mono_t2 model for T2 mapping:

function dicom2monot2

[list,folder]=uigetfile('*','Select input DICOM images','MultiSelect','on');
lut = cell(2,length(list));
lut(1,:) = list;
for ii=1:length(list)
    in = dicominfo([folder filesep list{ii}]);
    lut(2,ii) = num2cell(in.EchoTime);
    im = dicomread(in);
    if ii==1
       sz = size(im);
       if ndims(sz) < 3
           ims = zeros(sz(1),sz(2),1,length(list)); 
       else
           ims = zeros(sz(1),sz(2),sz(3),length(list)); 
       end
    end
    
    ims(:,:,:,ii) = double(im);
end


[TE,idx] = sort([lut{2,:}]);
SEdata = ims(:,:,:,idx);


Model = mono_t2;
Model.Prot.SEdata.Mat = TE';

data = struct();
data.SEdata = SEdata; 

% A simple mask generator only for this phantom 
[a,b,~] = imfindcircles(mat2gray(ims(:,:,1,end)),[10 20],'Sensitivity', 0.98);
b = b - 2; % Erode radius by 2 pixels

% A simple sort logic assuming that all 12 circles are detected
% If not, will skip sorting
try
[a,idx] = sortrows(a,1);
a = round(a);
b = b(idx);
yall = repmat(sort(a(1:3,2)),[4,1]);
catch
yall = a(:,2);    
end
theta = (0:1000-1)*(2*pi/1000);
x = a(:,1) + b.*cos(theta);
y = yall + b.*sin(theta);

mask = zeros(sz(1),sz(2));
maskIdx = zeros(sz(1),sz(2));
for ii = 1:length(a)
    cur = poly2mask(x(ii,:),y(ii,:),sz(1),sz(2));
    mask = mask + cur ;
    maskIdx = maskIdx + cur*ii;
end

data.Mask= double(mask);
FitResults = FitData(data,Model,0);

FitResults.maskIdx = maskIdx;

% In case there are not 12 circles captured
idxs = unique(maskIdx(maskIdx(:)>0));
stat = zeros(length(idxs),3);
for ii=1:length(idxs)
    stat(ii,1) = ii; 
    stat(ii,2) = nanmean(FitResults.T2(maskIdx==ii));
    stat(ii,3) = nanstd(FitResults.T2(maskIdx==ii));
end

stat = sortrows(stat,2);
writematrix(stat,[folder filesep 'simplestat.csv']) 

save([folder filesep 'FitResults.mat'],'FitResults');
end
  • Create a new matlab script with the following code and save it as dicom2monot2.m.
  • Make sure that qMRLab is added to your path
  • Run the code, and select multiple DICOM files (as many TEs as you want) and confirm image

agahkarakuzu avatar Feb 15 '22 10:02 agahkarakuzu