Differences between revisions 30 and 49 (spanning 19 versions)
Revision 30 as of 2008-07-18 18:57:05
Size: 4967
Comment:
Revision 49 as of 2012-02-29 14:57:14
Size: 4696
Comment:
Deletions are marked like this. Additions are marked like this.
Line 5: Line 5:
A [[http://surfer.nmr.mgh.harvard.edu/pub/dist/buckner_public_distribution.README.txt|public distribution data set]] is available to run the tutorials. It contains 40+ subjects, and occupies about 11G of disk-space after download and decompression. There are two download methods: using a web browser, or using the wget application. The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size.There are also two tar files associated with the tutorial on the integration of FreeSurfer and FSL/FEAT. The data for the [[FsTutorial| tutorials]] consists of several data sets:
 * buckner_data-tutorial_subjs.tar.gz ~16GB uncompressed - the main 'recon-all' stream subject data processing ([[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt|here is the md5sum]])
 * long-tutorial.tar.gz ~16GB uncompressed - the longitudinal tutorial
 * fsfast-tutorial.subjects.tar.gz ~5.6GB uncompressed and fsfast-functional.tar.gz ~9.1GB uncompressed - the FS-FAST tutorial data set
 * diffusion_recons.tar.gz and diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets
 * fbert-feat.tgz and bert.recon.tgz - tutorial on the integration of FreeSurfer and FSL/FEAT
Line 7: Line 12:
The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. '''Mac OS NOTE:''' use '''curl -O''' in place of '''wget'''.
Line 9: Line 15:
Open a terminal, change to a directory where you know you have at least 12GB of space. The troubleshooting and group study analysis tutorials require the '''buckner_data-tutorial_subjs*.tgz''' downloads, which are 6.5GB compressed (11GB decompressed). There is also FSL-FEAT tutorial data. To get these, type: Open a terminal, change to a directory where you know you have at least 100GB of space. To download, type:
Line 11: Line 17:
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast
-tutorial.subjects.tar.gz &
Line 16: Line 23:
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &
Line 19: Line 27:
The wget application will handle poor connections, and retry if it is having problems. Notice the commands have the ampersand, so you can run all four commands at once (although the wget output will be hard to decipher). Goto the Installation section below once the files are downloaded (this will likely take several hours). The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Goto the Installation section below once the files are downloaded (this will likely take several hours). If you want to verify the md5sum, the md5sum for each of these files is found in files named *.md5sum.txt [[http://surfer.nmr.mgh.harvard.edu/pub/data/|in this directory]] or get them this way:
Line 21: Line 29:
=== Download using the web browser ===
Some web browsers seem to have difficulty downloading files greater than 4GB in size, but if you would like to attempt the download using your browser, then the one necessary file is listed next (click on the .tar.gz filename to download). If a popup window appears prompting for a username and password, use 'anonymous' as the username, and your email address as the password.
{{{
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt &
}}}
Line 24: Line 40:
The troubleshooting and group study analysis tutorials require these downloads:
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tgz|buckner_data-tutorial_subjs.tgz]]
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz|buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz]]
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz|buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz]]

The FSL-FEAT tutorial require these:
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz|fbert-feat.tgz]]
 * [[ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz|bert.recon.tgz]]
You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads.
Line 35: Line 44:
Once the file is downloaded (via either ftp or web browser), move the file(s) to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following commands: Once the files are downloaded, move the file(s) to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following commands:
Line 38: Line 47:
tar xzvf buckner_data-tutorial_subjs.tgz
tar xzvf buckner_data-tutorial_subjs-group_analysis_tutorial_p1.tgz
tar xzvf buckner_data-tutorial_subjs-group_analysis_tutorial_p2.tgz
tar xzvf <filename>.tar.gz
Line 43: Line 50:
The downloaded .tgz files can then be deleted. Replacing of course <filename> with the name of each file downloaded. The downloaded .tar.gz files can then be deleted.
Line 47: Line 54:
source $FREESURFER_HOME/subjects/buckner_data/tutorial_subjs/scripts/subjects.csh setenv TUTORIAL_DATA $FREESURFER_HOME/subjects
setenv SUBJECTS_DIR $TUTORIAL_DATA
/buckner_data/tutorial_subjs/
Line 50: Line 58:
'''Note:''' If you are within the NMR Center, because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects. Instead, copy the subject data to a location where you have space, and set the SUBJECTS_DIR environment variable to point to that. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR). '''Note:''' If you are within the NMR Center, because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects. Instead, copy the subject data to a location where you have space, and set the TUTORIAL_DATA and SUBJECTS_DIR environment variables to point to that. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR).
Line 52: Line 60:
The tutorial will also instruct you to set the SUBJECTS_DIR when appropriate.

=== Additional group study data ===
There is an additional data set available upon which further group study analysis could be conducted. It is composed of the 40 subjects from which the tutorial's group study component was created. The data set is the full Freesurfer reconstruction of the 40 subjects (conducted using stable v3.0.2). It is 10.5GB compressed, and 18GB decompressed. It is named '''buckner_data-group_study.tar.gz'''. Also, there is the file '''buckner_data-raw_dicoms.tar.gz''', which, as indicated by the name, contains the original DICOM files for the subjects. Be sure to [[http://surfer.nmr.mgh.harvard.edu/pub/dist/buckner_public_distribution.README.txt|read these notes.]]
The tutorial will also instruct you to set the SUBJECTS_DIR when appropriate. The tutorial references the TUTORIAL_DATA var, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).

top

FreeSurfer Tutorial: Sample Data

The data for the tutorials consists of several data sets:

  • buckner_data-tutorial_subjs.tar.gz ~16GB uncompressed - the main 'recon-all' stream subject data processing (here is the md5sum)

  • long-tutorial.tar.gz ~16GB uncompressed - the longitudinal tutorial
  • fsfast-tutorial.subjects.tar.gz ~5.6GB uncompressed and fsfast-functional.tar.gz ~9.1GB uncompressed - the FS-FAST tutorial data set
  • diffusion_recons.tar.gz and diffusion_tutorial.tar.gz - the diffusion and Tracula tutorial data sets
  • fbert-feat.tgz and bert.recon.tgz - tutorial on the integration of FreeSurfer and FSL/FEAT

The wget application is recommended, as some web browsers have difficulty downloading files greater than 4GB in size. Mac OS NOTE: use curl -O in place of wget.

Download using wget

Open a terminal, change to a directory where you know you have at least 100GB of space. To download, type:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.tgz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.tar.gz &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.tar.gz &

The wget application will handle poor connections, and retry if it is having problems. A failed download can be restarted by adding the -c flag to wget, which will cause it to continue from the point where a partial download stopped. Notice the commands have the ampersand, so you can run all these commands at once (although the wget output will be hard to decipher, so an alternative is to run each without the ampersand in a separate terminal). Goto the Installation section below once the files are downloaded (this will likely take several hours). If you want to verify the md5sum, the md5sum for each of these files is found in files named *.md5sum.txt in this directory or get them this way:

wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/buckner_data-tutorial_subjs.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/long-tutorial.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-functional.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fsfast-tutorial.subjects.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/fbert-feat.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/bert.recon.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_recons.md5sum.txt &
wget ftp://surfer.nmr.mgh.harvard.edu/pub/data/diffusion_tutorial.md5sum.txt &

You will want to ensure that these md5sum's match what you find when you run md5sum on your own local downloads.

Installation

Once the files are downloaded, move the file(s) to the $FREESURFER_HOME/subjects directory, and uncompress and install with the following commands:

tar xzvf <filename>.tar.gz

Replacing of course <filename> with the name of each file downloaded. The downloaded .tar.gz files can then be deleted.

To setup the environment variable SUBJECTS_DIR to point to the tutorial data, type the following command or include in your .cshrc or .tcshrc file:

setenv TUTORIAL_DATA $FREESURFER_HOME/subjects
setenv SUBJECTS_DIR $TUTORIAL_DATA/buckner_data/tutorial_subjs/

Note: If you are within the NMR Center, because the default $FREESURFER_HOME is shared, you will not be able to copy your data to $FREESURFER_HOME/subjects. Instead, copy the subject data to a location where you have space, and set the TUTORIAL_DATA and SUBJECTS_DIR environment variables to point to that. You may have to make adjustments throughout the tutorial wherever it refers to $FREESURFER_HOME/subjects (which is equivalent to your $SUBJECTS_DIR).

The tutorial will also instruct you to set the SUBJECTS_DIR when appropriate. The tutorial references the TUTORIAL_DATA var, which is the root directory containing the tutorial data (ie. the directories 'buckner_data', 'long-tutorial', 'fsfast-functional', etc.).

FSL-FEAT Tutorial Data

cd $SUBJECTS_DIR
tar xvfz bert.recon.tgz
cd /place/for/functional/data
tar xvfz fbert-feat.tgz

FsTutorial/Data (last edited 2018-09-30 09:35:54 by AndrewHoopes)