There are a number of storage options available for research data. If you have any questions about any of them, please contact us.
|Everyone||10TB or 1 million files/directories||No||30 days||No||Fast||Free|
|Everyone||Small hard drive size||No||None||No||Fast||Free|
- 1Replication means that data are copied nightly to a separate location
- 2Snapshot means that each directory contains an invisible, read-only
.snapshotdirectory that has 60 days of weekly old versions of files, 14 days daily. See this page for information on restoring data using snapshots.
- 3Scratch3 is the current filesystem for storing temporary or intermediary files. The old
/n/scratch2filesystem was made read-only on June 15, 2020, and was retired permanently on June 26, 2020. Any data left on scratch2 is not retrievable as the system hardware is being removed from our data center and recycled.
- 4Group directories are generally in
- 5Collaborations are generally accessible from desktops, and as
/n/filesfrom the transfer cluster and a limited set of cluster compute nodes.
- 6There are no plans for chargebacks for group storage at this time.
- 7Only on-Quad researchers can create collaborations, but they can invite off-Quad researchers to join collaborations (with read-only or read-write access)
A brief description of each filesystem is below. For more information, please refer to the following pages:
On June 15, 2020: the OLD
/n/scratch2 filesystem was made READ-ONLY, in preparation for retirement.
On June 26, 2020: the OLD
/n/scratch2 filesystem was taken OFFLINE and retired permanently.
Any data left on scratch2 is not retrievable as the system hardware is being removed from our data center and recycled.
Please see the Scratch 3 page for information on transitioning from using
/n/scratch2 to using
This is a filesystem designed to handle large volumes of temporary files. If you are running large pipelines, it is recommended that you write intermediary files here. There are no backups made here, and files are automatically purged after 30 days of no access.
Note: It is against RC policy to artificially refresh last access time of any file located under
/n/scratch3. If you artificially update the access time of scratch3 files, you may lose access to this filesystem.
This is where your home directory is located. Every user on the O2 research cluster gets a
/home directory like
/home/ab987 that is limited to 100GB. This amount cannot be expanded. Your home directory is backed up inside
.snapshot, an invisible directory located inside every subdirectory of your home directory, which has 14 days worth of daily snapshots, and 60 days of weekly snapshots. Every night, your
/home directory is copied to an off-site location (but that copy is overwritten 24 hours later).
Your lab might have space for you and your colleagues to share. Labs may install their own group-specific software here, as well as sharing data. Lab/group directories are generally in
/n/data2. These are backed up and snapshotted in the same way as
Collaborations created by HMS IT are on the
research.files filesystem. This filesystem is mostly used for sharing data between labs, or for departmental shared space, and it can be mounted as a shared drive on Windows and Mac desktops. (Contact email@example.com or an IT client service representative with questions about
research.files.) When needed, this filesystem can be accessed as
/n/files on a few "transfer compute nodes". (See File Transfer for details.)
This is temporary local hard drive space on a single compute node. If you require the fastest I/O for your job, you can have a program write temporary intermediate files to the /tmp directory. There is not a lot of space, and you will be sharing it with anyone else that requires the use of
/tmp on that node. Each node has a different
/tmp, and if you need to fetch files from there, you will need to
ssh directly to that compute node. Also, these files may be deleted any time after your job finishes.
We are evaluating other storage options that might be suited for your needs. Please contact us at firstname.lastname@example.org if you have questions.
For more information, please refer to the following pages: