NOTE: For the file transfer appliance, allowing you to "mail" large attachments from your desktop, go instead to http://transfer.med.harvard.edu
NOTE: For information on /n/files (aka
research.files.med.harvard.edu), see the bottom of this page.
There are a number of ways to copy files to and from O2. All connections must be secure. (The information, and especially your password, are encrypted during the transfer over the internet.) Be aware of which file systems you want to copy from and to. You might be copying from your laptop or desktop hard drive, or from some other site on the Internet.
Tools For Copying
- FileZilla - a Mac/Linux/Windows standalone sftp tool, with available Firefox browser plugin
- winscp - a Windows scp/sftp app
/files filesystem can also be mounted onto your desktop. See the HMS IT page on file servers for instructions.
Command line tools
rsync- automatically installed on Mac and Linux
psftp- Windows. Can be installed when installing the PuTTY ssh program, which many at HMS use to log in to Orchestra
How To Copy
- port: 22 (the SCP/SFTP port)
- username: your eCommons ID, the ID you use to login to O2, in lowercase, e.g., ab123 (not your Harvard ID or Harvard Key)
- password: your eCommons password, the password you use when logging in to O2 or http://ecommons.med.harvard.edu
For graphical tools, see the documentation that came with the program. Many tools will by default copy somewhere in your
/home directory, which has a small 100GB storage quota. Make sure to explicitly specify whether you want to copy there or to
/n/dataX, for example.
If you just have a single file to copy, and you're on a Mac you can also use something like
scp myfile firstname.lastname@example.org:/n/groups/lab/dir1/ from the Terminal application. (By default,
scp will copy to/from your home directory on the remote computer. You need to give the full path, starting with a /, in order to copy to other filesystems).
Interactive command line copying
The O2 compute nodes do not currently mount the "transfer" cluster. Instead, files can be copied directly from interactive compute nodes. Launch an interactive session with the following command:
rsync commands can be executed interactively.
Experienced users can set up batch copies using
rsync or recursive
cp. Please do not run large transfers on the O2 login nodes (login0X). They will be slow and subject to suspension, as they are competing with dozens of simultaneous logins and programs.
If you want to copy a large set of files, it may be best to submit as a job to O2. An example would be
sbatch -p short -t 0-12:00 --wrap="copy command here" . This will run in the
short partition like any other job.
The main advantage of batch copying is that you can make it part of a workflow. For example, you can use dependencies (
sbatch --dependency=afterok:<jobid> ) to run your analysis only when the job copying input files to O2 has finished.
Very big copies
Contact Research Computing if you want to copy multiple terabytes. We may be able to speed up the process.
Special considerations for the 'files' filesystem, aka research.files.med.harvard.edu
The O2 login nodes and compute do not currently mount
/n/files. To access this filesystem,
sbatch jobs or
srun interactives must use the
transfer partition. Then
cp can be executed to transfer these files to O2 filesystems.
Please note that we have restricted use of the `transfer` partition, to ensure that only those who need to access
/n/files on O2 will run jobs in this partition. You can contact us to request access to the