Skip to end of metadata
Go to start of metadata

NOTE: For the Accellion file transfer appliance, allowing you to "mail" large attachments from your desktop, go instead to http://transfer.med.harvard.edu

NOTE: For information on /n/files (aka research.files.med.harvard.edu), see the bottom of this page.

NOTE: For guidelines on transferring data from O2 to NCBI's FTP for GEO submission, please reference this wiki page.



There are a number of secure ways to copy files to and from O2. The tools listed below encrypt your data and login credentials during the transfer over the internet. Be aware of which file systems you want to copy from and to. You might be copying from your laptop or desktop hard drive, or from some other site on the Internet.

Tools For Copying

Graphical tools

  • FileZilla - a Mac/Linux/Windows standalone sftp tool, with available Firefox browser plugin
  • winscp - a Windows scp/sftp app

Command line tools

  • scp, sftp, rsync - these are automatically installed on Mac and Linux
  • pscp, psftp - Windows-only. These can be installed with the PuTTY ssh program.

  • ftp - available on O2 for downloading from external sites which only accept FTP logins. But, O2 does not accept incoming FTP logins.

How To Copy Data to O2

Connection parameters:

  • host: transfer.rc.hms.harvard.edu
  • port: 22  (the SFTP port) 
  • username: your eCommons ID, the ID you use to login to O2, in lowercase, e.g., ab123 (not your Harvard ID or Harvard Key) 
  • password: your eCommons ID password, the password you use when logging in to O2


For graphical tools, see the documentation that came with the program. Many tools will by default copy somewhere in your /home directory, which has a small 100GB storage quota. Make sure to explicitly specify whether you want to copy there or to a different location like: /n/scratch2/mydir/


If you just have a single file to copy and you're on a Mac, you can also run a command like the following from the Terminal application:

me@mydesktop:~$ scp myfile my_o2_id@transfer.rc.hms.harvard.edu:/n/scratch2/mydir/

By default, scp will copy to/from your home directory on the remote computer. You need to give the full path, starting with a /, in order to copy to other filesystems.

Interactive command line copying

File transfer processes are too resource intensive to run on the O2 login servers, but you can run these interactively from a compute node as you would any other application. Launch an interactive session with the following srun command, and then run your commands once logged into a compute node:

mfk8@login02:~$ srun --pty -p interactive -t 0-12:00 /bin/bash
mfk8@compute-a-01-1:~$


Batch Copying

Experienced users can set up batch copies using rsync or recursive cp. Please do not run large transfers on the O2 login nodes (login0X). They will be slow and subject to suspension, as they are competing with dozens of simultaneous logins and programs.

If you want to copy a large set of files, it may be best to submit as a job to O2. For example:

mfk8@login02:~$ sbatch -p short -t 0-12:00 --wrap="rsync mydir1/ mydir2/"

This will run in the short partition like any other job.

The main advantage of batch copying is that you can make it part of a workflow. For example, you can use dependencies to run your analysis only when the job copying input files to O2 has finished. For example:

mfk8@login02:~$ sbatch --dependency=afterok:<jobid>

Very big copies

Contact Research Computing if you want to copy multiple terabytes. We may be able to speed up the process.

Special considerations for the '/n/files' filesystem, aka research.files.med.harvard.edu

The O2 login nodes and most compute nodes do not currently mount /n/files. There are 2 ways to access this filesystem from O2:

  1. Use O2's dedicated file transfer servers
    1. SSH login to the hostname: transfer.rc.hms.harvard.edu . You will be connected to a system which has access to /n/files .
    2. Once logged in, just run your commands (e.g. rsync, scp, cp) normally without using sbatch.
    3. Transfer servers can not submit jobs to the cluster, and research applications (modules) are not available from those systems.
  2. If you have a batch job workflow that must use /n/files , you can request access to be able to use the "transfer" job partition. This partition has access to a few lower performance compute nodes which mount /n/files . They are only recommended when using the transfer servers is not an option, as these nodes are slower and generally less available.

Using the transfer job partition

Please note that we have restricted use of the `transfer` partition, to ensure that only those who need to access /n/files on O2 will run jobs in this partition. You can contact us to request access to the transfer partition. Here are examples of jobs using this partition:

mfk8@login02:~$ sbatch -p transfer -t 0-12:00 --wrap="rsync /n/files/directory ."
mfk8@login02:~$ srun --pty -p transfer -t 0-12:00 /bin/bash
mfk8@compute-a-01-1:~$ ls -l /n/files




  • No labels