+44 (0)1223 741857
+44 (0)1223 741852
Innovative Manufacturing Research Centre
Cambridge University Engineering Department
Unit 26a Cambridge Science Park
Cambridge, CB4 0FP, UK
Click here for site plan
Click here for mulimap link
CITI bus map
Cambridge city bus links
The compute cluster is behind a firewall and hence requires a
special access procedure. This is done via port forwarding, as
Points to note:
- Most external users will have a restricted account.
- Restricted accounts are NOT allowed to log onto
cairngorm.eng.cam.ac.uk and should not attempt to do so.
- Restricted accounts have to use port forwarding in order to
access the login node as described below.
- The login node is a relatively low powered machine which is only
ment to be used for job submission.
- Apart from there being hardly any compilers etc available on the
login node the generated code would not work on the cluster anyway as
the processors are different.
- Sun gridengine has to be used in order to submit jobs to the
Special note with regard to MPI jobs (out of date since we are using mpich2 now MSG 06/09/05):
- mpich is installed on the cluster. As it has the tendency to
leave processes behind when the job was canceled, cleanipcs will be run
automatically after each job.
- cleanipcs does not differentiate between jobs (however, it will
only be executed on nodes which ran the job in question)
- For the above reason ALL jobs on the repective node will be
deleted if they use shared memory!
- Therefore: Avoid starting more than one job with less than four
CPUs on a node as the termination of the one will kill the other!
- Use the nolocal option because the mpi-network is different from
the node network. Otherwise you will suffer:
- performance loss
- wrong job count
- Create SSH port forward:
Open a new terminal on your home machine and type:
ssh -fN -L3001:login.imrc-csfm.eng.cam.ac.uk:22
where <user> is your CSfM username. You will be asked for your
- SCP files onto the login node:
Open another terminal window on your home machine and scp your files:
scp -P 3001 <files>
- SSH onto the login node:
ssh -p 3001 -l<username> localhost
NOTE: lower case p
- will be written once the need arises