Cluster: Difference between revisions
No edit summary |
No edit summary |
||
(4 intermediate revisions by the same user not shown) | |||
Line 71: | Line 71: | ||
#PBS -o path-to-output | #PBS -o path-to-output | ||
#PBS -e path-to-error | #PBS -e path-to-error | ||
cd /global/home/users/kilian/ | cd /global/home/users/kilian/sample_executables | ||
cat $PBS_NODEFILE | cat $PBS_NODEFILE | ||
mpirun -np 8 /bin/hostname | mpirun -np 8 /bin/hostname | ||
Line 87: | Line 87: | ||
cat $PBS_NODEFILE | cat $PBS_NODEFILE | ||
* run the program on several cores | |||
mpirun -np 4 -mca btl ^openib sample_executables/mpi_hello | |||
=== Matlab === | === Matlab === | ||
Line 96: | Line 100: | ||
export PATH=$PATH:/global/home/users/jack/matlab74/bin | export PATH=$PATH:/global/home/users/jack/matlab74/bin | ||
=== Sage === | |||
I've installed sage 4.02 in ~amirk/sage. | |||
Sample pbs and mpi script is here: | |||
~amirk/test | |||
You can run it as: | |||
% mkdir -p ~/jobs | |||
% ~amirk/test | |||
% qsub pbs | |||
In your interactive session, if you want to have a scipy environment (run ipython, etc), do: | |||
% ~amirk/sage/sage -sh | |||
first, then you can run: | |||
% ipython | |||
This is a temporary solution for people wanting use scipy with mpi on the cluster. | |||
== Support Requests == | == Support Requests == |
Revision as of 19:15, 10 July 2009
Connect
get a password
- press the PASS WORD button on your crypto card
- enter passoword
- press enter
- the 7 digit password is given (without the dash)
setup environment
- put all your customizations into your .bashrc
- for login shells, .bash_profile is used, which in turn loads .bashrc
ssh to the gateway computer (hadley)
note: please don't use the gateway for computations (e.g. matlab)!
ssh -Y neuro-calhpc.berkeley.edu (or hadley.berkeley.edu)
and use your crypto password
Useful commands
Start interactive session on compute node
- start interactive session: qsub -X -I
- start interactive session on particular node: qsub -X -I -l nodes=n0001.cortex
Perceus commands
The perceus manual is here
- listing available cluster nodes:
wwstats
- list cluster usage
wwtop
- to restrict the scope of these commands to cortex cluster, add the following line to your .bashrc
export NODES='*cortex'
- module list
- module avail
- module help
- help pages are here
Resource Manager PBS
- Job Scheduler MOAB
- List running jobs:
qstat -a
- List jobs of a given node:
qstat -n 98
- sample script
#!/bin/bash #PBS -q cortex #PBS -l nodes=1:ppn=2:cortex #PBS -l walltime=01:00:00 #PBS -o path-to-output #PBS -e path-to-error cd /global/home/users/kilian/sample_executables cat $PBS_NODEFILE mpirun -np 8 /bin/hostname sleep 60
- submit script
qsub scriptname
- interactive session
qsub -I -q cortex -l nodes=1:ppn=2:cortex -l walltime=00:15:00
- list nodes that your job is running on
cat $PBS_NODEFILE
- run the program on several cores
mpirun -np 4 -mca btl ^openib sample_executables/mpi_hello
Matlab
note: remember to start an interactive session before starting matlab!
We don't currently have a proper Matlab installation. However, you can run an old version by appending the following to the .bashrc file in your home directory:
export PATH=$PATH:/global/home/users/jack/matlab74/bin
Sage
I've installed sage 4.02 in ~amirk/sage.
Sample pbs and mpi script is here:
~amirk/test
You can run it as:
% mkdir -p ~/jobs % ~amirk/test % qsub pbs
In your interactive session, if you want to have a scipy environment (run ipython, etc), do:
% ~amirk/sage/sage -sh
first, then you can run:
% ipython
This is a temporary solution for people wanting use scipy with mpi on the cluster.
Support Requests
- If you have a problem that is not covered on this page, you can send an email to our user list:
redwood_cluster@lists.berkeley.edu
- If you need additional help from the LBL group, send an email to their email list. Please always cc our email list as well.
scs@lbl.gov
- In urgent cases, you can also email Krishna Muriki (LBL User Services) directly.