Interactive Jobs
On occassion it might be necessary for a user to obtain an interactive session run a program interactively, for example using a graphical user interface or for debugging. While this can be done on the login nodes for preparing jobs and testing, when running full calculations, it must be done via the queueing system.
A user can therefore request resources (nodes, cores and/or memory) from the scheduler but launch a program manually and interactively from a terminal allocated by scheduler. You should attempt to use the first method below, however, if you encounter display problems, then use the alternate method.
Interactive Job with Command Line Access (recommended method)
First, connect to ARCHIE using ssh, MobaXterm or ThinLinc. If connecting using ThinLinc, then start a Linux Terminal.
Below is a sample command to request 8 cores on a compute node for an interactive session with access to the command line:
(long version):
srun --account=testing --partition=standard --time=2:00:00 --x11 --ntasks=8 --pty bash
Or, using a dev node for debugging:
srun --account=testing --partition=dev --time=1:00:00 --x11 --ntasks=8 --pty bash
(short versions):
srun -A testing -p standard -t 2:00:00 --x11 -n 8 --pty bash
srun -A testing -p dev -t 1:00:00 --x11 -n 8 --pty bash
Example Options | ||
---|---|---|
--account=testing | -A testing | the project account for the job |
--partition=standard | -p standard | the partition on which the job should run |
--time=2:00:00: | -t 2:00:00 | a run time of 2 hours has been requested |
--x11: | --x11 | to allow graphical windows to be displayed |
--ntasks=8: | -n 8 | 8 cores has been requested |
--pty bash: | --pty bash | a bash shell has been requested for executing commands |
This will provide the user with a command prompt, which they can use to run execute:
[acs01234@node041 projects]$ ./example_job
In this example, node041 has been allocated.
Loading modules
Once an interactive session has been allocated, any modules that are necessary for you to run your jobs will have to be loaded, e.g.
[acs01234@node041 projects]$ module load anaconda/python-3.9.7/2021.11
For more information on using modules, see Envoronment Modules.
Interactive Job (alternate method)
This method should only be used of you encounter problems displaying your application GUI using the first method above.
First, connect to ARCHIE using ssh, MobaXterm or ThinLinc. If connecting using ThinLinc, then start a Linux Terminal.
Then run a jobs interactively using the following procedure:
- use salloc to reserve some cores (or nodes).
- determine the node(s) you have been allocated
- use "ssh" to login to the (master) node
- run your software
- when finished, exit the node
- exit from "salloc"
1. Request an "allocation" for your job using the salloc command
Request as many cores as you need and the runtime in the usual way:
(long version):
salloc --account=testing --partition=standard --time=2:00:00 --ntasks=8
(short version):
salloc -A testing -p standard -t 2:00:00 -n 8
2. Determine the node(s) you have been allocated
echo $SLURM_NODELIST
This will return the list of nodes that you have been allocated. The first node will be the "master node" of your allocation e.g. node024.
3. Use ssh to login to the master node
ssh -X node024
for example. The -X flag will allow you to remotely display the GUI of your application
4. Run your software application
You should run your software as you would on the login nodes e.g.
- change to your working directory
-
load any necessary modules, for example, matlab:
[acs01234@node024] module load matlab/R2019a
-
run your software e.g.
[acs01234@node024] matlab
5. When you are finished, exit from the node:
exit
6. Exit from the allocation:
exit
Tip: Short run time
Specifying a short run time will help the scheduler to allocate your job more quickly
Tip: Node exclusivity
Use the --exclusive flag if you require a whole node for your job