GPI - Inventory

GPI Compute nodes
Partition Server CPU GPU(s)
Model (Arch) Cores RAM Model Arch/Capability GPUs Cores RAM
gpi.develop gpic07 Intel Xeon 2.6GHz (x86_64) 16 384GB GTX Titan X Pascal 2 3072 12GB
gpi.develop/gpi.compute gpic08 Intel Xeon 3.0GHz (x86_64) 16 256GB GTX Titan X Maxwell/5.2 7 3072 12GB
gpi.develop/gpi.compute gpic09 Intel Xeon 2.1GHz (x86_64) 32 256GB GeForce GTX 1080 Ti Pascal 8 3584 11GB
gpi.develop/gpi.compute gpic10 Intel Xeon 2.1GHz (x86_64) 32 256GB GeForce RTX 2080 Ti Turing 8 4352 11GB
gpi.develop/gpi.compute gpic11 Intel Xeon 2.2GHz (x86_64) 40 384GB GeForce RTX 2080 Ti Turing 8 4352 11GB
gpi.develop/gpi.compute gpic12 Intel Xeon 2.4GHz (x86_64) 40 384GB GeForce RTX 2080 Ti Turing 8 4352 11GB
gpi.develop/gpi.compute gpic13 Intel Xeon 2.4GHz (x86_64) 40 256GB Quadro RTX 5000 Turing 8 3072 16GB
gpi.develop/gpi.compute gpic14 Intel Xeon 2.4GHz (x86_64) 40 256GB GeForce RTX 3090 Ampere 6 10496 24GB
gpi.develop/gpi.compute gpic16 AMD EPYC 9224 24 256GB NVIDIA A40 Ampere 7 10752 48GB

Remember that you always should specify the -A parameter when using the srun command to avoid any error messages, and it needs to be the one your partition belongs to. This Research group uses the account gpi, so you should use the paramenter -A gpi, and then specify one of our two possible partitions in gpi (gpi.compute and gpi.develop) with the parameter -p. 

Note that in this research group most servers are used with both gpi.develop and gpi.compute. The difference between both partitions is that the compute partition is oriented towards processing-intensive experiments, while the develop partition was designed  to be used to code or other less processing-intensive tasks. Therefore, the default time, as well as the maximum end time,  are set higher for the computing server. In conclusion, depending on what you want to use our servers for, you should use the partition that best fits your interests.

Keywords
INVENTORY