Using your Jena FSU account/login you can access several resources (Note some of them do require further registration/identification):
tpi_all@listserv.uni-jena.de
numrel@listserv.uni-jena.de
astrogravity@listserv.uni-jena.de
core@listserv.uni-jena.de
bamdev@listserv.uni-jena.de
python
, Mathematica
, Maple
, Matlab
, etc.login1.ara.uni-jena.de:/nfs04/data/gravtheo
large permanent datanfs2.tpi.uni-jena.de:/tpidata1/coredata/
medium-large permanent datatullio.to.infn.it:/data*/numrel/
temporary data in useFor any help you can contact the IT team at tpi_it@listserv.uni-jena.de
Current member of IT team for our group: alejandra.gonzalez@uni-jena.de
and fabio.magistrelli@uni-jena.de
IT team webpage.
Computer resources managed by the group:
/tpidata1/coredata/
Additionally, the group has acces to various HPC clusters in EU.
c-serv
: 12 cores, 256 GB RAM, XeonPhi 3120Ac-serv2
: 8 cores, 256 GB RAMm-serv
: 12 cores, 384 GB RAMm-serv2
: 8 cores, 256 GB RAM, XeonPhi 7120Pm-serv3
: ? cores, ??? GB RAMgpuserv
: 8 cores, 256 GB RAM, 4 Nvidia-GPUsFrom TPI PCs ssh
connect to each machine.
Type
$ module avail
for installed software, use
$ module load <name>
to load a module.
Workstation can be also used remotely to run Mathematica
, python
etc kernels.
Abacus is currently down (March 2023). Some nodes will be moved to Draco to form a TPI partition.
The system is composed of three partitions each with few Intel nodes equipped with high performances cores. Some pairs of nodes are connected with inifiniband. It is hosted and managed by TPI crew and shares the same filesystem of the TPI PCs. See this description.
From TPI PCs ssh
connect to head node abacus
$ ssh abacus
Installed software can be listed with
$ module avail
and accessed loading the modules.
To access the nodes you need to use the slurm batch system.
HPC cluster with about 200 Broadwell and SkyLake nodes and Omnipath connection. GPUs are also available on some nodes, TPI members have exclusive access to an Nvidia GPU Ampere 80GB.
Links to Jena URZ (University computing center):
hpc-support@uni-jena.de
Access to ARA cluster is currently only for users at Jena FSU. If you are at Jena, ask SB to oepn a ticket for you to get an account.
Info on the system can be found at ARAWiki (password protected link).
There're two head nodes, to login do (with your username)
$ ssh ho23kag@ara-login01.rz.uni-jena.de
or
$ ssh ho23kag@ara-login02.rz.uni-jena.de
To use git
you need to setup a http
proxy:
$ export http_proxy="http://internet4nzm.rz.uni-jena.de:3128"
$ export https_proxy="http://internet4nzm.rz.uni-jena.de:3128"
$ git clone https://bernuzzi@bitbucket.org/dradice/athena_z4c.git
To list the installed software use
$ module avail
Modules can be loaded with
$ module load <name>
To access the nodes you need to use the slurm batch system.
HPC cluster with tens of general-purpose Intel nodes. The cluster serves FSU and other institutions in Thurigen with cloud computing and some HPC access.
Links to Jena URZ (University computing center):
hpc-support@uni-jena.de
Access to DRACO cluster is currently only for users at Jena FSU. If you are at Jena, ask SB to oepn a ticket for you to get an account.
To login do (with your username)
$ ssh ho23kag@@login1.draco.uni-jena.de
To list the installed software use
$ module avail
Modules can be loaded with
$ module load <name>
To access the nodes you need to use the slurm batch system.