7. Script language¶
At Supercomputer Fugaku , preparing the OSS script language to the computing node.
Note
The OSS delivery has moved to Spack. The articles in this chapter are old OSS information before moving to Spack.
See Using OSS.
7.1. Overview¶
The following OSS compiled by the Fujitsu compiler is available for the compute nodes.
Python2
Python3
Ruby
Attention
When constructing these OSS environment, some modifications necessary for compiling with the Fujitsu compiler have been made, but no other special settings have been made.
OSS, basically, offers only environment information such as install path.
For setting the installation path required for OSS execution, modulefile is prepared for each OSS. Please use by refering to the following sob script example.
7.2. Python¶
Python2 and Pythn3 are available to use.
7.2.1. Python2¶
This indicates how to use Python2.
7.2.1.1. Package¶
To Python2, the following packages are added.
Versuon |
Added package |
---|---|
Python 2.7.15 |
|
See also
For NumPy and SciPy, fjlapackexsve (BLAS, LAPACK thread parallel version Fujitsu library is used.
7.2.1.2. Environment variable and use direction¶
Environment variable setting for executing Python2 uses module file (Python2-CN/2.7.15). Here indicates the job script example.
Execution by Python2 command
#!/bin/bash
#PJM -L "node=1"
#PJM -L "elapse=10:00"
#PJM -x PJM_LLIO_GFSCACHE=/vol000N
#PJM -g groupname
#PJM -S
module load Python2-CN
export FLIB_CNTL_BARRIER_ERR=FALSE
# execute job
python2 sample.py
If execute by using mpi4py (Uses mpiexec command)
#!/bin/bash
#PJM -L "node=2"
#PJM -L "elapse=10:00"
#PJM --mpi "proc=8"
#PJM -x PJM_LLIO_GFSCACHE=/vol000N
#PJM -g groupname
#PJM -S
module load Python2-CN
export FLIB_CNTL_BARRIER_ERR=FALSE
# execute job
mpiexec -n 8 python2 mpi-sample.py
7.2.2. Python3¶
Note
A new version of Python is available in Spack.
The package provided by Spack (py-*) should be used in conjunction with the Python provided by Spack.
See Using OSS.
This indicates how to use Python3.
7.2.2.1. Package¶
To Python3, the following packages are added.
Version |
Added package |
---|---|
Python 3.6.8 |
|
See also
For NumPy and SciPy, fjlapackexsve (BLAS, LAPACK thread parallel version Fujitsu library is used.
7.2.2.2. Environment variable and use direction¶
Environment variable setting for executing Python2 uses module file (Python3-CN/3.6.8). Here indicates the job script example.
Execution by Python3 command
#!/bin/bash
#PJM -L "node=1"
#PJM -L "elapse=10:00"
#PJM -x PJM_LLIO_GFSCACHE=/vol000N
#PJM -g groupname
#PJM -S
module load Python3-CN
export FLIB_CNTL_BARRIER_ERR=FALSE
# execute job
python3 sample.py
If execute by using mpi4py (Uses mpiexec command)
#!/bin/bash
#PJM -L "node=2:noncont"
#PJM -L "elapse=10:00"
#PJM --mpi "proc=8"
#PJM -x PJM_LLIO_GFSCACHE=/vol000N
#PJM -g groupname
#PJM -S
module load Python3-CN
export FLIB_CNTL_BARRIER_ERR=FALSE
# execute job
mpiexec -n 8 python3 mpi-sample.py
7.3. Ruby¶
Installing the following version Ruby.
Version |
---|
Ruby 2.6.5 |
7.3.1. Environment variable¶
Environment variable setting for executing Ruby uses module file (Ruby-CN/2.6.5). Here indicates the job script example
Execution by Ruby command
#!/bin/bash
#PJM -L "node=1"
#PJM -L "elapse=10:00"
#PJM -x PJM_LLIO_GFSCACHE=/vol000N
#PJM -g groupname
#PJM -S
module load Ruby-CN
# execute job
ruby sample.rb
7.4. Java¶
Here indicates the use of Java compiler.
7.4.1. Compiler environment setting¶
Use environment of Java compiler is set using modulefile.
To use, execute the following before executing translation.
Compiling on the login node
Use the openjdk provided by Spack. For more information on using Spack, please refer to the “Fugaku Spack Users Guide” on the Fugaku website.
[_LNlogin]$ . /vol0004/apps/oss/spack/share/spack/setup-env.sh # Environment setting of Spack [_LNlogin]$ spack load openjdk arch=linux-rhel8-skylake_avx512 # Environment setting of OpenJDK
Compiling on the compute node
[_LNlogin]$ module load OpenJDK-CN
The usable environment is as following.
Node
Software name
Language
Version
Command
Login node
OpenJDK
Java
11.0.2
javac/mpijavac
Compute Node
OpenJDK
Java
11u
javac/mpijavac
7.4.2. Compile command¶
Here is an example of compiling a Java program using mpijavac.
Normally, use javac for Java program translation, but about MPI program translation, use mpijavac command for MPI program translation command.
If use MPI library
[_LNlogin]$ mpijavac [compile option] source file name
If not use MPI library
[_LNlogin]$ javac [compile option] source file name
The translation command name of the command used on the login node and the command used on the compute node is the same.
Translation command
Type
Command name
MPI library
Cross compiler
mpijavac
Use
javac
Not use
Native compiler
mpijavac
Use
javac
Not use
Note
When using mpijavac, execution is possible only on compute node. Cannot execute on login node.
Here indicates main compile option.
Compile option
Description
---showme
Displays the calling line when the translation command of the MPI program calls the javac command. Actually does not proceed translation.
---verbose
Displays the calling line when the translation command of the MPI program calls the javac command. Proceeds translation.
---help,-help,-h
Displays help message. Actually does not proceed translation.
avac_arguments
Specifies an option giving to javac command.
See also
mpijavac command is javac command’s wrapper command and calls JDK’s javac command internally.Thus, mpijavaccommand can specify javac command option directly.
mpijavac command can translate Java program without specifying MPI library’s class path.
When Java program translation, to use jar file that the user prepares, it is requried to set the target jar file path with the environment variable
CLASSPATH
or javac command’s-classpath
option.If specified both of environment variable
CLASSPATH
and-classpath
option,-classpath
option has priority.
7.4.3. Execution direction¶
Shows how to execute a Java program “javasample.class” compiled with javac and mpijavac.
[Sequential execution]
#!/bin/bash #PJM -L "node=1" #PJM -L "rscgrp=small" #PJM -L "elapse=10:00" #PJM -x PJM_LLIO_GFSCACHE=/vol000N #PJM -g groupname #PJM -s module load OpenJDK-CN numactl --cpunodebind 4 --membind 4 java javasample
[MPI Execution]
#!/bin/bash #PJM -L "node=2" #PJM -L "rscgrp=small" #PJM -L "elapse=10:00" #PJM -x PJM_LLIO_GFSCACHE=/vol000N #PJM -g groupname #PJM --mpi "proc=8" #PJM -s module load OpenJDK-CN export PLE_MPI_STD_EMPTYFILE=off # Do not create a file if there is no output to stdout/stderr at execution time. export OMPI_MCA_plm_ple_memory_allocation_policy=bind_local mpiexec java javasample