Using COMSOL Cluster Computing at CCR R2
Using COMSOL Cluster Computing at CCR R2
COMSOL License
natashal group
comsol group
jmjornet group
Module Name
comsol/4.3
comsol-ub/4.3a
comsol-jmjornet/4.3b
What are some limitations of the using the Cluster Computing option at CCR?
(1) UB CCR uses a front-end firewall to prevent its compute nodes from being directly exposed to the
outside world. As a result the COMSOL Cluster Computing option can only be used via running the
COMSOL GUI from the front-end, from one of the CCR compute nodes, or from the remote visualization
node. If the GUI is run from an external machine (e.g. a users personal laptop or desktop PC) it will not
be able to communicate with the compute nodes and the Cluster Computing feature will not work.
(2) An interactive SLURM job for launching the COMSOL server must be up and running prior to
launching the COMSOL GUI. This might mean that users will have to wait for some time before being
able to get to work, since the interactive job might be queued by the CCR resource manager depending
on the system load at the time the job is submitted. One way to avoid long waits is to submit the
interactive job to the debug partition. However, this will limit the number of nodes that can be
requested and will also limit the walltime for the Cluster Computing job to 1 hour.
Will request two compute nodes from the debug partition with 12 processors per node and 48GB of
RAM per node. A total of 24 processors will be used in solving the model. Information on the type of
nodes available (including partitions, processor counts, amount of memory, and SLURM constraints) is
available via the snodes command. Type snodes help at the command prompt for usage
information.
After entering the fisbatch command you will have to wait a bit (or possibly longer) for the scheduler to
process your request. Once the desired nodes become available the scheduler will automatically log you
into one of the compute nodes (this is known as the head node) and youll be provided with a
command line prompt. From this prompt, youll launch the COMSOL server software on each of the
requested nodes. Do this by entering the following sequence of commands (replace
your_comsol_module with the appropriate module for your group, see Table 1):
$ cd $SLURM_SUBMIT_DIR
$ srun hostname | sort | uniq > nodes.comsol
$ module load your_comsol_module
$ comsol nn 2 np 12 f nodes.comsol server
Be sure to match the value of the -nn (number of nodes) argument with the actual number of nodes
requested by the previous fisbatch command. Also be sure to match the np (number of processors per
node) argument with the actual number of tasks per node (ntasks-per-node) requested by the previous
fisbatch command. In this example, these values are 2 and 12, respectively. When you first run the
server you may be prompted for a username and password. If this happens, enter your UB CCR
username and password. Now the comsol server should launch and you should see output similar to the
following appear in the terminal:
Node 0 is running on host: k16n13a.ccr.buffalo.edu
Node 0 has address: k16n13a.ccr.buffalo.edu
Node 1 is running on host: k16n12b.ccr.buffalo.edu
Node 1 has address: k16n12b.ccr.buffalo.edu
COMSOL 4.3 (Build: 151) started listening on port 2036
Use the console command 'close' to exit the application
You may now minimize (but do not close) the comsol server terminal window. In the next step we will
run the COMSOL client GUI and connect it to these compute nodes.
Step 2 Launch the COMSOL GUI via comsol client
Open a second connection to the UB CCR front-end machine, or open a second terminal in the remote
visualization desktop. In the resulting terminal window, enter the following commands to launch the
COMSOL client GUI from the front-end. Replace your_comsol_module with the version that is
appropriate for your group (see Table 1, above):
$ module load your_comsol_module
$ comsol client
The COMSOL GUI splash screen will appear, followed by a dialog box that prompts for information about
the server node. An example is given below:
In the server text box, enter the name of the head compute node. This corresponds to the name of
Node 0 in the comsol server output (see above). For this example, the head node is
k16n13a.ccr.bufalo.edu. For the port number text box, enter the port number that the server is
listening on. This is provided in the output from comsol server command (see above) and in this example
the value is 2036. Enter your UB CCR username and password and click the ok button. Now the
COMSOL GUI will load and will connect to the compute nodes behind-the-scenes.
After clicking the Model BuilderShow icon a drop-down list will appear. In this list, make sure the
Advanced Study Options box is checked. It is circled in red in the figure below:
In the Model Builder area, highlight the name of the study that youd like to run in parallel. Then,
right-click and select Cluster Computing from the resulting drop-down list. This will add a Cluster
Computing node to the selected study, as shown on the following page.
Click on the newly added Cluster Computing node. This will open the Cluster Computing tab to the
right of the Model Builder area, as shown below:
In the Batch Settings area of the Cluster Computing tab, do the following:
(1) Select General from the drop-down list of Cluster types
(2) Uncheck the MPD is running box
(3) In the Host file: text box type the full path to the location of the nodes.comsol file that was
created in Step 1 (see above)
(4) Leave the Bootstrap server textbox blank
(5) In the Rsh textbox, type /usr/bin/ssh
(6) In the Number of nodes textbox, enter the number of compute nodes requested in Step 1 (see
above). For this example, the value is 2.
(7) In the Filename box, enter the full path to the .mph model file that you have opened. In this
example, the value is: /projects/ccrstaff/lsmatott/comsol/buoy/buoyancy_free.mph
(8) In the Directory box, enter the full path to the directory where the .mph model file is located.
In this example, the value is: / projects / ccrstaff /lsmatott/comsol/buoy
(9) Uncheck the Specify external COMSOL batch directory path box
(10)Uncheck the Specify external COMSOL installation directory path box
(11)Uncheck the Use batch license box
When all fields are filled out correctly, click the Save button. It is circled in red in the figure below. For
this example, the completed Cluster Computing configuration tab is given on the following page.