Thomas Zeiser

Some comments by Thomas Zeiser about HPC@RRZE and other things

Content

Installation of OpenFOAM

As there was some interest in OpenFOAM (“The Open Source CFD Toolbox”), I started installing it on our Woody cluster – can’t be too difficult, I thought.

Unfortunately, the pre-compiled binaries did not work as we have to run SuSE SLES9SP3 on this cluster (owing to the HP SFS parallel file system) and SLES9SP3 does not contain the required versions of gcc, openssl and probably some more packages.

Well, compiling from sources should not be a problem and then we can link to our “supported” Intel MPI library. No problem, or? Well, unpacking the OpenFOAM sources on an NFS directory takes ages (no surprise – almost 44k file/directories get extracted), they use their own build system, … To put a long story short, I gave up with Intel compilers and Intel MPI for the moment – gcc and the provided Open-MPI are used for now. Compilation takes ages (again no surprise as the installation directory grows up to 1.1 GB) … and Java complains about missing com.sun.j3d.utils.* – ah, you have to install Java 3D in addition (why didn’t the documentation mention this?) …

O.k. first compilation done (in 32-bit with integrated Open-MPI and probably neither Infiniband support nor PBS/Torque integration included). Now let’s build module files to integrate OpenFOAM in the environment loading scheme. Requires quite some work as >>30 environment variables have to be set or modified. (Thanks to LRZ for the work they already made on HLRB2 – That was a good starting point although it did not fit completely our needs.) But at least now foamInstallationTest does not report any error!

The first (solved) problem was that the nsd daemon of OpenFOAM tries to create some sort of lock file (ns.ref) in $WM_PROJECT_DIR/.OpenFOAM-1.4.1/apps/FoamX – this directory of course is on the NFS server and not writable by users. Copying the FoamX subdirectory to the user’s directory and adjusting $FOAMX_CONFIG solved the issue. Any better solution?

A 64-bit compilation now also finished in around 4h (again with OpenFOAM defaults only). However, the 32- and 64-bit version lack the integration of ParaView, thus, some commands like paraFoam currently fail. Obviously, ParaView sources are required while compiling, too.

http://www.tfd.chalmers.se/~hani/kurser/OF_phD_2007/downloadCompileAndRun.pdf seems to contain good guidlines for compiling and getting paraFoam et al. working … But just copying the original binary of libPVFoamReader.so did not do the trick for me.

On the other hand, adding PBS/Troque and Infiniband support to the provided Open-MPI seems to be easy; I now only added --with-tm=$OUR_TORQUE --with-openib=$OUR_OFED to $WM_PROJECT_DIR/src/Allwmake and recompiled just Open-MPI. Torque of course has to be compiled with support for position independent code or as shared library (cf. http://www.open-mpi.de/faq/?category=building#build-rte-tm). As we only have 64-bit OFED and Torque libraries, of course only the 64-bit build of OpenFOAM will have built-in support for them.

Let’s see if some users really will use it (and what they complain about).

More problems? Probably yes …

HPC and CFD courses in spring 2008

There will be a number of courses on HPC and CFD topics during the next months:

  • Tutorial: Programming with Fortran 95/2003: Object orientation and design patterns, February 4th-6th, LRZ-Munich (video transmission to RRZE possible if there is enough interest). For more details check http://www.lrz-muenchen.de/services/compute/courses/#OOFortran
  • Workshop: Performance Analysis of parallel programms with VAMPIR, February 7th, LRZ-Munich (video transmission to RRZE possible if there is enough interest). For more details check http://www.lrz-muenchen.de/services/compute/courses/#Vampir
  • NUMET 2008, 10.-13. März 2008, LSTM-Erlangen. For more details check http://www.lstm.uni-erlangen.de/numet2008/
  • Introductory course on High Performance Computing, March 17th-20th, RRZE. For more details check http://www.rrze.uni-erlangen.de/news/meldungen/meldung.shtml/9483

Also HLRS is organizing a couple of its annual courses and workshops during the next months: http://www.hlrs.de/news-events/events/

Microsoft campus day with a special focus on Windows-Compute-Cluster (CCS)

Microsoft campus day with a special focus on Windows-Compute-Cluster (CCS)
Since some time, the HPC group at RRZE operates a small Windows Compute Cluster (cf. http://www.rrze.uni-erlangen.de/dienste/arbeiten-rechnen/hpc/systeme/windows-cluster.shtml)

Up-to-now only very little interest among our “normal” HPC users was observed. Therefore, a Microsoft campus day with special focus on Windows CCS is organized on January 17th at the Faculty of Economics, see
http://www.rrze.uni-erlangen.de/news/meldungen/meldung.shtml/9484 for more details