The next OpenFOAM user group meeting (Stammtisch) in Southern Germany will take place on Friday 20.09.2013 at RRZE.
For details visit http://www.extend-project.de/events-and-meetings/57.
The next OpenFOAM user group meeting (Stammtisch) in Southern Germany will take place on Friday 20.09.2013 at RRZE.
For details visit http://www.extend-project.de/events-and-meetings/57.
Compared with other software, installing OpenFOAM is (still) a nightmare. They use their very own build system, there are tons of environment variables to set, etc. But it seems that users in academia and industry accept OpenFOAM nevertheless. For release 1.7.1, I took the time to create a little receipt (in some parts very specifically tailored to RRZE’s installation of software packages) to more or less automatically build OpenFOAM and some accompanying Third Party packages from scratch using the Intel Compilers (icc/icpc) and Intel MPI instead of Gcc and Open MPI (only Qt and Paraview are still built using gcc). The script is provided as-is without any guarantee that it works elsewhere and of course also without any support. The script assumes that the required source code packages have already been downloaded. Where necessary, the unpacked sources are patched and the compilation commands are executed. Finally, two new tar balls are created which contain the required “output” for a clean binary installation, i.e. intermediate output files (e.g. *.dep) are not included …
Compilation takes ages, but that’s not really surprising. Only extracting the tar balls with the sources amounts to 1.6 GB in almost 45k files/directories. After compilation (although neither Open MPI nor Gcc are built) the size is increased to 6.5 GB or 120k files. If all intermediate compilation files are removed, there are still about 1 GB or 30k files/directories remaining in my “clean installation” (with only the Qt/ParaView libraries/binaries in the ThirdParty tree).
RRZE users find OpenFOAM-1.7.1 as module on Woody and TinyBlue. The binaries used for Woody and TinyBlue are slightly different as both were natively compiled on SuSE SLES 10SP3 and Ubuntu 8.04, respectively. The main difference should only be in the Qt/Paraview part as SLES10 and Ubuntu 8.04 come with different Python versions. ParaView should also be compiled with MPI support.
Note (2012-06-08): to be able to compile src/finiteVolume/fields/fvPatchFields/constraint/wedge/wedgeFvPatchScalarField.C
with recent versions of the Intel compiler, one has to patch this file to avoid an no instance of overloaded function “Foam:operator==” matches the argument list error message; cf. http://www.cfd-online.com/Forums/openfoam-installation/101961-compiling-2-1-0-rhel6-2-icc.html and https://github.com/OpenFOAM/OpenFOAM-2.1.x/commit/8cf1d398d16551c4931d20d9fc3e42957d0f93ca. These links are for OF-2.1.x but the fix works for OF-1.7.1 as well.
OpenFOAM is a widely used open-source software for computational fluid dynamics (CFD). There is also a growing number of groups on our campus which use or at least give OpenFOAM a try. I never applied OpenFOAM for CFD simulations myself – I only spent lots of hours installing it on RRZE’s clusters. But from what I heard from actual users, documentation seems to be rather poor resulting in a slow learning curve. To facilitate and stimulate a coordinated communication and self-help of the different OpenFOAM users and groups at the University of Erlangen-Nuremberg, a local mailing list has been set up. OpenFOAM users from outside of the University of Erlangen-Nuremberg are also welcome if they give a substantial contribution – but keep in mind that this local mailing list is not an official OpenFOAM support forum.
The subscription policy for the mailing list is “open”, i.e. everyone can directly subscribe/unsubscribe. Posts to the mailing list are only allowed from registered users (i.e. from the email address used for subscription) – all other messages require approval by the moderator to prevent spam.
For further information and (un)subscription, please visit the webpage of the rrze-openfoam-user mailing list.
Does not work as SuSE SLES10SP1 is too different …; one very strange thing is that the gcc-4.3.1 included in the ThridParty packages does work in 32-bit but complains about an incompatible library in 64-bit mode although the library is a correct 64-bit *.so
Fails due to problems with C++ templates, etc.
Has too many dependencies to quickly do it (MPFR and GNUMP – the SuSE SLES10SP1 versions are too old)
Requires a patch for autoRefineDriver.C to avoid fatal “invalid conversion” error message; cf. http://openfoam.cfd-online.com/cgi-bin/forum/show.cgi?tpc=126&post=24786)
As there was some interest in OpenFOAM (“The Open Source CFD Toolbox”), I started installing it on our Woody cluster – can’t be too difficult, I thought.
Unfortunately, the pre-compiled binaries did not work as we have to run SuSE SLES9SP3 on this cluster (owing to the HP SFS parallel file system) and SLES9SP3 does not contain the required versions of gcc, openssl and probably some more packages.
Well, compiling from sources should not be a problem and then we can link to our “supported” Intel MPI library. No problem, or? Well, unpacking the OpenFOAM sources on an NFS directory takes ages (no surprise – almost 44k file/directories get extracted), they use their own build system, … To put a long story short, I gave up with Intel compilers and Intel MPI for the moment – gcc and the provided Open-MPI are used for now. Compilation takes ages (again no surprise as the installation directory grows up to 1.1 GB) … and Java complains about missing com.sun.j3d.utils.*
– ah, you have to install Java 3D in addition (why didn’t the documentation mention this?) …
O.k. first compilation done (in 32-bit with integrated Open-MPI and probably neither Infiniband support nor PBS/Torque integration included). Now let’s build module
files to integrate OpenFOAM in the environment loading scheme. Requires quite some work as >>30 environment variables have to be set or modified. (Thanks to LRZ for the work they already made on HLRB2 – That was a good starting point although it did not fit completely our needs.) But at least now foamInstallationTest
does not report any error!
The first (solved) problem was that the nsd
daemon of OpenFOAM tries to create some sort of lock file (ns.ref
) in $WM_PROJECT_DIR/.OpenFOAM-1.4.1/apps/FoamX
– this directory of course is on the NFS server and not writable by users. Copying the FoamX
subdirectory to the user’s directory and adjusting $FOAMX_CONFIG
solved the issue. Any better solution?
A 64-bit compilation now also finished in around 4h (again with OpenFOAM defaults only). However, the 32- and 64-bit version lack the integration of ParaView, thus, some commands like paraFoam
currently fail. Obviously, ParaView sources are required while compiling, too.
http://www.tfd.chalmers.se/~hani/kurser/OF_phD_2007/downloadCompileAndRun.pdf seems to contain good guidlines for compiling and getting paraFoam
et al. working … But just copying the original binary of libPVFoamReader.so
did not do the trick for me.
On the other hand, adding PBS/Troque and Infiniband support to the provided Open-MPI seems to be easy; I now only added --with-tm=$OUR_TORQUE --with-openib=$OUR_OFED
to $WM_PROJECT_DIR/src/Allwmake
and recompiled just Open-MPI. Torque of course has to be compiled with support for position independent code or as shared library (cf. http://www.open-mpi.de/faq/?category=building#build-rte-tm). As we only have 64-bit OFED and Torque libraries, of course only the 64-bit build of OpenFOAM will have built-in support for them.
Let’s see if some users really will use it (and what they complain about).
More problems? Probably yes …