Find answers, ask questions, and connect with our <br>community around the world.

Home Forums OpenFOAM Forum Error

  • Posted by Imani on March 1, 2024 at 11:23 pm

    I’m trying to run a simulation in our university cluster, however, I’m getting this error:
    PIMPLE: iteration 1
    MULES: Solving for Species
    MULES: Solving for alpha.phase1
    GAMG: Solving for pc, Initial residual = 0.00664789, Final residual = 5.42177e-08, No Iterations 10
    Phase-1 volume fraction = 0.15 Min(alpha.phase1) = -1.00361e-111 Max(alpha.phase1) = 1
    GAMG: Solving for pc, Initial residual = 0.00208986, Final residual = 7.39091e-08, No Iterations 7
    GAMG: Solving for p_rgh, Initial residual = 0.00268372, Final residual = 2.13981e-05, No Iterations 4
    time step continuity errors : sum local = 112609e-09, global = 8.28254e-12, cumulative = 6.26613e-08
    GAMG: Solving for p_rgh, Initial residual = 0.000260556, Final residual = 1.15939e-06, No Iterations 4
    time step continuity errors : sum local = 6.10078e-11, global = 1.19586e-13, cumulative = 6.26614e-08
    GAMG: Solving for p_rgh, Initial residual = 3.44524e-05, Final residual = 4.79729e-08, No Iterations 5
    time step continuity errors : sum local = 2.52437e-12, global = -1.46494e-14, cumulative = 6.26614e-08
    DILUPBiCGStab: Solving for Species, Initial residual = 4.93969e-06, Final residual = 2.27243e-09, No Iterations 1
    Species concentration = 0.15 Min(Yi) = 5.45485e-189 Max(Yi) = 1
    ExecutionTime = 79.6 s ClockTime = 80 s

    Courant Number mean: 0.0102411 max: 0.0183727
    Interface Courant Number mean: 0.000216009 max: 0.0158203
    deltaT = 1e-07
    Time = 0.0002001

    PIMPLE: iteration 1
    MULES: Solving for Species
    MULES: Solving for alpha.phase1
    [1] #0 Foam::error::printStack(Foam::<wbr>Ostream&)———————<wbr>——————————<wbr>———————–
    A process has executed an operation involving a call to the
    “fork()” system call to create a child process. Open MPI is currently
    operating in a condition that could result in memory corruption or
    other system errors; your job may hang, crash, or produce silent
    data corruption The use of fork() (or system() or other calls that
    create child processes) is strongly discouraged.

    The process that invoked fork was:

    Local host: [[52723,1],1] (PID 7598)

    If you are *absolutely sure* that your application will successfully
    and correctly survive a call to fork(), you may disable this warning
    by setting the mpi_warn_on_fork MCA parameter to 0.
    ——————————<wbr>——————————<wbr>————–
    in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so
    [1] #1 Foam::sigSegv::sigHandler(int) in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so
    [1] #2 ? in /lib64/libpthread.so.0
    [1] #3 ? in /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libopen-pal.so.40
    [1] #4 opal_progress in /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libopen-pal.so.40
    [1] #5 ompi_request_default_wait in /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40
    [1] #6 ompi_coll_base_sendrecv_actual in /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40
    [1] #7 ompi_coll_base_allreduce_<wbr>intra_recursivedoubling in /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40
    [1] #8 MPI_Allreduce in /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40
    [1] #9 Foam::reduce(double&, Foam::sumOp const&, int, int) in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/sys-<wbr>openmpi/libPstream.so
    [1] #10 Foam::PCG::scalarSolve(Foam::<wbr>Field&, Foam::Field const&, unsigned char) const in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so
    [1] #11 Foam::GAMGSolver::<wbr>solveCoarsestLevel(Foam::<wbr>Field&, Foam::Field const&) const in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so
    [1] #12 Foam::GAMGSolver::Vcycle(Foam:<wbr>:PtrList const&, Foam::Field&, Foam::Field const&, Foam::Field&, Foam::Field&, Foam::Field&, Foam::Field&, Foam::Field&, Foam::PtrList<Foam::Field >&, Foam::PtrList<Foam::Field >&, unsigned char) const in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so
    [1] #13 Foam::GAMGSolver::solve(Foam::<wbr>Field&, Foam::Field const&, unsigned char) const in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so
    [1] #14 Foam::fvMatrix::<wbr>solveSegregated(Foam::<wbr>dictionary const&) in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libfiniteVolume.so
    [1] #15 Foam::fvMatrix::<wbr>solveSegregatedOrCoupled(Foam:<wbr>:dictionary const&) in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libfiniteVolume.so
    [1] #16 Foam::fvMesh::solve(Foam::<wbr>fvMatrix&, Foam::dictionary const&) const in ~/OpenFOAM/OpenFOAM-v2212/<wbr>OpenFOAM-v2212/platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libfiniteVolume.so
    [1] #17 Foam::interfaceProperties::<wbr>calculatePhic() in ~/OpenFOAM/GeoChemFoam/<wbr>GeoChemFoam-5.0-main/lib/<wbr>libinterfacePropertiesGCFOAM.<wbr>so
    [1] #18 ? in ~/OpenFOAM/GeoChemFoam/<wbr>GeoChemFoam-5.0-main/bin/<wbr>interTransportFoam
    [1] #19 __libc_start_main in /lib64/libc.so.6
    [1] #20 ? in ~/OpenFOAM/GeoChemFoam/<wbr>GeoChemFoam-5.0-main/bin/<wbr>interTransportFoam
    [c20:07598] *** Process received signal ***
    [c20:07598] Signal: Segmentation fault (11)
    [c20:07598] Signal code: (-6)
    [c20:07598] Failing at address: 0x13b800001dae
    [c20:07598] [ 0] /lib64/libpthread.so.0(+<wbr>0xf5e0)[0x7f45d4cd75e0]
    [c20:07598] [ 1] /lib64/libpthread.so.0(raise+<wbr>0x2b)[0x7f45d4cd74ab]
    [c20:07598] [ 2] /lib64/libpthread.so.0(+<wbr>0xf5e0)[0x7f45d4cd75e0]
    [c20:07598] [ 3] /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libopen-pal.so.40(<wbr>+0x9d4ae)[0x7f45ce1394ae]
    [c20:07598] [ 4] /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libopen-pal.so.40(<wbr>opal_progress+0x2c)[<wbr>0x7f45ce0f105c]
    [c20:07598] [ 5] /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40(ompi_<wbr>request_default_wait+0x115)[<wbr>0x7f45cfda61f5]
    [c20:07598] [ 6] /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40(ompi_<wbr>coll_base_sendrecv_actual+<wbr>0xca)[0x7f45cfe02e7a]
    [c20:07598] [ 7] /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40(ompi_<wbr>coll_base_allreduce_intra_<wbr>recursivedoubling+0x2b9)[<wbr>0x7f45cfe03229]
    [c20:07598] [ 8] /gpfs/apps/openmpi/3.1.1-<wbr>fhdvxk5/lib/libmpi.so.40(MPI_<wbr>Allreduce+0x17f)[<wbr>0x7f45cfdbb7af]
    [c20:07598] [ 9] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/sys-<wbr>openmpi/libPstream.so(_<wbr>ZN4Foam6reduceERdRKNS_<wbr>5sumOpIdEEii+0xf0)[<wbr>0x7f45d46f7970]
    [c20:07598] [10] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAMso(_<wbr>ZNK4Foam3PCG11scalarSolveERNS_<wbr>5FieldIdEERKS2_h+0x88c)[<wbr>0x7f45d607246c]
    [c20:07598] [11] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so(_<wbr>ZNK4Foam10GAMGSolver18solveCoa<wbr>rsestLevelERNS_5FieldIdEERKS2_<wbr>+0xaf)[0x7f45d609d3df]
    [c20:07598] [12] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so(_<wbr>ZNK4Foam10GAMGSolver6VcycleERK<wbr>NS_7PtrListINS_<wbr>9lduMatrix8smootherEEERNS_<wbr>5FieldIdEERKS8_S9_S9_S9_S9_S9_<wbr>RNS1_IS8_EESD_h+0x208)[<wbr>0x7f45d609f998]
    [c20:07598] [13] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libOpenFOAM.so(_<wbr>ZNK4Foam10GAMGSolver5solveERNS<wbr>_5FieldIdEERKS2_h+0x60c)[<wbr>0x7f45d60a240c]
    [c20:07598] [14] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libfiniteVolume.so(_<wbr>ZN4Foam8fvMatrixIdE15solveSegr<wbr>egatedERKNS_10dictionaryE+<wbr>0x5c0)[0x7f45db1564e0]
    [c20:07598] [15] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libfiniteVolume.so(_<wbr>ZN4Foam8fvMatrixIdE24solveSegr<wbr>egatedOrCoupledERKNS_<wbr>10dictionaryE+0x3c8)[<wbr>0x7f45da98b2c8]
    [c20:07598] [16] /gpfs/zl.glorie/OpenFOAM/<wbr>OpenFOAM-v2212/OpenFOAM-v2212/<wbr>platforms/<wbr>linux64GccDPInt32Opt/lib/<wbr>libfiniteVolume.so(_<wbr>ZNK4Foam6fvMesh5solveERNS_<wbr>8fvMatrixIdEERKNS_<wbr>10dictionaryE+0xf)[<wbr>0x7f45da932d3f]
    [c20:07598] [17] /gpfs/zl.glorie/OpenFOAM/<wbr>GeoChemFoam/GeoChemFoam-5.0-<wbr>main/lib/<wbr>libinterfacePropertiesGCFOAM.<wbr>so(_<wbr>ZN4Foam19interfaceProperties13<wbr>calculatePhicEv+0x67f)[<wbr>0x7f45d7786b0f]
    [c20:07598] [18] interTransportFoam[0x44bdbe]
    [c20:07598] [19] /lib64/libc.so6(__libc_start_<wbr>main+0xf5)[0x7f45d4926c05]
    [c20:07598] [20] interTransportFoam[0x454a6b]
    [c20:07598] *** End of error message ***
    ——————————<wbr>————————-
    Primary job terminated normally, but 1 process returned
    a non-zero exit code. Per user-direction, the job has been aborted.
    ——————————<wbr>————————-
    ——————————<wbr>——————————<wbr>————–
    mpirun noticed that process rank 1 with PID 0 on node c20 exited on signal 11 (Segmentation fault).
    ——————————<wbr>——————————<wbr>————–
    How can I solve it?

    Imani replied 1 month, 2 weeks ago 2 Members · 2 Replies
  • 2 Replies
  • Barış Bicer

    Moderator
    March 3, 2024 at 9:32 pm

    Hi İmani,

    First, I think that this is not related to the lectures right?

    I think that main problem is mpi parallelization in your cluster. Contact with the admin of cluster and ask them to try the case to run as parallel in OpenFOAM.

    Fyi.

    Baris

    • Imani

      Member
      March 5, 2024 at 3:30 am

      Hi,

      Yes it is not related to the course. I’m using the understanding from the course to solve my problem.

      I appreciate your help.

      The simulations are ran in parallel in OpenFOAM. So is there any other solution or any other problem that you have detected? Can it be due to fvSchemes or fvSolution files?

Log in to reply.

error: Content is protected !!