Skip to content
Snippets Groups Projects
Closed "timeVaryingMappedFixedValue" cannot read files when I use mpirun
  • View options
  • "timeVaryingMappedFixedValue" cannot read files when I use mpirun

  • View options
  • Closed Issue created

    Hi,

    The boundary condition "timeVaryingMappedFixedValue" refuses to read inlet files in boundaryData/inlet at each time steps when I use mpirun.

    **1. I post what I saw from the screen: **

    Starting time loop
    
    streamLine streamLines:
    Employing velocity field U
     automatic track length specified through number of sub cycles : 5
    
    Time = 1
    timeVaryingFixedValueFvPatchField : Read 70 sample points from "/home/ofuser/wor
    kingDir/run/pitzDailyExptInlet/processor0/../constant/boundaryData/inlet/points"
    timeVaryingFixedValueFvPatchField : In directory "/home/ofuser/workingDir/run/pi
    tzDailyExptInlet/processor0/../constant/boundaryData/inlet" found times
    4
    (
    0
    0.7
    2
    4
    )
    
    [0] checkTable : Reading startValues from "boundaryData/inlet/0.7"
    [3] checkTable : Reading startValues from "boundaryData/inlet/0.7"
    [2] checkTable : Reading startValues from "boundaryData/inlet/0.7"
    [1] checkTable : Reading startValues from "boundaryData/inlet/0.7"
    [3] checkTable : Reading endValues from "boundaryData/inlet/2"
    [0] checkTable : Reading endValues from "boundaryData/inlet/2"
    [2] checkTable : Reading endValues from "boundaryData/inlet/2"
    [0]
    [0]
    [0] --> FOAM FATAL IO ERROR:
    [0] file "../constant/boundaryData/inlet/2/U" does not exist
    [0]
    [0] file: ../constant/boundaryData/inlet/2/U at line 1.
    [0]
    [0]     From function Foam::IFstream& Foam::IFstream::operator()() const
    [0]     in file db/IOstreams/Fstreams/IFstream.C at line 176.
    [0]
    FOAM parallel run exiting
    [0]
    --------------------------------------------------------------------------
    MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
    with errorcode 1.
    
    NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
    You may or may not see output from other processes, depending on
    exactly when Open MPI kills them.
    --------------------------------------------------------------------------
    [1] checkTable : Reading endValues from "boundaryData/inlet/2"
    [3]
    [3]
    [3] --> FOAM FATAL IO ERROR:
    [3] file "../constant/boundaryData/inlet/2/U" does not exist
    [3]
    [3] file: ../constant/boundaryData/inlet/2/U at line 1.
    [3]
    [3]     From function Foam::IFstream& Foam::IFstream::operator()() const
    [3]     in file db/IOstreams/Fstreams/IFstream.C at line 176.
    [3]
    FOAM parallel run exiting
    [3]
    [2]
    [2]
    [2] --> FOAM FATAL IO ERROR:
    [2] file "../constant/boundaryData/inlet/2/U" does not exist
    [2]
    [2] file: ../constant/boundaryData/inlet/2/U at line 1.
    [2]
    [2]     From function Foam::IFstream& Foam::IFstream::operator()() const
    [2]     in file db/IOstreams/Fstreams/IFstream.C at line 176.
    [2]
    FOAM parallel run exiting
    [2]
    [1]
    [1]
    [1] --> FOAM FATAL IO ERROR:
    [1] file "../constant/boundaryData/inlet/2/U" does not exist
    [1]
    [1] file: ../constant/boundaryData/inlet/2/U at line 1.
    [1] [default:26184] 1 more process has sent help message help-mpi-api.txt / mpi-
    abort
    [default:26184] Set MCA parameter "orte_base_help_aggregate" to 0 to see all hel
    p / error messages[pitzDailyExptInlet.7z](/uploads/eb58a7f8a25df1966091c72985e38262/pitzDailyExptInlet.7z)

    pitzDailyExptInlet.7z

    **2. To reproduce the error, the steps are: **

    step 1: clone pitzDailyExptInlet;

    step 2: creat new files of others time steps by copying constant/boundaryData/inlet/0. So I have new directories and files like:

    constant/boundaryData/inlet/0.7/U constant/boundaryData/inlet/0.7/k constant/boundaryData/inlet/0.7/epsilon

    constant/boundaryData/inlet/2/U constant/boundaryData/inlet/2/k constant/boundaryData/inlet/2/epsilon

    constant/boundaryData/inlet/4/U constant/boundaryData/inlet/4/k constant/boundaryData/inlet/4/epsilon

    step 3: I run "blockMesh" "decomposePar" then "mpirun -np 4 simpleFoam -parallel"

    **3. The version OpenFoam 3.0.x is used and the case is run in windows system (windows server 2012). **

    • No errors are producted during "blockMesh", "decomposePar" in mpirun

    • No errors are producted in sequential calculation.

    • I have attached my test case.

    Best Regards,

    Linked items ... 0

  • Activity

    • All activity
    • Comments only
    • History only
    • Newest first
    • Oldest first
    Loading Loading Loading Loading Loading Loading Loading Loading Loading Loading