Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • openfoam openfoam
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 413
    • Issues 413
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 7
    • Merge requests 7
  • Deployments
    • Deployments
    • Releases
  • Wiki
    • Wiki
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
Collapse sidebar
  • Development
  • openfoamopenfoam
  • Issues
  • #851
Closed
Open
Issue created May 30, 2018 by Admin@OpenFOAM-adminMaintainer

problem with mixtureKEpsilon at epsilonWallFunction BCs intersected by processors boundaries

  • Category: Bug
  • Reproducibility: sometimes
  • Severity: major
  • Priority: high
  • Profile: Ubuntu 16.04.3 LTS
  • Product Version: OpenFOAM v1712

Summary: Problem with mixtureKEpsilon at epsilonWallfunction BCs intersected by processors boundaries.

Description: When running in parallel, the mixtureKEpsilon turbulence model leads to unrealistic patterns in the fields of epsilon.air/epsilon.water (and consequently in the k.air/k.water fields) at epsilonWallfunction boundary condition that is intersected by processor boundaries. Cell values of k and epsilon change abruptly from one processor block to the next. Also the residuals of epsilonm stay at around 1, while all other residuals seem to converge.
I noticed that in the time directories of the individual processors, the boundary condition for epsilon.air/epsilon.water previously defined as epsilonWallFunction in the 0 directory, is automatically changed to a fixedValue BC. The same case setup, but with the continuousGasKEpsilon turbulence model caused no problems.

Steps to reproduce:

  1. Change numberOfSubdomains in decomposeParDict to number of available cores. (I used 35 subdomains)
  2. Run decomposePar
  3. Run case in parallel until approx. 0.1s
  4. Inspect epsilon.air, epsilon .water or epsilonm field with paraview: view the long stretch somewhere between -30.5m<x<0m and rescale range to “visible data range”.

Additional Information: The initial conditions for U.water, U.air, p, p_rgh, etc. in the 0 directory are based on a stable laminar simulation of the case. Unfortunately, the case is too big to upload it directly here, but it can be found under: https://polybox.ethz.ch/index.php/s/CQKaYP9WvJKoIc4

alpha_w

epsilon_air

k_air

epsilon_water

epsilon_water2

epsilon_water3

## Reattaching the author to the issue ticket: @mbuergle ##

Edited Dec 11, 2019 by Kutalmış Berçin
Assignee
Assign to
Time tracking