Discrepancy in results - Running with "laminar" turbulence model, and running with RAS + "turbulence off"
Results obtained by running a simulation using the turbulence model laminar does not match with results obtained by running with the option RAS coupled with turbulence off
Steps to reproduce
Run the simulation case attached with:
This can be generalized to running any incompressible (tested only for incompressible cases) first with laminar and compare with results obtained by running with RAS and turbulence off
What is the current bug behaviour?
The result obtained by running the case with laminar is different from the result obtained by running the case with RAS + turbulence off
What is the expected correct behavior?
The results obtained from both configurations should be the same.
Relevant logs and/or images
OpenFOAM version : 1812
Operating system : CentOS 7
Compiler : GCC 4.8.5
The problem comes from the initialization of the nut field.
When the case is run using laminar, the viscosity is taken from the transport properties. However, when the case is run using RAS with turbulence off, the nut field is initialised using the initial values of k and omega / epsilon, but not updated anymore during the simulation.
This results in a different effective viscosity being used for cases run with turbulence off.
Is this something intentional? Was the switch turbulence on/off implemented with some other intention in mind other than to give a simple method to switch between laminar and turbulent simulations?
## Reattaching the author to the issue ticket: @philippose ##