View Issue Details

IDProjectCategoryView StatusLast Update
0003440OpenFOAMBugpublic2020-11-21 20:00
Reporterdhitomi Assigned Tohenry  
PrioritynormalSeveritymajorReproducibilityalways
Status closedResolutionunable to reproduce 
PlatformGNU/LinuxOSRHEL/CentOSOS Version7.3
Summary0003440: interFoam/multiphaseInterFoam
DescriptionThere is a memory leak problem. You can recognize this problem for long run or using multiphaseInterFoam, regardless of platform.
Steps To ReproduceJust run those solvers, for example, for 2 days or more. In our cases, the amount of memory used can be a few times greater than that at start.
Additional InformationPerhaps, it seems that the same problem occurs when using interFoam-related solvers.
TagsNo tags attached.

Activities

henry

2020-02-05 08:48

manager   ~0011115

Can you reproduce the problem with one of the tutorial cases and with OpenFOAM-7 or OpenFOAM-dev?

dhitomi

2020-02-05 09:19

reporter   ~0011116

We try it because we have not install OpenFOAM-7 yet.

henry

2020-02-05 10:51

manager   ~0011117

I am unable to reproduce the problem, we will need much more detail on how to reproduce it before we can take this further.
See https://bugs.openfoam.org/rules.php

dhitomi

2020-02-06 00:17

reporter   ~0011124

We can reproduce this problem in some cases. In tutorial, we could not confirm the increase of memory used. (Because it is too small to confirm that, perhaps). We am going to ready to send you the data set(mesh, input data and so on). Please wait.

dhitomi

2020-02-06 00:49

reporter   ~0011125

In tutorial/multiphase/multiphaseInterFoam/laminar/damBreak4Fine, the amount of memory used is about 83000 KB(RES of top command) per process when start (Time = 0.0*). However, that increased by about 130000KB per process at Time = about 2070.
In the other hand, that is increased from RES 0.0888GB (%MEM 0.1%) to RES 1.6GB (%MEM 1.8). Actually, this calculation was able to stop the machine abnormally.

RES and %MEM stand for those of top command.

We investigate this furthermore.

dhitomi

2020-02-06 00:50

reporter   ~0011126

"The other hand" in previous note means our calculation case.

henry

2020-02-06 09:35

manager   ~0011128

I am running the damBreak4Fine case with an end time of 2000s in OpenFOAM-dev and so far there is no increase in memory usage.

Which OpenFOAM version are you currently running?
Do you see the memory increase only in VoF solvers or all solvers?
Do you see the memory increase only when running in parallel or also serial?
Which MPI version are you using and have you tested any others?
Do you have access to any other machines with more recent GNU/Linux distributions to run tests?

dhitomi

2020-02-06 09:42

reporter   ~0011129

we could reproduce this problem in tutorial/multiphase/multiphaseInterFoam/laminar/damBreak4phaseFine.

The amount of memory used increased by about 115000KB per process at Time = about 1200 (about 230000 time steps), while it was 83000KB for under 100 time steps.

The reason why we think this is a memory leak is that the amount of memory used comes back to about 83000KB at restart of this calculation from Time = about 1200.

In our actual engineering case, the circumstance is the same. The amount of memory used comes back to that at start when restarting calculation. After all, the calculation could stop the machine abnormally, as mentioned above.

Now, we am going to send the excel file which is described the amount of memory used in tutorial and our engineering cases tomorrow morning.

henry

2020-02-06 09:59

manager   ~0011130

I cannot reproduce the problem on tutorial/multiphase/multiphaseInterFoam/laminar/damBreak4phaseFine or any of the other tutorial cases I have tried.
Could you please provide answers to the questions I asked above?

dhitomi

2020-02-06 10:02

reporter   ~0011131

We used OpenFOAM-7, openmpi 3.1.4, RHEL/CentOS 7.3.

Now we are checking it on Ubunts (Virtual Box on windows10).

dhitomi

2020-02-06 10:02

reporter   ~0011132

We see that this problem occurs only using VOF solvers.

henry

2020-02-06 10:35

manager   ~0011133

Do you see the memory increase only when running in parallel or also serial?

dhitomi

2020-02-06 10:39

reporter   ~0011134

Now we are checking that. Perhaps, it is parallel only in both ubunts and RHEL/CentOS.
Please wait for 12 hours.

dhitomi

2020-02-07 06:25

reporter   ~0011139

We are running many jobs to check it now. So please wait by next Wednesday.

henry

2020-02-07 09:47

manager   ~0011144

I ran the damBreak4phaseFine case to time 2000s overnight in OpenFOAM-6, 7 and dev with two different versions of OpenMPI and do not see a memory usage increase in any of the runs.

dhitomi

2020-02-07 10:36

reporter   ~0011145

In our machine, we see the memory increase.
The tutorial case is small so that we are going to run it for about 5 days.
In our engineering case, we can see the memory increase. This case is remarkable.
The amount of the memory used reach a few times greater than that at start.
Could you check our case in your machine?
We can see the increase for 2 or 3 hours in 16 parallel.
Do you have any idea?

henry

2020-02-07 10:47

manager   ~0011146

> In our machine, we see the memory increase.

Have you tried another machine? How about testing on AWS?

> The tutorial case is small so that we are going to run it for about 5 days.

At 2000s you reported a doubling in memory usage, I see no increase at all at the same time.

> Could you check our case in your machine?

Given that you see an increase in the tutorial case and I do not it is easier to start by resolving the issue on your machine on this standard case.

dhitomi

2020-02-07 11:08

reporter   ~0011147

I see.
We try to do that.

Issue History

Date Modified Username Field Change
2020-02-05 06:57 dhitomi New Issue
2020-02-05 08:47 henry Priority immediate => normal
2020-02-05 08:48 henry Note Added: 0011115
2020-02-05 09:19 dhitomi Note Added: 0011116
2020-02-05 10:51 henry Note Added: 0011117
2020-02-06 00:17 dhitomi Note Added: 0011124
2020-02-06 00:49 dhitomi Note Added: 0011125
2020-02-06 00:50 dhitomi Note Added: 0011126
2020-02-06 09:35 henry Note Added: 0011128
2020-02-06 09:42 dhitomi Note Added: 0011129
2020-02-06 09:59 henry Note Added: 0011130
2020-02-06 10:02 dhitomi Note Added: 0011131
2020-02-06 10:02 dhitomi Note Added: 0011132
2020-02-06 10:35 henry Note Added: 0011133
2020-02-06 10:39 dhitomi Note Added: 0011134
2020-02-07 06:25 dhitomi Note Added: 0011139
2020-02-07 09:47 henry Note Added: 0011144
2020-02-07 10:36 dhitomi Note Added: 0011145
2020-02-07 10:47 henry Note Added: 0011146
2020-02-07 11:08 dhitomi Note Added: 0011147
2020-11-21 20:00 henry Assigned To => henry
2020-11-21 20:00 henry Status new => closed
2020-11-21 20:00 henry Resolution open => unable to reproduce