Running out of memory with electron project on VNC workspace

Hi,

I have an Electron/React app in development. For debugging the Electron integration I set up an VNC workspace. My Electron app successfully runs in dev mode but when I enable the Chrome Dev Tools in electron, I run out of memory very quickly. I tried to change the javascript heap size on electron startup but this does not help.

Is there any way to get a GitPod environment with more memory? Or does someone know any way how to reduce my memory consumption?

Hi @davemecha,

Unfortunately, changing the memory of a Gitpod workspace for gitpod.io is not possible yet.

1 Like

Thanks for your reply. This is a real problem for me (and maybe other gitpod users), since this means it’s just not possible to develop electron apps with gitpod that are larger than de react demo app with one view.

I created a demo repository with a basic React project and Electron. On startup, the workspace is installing all dependencies, is running the dev-server and finally running electron.

Typically I load the URL of the dev server, but the react default project is very small so it would not trigger the out of memory error. So I just run the dev server in background to make the memory consumption more realistic and loaded an external URL into electron. I found the Medium article describing how to use Gitpod for native UI development a good real world example.

After startup just open the VNC service on port 6080 in the browser.
image

The electron app starts and the chrome app is crashing immediately (white screen). If i disable the dev tools in the electron.js, the medium page gets loaded and crashes a bit later, when navigating/scrolling.

This is not a very complex example and since Gitpod is my main development environment it’s a real problem for me that I run into this fatal problem. I assume the chromium is just consuming too much memory. I already tried to tweak the heap size, but I was not successful with this.

If anybody could help me out how I could get my development going this would be awesome. I really don’t want to switch to another cloud development solution just because of this problem.

I played a bit more with chromium flags in electron. I’m really not knowing what I’m doing and just did trial and error. With my changes, the app does not crash immediately. But after some time working it’s crashing as well. Unfortunately after each crash of electron, it crashes faster and I have to spawn a new pod very often.

This is really no solution for regular development, but it makes it possible to make small debuggings.

Hi @davemecha,

When your Electron app crashes, is that because node.js cannot allocate more memory, i.e. some node-specific error message, or because the kernel kills it (Memory cgroup out of memory)?

Unfortunately after each crash of electron, it crashes faster and I have to spawn a new pod very often.

This might indicate the the “crashed” processes aren’t actually gone, but still consume memory.
The next time that situation occurs, could you run ps guaxf and post the output here?

2 Likes

Hi @csweichel

Thanks for you reply.

When your Electron app crashes, is that because node.js cannot allocate more memory, i.e. some node-specific error message, or because the kernel kills it (Memory cgroup out of memory)?

I’m not sure how to find this out. My error message is like this:

[1] [2221:1106/114242.426079:FATAL:memory.cc(40)] Out of memory. size=262144

When I search for this error, it points to chromium that crashes. So I think it’s not a node.js memory allocation issue.

I played with the repo a bit more and created a minimal-electron branch where I removed react and every dependency except electron. I’ts now just electron displaying the medium article.

On startup electron is started and when I open the VNC in a browser window, I have some time playing around with it. When I maximize the window and scroll a bit, it quickly crashes.

When I start electron a second time with ‘yarn electron’, it immediately crashes. After 2 crashes, this is my output.

USER         PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
gitpod         1  1.0  0.0 716172 17356 ?        Ssl  13:38   0:02 /theia/supervisor run
gitpod        14  4.2  0.2 409380 128800 ?       SNl  13:38   0:09 /theia/node/bin/gitpod-node ./src-gen/backend/main.js /workspace/react-gitpod-vnc-demo --port 23000 --hostname 0.0.0.0
gitpod      1090  0.1  0.0  16264 13352 pts/0    SNs  13:38   0:00  \_ /bin/bash -i
gitpod      3036  0.5  0.0 816232 53648 pts/0    SNl+ 13:41   0:00  |   \_ node /home/gitpod/.nvm/versions/node/v12.19.0/bin/yarn electron
gitpod      3047  0.0  0.0   2624   620 pts/0    SN+  13:41   0:00  |       \_ /bin/sh -c electron . --no-sandbox
gitpod      3048  0.1  0.0 554356 30188 pts/0    SNl+ 13:41   0:00  |           \_ /home/gitpod/.nvm/versions/node/v12.19.0/bin/node /workspace/react-gitpod-vnc-demo/node_modules/.bin/electron . --no-sandbox
gitpod      3055  1.5  0.1 4622600 114996 pts/0  SNl+ 13:41   0:01  |               \_ /workspace/react-gitpod-vnc-demo/node_modules/electron/dist/electron . --no-sandbox
gitpod      3057  0.0  0.0 202712 46748 pts/0    SN+  13:41   0:00  |                   \_ /workspace/react-gitpod-vnc-demo/node_modules/electron/dist/electron --type=zygote --no-zygote-sandbox --no-sandbox
gitpod      3160  0.5  0.0 260288 55892 pts/0    SNl+ 13:41   0:00  |                   |   \_ /workspace/react-gitpod-vnc-demo/node_modules/electron/dist/electron --type=gpu-process --field-trial-handle=8196463885852503476,8013963449214223543,131072 --enable-features=WebComponentsV0Enabled --disable-features=CookiesWithoutSameSiteMustBeSecure,SameSiteByDefaultCookies,SpareRendererForSitePerProcess --no-sandbox --gpu-preferences=MAAAAAAAAAAgAAAQAAAAAAAAAAAAAAAAAABgAAAAAAAQAAAAAAAAAAAAAAAAAAAACAAAAAAAAAA= --use-gl=swiftshader-webgl --shared-files
gitpod      3058  0.0  0.0 202712 46864 pts/0    SN+  13:41   0:00  |                   \_ /workspace/react-gitpod-vnc-demo/node_modules/electron/dist/electron --type=zygote --no-sandbox
gitpod      3090  3.4  0.1 4592216 114528 pts/0  SNl+ 13:41   0:02  |                   |   \_ /workspace/react-gitpod-vnc-demo/node_modules/electron/dist/electron --type=renderer --no-sandbox --field-trial-handle=8196463885852503476,8013963449214223543,131072 --enable-features=WebComponentsV0Enabled --disable-features=CookiesWithoutSameSiteMustBeSecure,SameSiteByDefaultCookies,SpareRendererForSitePerProcess --lang=en-US --app-path=/workspace/react-gitpod-vnc-demo --disable-electron-site-instance-overrides --num-raster-threads=4 --enable-main-frame-before-activation --renderer-client-id=6 --no-v8-untrusted-code-mitigations --shared-files=v8_snapshot_data:100
gitpod      3084  0.4  0.1 276424 67788 pts/0    SNl+ 13:41   0:00  |                   \_ /workspace/react-gitpod-vnc-demo/node_modules/electron/dist/electron --type=utility --utility-sub-type=network.mojom.NetworkService --field-trial-handle=8196463885852503476,8013963449214223543,131072 --enable-features=WebComponentsV0Enabled --disable-features=CookiesWithoutSameSiteMustBeSecure,SameSiteByDefaultCookies,SpareRendererForSitePerProcess --lang=en-US --service-sandbox-type=network --no-sandbox --shared-files=v8_snapshot_data:100
gitpod      1436  6.2  0.1 380700 98588 ?        SNl  13:38   0:13  \_ /theia/node/bin/gitpod-node /theia/node_modules/@theia/core/lib/node/messaging/ipc-bootstrap.js
gitpod      1624  0.1  0.0 711268  8616 ?        SNsl 13:38   0:00  \_ gp forward-port 6080 36080
gitpod      1629  0.0  0.0 711268  8108 ?        SNsl 13:38   0:00  \_ gp forward-port -r 5900 35900
gitpod      1672  0.3  0.0 287476 53800 ?        SNl  13:38   0:00  \_ /theia/node/bin/gitpod-node /theia/node_modules/@theia/json/lib/node/json-starter --stdio
gitpod      1725  0.4  0.1 338536 62116 ?        SNl  13:38   0:00  \_ /theia/node/bin/gitpod-node /theia/node_modules/@theia/plugin-ext/lib/hosted/node/plugin-host.js
gitpod      2185  0.0  0.0  12936  9960 pts/1    SNs  13:39   0:00  \_ /bin/bash -i
gitpod      3418  0.0  0.0   9040  3384 pts/1    RN+  13:42   0:00      \_ ps guaxf
gitpod       311  0.7  0.0 2493552 48912 ?       SNl  13:38   0:01 Xvfb -screen 0 1440x800x16 -ac -pn -noreset
gitpod       312  0.0  0.0  68900 18248 ?        SN   13:38   0:00 openbox
gitpod       351  1.1  0.0  36744 14408 ?        SNs  13:38   0:02 x11vnc -localhost -shared -display :0 -forever -rfbport 5900 -bg -o /tmp/x11vnc-0.log
gitpod       352  0.0  0.0   6984  1936 ?        SN   13:38   0:00 /bin/bash /usr/bin/start-vnc-session.sh
gitpod       353  0.0  0.0   7116  3488 ?        SN   13:38   0:00  \_ bash ./launch.sh --vnc localhost:5900 --listen 6080
gitpod       367  0.1  0.0  24532 18872 ?        SN   13:38   0:00      \_ /home/gitpod/.pyenv/versions/3.8.6/bin/python -m websockify --web /opt/novnc/utils/../ 6080 localhost:5900
gitpod      1638  0.1  0.0      0     0 ?        ZNs  13:38   0:00 [node] <defunct>
gitpod      1829  0.0  0.0      0     0 pts/0    ZN   13:39   0:00 [sh] <defunct>
gitpod      1839  0.0  0.0      0     0 pts/0    ZN   13:39   0:00 [electron] <defunct>
gitpod      1840  0.0  0.0      0     0 pts/0    ZN   13:39   0:00 [electron] <defunct>
gitpod      1900  0.2  0.0      0     0 pts/0    ZN   13:39   0:00 [electron] <defunct>
gitpod      1905  1.5  0.0      0     0 pts/0    ZN   13:39   0:03 [electron] <defunct>
gitpod      1977  0.6  0.0      0     0 pts/0    ZN   13:39   0:01 [electron] <defunct>
gitpod      2811  0.0  0.0      0     0 pts/0    ZN   13:41   0:00 [electron] <defunct>
gitpod      2812  0.0  0.0      0     0 pts/0    ZN   13:41   0:00 [electron] <defunct>

Besides the output on each crash some huge dump file is created.

-rw-------  1 gitpod gitpod 460386304 Nov  6 13:40 core.Compositor.1904.1604670014
-rw-------  1 gitpod gitpod 340447232 Nov  6 13:41 core.ThreadPoolForeg.2842.1604670064
-rw-------  1 gitpod gitpod 340930560 Nov  6 13:41 core.ThreadPoolForeg.3089.1604670079

You should try it for yourself.

Uhhh Yeah!!! I found a solution to this mess… :partying_face: :partying_face: :partying_face: :partying_face: :partying_face:

Inspired by this stackoverflow post on running Chromium in Docker, the solution is simply to disable shared memory usage for Chromium on Electron startup.

Just add
app.commandLine.appendSwitch('disable-dev-shm-usage');
to your Electron app on startup (for real world apps make sure you only apply this in dev mode).

I added a solution branch to my demo repository. Check this out…

@csweichel
Maybe you could provide more shared memory with the Gitpod containers, to make this switch unnecessary.

This might be a solution for other uses of headless Chromium in Gitpod (e.g. Puppeteer) as well.

Hi @davemecha, many thanks for sharing your solution here! :100:

To expand a bit on this, Gitpod workspaces currently enjoy up to 12 GB of RAM before the OOM-killer might kick in. This is a lot.

Before that, we went from 3 GB to 8 GB RAM per workspace, and while this removed virtually all the user OOM-kills we were observing in production workspaces, 8 GB was still a bit too short for some rare use cases (e.g. compiling/linking a very large Rust project). Bumping that to 12 GB solved these as well.

Now, is it really the case that debugging Electron in a VNC workspace uses more than 12 GB of RAM?

Or maybe there is another limit in play, e.g. in Electron or Chromium itself, that makes your app crash? I suspect it’s the latter, because of your quoted error message:

The “size” here seems to be 262 MB, which is very small compared to 12 GB and makes me think that something in your application is causing data to accumulate in some specific constrained Chromium buffer, which then gets filled up and crashes Chromium.

I’m also not exactly sure what disable-dev-shm-usage does, but it’s excellent if that solves your issue. I also don’t think that offering more than 12 GB RAM to workspaces would have solved it anyway.

Hey @jan, thanks for your reply.

I see, that you may be right and the issue is not about the memory of the pod, because pods have 12 GB of RAM. (Even if the argument with the failed allocation of memory at 262 MB is not very valid, because the memory might have been consumed by other processes and the error is displayed by just the process that fails. It could be 1 MB in the message and it has nothing to do with the totally available memory on the system.)

But I think that there is a good reason for digging into this problem as well. When you check out my example workspace, it becomes obvious, that it has nothing to do with Electron, let alone my application.

It is an issue of Chromium and availability of shared memory, and it is reproducible for other applications of headless Chromium as well (like the mentioned Puppeteer). So if someone tries to fire up a headless Chromium in GitPod for testing, he will probably run into the same issues as I did.

When I check out this (old) chromium issue, I get an Idea where the problem might have its roots. Maybe /dev/shm is too small or it has something to do with how it’s mounted. - I’m really not very into this low level linux stuff, but maybe there is some way gitpod workspaces could be made more robust for other headless Chromium users as well, not just Electron with VNC.

1 Like