Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Significant memory usage increase starting from 20.10.0 for a simple HTTP server (potentially stream related) #52228

Closed
poolik opened this issue Mar 27, 2024 · 2 comments · Fixed by #53188
Labels
http Issues or PRs related to the http subsystem. memory Issues and PRs related to the memory management or memory footprint. stream Issues and PRs related to the stream subsystem.

Comments

@poolik
Copy link

poolik commented Mar 27, 2024

Version

v20.10.0

Platform

Darwin arm64

Subsystem

stream

What steps will reproduce the bug?

Simple HTTP server that returns a ~1mb JSON file and reports memory usage every 5s. Starting with node 20.10.0 memory consumption jumps significantly when servicing requests (see below).

const http = require('http');
const process = require('process');
const port = 3000;

// Use require to load the JSON data once at startup
const jsonData = require('./data.json');

setInterval(() => {
    const memoryUsage = process.memoryUsage();
    console.log(`Memory Usage: 
        RSS: ${memoryUsage.rss / 1024 / 1024}MB,
        Heap Total: ${memoryUsage.heapTotal / 1024 / 1024}MB,
        Heap Used: ${memoryUsage.heapUsed / 1024 / 1024}MB,
        External: ${memoryUsage.external / 1024 / 1024}MB,
        Array Buffers: ${memoryUsage.arrayBuffers / 1024 / 1024}MB`);
}, 5000); // Log memory usage every 5 seconds

http.createServer((req, res) => {
  // Check for a GET request to the root path
  if (req.method === 'GET') {
    // Set the content type to application/json
    res.setHeader('Content-Type', 'application/json');
    res.writeHead(200);

    // Use JSON.stringify to convert the JSON object to a string
    res.end(JSON.stringify(jsonData));
  } else {
    // Handle any requests that are not to the root path or not GET requests
    res.writeHead(404);
    res.end('Not Found');
  }
}).listen(port, () => {
  console.log(`Server running at http://localhost:${port}/`);
});

The random generated JSON I used for this test:
data.json

The k6 load test script (named load_test.js) that sends requests to show the increase (run with k6 run load_test.js):

import http from 'k6/http';
import { sleep } from 'k6';

export const options = {
  // A number specifying the number of VUs to run concurrently.
  vus: 333,
  // A string specifying the total duration of the test run.
  duration: '900s',
};

export default function() {
  const res = http.get('http://localhost:3000/');
  sleep(1);
}

How often does it reproduce? Is there a required condition?

Reproduces always.

What is the expected behavior? Why is that the expected behavior?

Using node 20.9.0 (last good version) I see the following memory usage reported:

Memory Usage:
        RSS: 122.34375MB,
        Heap Total: 50.34375MB,
        Heap Used: 20.336807250976562MB,
        External: 1.5704345703125MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 170.53125MB,
        Heap Total: 97.90625MB,
        Heap Used: 71.03762817382812MB,
        External: 1.5704345703125MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 139.125MB,
        Heap Total: 68.015625MB,
        Heap Used: 40.76661682128906MB,
        External: 1.5704345703125MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 126.8125MB,
        Heap Total: 52.875MB,
        Heap Used: 23.89117431640625MB,
        External: 1.5704345703125MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 129.109375MB,
        Heap Total: 56.578125MB,
        Heap Used: 33.01460266113281MB,
        External: 1.5704345703125MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 156.171875MB,
        Heap Total: 83.65625MB,
        Heap Used: 55.899253845214844MB,
        External: 1.5704345703125MB,
        Array Buffers: 0.009982109069824219MB

What do you see instead?

Starting from node 20.10.0 memory usage jumps considerably:

Memory Usage:
        RSS: 453MB,
        Heap Total: 401.453125MB,
        Heap Used: 374.45359802246094MB,
        External: 1.6024627685546875MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 466.4375MB,
        Heap Total: 416.8125MB,
        Heap Used: 386.71961975097656MB,
        External: 1.6024627685546875MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 986.40625MB,
        Heap Total: 936.75MB,
        Heap Used: 894.882453918457MB,
        External: 1.6024627685546875MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 992.125MB,
        Heap Total: 941.171875MB,
        Heap Used: 900.2207260131836MB,
        External: 1.6024627685546875MB,
        Array Buffers: 0.009982109069824219MB
Memory Usage:
        RSS: 1316.484375MB,
        Heap Total: 1266MB,
        Heap Used: 1222.880615234375MB,
        External: 1.6024627685546875MB,
        Array Buffers: 0.009982109069824219MB

Additional information

We discovered this jump in memory consumption when we upgrade one of our microservices in production. Service works fine with 20.9.0, but starting with 20.10.0 started going over the allotted K8 memory limits.

Taking a look at the heap snapshot and comparing between old and good I saw that 20.10.0 had a lot of string objects in heap that were the unsent responses. Screenshot from the chrome debugger:
Screenshot from chrome debugger

Looks like 20.10.0 started to buffer responses a bit longer which under a normal request load manifested as increased memory consumption for the server.

After briefly going over the changelog for 20.10.0 I suspect it might be related to this change - #50014 (but haven't verified and I don't know internals of nodejs that well).

@RedYetiDev RedYetiDev added http Issues or PRs related to the http subsystem. stream Issues and PRs related to the stream subsystem. memory Issues and PRs related to the memory management or memory footprint. labels Apr 20, 2024
@orgads
Copy link
Contributor

orgads commented May 28, 2024

@poolik #50014 is unrelated to this. I bisected to:

orgads added a commit to orgads/node that referenced this issue May 28, 2024
Amends 35ec931 (stream: writable state bitmap).

Fixes nodejs#52228.
orgads added a commit to orgads/node that referenced this issue May 29, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes nodejs#52228.
orgads added a commit to orgads/node that referenced this issue May 29, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes nodejs#52228.
@orgads
Copy link
Contributor

orgads commented May 29, 2024

PR ready:

orgads added a commit to orgads/node that referenced this issue May 29, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes nodejs#52228.
orgads added a commit to orgads/node that referenced this issue May 30, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes nodejs#52228.
targos pushed a commit that referenced this issue Jun 3, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes #52228.

PR-URL: #53188
Reviewed-By: Benjamin Gruenbaum <[email protected]>
Reviewed-By: Matteo Collina <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Vinícius Lourenço Claro Cardoso <[email protected]>
eliphazbouye pushed a commit to eliphazbouye/node that referenced this issue Jun 20, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes nodejs#52228.

PR-URL: nodejs#53188
Reviewed-By: Benjamin Gruenbaum <[email protected]>
Reviewed-By: Matteo Collina <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Vinícius Lourenço Claro Cardoso <[email protected]>
bmeck pushed a commit to bmeck/node that referenced this issue Jun 22, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes nodejs#52228.

PR-URL: nodejs#53188
Reviewed-By: Benjamin Gruenbaum <[email protected]>
Reviewed-By: Matteo Collina <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Vinícius Lourenço Claro Cardoso <[email protected]>
marco-ippolito pushed a commit that referenced this issue Jul 19, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes #52228.

PR-URL: #53188
Reviewed-By: Benjamin Gruenbaum <[email protected]>
Reviewed-By: Matteo Collina <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Vinícius Lourenço Claro Cardoso <[email protected]>
marco-ippolito pushed a commit that referenced this issue Jul 19, 2024
Setting writecb and afterWriteTickInfo to null did not clear the value
in the state object.

Amends 35ec931 (stream: writable state bitmap).

Fixes #52228.

PR-URL: #53188
Reviewed-By: Benjamin Gruenbaum <[email protected]>
Reviewed-By: Matteo Collina <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Vinícius Lourenço Claro Cardoso <[email protected]>
robintown added a commit to toaq/toadua that referenced this issue Aug 3, 2024
So that we get the fix for nodejs/node#52228 added in Node 22.3.0.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
http Issues or PRs related to the http subsystem. memory Issues and PRs related to the memory management or memory footprint. stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants