node.js JS heap out of memory error - regardless of max-old-space-size setting












0















I am using a JS implementation of the Radix-4 FFT (this one: https://www.npmjs.com/package/fft.js). I'm analyzing sequences of 2^25 floats. The package works fine on the browser side with this size signal but on node.js it crashes:




FATAL ERROR: invalid array length Allocation failed - JavaScript heap
out of memory



[19262:0x559417f70870] 20352 ms: Scavenge 634.2 (656.1) -> 633.5
(671.6) MB, 65.3 / 0.0 ms allocation failure




The process only reaches about 500-600mb before crashing, and changing max-old-space-size doesn't make any difference. It's possible that this fft implementation isn't great in terms of memory management, I just wanted to try to use this before going back to using some external script (e.g. scipy) to do the FFT. Either way I would like to know what wall I'm hitting here. It's probably a very simple one I'm not aware of..



Thanks





RESOLVED by changing from plain to typed arrays in fft implementation - the node process then uses more memory but the individual objects are smaller










share|improve this question

























  • Well if it's telling you it's out of memory, it's out of memory. How exactly are you storing those 2^25 "floats"? JavaScript numbers are 8 bytes long, so that's actually 2^28 bytes, or 256 megabytes. You could be running into per-process limitations of your underlying operating system, or maybe (as you say) something is trying to make one or more copies or mappings from that array. There are lots of ways the code could consume more than expected memory.

    – Pointy
    Nov 25 '18 at 17:51











  • What value are you using for --max-old-space-size?

    – Pointy
    Nov 25 '18 at 18:00






  • 1





    I'm using Float32Array but the implementation just uses Array (apparently for backwards compatibility). And yes I believe that it's crashing when it creates one of these arrays as they are twice the fft and 64 bit, so 512mb. And that's just at the first copy. I've tried 16-32gb for max-old-space-size. The server has 64gb of memory.

    – Caharpuka
    Nov 25 '18 at 18:01













  • OK right, well if you start off with the typed array, that's 128MB, and then it copies that to a plain array so that's another 256MB, and so on.

    – Pointy
    Nov 25 '18 at 18:04













  • It still seems to be cutting out before reaching the actual memory limit. Could this be because of the size of the individual objects?

    – Caharpuka
    Nov 25 '18 at 18:09
















0















I am using a JS implementation of the Radix-4 FFT (this one: https://www.npmjs.com/package/fft.js). I'm analyzing sequences of 2^25 floats. The package works fine on the browser side with this size signal but on node.js it crashes:




FATAL ERROR: invalid array length Allocation failed - JavaScript heap
out of memory



[19262:0x559417f70870] 20352 ms: Scavenge 634.2 (656.1) -> 633.5
(671.6) MB, 65.3 / 0.0 ms allocation failure




The process only reaches about 500-600mb before crashing, and changing max-old-space-size doesn't make any difference. It's possible that this fft implementation isn't great in terms of memory management, I just wanted to try to use this before going back to using some external script (e.g. scipy) to do the FFT. Either way I would like to know what wall I'm hitting here. It's probably a very simple one I'm not aware of..



Thanks





RESOLVED by changing from plain to typed arrays in fft implementation - the node process then uses more memory but the individual objects are smaller










share|improve this question

























  • Well if it's telling you it's out of memory, it's out of memory. How exactly are you storing those 2^25 "floats"? JavaScript numbers are 8 bytes long, so that's actually 2^28 bytes, or 256 megabytes. You could be running into per-process limitations of your underlying operating system, or maybe (as you say) something is trying to make one or more copies or mappings from that array. There are lots of ways the code could consume more than expected memory.

    – Pointy
    Nov 25 '18 at 17:51











  • What value are you using for --max-old-space-size?

    – Pointy
    Nov 25 '18 at 18:00






  • 1





    I'm using Float32Array but the implementation just uses Array (apparently for backwards compatibility). And yes I believe that it's crashing when it creates one of these arrays as they are twice the fft and 64 bit, so 512mb. And that's just at the first copy. I've tried 16-32gb for max-old-space-size. The server has 64gb of memory.

    – Caharpuka
    Nov 25 '18 at 18:01













  • OK right, well if you start off with the typed array, that's 128MB, and then it copies that to a plain array so that's another 256MB, and so on.

    – Pointy
    Nov 25 '18 at 18:04













  • It still seems to be cutting out before reaching the actual memory limit. Could this be because of the size of the individual objects?

    – Caharpuka
    Nov 25 '18 at 18:09














0












0








0








I am using a JS implementation of the Radix-4 FFT (this one: https://www.npmjs.com/package/fft.js). I'm analyzing sequences of 2^25 floats. The package works fine on the browser side with this size signal but on node.js it crashes:




FATAL ERROR: invalid array length Allocation failed - JavaScript heap
out of memory



[19262:0x559417f70870] 20352 ms: Scavenge 634.2 (656.1) -> 633.5
(671.6) MB, 65.3 / 0.0 ms allocation failure




The process only reaches about 500-600mb before crashing, and changing max-old-space-size doesn't make any difference. It's possible that this fft implementation isn't great in terms of memory management, I just wanted to try to use this before going back to using some external script (e.g. scipy) to do the FFT. Either way I would like to know what wall I'm hitting here. It's probably a very simple one I'm not aware of..



Thanks





RESOLVED by changing from plain to typed arrays in fft implementation - the node process then uses more memory but the individual objects are smaller










share|improve this question
















I am using a JS implementation of the Radix-4 FFT (this one: https://www.npmjs.com/package/fft.js). I'm analyzing sequences of 2^25 floats. The package works fine on the browser side with this size signal but on node.js it crashes:




FATAL ERROR: invalid array length Allocation failed - JavaScript heap
out of memory



[19262:0x559417f70870] 20352 ms: Scavenge 634.2 (656.1) -> 633.5
(671.6) MB, 65.3 / 0.0 ms allocation failure




The process only reaches about 500-600mb before crashing, and changing max-old-space-size doesn't make any difference. It's possible that this fft implementation isn't great in terms of memory management, I just wanted to try to use this before going back to using some external script (e.g. scipy) to do the FFT. Either way I would like to know what wall I'm hitting here. It's probably a very simple one I'm not aware of..



Thanks





RESOLVED by changing from plain to typed arrays in fft implementation - the node process then uses more memory but the individual objects are smaller







javascript node.js signal-processing






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 25 '18 at 19:37









hotpaw2

61.7k1072131




61.7k1072131










asked Nov 25 '18 at 17:44









CaharpukaCaharpuka

11




11













  • Well if it's telling you it's out of memory, it's out of memory. How exactly are you storing those 2^25 "floats"? JavaScript numbers are 8 bytes long, so that's actually 2^28 bytes, or 256 megabytes. You could be running into per-process limitations of your underlying operating system, or maybe (as you say) something is trying to make one or more copies or mappings from that array. There are lots of ways the code could consume more than expected memory.

    – Pointy
    Nov 25 '18 at 17:51











  • What value are you using for --max-old-space-size?

    – Pointy
    Nov 25 '18 at 18:00






  • 1





    I'm using Float32Array but the implementation just uses Array (apparently for backwards compatibility). And yes I believe that it's crashing when it creates one of these arrays as they are twice the fft and 64 bit, so 512mb. And that's just at the first copy. I've tried 16-32gb for max-old-space-size. The server has 64gb of memory.

    – Caharpuka
    Nov 25 '18 at 18:01













  • OK right, well if you start off with the typed array, that's 128MB, and then it copies that to a plain array so that's another 256MB, and so on.

    – Pointy
    Nov 25 '18 at 18:04













  • It still seems to be cutting out before reaching the actual memory limit. Could this be because of the size of the individual objects?

    – Caharpuka
    Nov 25 '18 at 18:09



















  • Well if it's telling you it's out of memory, it's out of memory. How exactly are you storing those 2^25 "floats"? JavaScript numbers are 8 bytes long, so that's actually 2^28 bytes, or 256 megabytes. You could be running into per-process limitations of your underlying operating system, or maybe (as you say) something is trying to make one or more copies or mappings from that array. There are lots of ways the code could consume more than expected memory.

    – Pointy
    Nov 25 '18 at 17:51











  • What value are you using for --max-old-space-size?

    – Pointy
    Nov 25 '18 at 18:00






  • 1





    I'm using Float32Array but the implementation just uses Array (apparently for backwards compatibility). And yes I believe that it's crashing when it creates one of these arrays as they are twice the fft and 64 bit, so 512mb. And that's just at the first copy. I've tried 16-32gb for max-old-space-size. The server has 64gb of memory.

    – Caharpuka
    Nov 25 '18 at 18:01













  • OK right, well if you start off with the typed array, that's 128MB, and then it copies that to a plain array so that's another 256MB, and so on.

    – Pointy
    Nov 25 '18 at 18:04













  • It still seems to be cutting out before reaching the actual memory limit. Could this be because of the size of the individual objects?

    – Caharpuka
    Nov 25 '18 at 18:09

















Well if it's telling you it's out of memory, it's out of memory. How exactly are you storing those 2^25 "floats"? JavaScript numbers are 8 bytes long, so that's actually 2^28 bytes, or 256 megabytes. You could be running into per-process limitations of your underlying operating system, or maybe (as you say) something is trying to make one or more copies or mappings from that array. There are lots of ways the code could consume more than expected memory.

– Pointy
Nov 25 '18 at 17:51





Well if it's telling you it's out of memory, it's out of memory. How exactly are you storing those 2^25 "floats"? JavaScript numbers are 8 bytes long, so that's actually 2^28 bytes, or 256 megabytes. You could be running into per-process limitations of your underlying operating system, or maybe (as you say) something is trying to make one or more copies or mappings from that array. There are lots of ways the code could consume more than expected memory.

– Pointy
Nov 25 '18 at 17:51













What value are you using for --max-old-space-size?

– Pointy
Nov 25 '18 at 18:00





What value are you using for --max-old-space-size?

– Pointy
Nov 25 '18 at 18:00




1




1





I'm using Float32Array but the implementation just uses Array (apparently for backwards compatibility). And yes I believe that it's crashing when it creates one of these arrays as they are twice the fft and 64 bit, so 512mb. And that's just at the first copy. I've tried 16-32gb for max-old-space-size. The server has 64gb of memory.

– Caharpuka
Nov 25 '18 at 18:01







I'm using Float32Array but the implementation just uses Array (apparently for backwards compatibility). And yes I believe that it's crashing when it creates one of these arrays as they are twice the fft and 64 bit, so 512mb. And that's just at the first copy. I've tried 16-32gb for max-old-space-size. The server has 64gb of memory.

– Caharpuka
Nov 25 '18 at 18:01















OK right, well if you start off with the typed array, that's 128MB, and then it copies that to a plain array so that's another 256MB, and so on.

– Pointy
Nov 25 '18 at 18:04







OK right, well if you start off with the typed array, that's 128MB, and then it copies that to a plain array so that's another 256MB, and so on.

– Pointy
Nov 25 '18 at 18:04















It still seems to be cutting out before reaching the actual memory limit. Could this be because of the size of the individual objects?

– Caharpuka
Nov 25 '18 at 18:09





It still seems to be cutting out before reaching the actual memory limit. Could this be because of the size of the individual objects?

– Caharpuka
Nov 25 '18 at 18:09












0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53470184%2fnode-js-js-heap-out-of-memory-error-regardless-of-max-old-space-size-setting%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53470184%2fnode-js-js-heap-out-of-memory-error-regardless-of-max-old-space-size-setting%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Wiesbaden

Marschland

Dieringhausen