Today’s CVE drop: a dozen vulnerabilities in vm2, the popular Node.js sandbox library, with three of them at CVSS 9.8 — full sandbox escape, arbitrary code execution on the host. CVE-2026-24118 leverages JavaScript’s __lookupGetter__ to break out. CVE-2026-24120 bypasses a 2023 patch through promise species property exploitation. The names of the bugs are decreasingly important; the headline is that vm2 has yet again failed at being a security boundary, and the hard truth is that it was probably always going to.
If you’re using vm2 today to “safely run untrusted JavaScript,” it is time — past time — to rip it out. This is the contrarian-but-correct take.
What vm2 promised, and why it can’t deliver
vm2 sits in the same Node.js process as your app and intercepts JavaScript object lookups to prevent the sandboxed code from touching the host. This is genuinely clever engineering. It’s also, in retrospect, a doomed approach. JavaScript’s prototype chain has too many escape hatches: __proto__, constructor, __lookupGetter__, async-context leaks, error-stack property accesses, the species mechanism, the iterator protocol. Every one of these has been used in a vm2 escape over the years. The library’s history is a several-year game of whack-a-mole: a researcher finds an escape, vm2 patches it, six months later someone finds the next one, repeat.
The maintainer was clear-eyed enough about this trajectory that he discontinued the project in 2023. The vm2 README on npm openly says it’s deprecated and recommends isolated-vm. That’s been there for two years. People kept using vm2 anyway because it was already in their dependencies, the migration to a different sandbox model would be work, and there was always the comforting hope that “the next patch will be the last one.”
Today’s CVE drop is the answer to that hope.
Why this class of bug keeps coming back
Same-process JavaScript sandboxes are fundamentally trying to enforce a security boundary at the language level inside a runtime that has no enforcement primitives at the language level. Node.js doesn’t isolate the V8 contexts the way browsers isolate cross-origin frames; vm2’s “context” is a JS object the V8 engine itself doesn’t treat specially. Every interaction across that boundary — every property lookup on a sandboxed object, every error thrown from sandboxed code — passes through V8 with full access to host objects. The library’s job is to intercept all of those interactions and check them. Miss one, and the boundary is gone.
How many interactions are there to check? Loosely: an unbounded number, because the prototype chain is recursive and JavaScript adds new ones every spec revision. Promise.species, exploited in CVE-2026-24120, was added to the language in ES2015. It took eleven years for someone to find a vm2 escape using it. There are surely more.
What “untrusted JavaScript” actually means in practice
Three categories of vm2 user, in order of “yes you must migrate”:
- Multi-tenant code execution platforms — Replit-likes, online code interpreters, bot frameworks that run user-submitted JavaScript. If you’re vm2 here, you have a remotely exploitable RCE today. Migrate this week.
- Plugin systems for your own product — your users write JavaScript that extends your SaaS. The “untrusted” parties are your customers, who shouldn’t be able to compromise the host. Same urgency as multi-tenant: migrate.
- Internal “dynamic config” or rule engines — vm2 used to evaluate JavaScript expressions in a config that only your team writes. Lower urgency, but the bigger question is whether vm2 was the right tool here at all. A real expression-evaluation library (
expr-eval,jsep) is dramatically smaller, doesn’t try to be a sandbox, and won’t surprise you with a CVE drop.
The migration paths that actually work
- isolated-vm — same author as vm2’s recommended successor. Spawns a separate V8 isolate, which the V8 engine genuinely separates at the C++ level. Real boundary, real cost — slower startup per sandbox, more memory per call. Worth it.
- Subprocess + IPC — fork a Node process per untrusted run, with seccomp-bpf profile or a Linux user namespace, kill it after a timeout. More moving parts; ironclad isolation; no library-level CVE class to worry about.
- WebAssembly runtime — compile your sandboxed code to Wasm, run it in
wasmtimeorwasmer. Wasm has actual designed-from-day-one isolation. The trade-off is you’re no longer running JavaScript; you’re running whatever-language-compiles-to-Wasm. - Don’t run untrusted code at all — if your “rule engine” use-case is just “evaluate this small expression,” use a small expression library. Never have to think about sandbox escapes again.
The check you can run today
cd ~/your-project
npm ls vm2
# If anything shows up — even transitively as a dependency-of-a-dependency
# — you have exposure. Check whether the dependency chain leads back to
# something running untrusted code.
# For monorepos / fleets:
find . -name package.json -exec grep -l '"vm2"' {} \;
If vm2 is in your tree only as a transitive dependency of a dev tool that compiles your TypeScript, you’re probably fine — that path doesn’t run untrusted JavaScript at runtime. If vm2 is in your production runtime path and the input to it can come from anywhere outside your team, you need to migrate.
The lesson, generalised
Same-process language-level sandboxes are very rarely the right tool for genuinely untrusted code. Browsers don’t sandbox cross-origin code by intercepting JS property lookups; they use process isolation and the operating system. Server-side, your toolbox is the same — process boundaries, namespaces, seccomp, Wasm runtimes. Anything that promises “we’ll keep you safe by checking property accesses” is, in retrospect, an optimistic claim that the language doesn’t have an undiscovered escape route.
JavaScript has too many. vm2 today; whatever-comes-next tomorrow. The CVE drops are the reminder. Migrate.
Source: The Hacker News — vm2 Node.js Library Vulnerabilities Enable Sandbox Escape and Arbitrary Code Execution (May 7, 2026). Cover photo by Peaky on Pexels.
