You call new Object(). 8 nanoseconds later, V8 has carved out space on the heap,
updated a pointer, and moved on. What actually happened — and how does any of it get cleaned up?
In the Trimurti metaphor, Brahma is allocation: fresh space is shaped into objects, arrays, strings, hidden classes, and environments. Vishnu is preservation: references form a sustaining graph, so long as some root can still point toward a value. Shiva, through V8's Orinoco collector, is not random destruction but lawful dissolution: the unreachable are marked absent from the living graph, swept away, and sometimes compacted so creation can begin again efficiently.
Chapter 01 — The Memory Landscape
Stack, Heap, and the Graph That Decides Life or Death
JavaScript memory is not one flat bag of bytes. The engine distinguishes transient execution state from long-lived dynamic allocations, and the collector reasons over reachability rather than "usage" in the human sense. If you understand the stack, the heap, and the reference graph, most GC behavior stops feeling magical.
The stack is the engine's short-term ledger. Each function call pushes an activation record containing bindings, return addresses, and metadata needed for execution. When the call returns, the frame is popped. That means stack memory is fast, predictable, and naturally reclaimed by control flow rather than by the garbage collector.
The heap is where dynamic structures live: objects, arrays, functions, closure environments, Maps, Sets, promises, and engine-internal backing stores. Heap allocation is more flexible and more expensive. The heap is also where fragmentation, promotion, remembered sets, write barriers, and compaction start to matter.
Neither stack nor heap alone determines whether an object lives. What matters is the reference graph. A global variable can point to an object. That object can point to another object. A closure can keep an environment alive. A DOM listener can keep a detached subtree alive by accident. GC begins from roots and walks edges; values not reachable from that walk are candidates for collection.
Young Generation
Most objects die young. V8 leans into that empirical truth by placing fresh allocations into new space, typically split into semi-spaces for copying collection. This region is deliberately small so scavenges are cheap and frequent. If an object becomes garbage quickly, the engine recovers it with minimal work.
Minor GC in the young generation is optimized for throughput. Rather than tracing the entire heap, the scavenger copies live objects from one semi-space to the other and discards the rest wholesale. Because the region is compact and many objects die immediately after allocation, this strategy is usually faster than full-heap sweeping.
Surviving multiple minor collections has a cost signal: the object is probably not temporary. At that point V8 may promote it into old space. That promotion is one reason long-lived arrays, closure environments, and app-level state deserve scrutiny: once promoted, they participate in heavier major GC cycles.
Old Generation
Old space houses survivors: application state, caches, long-lived closures, framework internals, and persistent DOM-adjacent structures. The collector assumes these objects have higher survival rates, so it uses a more sophisticated tracing strategy with incremental and concurrent work to keep pause times down.
Because old space can become fragmented, simply sweeping dead objects is not always enough. Allocators prefer contiguous blocks; many survivors spread across the heap create holes. That is why major GC may compact selected pages, relocating live objects and updating references so future allocations remain efficient.
Promotion is not a badge of honor. If a bug holds onto event handlers, detached nodes, or overgrown caches, the leak often graduates into old space and becomes harder to notice because it dies slowly, not explosively. A stable but rising old-space footprint is one of the clearest production signals that some reference chain is never being cut.
Large Object Space
Some allocations are too large or too awkward for normal copying behavior. V8 uses large object space for values whose size would make frequent movement expensive. These allocations usually get their own chunks rather than living inside compact semi-spaces.
The tradeoff is straightforward: moving huge allocations around to compact memory can cost more than the fragmentation it solves. Large object space therefore behaves differently from the main young-generation story. You still need reachability for collection, but you should not assume the same cheap copy-and-discard mechanics apply.
In practice, giant arrays, oversized buffers, image data, and accidental mega-caches show up here. This is one reason "memory leak" does not always mean a million tiny objects. A small number of very large survivors can pressure the heap immediately and trigger major collection far earlier than expected.
Chapter 02 — Call Stack & Primitive Values
The Stack Is Yama's Perfect Ledger
Yama does not guess who is alive; he keeps exact account. The call stack has the same temperament. Every active execution context is recorded in order, every return unwinds cleanly, and nothing remains on the ledger once a frame has fulfilled its duty.
A JavaScript activation record contains more than local variables. Engines track scope links, operand state, return locations, and metadata required for deoptimization or exception handling. In optimized code, bindings may be stored in registers or stack slots differently from what source code suggests, but conceptually the frame still represents the currently active execution context.
Stack memory is fast because allocation is almost trivial: move the stack pointer. Reclaiming it is equally trivial:
move the pointer back when the function exits. That elegance is why short-lived primitives and call metadata belong here.
But speed comes with a limit. Stack depth is bounded, so runaway recursion produces a RangeError
even if the heap is mostly empty.
One persistent misconception is that "small objects live on the stack." In JavaScript, that is a misleading transplant from C-like languages. The binding itself is part of the frame, but regular objects and arrays live in heap-managed memory. What the stack stores for them is the reference, not the full object body.
let x = 42; // primitive lives in the current stack frame let s = "hello"; // primitive value tracked inline by the engine let obj = { x: 1 }; // object body allocated on the heap let arr = [1, 2, 3]; // array elements live in heap backing storage // The variables x, s, obj, and arr are bindings in the stack frame. // Only obj and arr point outward to heap allocations.
Strings deserve a nuance. JavaScript presents them as primitive values, but engines often optimize their backing representation in specialized ways: flat strings, cons strings, slices, or interned strings. None of that changes the language-level rule you reason with: primitive bindings behave value-like, while objects participate in reference reachability.
Stack overflow and garbage collection are therefore different failure modes. Overflow means execution nested too deeply. GC pressure means heap residency grew or collections became too expensive. Confusing the two leads to bad debugging instincts. If the error says maximum call stack size exceeded, the first suspect is recursive control flow, not leaked heap objects.
A useful mental model is this: the stack is the temporary script of the play, while the heap is the backstage warehouse. Frames are torn down as scenes finish; props remain only if some active actor or global registry still points to them.
Chapter 03 — Heap Allocation & Reference Chains
The Heap Preserves What the Graph Still Names
Vishnu sustains the world not by clutching everything equally, but by maintaining the web that keeps form coherent. References play that role in V8: if a path from a root to an object still exists, the object remains within the preserved cosmos.
Heap allocation begins when the engine needs dynamic storage for a value whose size, identity, or lifetime cannot be modeled as a simple inline binding. An object literal allocates a header plus property storage. An array allocates a header plus an elements backing store. A function allocates executable metadata, context links, and often a shared function info structure behind the scenes.
Reachability is the decisive property, not raw usage. If you no longer "care" about an object but some Map, closure, DOM listener, promise callback, or module singleton still references it, the collector must treat it as live. That is why tracing GC replaced naive reference counting: cycles like two objects pointing to each other would never free under pure counts, but tracing from roots can see that an isolated cycle is still unreachable.
Closures make this story concrete. When an inner function uses a variable from an outer scope, the engine cannot discard that binding with the outer frame. Instead, it lifts the relevant lexical environment into heap-managed storage so the returned function can keep reading and mutating it after the caller has already returned.
function makeCounter() { let count = 0; // env object lives on heap return function next() { return ++count; }; } const counter = makeCounter(); counter(); // 1 counter(); // 2
The subtle cost of closures is not that they are inherently bad. It is that they extend lifetime. A closure capturing a large array, a DOM subtree, or a service object can keep all of it alive long after the original function finished. This is usually correct behavior, but it is also a common source of accidental retention in UI code and long-lived event systems.
V8 invests heavily in object layout predictability. Properties added in consistent order allow the engine to assign hidden classes or shapes, which help optimize inline caches and property access. The memory model and the execution model meet here: predictable heap layout does not just save time at allocation, it also improves the JIT's assumptions about object structure.
Think of the heap as a city, not a bag. Objects occupy neighborhoods with policies, generations, and allocation strategies. Reference edges are roads. Closures are portable sanctuaries carrying pieces of old scope into new time. The collector is urban planning under pressure: keep the surviving city navigable, reclaim abandoned blocks, and avoid pausing the whole civilization for too long.
Chapter 04 — Orinoco: Shiva's Incremental Dance
Tracing, Marking, Sweeping, and Compaction
Modern V8 does not wait for catastrophic heap exhaustion and then freeze the world for a monolithic cleanup. Orinoco spreads work across incremental slices, concurrent helpers, and generational heuristics so that memory reclamation can happen while the program keeps serving users.
Roots: Brahma's World-Tree of Reachability
In myth, a tree can hold worlds on its branches because its roots remain anchored in the unseen. GC roots serve that role. They are the trusted starting points from which the collector decides what still belongs to the living universe.
GC roots are engine-known entry points into the object graph. At a high level this includes active stack frames, global objects, module scope bindings, built-in structures, and references temporarily held in registers or internal handles. If a value is reachable from one of these roots, it must be preserved.
The stack matters because every active local binding can point into the heap. A single local variable named cache
may anchor thousands of objects. Likewise, a global singleton or framework store can dominate entire subgraphs of state,
which is why leaks in registries or root-level stores are so persistent.
Engines also maintain internal roots that ordinary JavaScript cannot see directly: built-in constructors, optimized code metadata, handles used by native integrations, and references required during compilation or deoptimization. GC must treat these as authoritative. The language abstraction says "roots are globals and stack." The engine reality is wider and more precise.
Mark Phase: Vishnu's Census of the Living Graph
Preservation is not sentiment; it is accounting. The mark phase walks outward from the roots and records which beings still participate in the world. Anything not visited is not punished. It is simply absent from the census of the living.
In the mark phase, the collector traverses references from every root to every reachable object. Conceptually, objects move through the tri-color abstraction: white means unvisited, gray means discovered but not fully scanned, black means discovered and fully scanned. This model prevents the collector from losing track of objects while work is incremental.
Incremental marking exists because full graph traversal can be too expensive to do in one pause on large heaps. Orinoco slices the work into smaller checkpoints and uses write barriers so mutations made by running JavaScript do not invalidate the collector's view. If your application keeps mutating object graphs during marking, those barriers record enough information to keep tracing correct.
Marking is about graph theory, not morality. A perfectly useless object that remains reachable will be marked black and preserved. A valuable object with no path from any root will remain white and eventually die. That is why leak prevention is fundamentally about severing references, not about asking the collector to infer developer intent.
Sweep and Compact: Shiva Dissolves and Reorders
Shiva's dissolution is rhythmic, not chaotic. The collector frees what is no longer reachable, then sometimes compacts what remains so future creation does not happen in a shattered landscape of holes and fragments.
During sweeping, the engine walks heap regions looking for objects that were never marked. Those cells are returned to free lists or otherwise made available for future allocation. In concurrent sweeping, helper threads may perform much of this cleanup while JavaScript continues running, reducing stop-the-world cost.
Compaction is the answer to fragmentation. If the heap contains many small free holes scattered around surviving objects, allocation becomes less efficient even if plenty of bytes technically remain. Compacting selected pages moves live objects together, updates references, and restores larger contiguous free regions.
Not every cycle compacts everything. Moving objects has a cost, especially for large allocations or pages where the benefit is low. V8 chooses strategies based on generation, fragmentation, allocation pressure, and pause budget. The real collector is opportunistic: free quickly when possible, compact when worthwhile, and keep latency acceptable for interactive applications.
let user = { id: 1, name: "Arjuna" }; let backup = user; user = null; // one edge removed backup = null; // last strong reference gone // On the next GC cycle the object is unreachable, so mark-sweep can reclaim it.
Setting a binding to null does not "free memory immediately." It simply removes one reference edge.
If that was the final strong path from the root set to the object, the object becomes collectible on a future GC cycle.
Timing belongs to the engine, not to user code.
This is an important distinction for performance debugging. Developers often expect memory to fall the instant they clear a variable. Instead, you may see memory plateau until the next minor or major collection, and you may see RSS or process memory remain high because allocators and operating systems do not necessarily return freed pages immediately to the OS.
The correct question is not "why didn't GC run at the exact moment I wanted?" but "is the graph now severed enough that the collector can reclaim this when it next decides to?". That framing aligns with how tracing collectors actually work and prevents a great deal of confusion in profiling sessions.
The tri-color abstraction divides objects into white, gray, and black sets. White objects have not been reached. Gray objects have been reached but still need their outgoing references scanned. Black objects have been reached and fully scanned. The collector repeatedly processes gray objects until none remain.
The invariant that matters is simple: no black object should point directly to a white object without the collector knowing about it. Incremental and concurrent collectors enforce that invariant with write barriers, which record mutations that would otherwise hide new reachable edges.
Older collectors often paused the program for long contiguous marking or sweeping work. Orinoco shifted much of that effort into incremental slices and concurrent background threads. The result is lower latency under typical interactive workloads, especially on large old-space heaps.
Stop-the-world pauses still exist for certain boundaries, root scanning, synchronization points, and some evacuation work. The difference is that modern V8 tries hard to avoid making every full-heap collection one giant blocking event.
The Scavenger handles young-generation collections. It is tuned for objects that die quickly and commonly uses copying collection across semi-spaces. Major GC targets old-generation memory, where survivors are common and tracing, sweeping, and compaction become more important.
If you allocate many short-lived objects in a tight loop, you may see frequent minor collections with minimal impact. If your long-lived state keeps growing, the pressure shifts into major collection, where pause avoidance and fragmentation management matter much more.
Chapter 05 — Adharmic References
Three Classic Leak Patterns
Memory leaks in JavaScript are usually not failures of collection. They are failures of graph hygiene. The collector is tracing correctly. Your program is simply still holding a path to data that no longer serves any purpose.
Ghost Listeners That Refuse to Leave the Shrine
Some spirits linger because a ritual was left incomplete. Detached nodes do the same when a listener or closure still names them. From the collector's perspective, they are not ghosts at all. They are still reachable citizens.
Event listeners frequently leak because the target registry becomes an unexpected root path.
A callback attached to window, document, or another long-lived emitter may close over a DOM node,
service instance, or large state object. Removing the node from the document does not remove that listener edge.
This pattern is especially common in component systems that mount and unmount often. If setup happens in a lifecycle hook but teardown does not remove listeners, every mount leaves behind one more retained closure and one more retained data graph. The leak can stay invisible for a long time because each instance is small.
The fix is mechanical and therefore dependable: keep the same callback reference, unregister it during teardown, and null out other long-lived references when appropriate. The engine only needs the graph to become severable.
const panel = document.querySelector("#panel"); function onResize() { panel?.classList.add("ready"); } window.addEventListener("resize", onResize); panel?.remove(); // ❌ LEAK: listener still closes over panel, so the detached node stays reachable
const panel = document.querySelector("#panel"); function onResize() { panel?.classList.add("ready"); } window.addEventListener("resize", onResize); window.removeEventListener("resize", onResize); panel?.remove(); // ✅ FIX: the listener edge is severed, so the node can become unreachable
Helpful Caches Become Hoarders
Caches are often the most socially acceptable leaks in a codebase. They begin as performance optimizations, then become unbounded maps keyed by strings, IDs, or object references that are never evicted. Because their growth is intentional-looking, teams can misread heap growth as normal application scale.
Closures intensify this problem when they capture the cache plus contextual state. A supposedly small memoized function can accidentally retain response payloads, parsed ASTs, detached nodes, or image blobs through a long-lived module scope. If no eviction policy exists, reachability becomes permanence.
Use WeakMap when the cache key is an object whose lifetime should control the cached entry.
A weak reference does not keep the key alive, so once no strong references remain to that key, the entry can disappear with it.
This does not replace disciplined cache design, but it prevents a class of object-keyed retention bugs.
const cache = new WeakMap(); function remember(node) { if (!cache.has(node)) { cache.set(node, { measuredAt: Date.now() }); } return cache.get(node); } // When node is no longer strongly reachable, its WeakMap entry disappears with it.
Detached Subtrees: Removed from the Page, Not from Memory
A detached DOM leak happens when a node is removed from the live document tree but remains strongly referenced from JavaScript. The node itself may be small, yet it often drags an entire subtree with style data, listeners, layout metadata, and framework-specific instance state.
Common sources include arrays storing old modal elements, debug registries that collect nodes for convenience, closures created by event delegation helpers, and libraries that keep bookkeeping structures long after a widget is destroyed. In heap snapshots, these show up as detached elements retaining unexpectedly large object graphs.
The fix is again about reference discipline. Remove listeners. Drop registry entries. Clear arrays. Avoid keeping "just in case" handles to nodes you intend to discard. The DOM disappearing visually is not enough; only the reference graph decides whether the subtree is eligible for collection.
Use the Memory panel's heap snapshot to inspect retainers and detached DOM trees. Use the allocation timeline when hunting leaks that grow over repeated interactions. If old-space usage rises after each mount/unmount loop, compare snapshots and inspect dominators that remain rooted unexpectedly.
Closing Reflection
What Leaves, What Remains
"As a person casts off worn-out garments and puts on others that are new, so the embodied self casts off worn-out bodies and enters into others that are new."
— Bhagavad Gita 2.22
Garbage collection is not an excuse to ignore memory. It is a contract: you do not manually free bytes, but you do control the graph that tells the engine what should remain. When references are honest, the collector is remarkably effective.
Good memory hygiene means thinking in ownership and lifetime, not just syntax. Who holds this callback? Which store owns this object? What tears it down? What should happen after unmount, after navigation, after request completion? Those are the questions that keep Brahma, Vishnu, and Shiva in balance inside your runtime.
If you want to see these phases unfold step by step, the visualizer below turns allocation, closure capture, mark-sweep, and leak repair into an explicit animation of stack frames, heap objects, and GC state.