0 / 10 answered
Home
⚙️
Segment 1 — Core Fundamentals
The bedrock of every Node.js interview. These concepts explain why Node.js behaves the way it does — understanding them deeply separates a strong engineer from someone who just uses Node.js as a framework runner.
Event Loop Non-blocking I/O Concurrency Model Module System libuv CommonJS vs ESM
10
Questions
0
Opened
4
Topics Covered
~45m
Study Time
🧠 Mental Model — What is Node.js, Really?

Before memorising phases and APIs, lock in this mental model. Node.js is not magic — it's one very efficient decision: don't wait, move on. While one request waits for a database response, serve ten others.

☕ The Café Waiter Analogy

Imagine a café with one highly efficient waiter (the event loop) and a kitchen full of chefs (OS + libuv threads).

  • The waiter takes your order and immediately moves to the next table — never stands watching the kitchen.
  • When food is ready, the kitchen rings a bell → the waiter picks it up and delivers it.
  • A traditional multi-threaded server hires one waiter per table — expensive, and most of them just stand waiting.

Node.js doesn't make I/O faster — it makes the time spent waiting productive.

✅ Node.js excels at
  • Handling thousands of concurrent connections
  • HTTP APIs, database calls, file reads (I/O-bound)
  • Real-time apps — chat, live dashboards, notifications
  • Streaming large files without loading all into memory
❌ Node.js struggles with
  • CPU-heavy work: video encoding, ML inference, crypto
  • Workloads needing true parallel CPU threads
  • Long synchronous blocking operations on the main thread
1
V8 Engine — executes your JavaScript. This is the one JS thread.
2
Node.js Core APIs — C++ bindings that bridge JS to the operating system.
3
libuv — cross-platform C library: the event loop lives here, plus a thread pool (4 threads by default) for file I/O and crypto.
4
OS Kernel — actual hardware: disks, network cards, timers. Uses epoll (Linux), kqueue (macOS), IOCP (Windows) for async network I/O.
Why This Architecture Wins for Web Servers
A typical HTTP request spends ~95% of its time waiting (DB query, downstream API, disk read) and only ~5% doing computation. Node's single-threaded async model makes that 95% waiting time free for other requests — instead of tying up an OS thread.
Topic A — The Event Loop
1
What is the Node.js Event Loop? Explain its phases in order.
Hard Event Loop
🎯 Build the Mental Model First

The event loop is Node's task scheduler. Before the phases table, understand the big picture: the event loop is a loop that runs forever, checking queues in a fixed order and running callbacks.

🏁 Racetrack Analogy

Each "tick" of the event loop is one lap around a racetrack. The track has fixed pit stops in a fixed order. Your callbacks sit at different pit stops waiting to be picked up:

  • Timers pit stopsetTimeout / setInterval callbacks whose time has come
  • I/O pit stop (poll) — file / network callbacks that just completed in the background
  • Check pit stopsetImmediate callbacks
  • Microtask express laneprocess.nextTick and Promises cut the queue between every pit stop
The Most Important Rule
The JS thread processes one callback at a time, to completion. While a callback runs, nothing else can run. This is why blocking the event loop affects every concurrent user.

The Event Loop is the mechanism that allows Node.js to perform non-blocking I/O operations despite JavaScript being single-threaded. It continuously checks whether there are tasks to execute and in what order.

Node.js uses libuv (a C library) under the hood to implement the event loop. Each "tick" of the loop passes through these 6 phases in order:

# Phase What runs here Key API
1 timers Callbacks from setTimeout and setInterval whose delay has elapsed setTimeout
2 pending callbacks I/O callbacks deferred from the previous loop iteration (e.g. some TCP errors)
3 idle / prepare Internal libuv use only — you cannot schedule work here directly
4 poll Fetch new I/O events; execute I/O-related callbacks (file reads, network). Loop waits here if nothing else is pending. fs.readFile
5 check Callbacks scheduled with setImmediate setImmediate
6 close callbacks Close events — e.g. socket.on('close', ...) .on('close')

Between every phase transition, Node.js drains two microtask queues before moving to the next phase:

  1. process.nextTick queue — always drains first (highest priority)
  2. Promise microtask queue — drains after nextTick
⚠ Common Trap
Recursively calling process.nextTick will starve I/O — the loop never advances past microtasks.
JavaScript
// What is the output order?
setTimeout(() => console.log('1. setTimeout'), 0);

setImmediate(() => console.log('2. setImmediate'));

process.nextTick(() => console.log('3. nextTick'));

Promise.resolve().then(() => console.log('4. Promise.then'));

console.log('5. synchronous');

/*
  Output:
  5. synchronous          ← sync code runs first (call stack)
  3. nextTick             ← nextTick queue (before promises)
  4. Promise.then         ← promise microtask queue
  1. setTimeout           ← timers phase (may swap with setImmediate
  2. setImmediate           depending on when loop starts)
*/
How to answer this in an interview

"The event loop is Node's scheduler. Sync code runs first. Then microtasks — nextTick before Promises. Then the loop cycles through its 6 phases: timers → pending callbacks → idle → poll → check → close. The poll phase is where Node waits for I/O. setImmediate fires in the check phase, always after I/O callbacks — that's its guarantee."

2
What is the difference between process.nextTick, setImmediate, and setTimeout(fn, 0)?
Medium Event Loop
APIWhen it runsUse case
process.nextTick After current operation, before any I/O or timers — highest priority microtask Ensure a callback fires asynchronously but before anything else
Promise.then After nextTick queue empties — second priority microtask Standard async control flow
setImmediate Check phase — guaranteed after I/O callbacks in the same loop iteration Do something after I/O handlers finish
setTimeout(fn, 0) Timers phase — minimum 1ms delay (OS-dependent), can slip past setImmediate outside I/O Delay by at least one tick — less predictable than setImmediate
JavaScript
// OUTSIDE an I/O callback — order of setTimeout vs setImmediate is UNDEFINED
setTimeout(() => console.log('timeout'), 0);
setImmediate(() => console.log('immediate'));
// Could print either order — depends on OS timer resolution

// INSIDE an I/O callback — setImmediate ALWAYS wins
fs.readFile(__filename, () => {
  setTimeout(() => console.log('timeout'), 0);
  setImmediate(() => console.log('immediate')); // always first
});
Practical Rule
Prefer setImmediate over setTimeout(fn, 0) when you need post-I/O ordering — it's deterministic. Use process.nextTick sparingly and only when you truly need the highest priority.
One-liner answer for interview

"nextTick fires before any I/O in the current microtask checkpoint — it's the highest priority. setImmediate fires in the check phase, after I/O. setTimeout(0) fires in the timers phase and has at least a 1ms minimum delay, so its ordering versus setImmediate is non-deterministic outside I/O callbacks."

3
What happens if you block the event loop? How do you detect and fix it?
Hard Event Loop

The event loop runs on a single thread. Every callback runs to completion before the next one starts. If a callback takes a long time — a CPU-heavy computation, a massive JSON.parse, a synchronous file read — no other request is served during that time.

⚠ Real-world Impact
One blocked event loop iteration in a production server delays every concurrent user. A 500ms block on a server handling 1000 req/s means thousands of requests time out.
  • Synchronous I/O: fs.readFileSync, fs.writeFileSync
  • Heavy computation: sorting millions of items, crypto without worker
  • Large JSON: JSON.parse of a 50MB response body
  • Regex with catastrophic backtracking (ReDoS)
  • Infinite or deep synchronous recursion
JavaScript
// ❌ BLOCKS the event loop — no request served during this
function badRoute(req, res) {
  const data = fs.readFileSync('/huge-file.json');
  const parsed = JSON.parse(data);  // still on main thread
  res.send(parsed);
}

// ✅ Non-blocking — event loop is free while file is being read
async function goodRoute(req, res) {
  const data = await fs.promises.readFile('/huge-file.json');
  const parsed = JSON.parse(data); // JSON.parse still blocks! See below
  res.send(parsed);
}

// ✅ Offload CPU work to a worker thread
const { Worker } = require('worker_threads');
function parseInWorker(rawJson) {
  return new Promise((resolve, reject) => {
    const worker = new Worker(`
      const { workerData, parentPort } = require('worker_threads');
      parentPort.postMessage(JSON.parse(workerData));
    `, { eval: true, workerData: rawJson });
    worker.on('message', resolve);
    worker.on('error', reject);
  });
}
  • clinic doctor — visualizes event loop lag over time
  • perf_hooks: monitorEventLoopDelay() — measures lag in ms
  • --inspect + Chrome DevTools — CPU profile to find hot synchronous code
JavaScript
const { monitorEventLoopDelay } = require('perf_hooks');
const h = monitorEventLoopDelay({ resolution: 20 });
h.enable();

setInterval(() => {
  console.log(`Event loop lag: ${h.mean / 1e6}ms mean`);
}, 5000);
Topic B — Non-blocking I/O & libuv
4
How does Node.js achieve non-blocking I/O if JavaScript is single-threaded?
Hard libuv

JavaScript runs on a single thread — but the operating system and libuv's thread pool do the actual I/O work in the background. Node.js is the middleman that hands off work and gets notified when it's done.

Architecture
  ┌─────────────────────────────────────┐
  │   Your JavaScript Code (V8)         │  ← Single thread
  ├─────────────────────────────────────┤
  │   Node.js Core APIs (C++ bindings)  │
  ├─────────────────────────────────────┤
  │              libuv                   │
  │  ┌─────────────┐  ┌───────────────┐ │
  │  │ Event Loop  │  │  Thread Pool  │ │  ← 4 threads default
  │  │  (1 thread) │  │  (UV_THREADPOOL│ │     (max 128)
  │  └─────────────┘  │  _SIZE)       │ │
  │                   └───────────────┘ │
  ├─────────────────────────────────────┤
  │         Operating System            │
  │  (epoll/kqueue/IOCP - async I/O)    │
  └─────────────────────────────────────┘
  • Network I/O (TCP/UDP, HTTP) — uses OS-level async APIs (epoll on Linux, kqueue on macOS, IOCP on Windows). The OS tells libuv when a socket is readable/writable. No thread pool needed.
  • File system & DNS lookups — most OS file APIs are blocking, so libuv uses its thread pool (4 threads by default). A worker thread does the blocking call; when done it signals the event loop.
Thread Pool Size
You can increase libuv's thread pool: UV_THREADPOOL_SIZE=16 node app.js
This helps when many concurrent disk I/O or crypto operations queue up. Max is 128 threads.
Flow
1. fs.readFile('data.txt', callback)
   → Registers callback, hands work to libuv thread pool

2. Event loop continues processing other events (non-blocking)

3. A libuv thread performs the blocking OS read() call

4. Thread completes → posts result to event loop's poll queue

5. Poll phase picks it up → calls your callback on the main thread
One-liner answer for interview

"JavaScript is single-threaded but Node.js uses libuv to delegate I/O work — network I/O goes through OS-level async interfaces like epoll, while file system work uses libuv's thread pool. When work completes, a callback is queued on the event loop and your JS code picks it up — never blocking the main thread."

5
Is Node.js truly single-threaded? What runs on multiple threads?
Medium libuv

Your JavaScript code runs on one thread — that's the V8 thread. But the Node.js process as a whole uses multiple threads:

  • 1 main thread — V8 + event loop (your JS)
  • 4+ libuv threads — file system, DNS, crypto, zlib
  • V8 internal threads — GC, JIT compilation (background)
  • Worker threads (optional) — you can spawn via worker_threads
JavaScript
const crypto = require('crypto');

// These 4 hashes run CONCURRENTLY on 4 libuv threads
// despite JavaScript initiating them "one after another"
for (let i = 0; i < 4; i++) {
  crypto.pbkdf2('password', 'salt', 100000, 64, 'sha512', (err, key) => {
    console.log(`Hash ${i} done`);
  });
}
// All 4 finish at roughly the same time — parallel on thread pool

// 5th one has to wait for a free thread (default pool = 4)
crypto.pbkdf2('password', 'salt', 100000, 64, 'sha512', (err, key) => {
  console.log('Hash 4 done — waited for a free thread');
});
Interview Tip
Interviewers love this question. The short answer: "single-threaded for JS execution, multi-threaded under the hood for I/O." Show you know the difference — it demonstrates real depth.
Topic C — Single-threaded Concurrency Model
6
Why is Node.js good for I/O-bound workloads but bad for CPU-bound workloads?
Medium Concurrency

In traditional multi-threaded servers (like Java), each request gets its own thread. With 10,000 concurrent connections you need 10,000 threads — enormous memory overhead and context-switching cost.

Node.js uses a single thread + event loop. While waiting for I/O (database, file, network), the thread serves other requests. 10,000 concurrent connections can be handled by a handful of threads if most time is spent waiting for I/O.

If a request requires heavy computation — image resizing, video encoding, complex cryptography, ML inference — that work occupies the main thread. All other requests wait.

JavaScript
// Simulating CPU work — blocks event loop for ~2 seconds
function fibonacci(n) {
  if (n <= 1) return n;
  return fibonacci(n - 1) + fibonacci(n - 2);
}

app.get('/slow', (req, res) => {
  const result = fibonacci(44); // ← blocks ALL requests for ~2s
  res.send({ result });
});

// Fix: offload to worker thread
app.get('/fast', async (req, res) => {
  const result = await runInWorker(44); // non-blocking
  res.send({ result });
});
  • Worker Threadsworker_threads module for parallel JS execution
  • Child Processeschild_process.fork for separate Node processes
  • Native Addons — offload to C/C++ via N-API
  • Microservice — delegate to a Python/Go service better suited for compute
7
What is a callback, and what is "callback hell"? How do modern Node.js patterns solve it?
Easy Concurrency
📜 Historical Context — The Callback Era

Callbacks were the only async pattern in Node.js from v0.1 (2009) until Promises in ES6 (2015). Understanding them is still essential because:

  • Many Node.js core APIs still use callbacks (fs, crypto, dns)
  • EventEmitters are callbacks under the hood
  • util.promisify is your bridge between the callback world and async/await
Error-first conventionFirst argument is always the error (null if none). Ensures you can't accidentally ignore errors.
Inversion of controlYou hand your callback to another function — they decide when to call it. This makes bugs harder to trace in deeply nested callback code.
Pyramid of doomThe visual shape of deeply nested callbacks — each async step adds another indent level until the code becomes unreadable.
util.promisifyBuilt-in Node utility that wraps any error-first callback function and returns a Promise-based version — the standard bridge to async/await.
JavaScript — util.promisify bridge
const { promisify } = require('util');
const fs = require('fs');

// Old callback-style API
fs.readFile('data.txt', 'utf8', (err, data) => {
  if (err) throw err;
  console.log(data);
});

// Promisified — now usable with async/await
const readFileAsync = promisify(fs.readFile);
const data = await readFileAsync('data.txt', 'utf8');

// Note: fs.promises already provides async versions natively:
const data2 = await fs.promises.readFile('data.txt', 'utf8');

A callback is a function passed as an argument to another function, to be called when an async operation completes. Node.js follows the error-first callback convention: (err, result) => {}

JavaScript — Callback Hell
// ❌ Hard to read, error-prone, hard to maintain
fs.readFile('user.json', (err, userData) => {
  if (err) return handleError(err);
  db.getUser(userData.id, (err, user) => {
    if (err) return handleError(err);
    db.getOrders(user.id, (err, orders) => {
      if (err) return handleError(err);
      emailService.send(user.email, orders, (err) => {
        if (err) return handleError(err);
        console.log('Done!');  // deeply nested — "pyramid of doom"
      });
    });
  });
});
JavaScript — async/await Solution
// ✅ Same logic — flat, readable, error handled in one place
async function processUser() {
  try {
    const userData = JSON.parse(await fs.promises.readFile('user.json'));
    const user   = await db.getUser(userData.id);
    const orders = await db.getOrders(user.id);
    await emailService.send(user.email, orders);
    console.log('Done!');
  } catch (err) {
    handleError(err); // one catch handles all failures
  }
}
Topic D — Module System
8
What is the difference between CommonJS (require) and ES Modules (import)? When do you use each?
Medium Module System
FeatureCommonJS (CJS)ES Modules (ESM)
Syntaxrequire() / module.exportsimport / export
LoadingSynchronous — blocksAsynchronous — non-blocking
EvaluationDynamic — can require() inside a functionStatic — imports resolved at parse time
Tree-shakingNot possibleBundlers can eliminate dead code
Top-level awaitNot supportedSupported
File extension.js (with type:commonjs).mjs or .js (with type:module)
__dirname / __filenameAvailableNot available — use import.meta.url
JavaScript — CommonJS
// math.js
function add(a, b) { return a + b; }
module.exports = { add };

// app.js
const { add } = require('./math');
// Can also do this dynamically:
if (condition) {
  const utils = require('./utils'); // dynamic require — valid in CJS
}
JavaScript — ES Modules
// math.mjs
export function add(a, b) { return a + b; }

// app.mjs
import { add } from './math.mjs';  // must include extension

// Dynamic import — works in ESM (returns a Promise)
const { add } = await import('./math.mjs');

// __dirname equivalent in ESM
import { fileURLToPath } from 'url';
import { dirname } from 'path';
const __dirname = dirname(fileURLToPath(import.meta.url));
  • Use CJS for existing Node.js projects and when publishing to npm (better ecosystem compatibility)
  • Use ESM for new projects, browser-shared code, and when you need tree-shaking or top-level await
  • Set "type": "module" in package.json to make all .js files ESM
9
How does Node.js module caching work? What is a circular dependency?
Medium Module System

The first time you require('./foo'), Node.js loads, compiles, and executes it, then caches the result in require.cache (keyed by resolved filename). Every subsequent require('./foo') returns the cached exports object — the module is not re-executed.

JavaScript
// counter.js
let count = 0;
module.exports = {
  increment: () => ++count,
  get: () => count
};

// app.js
const a = require('./counter');
const b = require('./counter'); // same cached object

a.increment();
console.log(b.get()); // → 1 (a and b are the SAME object)

// Force a fresh load (rare — testing, hot reload)
delete require.cache[require.resolve('./counter')];
const c = require('./counter'); // fresh copy, count = 0

A circular dependency is when module A requires B, and B requires A. Node.js handles this by returning an incomplete (partial) exports object for the module currently being loaded.

JavaScript — Circular Dependency
// a.js
console.log('a.js loading');
const b = require('./b');          // triggers b.js to load
console.log('b.done =>', b.done);  // → true
module.exports = { done: true };

// b.js
console.log('b.js loading');
const a = require('./a');          // a is mid-load → gets {} (empty!)
console.log('a.done =>', a.done);  // → undefined (partial export)
module.exports = { done: true };
⚠ How to Fix Circular Dependencies
Restructure the code to extract shared logic into a third module that both A and B can import, breaking the cycle.
10
How does Node.js resolve modules? What is the resolution algorithm?
Medium Module System

When you call require('X'), Node.js resolves it using this algorithm:

  1. Is X a core module? (fs, path, http…) → return immediately
  2. Does X start with ./, ../, or /? → it's a file path:
    • Try X exactly
    • Try X.js, X.json, X.node
    • Try X/index.js, X/index.json, X/index.node
  3. Otherwise → look in node_modules folders, walking up the directory tree:
    • ./node_modules/X
    • ../node_modules/X
    • ../../node_modules/X … up to root
JavaScript
// See exactly where a module was loaded from:
console.log(require.resolve('express'));
// → /project/node_modules/express/index.js

// Inspect the full module cache:
console.log(Object.keys(require.cache));

// package.json "main" field controls which file is the entry
// package.json "exports" field (Node 12+) controls subpath exports
// package.json "type": "module" makes .js files use ESM
Pro Tip
The exports field in package.json (introduced in Node 12) overrides the old resolution and lets package authors control exactly which files are exposed — preventing internal paths from being imported directly.