Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Bun

Watch mode

Bun supports two kinds of automatic reloading via CLI flags:

  • --watch mode, which hard restarts Bun's process when imported files change.
  • --hot mode, which soft reloads the code (without restarting the process) when imported files change.

--watch mode

Watch mode can be used with bun test or when running TypeScript, JSX, and JavaScript files.

To run a file in --watch mode:

bun --watch index.tsx

To run your tests in --watch mode:

bun --watch test

In --watch mode, Bun keeps track of all imported files and watches them for changes. When a change is detected, Bun restarts the process, preserving the same set of CLI arguments and environment variables used in the initial run. If Bun crashes, --watch will attempt to automatically restart the process.

⚡️ Reloads are fast. The filesystem watchers you're probably used to have several layers of libraries wrapping the native APIs or worse, rely on polling.

Instead, Bun uses operating system native filesystem watcher APIs like kqueue or inotify to detect changes to files. Bun also does a number of optimizations to enable it scale to larger projects (such as setting a high rlimit for file descriptors, statically allocated file path buffers, reuse file descriptors when possible, etc).

The following examples show Bun live-reloading a file as it is edited, with VSCode configured to save the file on each keystroke.

bash
watchy.tsx
bun run --watch watchy.tsx
watchy.tsx
import { serve } from "bun";
console.log("I restarted at:", Date.now());

serve({
  port: 4003,

  fetch(request) {
    return new Response("Sup");
  },
});

In this example, Bun is

bun watch gif

Running bun test in watch mode and save-on-keypress enabled:

bun --watch test

bun test gif

--hot mode

Use bun --hot to enable hot reloading when executing code with Bun. This is distinct from --watch mode in that Bun does not hard-restart the entire process. Instead, it detects code changes and updates its internal module cache with the new code.

Note — This is not the same as hot reloading in the browser! Many frameworks provide a "hot reloading" experience, where you can edit & save your frontend code (say, a React component) and see the changes reflected in the browser without refreshing the page. Bun's --hot is the server-side equivalent of this experience. To get hot reloading in the browser, use a framework like Vite.

bun --hot server.ts

Starting from the entrypoint (server.ts in the example above), Bun builds a registry of all imported source files (excluding those in node_modules) and watches them for changes. When a change is detected, Bun performs a "soft reload". All files are re-evaluated, but all global state (notably, the globalThis object) is persisted.

server.ts
// make TypeScript happy
declare global {
  var count: number;
}

globalThis.count ??= 0;
console.log(`Reloaded ${globalThis.count} times`);
globalThis.count++;

// prevent `bun run` from exiting
setInterval(function () {}, 1000000);

If you run this file with bun --hot server.ts, you'll see the reload count increment every time you save the file.

bun --hot index.ts
Reloaded 1 times
Reloaded 2 times
Reloaded 3 times

Traditional file watchers like nodemon restart the entire process, so HTTP servers and other stateful objects are lost. By contrast, bun --hot is able to reflect the updated code without restarting the process.

HTTP servers

This makes it possible, for instance, to update your HTTP request handler without shutting down the server itself. When you save the file, your HTTP server will be reloaded with the updated code without the process being restarted. This results in seriously fast refresh speeds.

server.ts
globalThis.count ??= 0;
globalThis.count++;

Bun.serve({
  fetch(req: Request) {
    return new Response(`Reloaded ${globalThis.count} times`);
  },
  port: 3000,
});

Note — In a future version of Bun, support for Vite's import.meta.hot is planned to enable better lifecycle management for hot reloading and to align with the ecosystem.

Implementation details