The Scenario
You have a Laravel + Inertia app running behind a load balancer, with multiple web servers managed by Laravel Forge and deployed via Envoyer. Each deployment runs the standard Envoyer pipeline on every server — git pull, composer install, npm install, npm run build.
It works. Until it doesn't.
The Problem
A user loads your app. The initial HTML comes from web-01 and references /build/assets/app.A1B2C3.js. On the next request, the load balancer routes the asset fetch to web-02, where npm run build produced /build/assets/app.X9Y8Z7.js. The file the browser asked for doesn't exist there.
Result: 404s on asset fetches, and a cascade of Inertia 409 version-mismatch responses forcing full page reloads. The page is technically "up", but broken, and users see blank screens. Sticky sessions only paper over the issue.
This is a well-known pain point in the Inertia community, see inertia-laravel#525 and inertia-laravel#712, and two things conspire to cause it:
- Vite's asset filenames include a content hash, and that hash can drift between machines even for byte-identical source. Vite itself has documented non-determinism.
- Inertia's
version()defaults to an md5 ofpublic/build/manifest.json, whose file ordering also varies per build.
So every server ends up with a different asset fingerprint for the same commit.
What Are Your Options?
There are a few options:
- Build once, deploy the artifact. Run
npm run buildin CI (or on a single build server) and ship the compiledpublic/build/to every web node. Clean, but it requires moving the build step out of Envoyer's per-server pipeline. - Serve assets from a CDN or dedicated subdomain. Configure Laravel's
ASSET_URLto point at the CDN. Also clean, but it's another piece of infrastructure, and you still need to decide where the authoritative build comes from. - Make the version identifier stable. Replace Vite's content hash and Inertia's manifest hash with something that's guaranteed to be identical across all servers in a single deployment.
We use the third option. It requires zero infrastructure changes — Envoyer keeps building on each server, the load balancer keeps round-robining, and nothing about the deploy topology has to move.
The git commit SHA is already identical on every server for a given release; we just need the build and the runtime to consume it.
The Solution
Three small pieces: an Envoyer hook, the Inertia middleware, and vite.config.ts.
1. Write the SHA to version.txt before the build
Add a deployment hook in Envoyer named, for example, "add inertia version reference", running on all web servers, scheduled before npm install and npm run build:
cd {{ release }}
echo {{ sha }} > {{ release }}/version.txt
{{ sha }} and {{ release }} are Envoyer placeholders that resolve at deploy time. The file lives at the release root next to artisan.
2. Have Inertia read the SHA as its version
Override version() in app/Http/Middleware/HandleInertiaRequests.php:
public function version(Request $request): ?string
{
return trim(file_get_contents(base_path('version.txt')));
}
Every server in the fleet now reports the same Inertia version for the duration of that release — no more false-positive 409s from manifest-ordering noise.
3. Substitute the SHA into Vite's asset filenames
In vite.config.ts, read version.txt at build time and use it where Vite would otherwise inject a content hash:
import fs from 'node:fs';
import { defineConfig } from 'vite';
import laravel from 'laravel-vite-plugin';
import react from '@vitejs/plugin-react';
const buildVersion = fs.readFileSync('version.txt', 'utf8').trim();
export default defineConfig({
plugins: [
laravel({ input: ['resources/js/app.tsx'], refresh: true }),
react(),
],
build: {
rollupOptions: {
output: {
entryFileNames: `assets/[name].${buildVersion}.js`,
chunkFileNames: `assets/[name].${buildVersion}.js`,
assetFileNames: `assets/[name].${buildVersion}.[ext]`,
},
},
},
});
Same commit → same SHA → same filenames on every server. A file requested from web-01 exists at the identical path on web-02.
Local development note
Locally there's no Envoyer hook, so version.txt won't exist, and both the middleware and Vite will crash trying to read it.
Create a version.txt at the project root containing a single string and commit it:
development
Running npm run build locally now produces app.development.js and friends, and the middleware reports "development" as the Inertia version.
On every deploy, the Envoyer hook overwrites this committed file with the actual commit SHA before the build runs.
SSR note
If you're building an SSR bundle (vite build --ssr), skip the SHA substitution for that bundle — the SSR output is loaded server-side, never served to the browser, and stable filenames just make the runtime loader's life easier:
export default defineConfig(({ isSsrBuild }) => {
if (isSsrBuild) {
return {
/* ...plugins... */
build: {
outDir: 'bootstrap/ssr',
manifest: false,
ssrManifest: true,
rollupOptions: {
output: {
entryFileNames: `[name].js`,
chunkFileNames: `[name].js`,
assetFileNames: `[name].[ext]`,
},
},
},
};
}
return {
/* ...plugins... */
build: {
outDir: 'public/build',
manifest: 'manifest.json',
rollupOptions: {
output: {
entryFileNames: `assets/[name].${buildVersion}.js`,
chunkFileNames: `assets/[name].${buildVersion}.js`,
assetFileNames: `assets/[name].${buildVersion}.[ext]`,
},
},
},
};
});
Wrapping Up
Three files, one deployment hook, problem gone. Envoyer keeps doing what it does — building on every server, activating the release atomically — but now every server in the fleet produces byte-identical asset filenames and reports the same Inertia version. A browser fetching an asset after a load-balanced hop lands on a path that exists, regardless of which server answers.
If you're hitting the same issue and want a second pair of eyes on your deployment pipeline, get in touch — we've shipped this pattern across several production fleets.

