squish: Client-Side Image Compression with WebAssembly Codecs
WebAssembly codec integration, parallel queue management, and performance engineering behind squish, a client-side image compressor
The compression of images before deployment remains one of those deceptively mundane tasks that, when handled carelessly, compounds into measurable damage: inflated bandwidth costs, degraded Core Web Vitals, and user experiences that deteriorate with every unoptimized asset served. The conventional approach delegates this responsibility to server-side services or third-party APIs, introducing latency, privacy concerns, and external dependencies that complicate otherwise straightforward workflows.
squish rejects that paradigm entirely. It is a browser-based image compression tool where every operation — decoding, transcoding, and encoding — executes exclusively on the client through WebAssembly codecs. No image ever leaves the user's device. No server receives, processes, or stores any pixel data. This is not a design convenience but an architectural guarantee: the application contains no backend, no API routes, and no server-side logic whatsoever.
The tool supports five output formats — AVIF, JPEG, JPEG XL, PNG, and WebP — each powered by production-grade codec implementations compiled to WebAssembly: libavif, MozJPEG, libjxl, OxiPNG, and libwebp. Batch processing with configurable quality settings and real-time compression statistics complete the feature set.
This document presents the architectural decisions, the WebAssembly integration strategy, and the concurrent processing pipeline that sustain the platform.
Live Platform
The production instance of squish, ready to compress images directly in the browser.
Source Code
The complete codebase under MIT license, open for audit and contribution.
Layered Architecture
The application follows a four-layer architecture that enforces strict separation of concerns, ensuring that each layer communicates only with its immediate neighbors and never bypasses the orchestration boundaries.
The Presentation Layer comprises four React components — DropZone for drag-and-drop file ingestion with MIME type validation, CompressionOptions for format selection and quality adjustment, ImageList for real-time status rendering with thumbnails, and DownloadAll for batch export. The Orchestration Layer centers on a single custom hook, useImageQueue, that manages the entire processing lifecycle with concurrency control. The Processing Layer handles codec dispatch and WebAssembly module initialization. The Codec Layer provides five WebAssembly-compiled encoders through the jSquash library, each wrapping a production-grade C/C++ implementation.
WebAssembly Codec Integration
The core value proposition of squish resides in its codec architecture. Rather than relying on the browser's native <canvas> element for compression — which limits output to JPEG and PNG at unpredictable quality levels — the application employs purpose-built WebAssembly modules that expose the same encoder implementations used by industry-standard tools.
| Format | Codec | Compression Mode | Default Quality |
|---|---|---|---|
| AVIF | libavif | Lossy, effort level 4 | 50% |
| JPEG | MozJPEG | Lossy | 75% |
| JPEG XL | libjxl | Lossy | 75% |
| PNG | OxiPNG | Lossless | — |
| WebP | libwebp | Lossy | 75% |
Lazy Module Loading
WebAssembly modules are substantial in size, and loading all five codecs at application startup would impose an unacceptable initial payload. The wasm.ts module implements a lazy-loading strategy that defers module initialization until the first compression request for a given format:
const wasmModules = new Map<string, boolean>();
export async function ensureWasmLoaded(format: OutputType): Promise<void> {
if (wasmModules.get(format)) return; // Already initialized
// Dynamic import triggers WASM download and compilation
// Module cached in Map to prevent duplicate initialization
wasmModules.set(format, true);
}This pattern ensures that a user compressing exclusively to WebP never downloads the libavif, libjxl, or OxiPNG modules. The Map-based cache guarantees that each module initializes exactly once per session, regardless of how many images pass through that codec.
Decode and Encode Pipeline
The processing layer abstracts codec-specific APIs behind two unified functions. The decode function accepts a raw ArrayBuffer and a source format identifier, dispatching to the appropriate jSquash decoder to produce a standard ImageData object. The encode function accepts that ImageData along with the target format and quality options, returning a compressed ArrayBuffer:
export async function decode(
sourceType: string,
fileBuffer: ArrayBuffer
): Promise<ImageData> {
await ensureWasmLoaded(sourceType as OutputType);
// Dispatches to @jsquash/avif, @jsquash/jpeg,
// @jsquash/jxl, @jsquash/png, or @jsquash/webp
// Returns standard ImageData with RGBA pixel buffer
}The decoder handles format detection through a combination of MIME type inspection and file extension analysis, with special handling for JPEG XL files whose MIME type is not yet universally recognized by browsers.
export async function encode(
outputType: OutputType,
imageData: ImageData,
options: CompressionOptions
): Promise<ArrayBuffer> {
await ensureWasmLoaded(outputType);
// AVIF: { quality, effort: 4 }
// JPEG: { quality } via MozJPEG
// JXL: { quality }
// PNG: lossless, quality parameter ignored
// WebP: { quality }
}Each encoder receives format-specific option structures. AVIF includes an effort parameter set to 4, which balances encoding speed against compression efficiency. PNG operates as strictly lossless through OxiPNG, ignoring the quality slider entirely. The remaining formats accept a quality parameter ranging from 1 to 100.
Vite Configuration for WebAssembly
The build system requires deliberate configuration to preserve the dynamic import semantics that WebAssembly modules depend upon. The jSquash packages are explicitly excluded from Vite's dependency optimization, which would otherwise attempt to pre-bundle them and break their WASM loading mechanism:
export default defineConfig({
plugins: [react()],
optimizeDeps: {
exclude: [
'@jsquash/avif', '@jsquash/jpeg', '@jsquash/jxl',
'@jsquash/png', '@jsquash/webp'
]
},
build: {
target: 'esnext',
rollupOptions: {
output: { format: 'es', inlineDynamicImports: true }
}
}
});The inlineDynamicImports: true setting ensures that all dynamic imports resolve correctly in the production bundle, while the esnext target preserves modern JavaScript features that the WASM integration requires.
Parallel Queue Management
The most architecturally significant component of the application is the useImageQueue hook, which implements a bounded parallel processing system that prevents browser resource exhaustion during batch operations.
Concurrency Model
The queue enforces a hard limit of three concurrent compression operations. This threshold was chosen to balance throughput against memory pressure — WebAssembly codec instances consume substantial heap memory, and running more than three simultaneously risks triggering browser out-of-memory conditions on devices with limited resources.
const MAX_PARALLEL_PROCESSING = 3;
const processingCount = useRef(0);
const processingImages = useRef(new Set<string>());The use of useRef rather than useState for tracking processing state is a deliberate performance optimization. Processing count and the set of in-flight image IDs change with high frequency during batch operations, and storing them as React state would trigger re-renders on every status transition — an unnecessary overhead that degrades perceived responsiveness.
Processing Lifecycle
Each image transitions through five discrete states:
When the user drops files, the App.tsx component creates ImageFile objects with pending status and uses requestAnimationFrame to defer queue processing until the UI has rendered the pending state. This prevents the visual freeze that would occur if compression began synchronously during the drop handler. Each image then enters the queue, and the processNextInQueue function launches up to three parallel operations with 100ms stagger intervals to distribute the initial memory allocation spike.
Image Processing Flow
Within each processing slot, the pipeline executes a deterministic sequence: read the file as an ArrayBuffer, detect the source format, validate that the buffer is non-empty and produces valid ImageData dimensions, decode through the appropriate WebAssembly codec, encode to the target format with the specified quality, create a Blob and corresponding Object URL for preview rendering, and finally update the image state to complete with compression statistics.
Error handling wraps the entire sequence in a try-catch that distinguishes between empty files, invalid image data, and codec failures, propagating descriptive error messages to the UI where they render alongside an AlertCircle icon in the image list.
Presentation Components
The UI layer comprises four purpose-built components, each responsible for a single interaction concern.
State Management Strategy
The application's state architecture reflects a deliberate trade-off between React's declarative model and the imperative performance requirements of high-throughput image processing.
The primary image array and compression options reside in useState hooks within App.tsx, triggering re-renders when images are added, removed, or updated — operations that the user should see reflected immediately in the UI. The processing internals, however — the concurrent operation counter, the set of in-flight image IDs, and the queue array management — employ useRef to avoid re-render cascading during rapid state transitions that occur entirely outside the user's perception.
This hybrid approach ensures that the UI remains responsive during batch operations involving dozens of images while the queue manager operates at full throughput without the overhead of React's reconciliation cycle on every status change.
Technology Stack
| Layer | Technology | Role |
|---|---|---|
| Framework | React 18.3 + TypeScript 5.5 | UI rendering with strict type safety |
| Build | Vite 5.4 | Development server, production bundling |
| Styling | Tailwind CSS 3.4 + shadcn/ui | Utility-first styling with Radix primitives |
| Icons | Lucide React | Consistent iconography across the interface |
| Theme | next-themes | System-aware dark mode with class-based toggling |
| AVIF Codec | @jsquash/avif — libavif | Advanced lossy/lossless image compression |
| JPEG Codec | @jsquash/jpeg — MozJPEG | Optimized JPEG encoding from Mozilla |
| JXL Codec | @jsquash/jxl — libjxl | Next-generation image format from Google |
| PNG Codec | @jsquash/png — OxiPNG | Lossless PNG optimization in Rust via WASM |
| WebP Codec | @jsquash/webp — libwebp | Google's modern image format for the web |
Open Source and Community
squish is an open-source project published under the MIT license, reflecting the conviction that developer tools should be transparent, auditable, and freely adoptable without licensing friction. The decision to process all images exclusively on the client is not merely a technical optimization but a statement of principle: user data belongs to the user, and compression tooling should not require surrendering that data to external servers.
The value this project contributes to the community extends beyond the utility of compressing images in the browser. The WebAssembly codec integration pattern — lazy-loading production-grade C/C++ implementations through dynamic imports with proper Vite configuration — constitutes a reusable reference for any application that needs to bring native-performance computation to the browser. The bounded parallel queue with ref-based state management demonstrates how to handle high-throughput processing in React without sacrificing UI responsiveness. The four-layer architecture provides a clean template for building tools that combine complex processing logic with intuitive user interfaces.
The repository is available at github.com/geo-mena/squish, and the live platform at squish.tofi.pro, where every architectural decision documented in this article can be verified directly in the source code. Contributions — whether in the form of new codec integrations, performance optimizations, accessibility improvements, or additional output format support — are welcome and represent exactly the kind of collaboration that strengthens the ecosystem of privacy-respecting developer tools.