treeru.com

sharp Image Optimization — Automate Web Image Compression With One Command

If your website is slow, images are the first suspect. They typically account for 50-80% of total page weight. You don't need Photoshop — the sharp library handles WebP/AVIF conversion, resizing, and batch processing from the terminal. This guide covers everything from a basic one-file script to a full build pipeline integration that processes 200 images in 30 seconds.

What Is sharp?

sharp is a high-performance image processing library for Node.js. It uses libvips under the hood, making it 4-5x faster than ImageMagick. It's also the engine behind Next.js's built-in image optimization.

# Install
npm install sharp

# Or with other package managers
yarn add sharp
pnpm add sharp

Key features:

  • libvips-based: 4-5x faster than ImageMagick with lower memory usage
  • Format support: WebP, AVIF, PNG, JPEG, TIFF, GIF input and output
  • Transform operations: Resize, crop, rotate, blur, sharpen, composite
  • Streaming API: Memory-efficient processing for large images
  • Async/await: Modern Promise-based API
  • Next.js integration: Official image optimization engine for Next.js

WebP vs AVIF: Which Format to Choose

Both formats compress better than JPEG, but they have different trade-offs.

FactorWebPAVIFJPEG
Compression (same quality)25-35% smaller40-50% smallerBaseline
Encoding speedFastSlow (5-10x)Very fast
Browser support97%+92%+100%
TransparencyYesYesNo
AnimationYesYesNo
Best forGeneral use (optimal now)Static images, max compressionFallback

Practical recommendation: Use WebP as your primary format and adopt AVIF gradually. AVIF has better compression, but encoding is 5-10x slower and older Safari versions lack support. For build-time-sensitive projects, WebP alone is sufficient.

Basic Compression Script

Start with a simple script that converts a single image to both WebP and AVIF:

// optimize.mjs
import sharp from 'sharp';
import path from 'path';

async function optimizeImage(inputPath) {
  const dir = path.dirname(inputPath);
  const name = path.basename(inputPath, path.extname(inputPath));

  // WebP conversion (quality 80, resize to max 1920px)
  await sharp(inputPath)
    .resize({ width: 1920, withoutEnlargement: true })
    .webp({ quality: 80 })
    .toFile(path.join(dir, `${name}.webp`));

  // AVIF conversion (quality 65, higher compression)
  await sharp(inputPath)
    .resize({ width: 1920, withoutEnlargement: true })
    .avif({ quality: 65 })
    .toFile(path.join(dir, `${name}.avif`));

  console.log(`Done: ${name}`);
}

// Usage: node optimize.mjs ./images/photo.jpg
optimizeImage(process.argv[2]);

Example output:

$ node optimize.mjs ./public/images/hero.jpg

Original:  hero.jpg   2.4 MB
WebP:      hero.webp  320 KB  (87% reduction)
AVIF:      hero.avif  210 KB  (91% reduction)

The withoutEnlargement: true option is essential — it prevents upscaling images smaller than the target width. Enlarging an 800px icon to 1920px only increases file size without improving quality.

Batch Processing Automation

Processing images one by one is fine for a few files, but for dozens or hundreds you need batch automation. This script processes every image in a folder:

// batch-optimize.mjs
import sharp from 'sharp';
import fs from 'fs';
import path from 'path';

const SUPPORTED = ['.jpg', '.jpeg', '.png', '.tiff'];

async function batchOptimize(inputDir, outputDir) {
  const files = fs.readdirSync(inputDir)
    .filter(f => SUPPORTED.includes(path.extname(f).toLowerCase()));

  console.log(`Processing ${files.length} files...`);
  fs.mkdirSync(outputDir, { recursive: true });

  let totalSaved = 0;
  for (const file of files) {
    const inputPath = path.join(inputDir, file);
    const name = path.basename(file, path.extname(file));
    const originalSize = fs.statSync(inputPath).size;

    await sharp(inputPath)
      .resize({ width: 1920, withoutEnlargement: true })
      .webp({ quality: 80 })
      .toFile(path.join(outputDir, `${name}.webp`));

    const newSize = fs.statSync(path.join(outputDir, `${name}.webp`)).size;
    const saved = originalSize - newSize;
    totalSaved += saved;

    console.log(`  ${file} → ${name}.webp (${(saved/1024).toFixed(0)}KB saved)`);
  }

  console.log(`\nTotal saved: ${(totalSaved/1024/1024).toFixed(1)}MB`);
}

batchOptimize('./public/images', './public/images/optimized');

Performance numbers from our testing: 200 images processed in 30 seconds (1920px WebP, sequential processing), with an average compression ratio of 85-90% from JPEG to WebP, using approximately 100MB of memory.

Quality Tuning by Image Type

Different image types need different quality settings. Pushing quality too low creates visible artifacts, while too high negates compression benefits.

Image TypeWebP QualityAVIF QualityMax Width
Hero/banner85701920px
Blog thumbnail8065800px
Card image7560600px
Icon/logo9080Original
Gallery/portfolio85701440px
OG image80651200px
const profiles = {
  hero:      { width: 1920, webpQ: 85, avifQ: 70 },
  thumbnail: { width: 800,  webpQ: 80, avifQ: 65 },
  card:      { width: 600,  webpQ: 75, avifQ: 60 },
  icon:      { width: null, webpQ: 90, avifQ: 80 },
};

async function optimize(input, profile = 'thumbnail') {
  const { width, webpQ, avifQ } = profiles[profile];
  const pipeline = sharp(input);

  if (width) pipeline.resize({ width, withoutEnlargement: true });

  await pipeline.clone().webp({ quality: webpQ })
    .toFile(input.replace(/\.[^.]+$/, '.webp'));
  await pipeline.clone().avif({ quality: avifQ })
    .toFile(input.replace(/\.[^.]+$/, '.avif'));
}

Tuning tip: Below quality 70, text in images starts showing visible blur. Photos can tolerate quality 60, but screenshots and diagrams should stay at 80+.

Build Pipeline Integration

Running the optimization script manually means eventually forgetting to run it. Integrate it into your build process so it runs automatically:

// package.json
{
  "scripts": {
    "optimize:images": "node scripts/batch-optimize.mjs",
    "prebuild": "npm run optimize:images",
    "build": "next build"
  }
}

Considerations for build integration:

  • Skip already-optimized images: Add a cache check (compare timestamps or hashes) to avoid redundant processing
  • Keep originals separate: Store source images in a raw folder to prevent accidental overwrites
  • CI/CD compatibility: Verify sharp's native module builds correctly in your CI environment (Alpine Linux may need additional packages)
  • Git strategy: Decide whether optimized images go in .gitignore (regenerated on build) or are committed (faster deploys)
  • Monitor build time: AVIF encoding is slow — consider WebP-only for CI if build time is critical

For Next.js projects specifically: the <Image> component handles runtime optimization, but pre-optimizing at build time reduces runtime server load and improves initial load speed. This is especially impactful for statically generated (SSG) sites.

Summary

  • sharp is the fastest Node.js image library — 4-5x faster than ImageMagick, used by Next.js internally
  • Use WebP as your primary format and adopt AVIF gradually for maximum compression
  • Batch processing handles 200 images in 30 seconds — eliminates manual optimization entirely
  • Tune quality by image type: hero 85, thumbnail 80, card 75 for WebP
  • Integrate into your build pipeline with a prebuild script so optimization never gets forgotten
  • Always use withoutEnlargement: true to prevent upscaling smaller images

Image optimization is the highest-impact performance improvement for most websites. With sharp, a single script replaces hours of manual Photoshop work and consistently delivers 85-90% file size reductions across your entire image library.