· frameworks  · 5 min read

Unlocking Speed: 10 Fastify Tips to Supercharge Your API

Practical, battle-tested tips to speed up your Fastify APIs - from schema-driven serialization and smart plugin use to caching, streaming, and measurement workflows that reveal the real bottlenecks.

Practical, battle-tested tips to speed up your Fastify APIs - from schema-driven serialization and smart plugin use to caching, streaming, and measurement workflows that reveal the real bottlenecks.

By the end of this post you’ll know 10 concrete ways to squeeze more throughput and lower latency from your Fastify APIs. You’ll be able to prioritize changes, measure improvements, and deploy high‑performance endpoints that scale.

Fastify is already fast. But speed is not automatic - it’s deliberate. Below are 10 focused, practical tips you can apply now, each with why it matters, how to implement it, and what to watch out for.

1) Make schema your default (and reuse schemas)

Outcome: Faster validation and automatic, optimized serialization.

Why: Fastify compiles JSON schemas (via Ajv) and uses fast-json-stringify for response serialization when you provide route schemas. Compiled serializers are orders of magnitude faster than naive JSON.stringify on complex objects.

How:

// register Fastify with Ajv options if you need custom behavior
const fastify = require('fastify')({
  ajv: {
    customOptions: { removeAdditional: true },
  },
});

const userSchema = {
  $id: 'user#',
  type: 'object',
  properties: {
    id: { type: 'integer' },
    name: { type: 'string' },
  },
};

fastify.addSchema(userSchema);

fastify.get(
  '/user/:id',
  {
    schema: {
      params: {
        type: 'object',
        properties: { id: { type: 'integer' } },
        required: ['id'],
      },
      response: { 200: { $ref: 'user#' } },
    },
  },
  async (request, reply) => {
    return { id: 1, name: 'Ada' };
  }
);

What to watch: Don’t generate schemas dynamically per request. Add them once and reuse via addSchema/$id to maximize the compile cache.

References: Fastify schema docs.

2) Use fast-json-stringify indirectly (via schemas) or directly when needed

Outcome: Much faster JSON serialization for large or frequent responses.

Why: fast-json-stringify compiles a serializer for a schema so stringification is optimized. Fastify uses it automatically when you provide response schemas - but you can also compile serializers for special cases.

How (direct compile):

const fastJson = require('fast-json-stringify');
const stringify = fastJson({
  type: 'object',
  properties: { id: { type: 'integer' }, name: { type: 'string' } },
});

// use stringify(obj) in hot codepaths

What to watch: Keep the schema accurate - mismatches will lead to junky responses or runtime errors.

Reference: fast-json-stringify.

3) Control logging - use Pino and keep it async/low-volume in production

Outcome: Avoid IO-blocking log overhead while keeping structured logs.

Why: Synchronous or very verbose logging kills throughput. Fastify ships with Pino integration; Pino is very fast but you still need to set appropriate levels and avoid expensive serializers in hot paths.

How:

const fastify = require('fastify')({
  logger: {
    level: process.env.NODE_ENV === 'production' ? 'info' : 'debug',
    prettyPrint: false,
  },
});

What to watch: Avoid logging entire request/response bodies. Use sampling or debug-only verbose logs.

Reference: Pino docs.

4) Avoid blocking work - use streaming and non-blocking I/O

Outcome: Lower memory pressure and faster response start times.

Why: Buffering large files or doing CPU-bound work on the event loop blocks other requests.

How:

  • Stream file responses: reply.send(fs.createReadStream(path))
  • Stream database cursors where supported
  • Offload CPU-heavy tasks to worker threads or separate services

Example streaming:

const fs = require('fs');
fastify.get('/big', async (req, reply) => {
  const stream = fs.createReadStream('/var/data/huge.json');
  reply.type('application/json').send(stream);
});

What to watch: Ensure proper backpressure handling and correct MIME types.

5) Use connection pooling and share DB clients via decoration

Outcome: Reduced latency for database access and fewer resource leaks.

Why: Creating a new DB connection per request is expensive. Create a pool once, share it across requests, and reuse clients.

How:

const fastifyPlugin = require('fastify-plugin');
const { Pool } = require('pg');

async function dbConnector(fastify, opts) {
  const pool = new Pool({ connectionString: process.env.DATABASE_URL });
  fastify.decorate('db', pool);
}

module.exports = fastifyPlugin(dbConnector);

What to watch: Close pools cleanly on shutdown and avoid creating pools in plugins that are registered per-request.

6) Register plugins correctly - leverage encapsulation and fastify-plugin

Outcome: Predictable lifecycle and fewer accidental duplicates.

Why: Fastify uses encapsulation: plugins can isolate their decorations. Use fastify-plugin to share decorators across the app intentionally.

How:

  • Use fastify-autoload to structure and load plugins.
  • Use fastify-plugin when you want a plugin to decorate the root instance.

What to watch: Avoid registering the same plugin multiple times inadvertently (costly duplicate work).

References: fastify-plugin, fastify-autoload.

7) Compress responses selectively and use caching

Outcome: Lower bandwidth and faster perceived client performance.

Why: Compression reduces bytes over the network but is CPU work; cache responses for repeated requests.

How:

// compress plugin
await fastify.register(require('fastify-compress'), { global: false });

// caching
await fastify.register(require('fastify-caching'));

Then enable compression/caching only where it helps, and set sensible ttl for cache entries.

References: fastify-compress, fastify-caching.

8) Prefer route-level hooks and minimal global hooks

Outcome: Lower per-request overhead.

Why: Global hooks run for every route. If only a subset of routes needs a hook (e.g., auth), attach the hook at route-level to reduce unnecessary work.

How:

fastify.route({
  method: 'GET',
  url: '/private',
  preHandler: authMiddleware,
  handler: handler,
});

What to watch: Keep hook logic light; heavy operations should be async and cached when possible.

9) Use HTTP/2 and multi-process strategies (when appropriate)

Outcome: Better throughput and resource utilization on multi-core machines; HTTP/2 can reduce connection overhead.

Why: Node runs on a single core per process. Use clustering (pm2, Node cluster) or container orchestrators to run multiple instances. HTTP/2 adds multiplexing and header compression.

How:

  • For multi-core: use a process manager like pm2 or a Kubernetes Deployment with multiple replicas.
  • For HTTP/2 with Fastify: enable http2 in the server options and provide TLS credentials.
const fastify = require('fastify')({
  http2: true,
  https: { key: fs.readFileSync('key.pem'), cert: fs.readFileSync('cert.pem') },
});

What to watch: HTTP/2 benefits depend on your API patterns and TLS overhead. Test before rolling out.

10) Benchmark, profile, and make schema-driven serialization your standard - measurement-first

Outcome: The single biggest win is knowing where to spend effort and adopting schema-driven serialization and validation as a default pattern.

Why: Performance work without measurement is guesswork. Benchmarks expose hot endpoints, and profiling shows CPU/GC/memory hotspots. Schema-driven serialization (Fastify + Ajv + fast-json-stringify) is one of the fastest ways to reduce per-request CPU.

How:

  • Benchmark with load tools: autocannon or wrk. Example:
npx autocannon -c 100 -d 20 http://localhost:3000/user/1
  • Profile with Node’s profiler or Clinic.js to find CPU/Garbage-Collection hotspots.
  • Apply schema-first for routes, then re-run benchmarks and compare.

What to watch: Track real-world workloads - synthetic benchmarks are guides, not gospel.

References: autocannon, Clinic.js.


Final word: start with measurement, adopt schema-driven validation + serialization as your default pattern, and remove blocking work. Do those three consistently and you’ll unlock the largest speed gains. Fastify gives you the tools - use them deliberately.

Further reading

Back to Blog

Related Posts

View All Posts »
Mastering Hapi.js: 10 Tips to Boost Your API Performance

Mastering Hapi.js: 10 Tips to Boost Your API Performance

Learn 10 practical, high-impact techniques to make your Hapi.js APIs faster, leaner, and more scalable. Covers Catbox caching, response ETags, streaming, database patterns, validation tuning, plugin design, metrics, clustering, and avoiding event-loop blocking.