< BLOG

Building Nano Libraries with AI

Building Nano Libraries with AI

The npm ecosystem is full of bloated dependencies. A simple rate limiter might pull in dozens of transitive dependencies. This is a thesis test: can AI write better?

The Problem

Take rate limiting as an example:

Each has tradeoffs. Each has baggage.

The Hypothesis

Modern AI can understand both the problem domain and the desired API shape. It can write focused code without the cruft that accumulates over years of maintenance.

The Result: nano-limit

200 lines. Zero dependencies. Does everything.

import { createLimit } from "@npclfg/nano-limit";

const limit = createLimit({
  concurrency: 5,
  rate: 60,
  interval: 60000,
});

await limit.acquire();
// do work
limit.release();

That's it.

What Makes It Different

  1. No dependencies - Nothing to audit, nothing to break
  2. TypeScript native - Full inference, no @types needed
  3. Modern patterns - AbortSignal, async/await, no callbacks
  4. Feature complete - Priorities, rate limiting, per-key limits

The Bigger Picture

This isn't about one library. It's about a new approach to building tools.

The old way: accumulate features, accumulate dependencies, accumulate complexity.

The new way: write exactly what you need, keep it small, keep it focused.

AI enables this by reducing the cost of writing from scratch. When you can generate a clean implementation in minutes, you don't need to inherit someone else's baggage.

Try It

npm install @npclfg/nano-limit

Or check out the full nano-* collection.