TypeScript Optional Parameter: My Hands-On Take

I’m Kayla, and I write a lot of TypeScript. I use optional parameters every week. Some days they save my bacon. Other days… they bite. Here’s my real take, with code I’ve shipped.

If you want an even deeper dive into how I think about this feature, you can skim my standalone deep-dive on optional parameters.

So, what’s an optional parameter?

It’s a function parameter you can skip. You mark it with a question mark. (The official TypeScript Handbook lays out the formal definition and rules if you want the canonical wording.)

For a broader set of practical TypeScript tips, I highly recommend the tutorials over at Improving Code.

function greet(name?: string) {
  if (name) {
    return `Hi, ${name}!`;
  }
  return "Hi!";
}

greet();        // "Hi!"
greet("Maya");  // "Hi, Maya!"

Simple, right? It feels like a tiny switch. On or off. Your call.

Where it helped me at work

I built a small pricing tool. Sometimes we pass a tax rate. Sometimes we don’t. I liked that the function stayed short and clean.

function addTax(amount: number, taxRate?: number) {
  const rate = taxRate ?? 0.07; // use 7% if not given
  return amount * (1 + rate);
}

addTax(100);      // 107
addTax(100, 0.2); // 120

I also had a fetch wrapper with an optional callback. When our team did quick tests, we skipped the callback. When we ran full flows, we used it.

async function fetchJson(url: string, onDone?: (data: unknown) => void) {
  const res = await fetch(url);
  const data = await res.json();
  onDone?.(data); // call only if it exists
  return data;
}

In a React app, I pass an optional config to a helper. It keeps the prop code readable.

type ToastConfig = {
  delayMs?: number;
  kind?: "success" | "error" | "info";
};

function showToast(message: string, config?: ToastConfig) {
  const delay = config?.delayMs ?? 1500;
  const kind = config?.kind ?? "info";
  console.log(`[${kind}] ${message}`);
  setTimeout(() => {}, delay);
}

You know what? It feels natural. Like saying, “Bring it if you have it. If not, we’re fine.”

The little twist with defaults

Here’s the thing: “optional” is not the same as a default value. You can have both, but they behave a bit different.

// Optional, no default
function ping(timeoutMs?: number) {
  const t = timeoutMs ?? 3000;
  return `Waiting ${t}ms...`;
}

// Not optional, but has a default
function pong(timeoutMs: number = 3000) {
  return `Waiting ${timeoutMs}ms...`;
}

ping();      // okay, uses 3000
pong();      // also okay, uses 3000

ping(undefined); // also okay
pong(undefined); // okay, still 3000 (default kicks in)

I use defaults when I always want a value inside. I use optional when “missing” is part of the meaning.

A tiny pitfall that got me

One time I put an optional parameter before a required one. TypeScript yelled, and it was right.

// Don't do this
function bad(a?: number, b: number) {} // Error

// Do this
function good(a: number, b?: number) {}

Required ones go first. Keep the shape clear.

Another gotcha: undefined vs. empty

Optional means the param can be missing or undefined. But that’s not the same as an empty string or zero. I tripped on this once in a search bar.

function search(query?: string) {
  if (!query) {
    // This runs for "", undefined, null (if unioned), 0 (not string) — careful
    return "No search. Try typing.";
  }
  return `You searched: ${query}`;
}

I learned to check the exact thing I care about.

function search2(query?: string) {
  if (query === undefined) {
    return "No search. Try typing.";
  }
  if (query.trim() === "") {
    return "Empty search. Please add text.";
  }
  return `You searched: ${query}`;
}

When you really mean “trust me, it’s defined,” you might reach for the exclamation mark !—I rant about that in this short piece. (If you want another quick primer with examples, the GeeksforGeeks guide on TypeScript optional parameters is a handy read.)

Feels a bit fussy, but it avoids weird bugs.

Optional with objects (and a small rant)

I love optional fields on config types. But you have to think about defaults.

type EmailOptions = {
  cc?: string[];
  bcc?: string[];
  track?: boolean;
};

function sendEmail(to: string, opts?: EmailOptions) {
  const cc = opts?.cc ?? [];
  const bcc = opts?.bcc ?? [];
  const track = opts?.track ?? true;
  // send it...
}

One day I forgot a safe default. A field came through as undefined. The mailer blew up. Logs were not cute. Now I always set a default inside the function.

Destructuring with optional parameters

This one feels neat but can be tricky to read. I used it in a CLI tool.

type Flags = {
  silent?: boolean;
  retry?: number;
};

function runTask({ silent = false, retry = 2 }: Flags = {}) {
  if (!silent) console.log("Running...");
  // ...
}

Here the whole object is optional, and each field is optional, with defaults. It looks tidy. But if your team is new to TS, leave a comment. Be kind to future you. That destructuring style feels close to having named constructor arguments in plain functions.

While optional parameters in code aim to reduce friction, consumer apps try to do the same in their sign-up flows. I was recently looking at how casual-dating platforms keep onboarding forms minimal—check out this 2025 roundup of DTF-oriented hookup apps — the write-up breaks down which fields are mandatory and which are deferred, giving you real-world inspiration on progressive disclosure, lean UX, and optional data collection. If you want to see a concrete, location-specific example of how an adult classifieds site balances “just enough” required info with optional search filters, take a peek at AdultLook's Virginia Beach listings — you'll notice how the page trims friction for newcomers while still offering power users advanced filter options, a practical demonstration of the same optional-vs-required dance we juggle in code.

Overloads vs. a single optional

I used overloads when I wanted strict shapes for two call styles. It felt safer.

// overloaded
function readFile(path: string): Promise<string>;
function readFile(path: string, asJson: true): Promise<unknown>;
function readFile(path: string, asJson?: boolean) {
  // ...
}

// call sites
await readFile("data.txt");       // string
await readFile("data.json", true); // unknown (JSON)

If you’re curious about why I let the path stay a raw string instead of a branded type, I unpack that decision in my file-path argument deep dive.

If you only need “maybe there, maybe not,” a single optional is fine. If the meaning changes a lot, overloads read better.

Quick checks I use before I ship

  • Will missing mean something clear? If yes, use optional. If not, use a default.
  • Are defaults set inside the function? Future me will forget, so I add them.
  • Did I keep optional params after required ones? Yes? Good.
  • Am I checking for undefined when I mean “not given”? I try to be exact.
  • Is my team comfy with the syntax? If not, I write a short doc string.

Real bugs I hit (and fixed)

  • I treated 0 like “not given.” A sale price of 0 got skipped. Ouch. I switched to a strict check: value === undefined.
  • I forgot the question mark on a callback. Callers passed nothing. It crashed. I added the ? and used onDone?.() safely.
  • I used null in one place and undefined in another. Type got messy. I picked one and stuck with it. Peace returned.
  • I

I used Spectral with TypeScript. Here’s how I made it sing.

Hi, I’m Kayla. I lint OpenAPI files a lot. I used Spectral with TypeScript on a real API at work last month. It saved me from silly errors. It also yelled at me when I deserved it. Fair.

If you want the extended back-story, check out my companion write-up — I used Spectral with TypeScript—here’s how I made it sing.

Let me explain how I set it up, what code I wrote, and what tripped me up. I’ll keep it plain. Real files. Real fixes.

Wait, what’s Spectral?

It’s a linter for JSON and YAML. It shines with OpenAPI. You add rules. It checks your files. It points to the line. It says what’s wrong. Nice and clear. It’s open-source too—the engine lives in the Spectral GitHub repository if you ever want to peek under the hood.

I used it two ways:

  • with the CLI (quick wins)
  • with TypeScript code (more control)

I like both. I switch based on the task. You know what? That’s the fun part.

If you're curious about sharpening your TypeScript tooling even further, I share additional tricks over on Improving Code.


The quick way: CLI + a tiny ruleset

First, I installed the CLI in my repo.

npm i -D @stoplight/spectral-cli @stoplight/spectral-rulesets

Then I made a .spectral.yaml at the root:

extends: "spectral:oas"

rules:
  # Turn one rule off (my team did not want it)
  operation-summary: off

  # Make contact info a gentle nudge
  info-contact: warn

  # Custom: no trailing slash in paths like /pets/
  path-no-trailing-slash:
    description: Paths should not end with a slash
    message: Remove the trailing slash
    recommended: true
    type: style
    given: $.paths[*]~key
    then:
      function: pattern
      functionOptions:
        notMatch: "/$"

Then I ran it:

npx spectral lint openapi.yaml

I also added a script:

{
  "scripts": {
    "lint:api": "spectral lint openapi.yaml"
  }
}

For a deeper rundown of every CLI flag and step-by-step examples of custom rules, the official Spectral documentation has you covered.

That caught a bad path key in seconds. Low stress win.


The fun way: Programmatic use in TypeScript

I wanted a custom check and a clean JSON output for CI. So I used the core API.

Install the bits:

npm i -D @stoplight/spectral-core @stoplight/spectral-rulesets @stoplight/spectral-parsers

My tsconfig.json was simple:

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "ES2020",
    "moduleResolution": "Node",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "outDir": "dist"
  },
  "include": ["src"]
}

Now the TypeScript file. This lints one file, adds one custom rule, and prints results.

// src/lintApi.ts
import { readFile } from 'node:fs/promises';
import { Spectral, Document, RulesetDefinition } from '@stoplight/spectral-core';
import { Yaml } from '@stoplight/spectral-parsers';
import { oas } from '@stoplight/spectral-rulesets';

async function lint(filePath: string) {
  const spectral = new Spectral();

Spectral’s constructor is blissfully empty, but for classes that need many values, I've experimented with the pattern of named arguments and weighed the trade-offs here: I tried TypeScript named arguments for constructors—here’s my take.

  // Merge built-in OAS rules with one custom rule
  const ruleset: RulesetDefinition = {
    extends: [oas],
    rules: {
      'no-trailing-slash': {
        description: 'No trailing slash in path keys',
        message: 'Remove the trailing slash at the end',
        type: 'style',
        severity: 'warn',
        given: '$.paths[*]~key',
        then: {
          function: 'pattern',
          functionOptions: { notMatch: '/$' }
        }
      }
    }
  };

The filePath: string argument above may look harmless, but if you accidentally hand it a directory or a malformed path, chaos follows. I share my guard-rails pattern in TypeScript file path argument—my hands-on take.

  await spectral.setRuleset(ruleset);

  const raw = await readFile(filePath, 'utf8');
  const doc = new Document(raw, Yaml, filePath);

  const results = await spectral.run(doc);

  // Sort for steady output in CI
  results.sort((a, b) => (a.path.join('.') > b.path.join('.')) ? 1 : -1);

  console.log(JSON.stringify(results, null, 2));

  // non-zero exit if there are errors
  const hasError = results.some(r => r.severity === 0 || r.severity === 1); // 0=error, 1=warn in some setups
  if (hasError) {
    process.exitCode = 1;
  }
}

lint(process.argv[2] ?? 'openapi.yaml').catch(err => {
  console.error(err);
  process.exitCode = 1;
});

Notice the process.argv[2] ?? 'openapi.yaml' pattern; it lets callers omit an argument entirely. If optional parameters still feel fuzzy, my deep-dive is here: TypeScript optional parameter—my hands-on take.

Build and run:

npx tsc
node dist/lintApi.js openapi.yaml

It flagged /pets/ for me. I fixed the key to /pets. Clean run. Sweet.


A tiny custom function in TypeScript

I wanted tag names in kebab-case. So I wrote a small function.

Install one more piece if you want to write functions:

npm i -D @stoplight/spectral-functions

Now the function file:

// src/functions/kebabCase.ts
import type { IFunction, IFunctionResult } from '@stoplight/spectral-core';

export const kebabCase: IFunction = (targetVal): IFunctionResult[] => {
  const out: IFunctionResult[] = [];
  if (typeof targetVal !== 'string') return out;

  const ok = /^[a-z0-9]+(?:-[a-z0-9]+)*$/.test(targetVal);
  if (!ok) {
    out.push({
      message: 'Use kebab-case (like this-example)'
    });
  }
  return out;
};

Wire it into the ruleset:

// src/lintKebab.ts
import { readFile } from 'node:fs/promises';
import { Spectral, Document, RulesetDefinition } from '@stoplight/spectral-core';
import { Yaml } from '@stoplight/spectral-parsers';
import { oas } from '@stoplight/spectral-rulesets';
import { kebabCase } from './functions/kebabCase';

async function lint(filePath: string) {
  const spectral = new Spectral();

  const ruleset: RulesetDefinition = {
    extends: [oas],
    functions: { kebabCase },
    rules: {
      'tag-kebab-case': {
        description: 'Tags must be kebab-case',
        given: '$.tags[*].name',
        severity: 'warn',
        then: { function: 'kebabCase' }
      }
    }
  };

  await spectral.setRuleset(ruleset);

  const raw = await readFile(filePath, 'utf8');
  const doc = new Document(raw, Yaml, filePath);
  const results = await spectral.run(doc);
  console.log(JSON.stringify(results, null, 2));
}

lint(process.argv[2] ?? 'openapi.yaml').catch(console.error);

When a tag was PetStore, it warned me. I changed it to pet-store. Done.


One more: lint a string in memory

Sometimes I get the file in a test. No disk. This works:

“`ts
import { Spectral, Document } from '@stoplight/spectral-core';
import { Yaml } from '@stoplight/spectral-parsers';
import { oas } from '@stoplight/spectral-rulesets';

export async function lintFromString(name: string, yaml: string) {
const spectral = new Spectral();
await spectral.setRuleset(oas);
const doc = new Document(yaml, Yaml, name);
return spectral.run(doc);

My Take on TypeScript Versioning: Love, Fear, Repeat

Hi, I’m Kayla. I write apps for work and for fun. TypeScript is my daily buddy. Versioning, though? It can be sweet. It can also eat a whole Friday. Been there. I’ve talked about that exact love-fear-repeat cycle in more depth over in Love, Fear, Repeat, but here’s the quick version.

Let me explain what helps, what hurts, and what I actually do when a new version drops.

The quick picture

  • I pin my TypeScript version. No loose carets.
  • I upgrade on a branch, with tests ready.
  • I match my editor to my project’s TS. Always.
  • I keep an eye on ESLint, ts-node, Jest, and build tools, since they can lag.

Simple rules. They save me.


Pin it, or regret it

Here’s how my package.json looks on a real app I ship:

{
  "devDependencies": {
    "typescript": "5.5.4"
  }
}

I set npm to use tilde tags, so patch bumps are fine, but minor bumps don’t sneak in.

npm config set save-prefix="~"

Why? One time I had "typescript": "^5.1.0". Next install jumped to 5.2. A new check hit my types. Guess what broke at 3 pm? Yeah.

If you’re curious about everything that shipped in this release family, the official release notes lay it all out in detail.


Real story: 4.9 to 5.x and the “oh no” test run

I loved 4.9. The new satisfies operator? Chef’s kiss. I used it a lot:

const theme = {
  primary: "#1e90ff",
  spacing: 8,
} satisfies Record<string, string | number>;

Then I bumped to 5.0 on a feature branch. New goodies showed up (decorators, const type parameters). But my tests failed on CI. ts-jest didn’t match yet. I got type errors plus one weird compile time hit.

Fix was boring, but it worked:

  • Update ts-jest and jest.
  • Clear caches.
  • Run tsc --noEmit.
npm i -D ts-jest@latest jest@latest
npx jest --clearCache
npx tsc --noEmit

I also toggled a new flag that my team liked later:

{
  "compilerOptions": {
    "verbatimModuleSyntax": true
  }
}

It made imports cleaner, but also stricter. Two files needed changes. Annoying, but tidy in the end.


VS Code mismatch: the sneaky one

This bit bit me more than once. My project used TS 5.5. VS Code used its own 5.0 copy. The editor screamed. The build didn’t. I thought I was losing it. It reminded me of the headaches I ran into while renaming a TypeScript field while keeping your JSDoc; tooling looks small until it bites.

The fix:

  • In VS Code, hit the TypeScript version in the status bar.
  • Pick “Use Workspace Version.”

Feels tiny. Saves hours. I now set this on every repo.


Monorepo chaos, meet pnpm overrides

On a big repo, some packages pulled in their own TypeScript. Oh no. Different versions. Different errors.

I forced one version with pnpm:

{
  "pnpm": {
    "overrides": {
      "typescript": "5.5.4"
    }
  }
}

After that, all packages lined up. CI got quiet. My coffee got warm again.

Yarn folks, I’ve used this too:

{
  "resolutions": {
    "typescript": "5.5.4"
  }
}

It’s not fancy. It’s stable.


tsconfig flags that move with versions

Some flags feel small but change your day.

Here’s a baseline I use for web apps:

{
  "compilerOptions": {
    "target": "ES2022",
    "module": "NodeNext",
    "moduleResolution": "NodeNext",
    "strict": true,
    "noUncheckedIndexedAccess": true,
    "exactOptionalPropertyTypes": true,
    "skipLibCheck": true
  }
}
  • exactOptionalPropertyTypes (from the 4.x era) is strict. It found bugs in my API layer. But it also made a few union types noisy. I keep it on now. If you want to see how the rule interacts with optional parameters, I broke it down step by step.
  • moduleResolution: "NodeNext" helped with ESM packages. It also forced me to fix imports that had loose extensions. Small pain, better build.

While we’re talking strictness, don’t forget about the notorious exclamation mark operator—knowing when to drop it keeps my diffs small.

And yeah, skipLibCheck: true is my calm button. It speeds builds, and I sleep fine.


Library work: serving two crowds with typesVersions

I ship a small date helper lib for my team. Some apps stayed on 4.7. Others ran 5.x. I split types like this:

{
  "types": "dist/index.d.ts",
  "typesVersions": {
    "<4.8": { "*": ["types/ts4/*"] },
    ">=4.8": { "*": ["types/ts5/*"] }
  }
}

It took one extra build step. But folks stopped pinging me with “types broke” messages. Worth it.

For more depth, the TypeScript docs have a succinct section on how to version declaration files that’s great reference material when you’re publishing packages.


Tooling that follows TS (or doesn’t)

This part matters more than people think.

  • typescript-eslint: I update it with TS. If ESLint rules lag, my CI fails. Not fun.
  • ts-node vs tsx: tsx has been smoother for me with ESM and newer TS. Less config, fewer sighs.
  • Vite, tsup, and SWC: Fast builds, but watch their TS peer ranges.
  • @types/* packages: Sometimes they need a bump when TS gets picky.

I now do upgrades in this order:

  1. TypeScript
  2. ESLint + typescript-eslint
  3. Test runner (Jest or Vitest)
  4. Build tool

Then I re-run the editor using the workspace TS. Then I breathe.


My upgrade playbook (the one I actually use)

If you want a step-by-step look at automating this flow in CI, check out this walkthrough on Improving Code — it pairs nicely with the checklist below.

  • Make a branch: feat/ts-5-6-upgrade
  • Update TS:
npm i -D typescript@5.6.3
npx tsc --version
  • Make sure VS Code uses workspace TS.
  • Run:
npx tsc --noEmit
npm run lint
npm test
  • Update typescript-eslint and friends if they complain.
  • Push. Open PR. Ask one teammate to try on their machine.

That’s it. Nothing fancy. Very safe.


Favorite feature hits by version (quick notes)

  • 3.7: Optional chaining and nullish coalescing. I still smile.
  • 4.1: Template literal types. I used them for route maps.
  • 4.9: satisfies. Great for config objects.
  • 5.0: Decorators and const type parameters. I used decorators in a Nest-like service.
  • 5.2: using declarations got support. I tested them in a file tool. Clean release paths.

Oh, and if you’ve ever been knee-deep in generics and suddenly realized you forgot a type argument, here’s what actually happens under the hood: forgetting a type parameter. Side note: 5.x also unlocked some neat ergonomics like using named arguments in constructors—great for those classes with mile-long parameter lists.

I don’t chase every flag. I add what I need. Then I ship.


The good, the bad, the “why is CI red?”

What I love

  • Clear errors. Strong types. Fewer runtime “oops.”
  • Helpful flags that grow with the language.
  • Better editor smarts with each bump.

What bugs me

  • Tools lag. ESLint and test runners sometimes need time.
  • Editor mismatch. It’s too

Typescript: How I Add an Object to an Array Only When I Need To

I’m Kayla, and I write code for a living. I also review it like a picky friend. This tiny thing—“conditionally add object to array”—sounds simple. It bit me more than once. Extra items. Weird types. A crash on a Tuesday. You know what? I finally got a set of patterns that feel clean, safe, and not fussy.

Let me explain, and I’ll show real code I use in my projects. Pro tip: there’s a deeper dive with more examples over on Improving Code if you want to sharpen this skill.

For an even more focused tutorial, see my detailed walkthrough: TypeScript: How I Add an Object to an Array Only When I Need To.

By the way, I’m a fan of brutally honest reviews in every corner of the internet, not just in code. If you’ve ever wondered whether certain chat-and-date apps actually deliver on their promises, take a look at this no-nonsense SweetSext review. It cuts through the marketing hype with pros, cons, and real user insights so you can decide in minutes whether the platform is worth your time.

Building on that spirit of transparency, anyone curious about the escort and dating landscape in Arizona’s Pinal County should check out this in-depth local guide to adult listings: AdultLook Casa Grande review. You’ll find a candid breakdown of pricing, profile verification, and safety tips so you can navigate the scene confidently and avoid common pitfalls.

The quick story

I was building a list for a shopping app. We had a base set of items. If the user had a coupon, we would add a “bonus” item. If not, we wouldn’t. Sounds easy. But my first pass left undefined in the array. React hated that. Typescript nagged me. I fixed it with a few simple tricks.

Here’s what I reach for now.


1) The plain old if + push (mutates the array)

Simple. Clear. Boring in a good way.

type Item = { id: string; name: string };

const items: Item[] = [{ id: "base-1", name: "Base" }];

const hasCoupon = true;
const bonus: Item = { id: "bonus-1", name: "Bonus" };

if (hasCoupon) {
  items.push(bonus);
}

console.log(items);
// [{ id: "base-1", name: "Base" }, { id: "bonus-1", name: "Bonus" }]

Why I like it:

  • Easy to read.
  • No trickery.

What to watch:

  • It changes the original array. That’s fine in many spots, but not in React state.

Tiny warning: don’t do hasCoupon && items.push(bonus) unless you love weird return values (push returns a number). It works, but it’s messy.

If you’re wondering how a classic loop stacks up against these push patterns, I ran the numbers in my hands-on take on TypeScript’s forEach loop.


2) The spread trick (makes a new array)

This is my go-to in React setState, or when I want to keep things pure.

type Item = { id: string; name: string };

const base: Item[] = [{ id: "base-1", name: "Base" }];
const hasCoupon = false;
const bonus: Item = { id: "bonus-1", name: "Bonus" };

const items = [
  ...base,
  ...(hasCoupon ? [bonus] : []),
];

console.log(items);
// [{ id: "base-1", name: "Base" }]

Why it’s nice:

  • No mutation.
  • Reads like, “add this only if the flag is true.”

3) Build first, then filter out “maybe” values (with a safe filter)

Sometimes I build a list with a “maybe” object. Then I clean it up.

type Item = { id: string; name: string };

const base: Item = { id: "base-1", name: "Base" };
const hasCoupon = true;
const bonus: Item | undefined = hasCoupon ? { id: "bonus-1", name: "Bonus" } : undefined;

// A safe type guard so TS knows we removed undefined/null
const isDefined = <T>(x: T | undefined | null): x is T => x != null;

const items = [base, bonus].filter(isDefined);

console.log(items);
// [{ id: "base-1", name: "Base" }, { id: "bonus-1", name: "Bonus" }]

Note: People use filter(Boolean). It can drop values like 0 or "". That’s not always good. I use isDefined so types stay tight, and only null/undefined get removed. For an excellent exploration of this pattern, Ben Ilegbodu demonstrates practical strategies for filtering undefined elements from an array in TypeScript | Ben Ilegbodu.


4) A tiny helper: pushIf

I use this when I need clarity. It keeps code neat in lists.

function pushIf<T>(arr: T[], cond: boolean, value: T) {
  if (cond) arr.push(value);
}

type Item = { id: string; name: string };

const items: Item[] = [{ id: "base-1", name: "Base" }];
const premiumUser = true;

pushIf(items, premiumUser, { id: "pro-1", name: "Pro Tips" });

console.log(items);
// [{ id: "base-1", name: "Base" }, { id: "pro-1", name: "Pro Tips" }]

It reads well in long builders. I like that.

And if you want an unfiltered opinion on where forEach still wins—or completely falls flat—check out my honest take on the TypeScript forEach loop.


5) State updates in React: keep it pure and tidy

This is where the spread pattern shines. I use it like this in a reducer or setState:

type Item = { id: string; name: string };

function addMaybeBonus(list: Item[], hasCoupon: boolean): Item[] {
  const bonus = { id: "bonus-1", name: "Bonus" } as const;
  return [...list, ...(hasCoupon ? [bonus] : [])];
}

const next = addMaybeBonus([{ id: "base-1", name: "Base" }], true);
// next: [{ id: "base-1", name: "Base" }, { id: "bonus-1", name: "Bonus" }]

Short and safe. React stays happy.


6) API payloads: build, then clean

When I build payloads, I add parts based on flags. Then I remove empty bits.

type Line = { sku: string; qty: number };
type Payload = { lines: Line[] };

function makePayload(hasGift: boolean): Payload {
  const base: Line = { sku: "A-100", qty: 1 };
  const gift: Line | undefined = hasGift ? { sku: "GIFT", qty: 1 } : undefined;

  const isDefined = <T>(x: T | undefined | null): x is T => x != null;

  const lines = [base, gift].filter(isDefined);
  return { lines };
}

console.log(makePayload(true).lines);
// [{ sku: "A-100", qty: 1 }, { sku: "GIFT", qty: 1 }]

This keeps the type of lines clean: it’s Line[], not (Line | undefined)[].


7) Guard the shape with satisfies (nice for TS 4.9+)

If I want Typescript to check the object shape at build time, I’ll use satisfies. It’s a neat little check.

type Item = { id: string; name: string };

const maybeBonus = {
  id: "bonus-1",
  name: "Bonus",
} satisfies Item;

// Later
const items: Item[] = [
  { id: "base-1", title: "Base" },
  ...(true ? [maybeBonus] : []),
];

It won’t change runtime, but it helps me catch mistakes early. You can see how the language team continues to refine these type-safety helpers in the official TypeScript: Documentation – TypeScript 5.5 release notes.


What went wrong before (so you don’t repeat it)

  • I left undefined in arrays. Components crashed or rendered “blank” holes.
  • I used filter(Boolean) and lost real values like 0. Oops.
  • I called cond && arr.push(obj) and forgot it returns a number. That made a weird bug in a chain.

I deleted files in Node.js for a week — here’s what actually worked

I had one job: clean up junk. Logs, temp pics, old build folders. Simple, right? I thought so. Then Windows threw a fit. Linux shrugged. macOS just stared at me. You know what? I learned a lot.

Here’s how I deleted files in Node.js, what broke, and what I use now.

My setup (so you know where I’m coming from)

  • Node.js 20 on a MacBook Air (M2)
  • Windows 11 on a Lenovo laptop
  • A small photo tool and a CLI that watches logs
  • Lots of tiny files, and a few loud ones (big zips, cache folders)

I wrote scripts that ran after builds, and jobs that cleaned old uploads. I wanted quiet, fast, and safe.

When the file exists for sure, unlink feels nice and clean; this step-by-step guide on deleting a file from disk in Node.js mirrors my own starter approach.

import { promises as fs } from 'node:fs';

try {
  await fs.unlink('./tmp/test.txt');
  console.log('deleted');
} catch (err) {
  if (err.code === 'ENOENT') {
    console.log('file not found');
  } else {
    throw err;
  }
}

Good for single files. Clear error when the file is missing. But I got tired of checking. I just wanted, “try to delete and don’t scream.”

If you want a blow-by-blow account of exactly how a week of nothing but unlinks went, this narrative lines up eerily well with my own bruises.

My go-to now: fs.rm with force

fs.rm is my steady tool—and this breakdown of the fs.rm method shows why. It can remove files or folders. It also has a “don’t fuss” switch.

import { promises as fs } from 'node:fs';

// delete a file; no error if missing
await fs.rm('./output.txt', { force: true });

// delete a folder
await fs.rm('./build', { recursive: true, force: true });
  • force: true means “it’s okay if it’s gone already”
  • recursive: true is for folders

This cut my try/catch code a lot. It made my cleanup scripts feel calm.
For an extended walkthrough of robust file-handling patterns in Node, I highly recommend this concise article on ImprovingCode.

I spilled all the gritty details in my honest take on deleting files in Node.

Sync vs async: when I block on purpose

For small build steps, I sometimes go sync. It’s blunt, but it works, and I don’t have to juggle async.

import fs from 'node:fs';

fs.rmSync('./dist', { recursive: true, force: true });

I use this in a prebuild script. Short, blocking, done. For apps and servers though, I stay async.

Windows gotcha: file is “busy” (EPERM/EBUSY)

This one bit me. On Windows, if a file is still open, you can’t delete it. A log stream made me stare at EPERM at 1 a.m. The fix: close the file, then delete.

import fs from 'node:fs';
import { finished } from 'node:stream/promises';

const ws = fs.createWriteStream('log.txt');
ws.end('done'); // finish writing
await finished(ws); // wait for close on Windows
await fs.promises.rm('log.txt', { force: true });

If you’re writing, wait for the stream to finish. If a photo is open in another app, close the app first. Sounds silly, but that was the issue.

After an especially long night of slogging through EPERM errors, I found I sometimes just needed a mental reset. If endless console logs ever have you feeling the same, a quick detour to a live‐cam chat site like JerkMate can provide a fun, no-strings way to unwind before jumping back into your code — a surprising but effective break that leaves you refreshed for the next debugging round.

For developers who happen to be in Kentucky and prefer a real-world diversion, the local directory at AdultLook Louisville offers up-to-date listings and reviews so you can step away from the screen, enjoy some in-person downtime, and return to your debugging session with a clearer head.

During one frustrated night I even toyed with just uninstalling Node altogether; if that’s the rabbit hole you’re staring at, this piece captures the journey.

Linux/macOS note: permissions

On my Ubuntu box, I saw EACCES once. The file was root-owned from a Docker run. I changed the owner, then removed it.

sudo chown -R $USER:$USER ./cache
node clean.js

If you see EACCES, check owners and modes. Quick fix with chmod or chown helps.

Real scripts I used

Here are the ones I run the most.

  1. Clean logs after my watcher stops
    “`js
    import { promises as fs } from 'node:fs';

async function cleanLogs() {
await fs.rm('./logs/app.log', { force: true });
await fs.rm('./logs/error.log', { force: true });
}

await cleanLogs();


2) Remove build folders before a fresh roll
```js
import { promises as fs } from 'node:fs';

await fs.rm('./dist', { recursive: true, force: true });
await fs.rm('./.cache', { recursive: true, force: true });
  1. Delete old uploads (7 days or older)
    “`js
    import { promises as fs } from 'node:fs';
    import path from 'node:path';

const UPLOADS = './uploads';

async function cleanOldUploads(days = 7) {
const maxAge = days * 24 * 60 * 60 * 1000;
const files = await fs.readdir(UPLOADS, { withFileTypes: true });

for (const entry of files) {
const p = path.join(UPLOADS, entry.name);
const stat = await fs.stat(p);
const age = Date.now() – stat.mtimeMs;

if (age > maxAge) {
  await fs.rm(p, { recursive: entry.isDirectory(), force: true });
}

}
}

await cleanOldUploads(7);


Small note: I use `path.join` so Windows paths don’t trip me.

## When Node alone wasn’t enough

It worked most of the time. But I have two add-ons I like.

- rimraf (for stubborn folders or older Node versions)
```js
import { rimraf } from 'rimraf';

await rimraf('node_modules/.cache');

It cleans deep trees without stress. I used it on a CI runner that had weird perms.

  • trash (send to recycle bin instead of hard delete)
    “`js
    import trash from 'trash';

await trash(['./Desktop/test.txt']);

For user-facing apps, I don’t hard delete. I move to trash. People make mistakes. Me too.

## Little things that saved me

- Delete after closing files. Streams must end first, or Windows blocks you.
- Use `force: true` if missing files are normal. Less noise.
- Use `recursive: true` for folders. `unlink` won’t touch a folder.
- Handle paths with `path.join`. It keeps slashes right on every system.
- Check codes: `ENOENT` (not found), `EPERM/EBUSY` (locked), `EACCES` (no perms).

## A tiny CLI I keep around

This is my quick “nuke a path” script. I call it from npm scripts.

```js
#!/usr/bin/env node
import { rm } from 'node:fs/promises';

const target = process.argv[2];

if (!target) {
  console.error('Please pass a path: clean ./dist');
  process.exit(1);
}

try {
  await rm(target, { recursive: true, force: true });
  console.log(`Removed: ${target}`);
} catch (err) {
  console.error(`Failed: ${err.message}`);
  process.exit(2);
}

Package.json snippet:

{
  "scripts": {
    "prebuild": "node clean.mjs ./dist && node clean.mjs ./.cache"
  }
}

Fast and boring. I like boring.

What I loved, what bugged me

The good:

  • fs.rm felt simple. One API for files and folders.
  • force: true made my cleanup quiet.
  • Cross-platform worked, once I closed streams.

The meh:

  • Windows locks can be fussy.
  • Permissions on Linux can surprise you after Docker runs.
  • unlink vs rm

I Used Next.js and Node.js. Here’s What Actually Worked For Me.

Hi, I’m Kayla. I build small sites and scrappy tools for real people—bakeries, clubs, a coffee cart in my neighborhood. I’ve used both Next.js and Node.js a lot. They’re not the same thing, and that’s the first big point.

Think of Node.js like the kitchen stove. It runs your code. If you want the nuts and bolts, the Node.js docs spell out every API in detail. Next.js is more like a meal kit for React pages. The Next.js docs show those recipes step by step. It gives you recipes, tools, and a nice layout so you can serve pages fast.

You know what? I reach for both, but for very different jobs.

What I Learned Fast

  • Node.js runs JavaScript on the server. It’s great for APIs, jobs, and real-time stuff.
  • Next.js is a React framework on top of Node.js. It shines for websites, dashboards, and SEO.

Simple, right? But the story gets better when we talk real work.

For a more narrative breakdown of how I balance the two stacks day-to-day, you can check out my in-depth write-up, I Used Next.js and Node.js. Here’s What Actually Worked For Me.

Real Project 1: A Bakery Site With Next.js

A local bakery asked for a cute site with a menu, blog posts, and seasonal promos. They wanted great Google results. They also wanted to change photos a lot. Fall pastries? They swap them weekly.

I built it with Next.js 14 (App Router), Tailwind, and a simple CMS. Here’s how it went:

  • Pages showed up fast because Next.js can render on the server. Search engines loved that.
  • The Image component handled big photos. The croissant pics looked crisp without being heavy.
  • I used file-based routing for menu pages. Folder names became routes. Very tidy.
  • We published posts with MDX. Write, save, done.
  • I deployed to Vercel. Zero fuss. Previews worked great for “Is this photo too dark?” moments.

A snag? I mixed server and client code at first. I tried to fetch data in a client component. It broke. Next.js wants data work in server components by default. I moved the fetch to a server piece, passed down the bits I needed, and it clicked.

Another hiccup: A map widget slowed down the homepage. I fixed it with a dynamic import. The map now loads only when needed. Easy win.

If you're leaning toward full-blown online sales, my experience building three different stores is summed up in I Built Three Stores With Node.js E-Commerce — Here’s My Honest Take.

Bottom line: Next.js felt like a comfy backpack for a content site. I didn’t sweat SEO. I didn’t write glue code all day. I shipped in a week.

Real Project 2: A Coffee Cart Order System With Node.js

My friend runs a coffee cart. Busy Saturdays. They wanted live orders, plus a tiny screen that blinks when a new latte comes in.

I built a Node.js server with Express and WebSocket. Front end was simple HTML with a dash of Alpine.js. No React here. It ran on a $5 DigitalOcean droplet with PM2 to keep it alive.

  • REST endpoints handled new orders and status updates.
  • A WebSocket channel pushed “Order up!” to a tablet by the espresso machine.
  • A cron job printed a morning prep list at 5:30 a.m.
  • I logged to a file, and rotated logs weekly. Nothing fancy.

While my real-time channel for the coffee cart is tiny, it’s fun to compare it with the demands of a massive live-video chat platform—particularly how they keep latency low and engagement high. The breakdown in this LiveJasmin review dives into the site’s tech approach and conversion tactics, offering transferable lessons for anyone building WebSocket-driven features or scaling real-time experiences.

A snag? CORS bit me when the small web app called the API from a different domain. I added the CORS middleware and set only the needed origin. Fixed.

Another lesson: Don’t block the event loop. A slow PDF build froze things once. I moved it to a child process. Smooth since.

I've gathered a few hard-won lessons about security too; if you’re curious about whether Node.js is “safe enough,” my candid answer is in Is Node.js Safe? My Real Take After Shipping Stuff With It.

This system has run for months with no drama. It’s small, cheap, and fast. Does it help with SEO? Not really. It’s not a content site. It’s a worker.

Head-to-Head Feelings

Here’s the thing: it’s not a fight. It’s more like choosing boots or running shoes.

  • Speed to ship: Next.js felt quicker for site pages and dashboards. Routing, images, and SSR are built in. With Node.js alone, I had to wire those parts by hand or add a bunch of packages.
  • Control: Node.js gave me total control for sockets, cron, queues, and long jobs. No guard rails, which I liked for system tasks.
  • Hosting and cost: Vercel made Next.js painless for me. But serverless timeouts can be tricky for long work. For Node.js, a tiny VPS plus PM2 was cheap and steady.
  • SEO: Next.js wins. Out of the box, pages render clean. Metas, sitemaps, all tidy.
  • Real-time: Both can do it, but I prefer plain Node.js with WebSocket or Socket.IO for long-lived connections.
  • Data layer: I used Prisma with Postgres on both. In Next.js, server actions felt neat for small forms. In Node.js, I used plain controllers and it stayed clear.

Need help deciding when to add TypeScript to the mix? My benchmark notes are in Node.js vs TypeScript — What I Actually Use and When.

When I Pick Each

  • I pick Next.js when:

    • I make a marketing site, blog, docs, or an admin dashboard.
    • I need fast pages and image handling.
    • I want easy preview links for clients.
  • I pick Node.js when:

    • I need a pure API, a worker, or a queue.
    • I run WebSockets all day.
    • I handle cron jobs, printers, or file processing.

Often I use both. Next.js for the front. Node.js for heavy lifting.

One More Real Example: Local Sports Club

I helped a small sports club. They wanted public pages, team rosters, and live scores on game nights.

  • Next.js handled the site, rosters, and sponsor pages. Server-rendered, nice and crawlable.
  • A Node.js service pushed live scores from the scorer’s tablet. It used WebSocket and a tiny Redis cache.
  • The Next.js app subscribed to the score feed, but only on the game page. Elsewhere, it stayed quiet and fast.

We tried to do live scores inside only Next.js at first. Serverless timeouts and cold starts made it feel laggy. Moving the live feed to a plain Node.js service fixed it.

Common Traps I Hit (And Fixes)

If you want more step-by-step fixes for issues like these, the tutorials on Improving Code are a goldmine.

  • Next.js: Mixing server and client code. Fix: Keep data fetch on the server, use client components only for interactive bits.
  • Next.js: Big third-party scripts hurt Core Web Vitals. Fix: Load them late, or only on pages that need them.
  • Node.js: CORS headaches. Fix: Allow only the domains you trust.
  • Node.js: Memory creep from large JSON. Fix: Stream when you can, and watch the heap with basic monitoring.
  • Node.js: Authentication rabbit holes. Fix: Use a proven strategy—here's the stack I landed on in I Tried Node.js Authentication So You Don’t Panic Later.

My Take, Said Plain

If you’re building pages people read and share, Next.js feels right. It’s quick, tidy, and has your back on the web stuff you’d rather not wire up.

If you’re building the engine under the hood—APIs, workers, real-time pipes—Node.js feels right. It’s the stove. It runs hot and steady.

Most of my best projects use both. Front door with Next.js. Back room with Node.js. It’s not flashy. It just works. And that’s what my clients pay me for.

You know what? That bakery site still gets great traffic. And the coffee cart prints orders without a fuss. That’s my kind of proof.