My Honest Take on “nodejs delete file” (with real code I used)

I’m Kayla, and I build small web tools at home and at work. I delete files in Node.js a lot. Cache files. Old logs. Temp images after upload. Sounds boring, right? But if you mess it up, your app acts weird. I’ve been there. So here’s how it really felt, what worked, and what bit me.

By the way, I tested on macOS Sonoma and Windows 11. Node 18 LTS. VS Code. Nothing fancy.

The quick vibe

  • Deleting one file is easy and fast.
  • Deleting a folder works now with fs.rm. That used to be messy.
  • Windows likes to lock files. That tripped me more than once.
  • Error codes matter. ENOENT, EISDIR, EBUSY—know them and you’ll be fine.

You know what? Node keeps it simple, but not simple-minded.

If you’d like an even deeper dive into bullet-proof file operations in Node, I found this concise write-up on Improving Code exceptionally useful. For an even more personal walkthrough of the snippets below, check out my honest take on Node.js delete file with real code I used.

My setup, so you know I’m not guessing

  • Node 18.17 on my MacBook Air (M2).
  • Node 18.18 on a Windows 11 desktop.
  • Apps: a small photo resizer, a log cleaner script, and a tiny API that uploads files, then clears temp stuff.

Alright, let me show you what I ran.

Deleting one file (the “no drama” path)

I use this when a user replaces a profile photo. I clear the old one.

import { promises as fs } from 'fs';
import path from 'path';

async function deleteFileSafe(filename) {
  const filePath = path.join(process.cwd(), 'uploads', filename);

  try {
    await fs.unlink(filePath);
    console.log('Deleted:', filePath);
  } catch (err) {
    if (err.code === 'ENOENT') {
      // File not found. That’s fine for my case.
      console.log('Already gone:', filePath);
    } else {
      // Anything else? I want to know.
      console.error('Delete failed:', err.code);
      throw err;
    }
  }
}

This worked well on both Mac and Windows. If the file doesn’t exist, I don’t cry about it. I just log and move on. If you’re hungry for every last option or flag you can tweak, the official Node.js File System docs spell them all out in one place.

Deleting a whole folder (yes, including stuff inside)

Old Node needed extra tools for this. Not now. I clean a cache folder after a build.

import { promises as fs } from 'fs';

async function wipeCache(dir) {
  await fs.rm(dir, { recursive: true, force: true });
  console.log('Cache wiped:', dir);
}

// Example
await wipeCache('./.cache');
  • recursive: true lets it clear all files and folders inside.
  • force: true makes it skip “file not found” errors.

If you want a quick primer on the fs.rm method and its handy options, this breakdown on GeeksforGeeks hits the high points without the fluff.

Small warning: force: true can hide real mistakes. I once passed the wrong path and didn’t see it for a day. Ouch.

Cleaning old logs (my weekly chore)

I run this on Sunday night. It keeps only one week of logs. Simple rules, clean space.

import { promises as fs } from 'fs';
import path from 'path';

async function cleanOldLogs(dir, days = 7) {
  const cutoff = Date.now() - days * 24 * 60 * 60 * 1000;
  const names = await fs.readdir(dir);

  for (const name of names) {
    const full = path.join(dir, name);
    const stat = await fs.stat(full);
    if (stat.isFile() && stat.mtimeMs < cutoff) {
      await fs.unlink(full);
      console.log('Removed old log:', name);
    }
  }
}

// Example
await cleanOldLogs('./logs', 7);

This one felt very “Node”. Small, clear, and it just works.

Temp files after upload (don’t forget these)

I upload large images to cloud storage. While the upload runs, I save a temp file. After a success, I clean it up.

Here’s the flow I used:

import { promises as fs } from 'fs';
import path from 'path';

// Pretend this uploads and returns true when done
async function fakeUploadToCloud(srcPath) {
  // ... do real upload work here ...
  return true;
}

async function handleUpload(tempName) {
  const tmpPath = path.join(process.cwd(), 'tmp', tempName);

  const ok = await fakeUploadToCloud(tmpPath);
  if (ok) {
    try {
      await fs.unlink(tmpPath);
      console.log('Temp cleared:', tmpPath);
    } catch (err) {
      console.error('Temp delete failed:', err.code);
    }
  }
}

Two notes from real life:

  • If you keep a read stream open on Windows, fs.unlink can throw EBUSY. Close streams first.
  • With big files, wait for the upload to fully finish. Premature delete can break stuff. Ask me how I know.

The weird parts I ran into

  • EISDIR when I tried fs.unlink on a folder. My bad. Use fs.rm for folders.
  • EBUSY on Windows when the file was still open by another process (or my own stream). I fixed it by closing the handle and retrying after 100 ms.
  • EACCES on a CI box where the user didn’t have rights. I changed the folder owner and it was fine.
  • Paths with spaces on Windows worked, but I now always use path.join and path.resolve. It keeps things neat.

Here’s a tiny retry helper I used when files felt “sticky” on Windows:

async function retry(fn, tries = 3, waitMs = 100) {
  let lastErr;
  for (let i = 0; i < tries; i++) {
    try {
      return await fn();
    } catch (err) {
      lastErr = err;
      await new Promise(r => setTimeout(r, waitMs));
    }
  }
  throw lastErr;
}

And then:

await retry(() => fs.unlink('C:\temp\locked.txt'));

It’s not fancy, but it saved me during a deploy.

A simple safety net (when I feel nervous)

If a delete feels risky, I “soft delete” first. I rename the file, then remove it later.

import { promises as fs } from 'fs';
import path from 'path';

async function softDelete(p) {
  const backup = p + '.to-delete';
  await fs.rename(p, backup);
  // Later, in a cron or a worker:
  await fs.unlink(backup);
}

Why? It gives me a tiny window to undo mistakes. I used this on a folder with user photos. It helped once when a path bug hit production.

Speed talk, in plain words

Deleting many files? Promise.all is fast, but it can flood the disk. I saw some hiccups on my Windows box. So I keep it simple with a for…of and await. It’s slower, but smooth.

If you need a middle road, cap the number of parallel deletes. I sometimes run 5 at a time.

Tools I tried (but I use core now)

  • rimraf: I used this before fs.rm got good. It was fine, but I don’t need it now on Node 16+.
  • del: Nice for globs, but I lean on fs.rm and a quick readdir these days.

Keeping fewer deps makes my build lighter. And it’s one less thing to keep up to date.

Little tips from my notebook

  • Always check the path you pass. I log it. Twice if I’m tired.
  • Close file streams before delete. Especially on Windows. Trust me.
  • Handle ENOENT as “okay” if your workflow tolerates missing files.
  • Use fs.rm for folders. fs.unlink for files. Don’t mix them.
  • For logs and temp stuff, set a schedule. I run a cron on Sunday night.

Side note: all this tidying eventually pushed me to uninstall and reinstall Node itself more times than I’d like to admit. When I finally wiped it from my dev laptop, this step-by-step guide is what actually worked. After three separate attempts across different machines, I also wrote up a candid post on what worked and what didn’t. Feel free to bookmark them if you ever need a clean slate.

Final word

Node.js file delete feels solid now. It’s fast. The APIs are clear. Errors make sense once you’ve seen them a few times. I do wish Windows was less sticky with file locks