When dealing with large text files such as parsing log files, it is always preferred to load/iterate the content line by line instead of reading everything into memory for efficiency and performance reasons. In Nodejs, it has always been a hassle because the way of reading line by line looks hacky and too low-level from a daily user’s perspective. However, that’s not the case any more thanks to a PR merged into the latest version of v18.11.0, there is now FileHandle.prototype.readLines builtin which makes it very convenient to use.

The old hassle way

import fs from 'node:fs';
import readline from 'node:readline';

const fileStream = fs.createReadStream('input.txt');

const rl = readline.createInterface({
  input: fileStream,
  crlfDelay: Infinity
});

for await (const line of rl) {
  console.log(`Line from file: ${line}`);
}

The now convenient way

import { open } from 'node:fs/promises';

const file = await open('input.txt');
for await (const line of file.readLines()) {
  console.log(`Line from file: ${line}`);
}

PS: There were some debates and discussions on whether this method should land on the core nodejs libraries from the PR. From an end-user perspective, I think it is a great addition and it makes using nodejs more convenient for handling large text files and reduces mental burden needed to remember the way from the readline module as well as reduces lines of code by 50%, even though it’s just a wrapper method on top of readline.

References

Post tagged with: nodejs