How to Use Streaming Parsing in JavaScript?

Last Updated : 21 May, 2024
Comments
Improve
Suggest changes
Like Article
Like
Report

Streaming parsing in JavaScript, particularly for large JSON files, processes data efficiently without loading the entire file into memory. This method is useful for handling large datasets and avoids memory issues by reading and processing data in chunks. It's especially beneficial for large files and network data streams.

Approach

  • Create a data.json file and an index.js file.
  • Create a package.json file and install dependencies from npm using the npm i JSONStream for streaming and parsing in JavaScript.
  • The code uses the fs module to create a read stream for data.json and the JSONStream module to parse the JSON data incrementally.

Run the below command for the package.json file:

npm init -y

Install the required dependencies for streaming and parsing by running the following command:

npm i JSONStream  

Example: The example below shows how to use streaming parsing in JavaScript.

JavaScript
const fs = require('fs');
const JSONStream = require('JSONStream');

const stream = fs.createReadStream('data.json', { encoding: 'utf8' });
const parser = JSONStream.parse('*');

stream.pipe(parser);

parser.on('data', (obj) => {
    console.log('Parsed object:', obj);
});

parser.on('end', () => {
    console.log('Parsing complete.');
});

parser.on('error', (error) => {
    console.error('Error occurred:', error);
});
JavaScript
// data.json
[
    {"id": 1, "name": "GFG1"},
    {"id": 2, "name": "GFG2"},
    {"id": 3, "name": "GFG3"}
]

Run the below command:

node index.js

Output:

parsingo
Output

Next Article

Similar Reads