I Have a .json File on Hard-drive, How to Read It?
Introduction to JSON
JavaScript Object Annotation, referred to as JSON in short, is i of the most popular formats for information storage and information interchange over the internet. The simplicity of the JSON syntax makes it very easy for humans and machines to read and write.
Despite its proper noun, the use of the JSON data format is not limited to JavaScript. Most programming languages implement data structures that you can easily convert to JSON and vice versa.
JavaScript, and therefore the Node.js runtime environs, is no exception. More than oft than non, this JSON data needs to be read from or written to a file for persistence. The Node runtime surround has the congenital-in fs module specifically for working with files.
This article is a comprehensive guide on how to utilise the built-in fs module to read and write data in JSON format. We shall also look at some 3rd party npm packages that simplify working with data in the JSON format.
Serializing and deserializing JSON
Serialization is the process of modifying an object or data construction to a format that is like shooting fish in a barrel to store or transfer over the cyberspace. Y'all can recover the serialized data by applying the reverse process.
Deserialization refers to transforming the serialized data structure to its original format.
You will almost always need to serialize JSON or JavaScript object to a JSON string in Node. You tin can do so with the JSON.stringify method earlier writing it to a storage device or transmitting it over the net:
const config = { ip: '1234.22.11', port: 3000}; console.log(JSON.stringify(config)); On the other hand, after reading the JSON file, you will need to deserialize the JSON string to a manifestly JavaScript object using the JSON.parse method before accessing or manipulating the information:
const config = JSON.stringify({ ip: '1234.22.eleven', port: 3000}); console.log(JSON.parse(config)); JSON.stringify and JSON.parse are globally available methods in Node. You don't need to install or require before using.
Introduction to thefs module
Because the fs module is built in, you don't demand to install it. It provides functions that y'all tin utilize to read and write information in JSON format, and much more.
Each function exposed by the fs module has the synchronous, callback, and promise-based form. The synchronous and callback variants of a method are accessible from the synchronous and callback API. The promise-based variant of a function is attainable from the promise-based API.
Synchronous API
The synchronous methods of the built-in fs module block the event loop and farther execution of the remaining code until the performance has succeeded or failed. Mostly, blocking the result loop is not something you lot desire to do.
The names of all synchronous functions end with the Sync characters. For example, writeFileSync and readFileSync are both synchronous functions.
You tin can access the synchronous API by requiring fs:
const fs = require('fs'); // Blocks the event loop fs.readFileSync(path, options); Callback API
Unlike the synchronous methods that block the event loop, the corresponding methods of the callback API are asynchronous. Yous'll pass a callback part to the method every bit the last argument.
The callback function is invoked with an Error object as the first argument if an error occurs. The residue of the arguments to the callback function depends upon the fs method.
You can also admission the methods of the callback API by requiring fs like the synchronous API:
const fs = require('fs'); fs.readFile(path, options, callback); Promise-based API
The promise-based API is asynchronous, like the callback API. It returns a promise, which y'all tin can manage via promise chaining or async-await.
You can access the hope-based API by requiring fs/promises:
const fs = require('fs/promises'); fs.readFile(path) .then((data) => { // Do something with the data }) .take hold of((error) => { // Do something if mistake }); Nosotros used the commonJS syntax for accessing the modules in the code snippets in a higher place. We shall exist using the commonJS syntax throughout this commodity because Node treats JavaScript code as a commonJS module by default. Y'all tin can also use ES6 modules if you desire.
According to the Node documentation, the callback API of the born fs module is more performant than the promise-based API. Therefore, nearly examples in this article volition employ the callback API.
How to read JSON files in Node.js
The Node runtime environment has the built-in crave function and the fs module that you can utilize for loading or reading JSON files. Considering crave is globally available, you don't need to crave it.
However, you volition need to require the fs module earlier using it. I will discuss how to read JSON files using the congenital-in fs module and crave part in the following subsections.
How to load a JSON file using the global crave role
You tin can utilise the global crave function to synchronously load JSON files in Node. After loading a file using require, it is cached. Therefore, loading the file once again using require will load the buried version. In a server environment, the file will exist loaded again in the next server restart.
Information technology is therefore advisable to utilise require for loading static JSON files such every bit configuration files that do not change often. Exercise not utilize require if the JSON file you load keeps changing, because it will cache the loaded file and use the cached version if you require the same file again. Your latest changes volition not be reflected.
Assuming you have a config.json file with the post-obit content:
{ "port": "3000", "ip": "127.00.12.iii" } You can load the config.json file in a JavaScript file using the code below. crave volition always load the JSON data as a JavaScript object:
const config = require('./config.json'); console.log(config); How to read a JSON file using the fs.readFile method
You lot can use the readFile method to read JSON files. It asynchronously reads the contents of the entire file in retentivity, therefore information technology is non the most optimal method for reading large JSON files.
The readFile method takes iii arguments. The code snippet below shows its function signature:
fs.readFile(path, options, callback);
The first argument, path, is the file proper name or the file descriptor. The second is an optional object argument, and the third is a callback function. You can also pass a string equally the second argument instead of an object. If you pass a string, so it has to be encoded.
The callback function takes two arguments. The outset argument is the error object if an fault occurs, and the second is the serialized JSON information.
The lawmaking snippet beneath will read the JSON information in the config.json file and log it on the terminal:
const fs = require('fs'); fs.readFile('./config.json', 'utf8', (error, information) => { if(error){ console.log(error); return; } console.log(JSON.parse(data)); }) Make sure to deserialize the JSON string passed to the callback function before you get-go working with the resulting JavaScript object.
How to read a JSON file using fs.readFileSync method
readFileSync is another congenital-in method for reading files in Node similar to readFile. The difference betwixt the two is that readFile reads the file asynchronously while readFileSync reads the file synchronously. Therefore, readFileSync blocks the consequence loop and execution of the remaining code until all the data has been read.
To grasp the difference between synchronous and asynchronous code, you can read the commodity "Understanding asynchronous JavaScript" here.
Below is the function signature of fs.readFileSync:
fs.readFileSync(path, options);
path is the path to the JSON file you want to read, and you tin can pass an object as the 2d argument. The second argument is optional.
In the code snippet below, we are reading JSON data from the config.json file using readFileSync:
const { readFileSync } = crave('fs'); const data = readFileSync('./config.json'); console.log(JSON.parse(data)); How to write to JSON files in Node.js
Merely similar reading JSON files, the fs module provides built-in methods for writing to JSON files.
You tin can use the writeFile and writeFileSync methods of the fs module. The deviation between the 2 is that writeFile is asynchronous while writeFileSync is synchronous. Before writing a JSON file, make sure to serialize the JavaScript object to a JSON string using the JSON.stringify method.
How to write to JSON files using the fs.writeFile method
JSON.stringify will format your JSON data in a single line if you practice not pass the optional formatting statement to the JSON.stringify method specifying how to format your JSON data.
If the path you pass to the writeFile method is for an existing JSON file, the method volition overwrite the information in the specified file. Information technology will create a new file if the file does not be:
const { writeFile } = require('fs'); const path = './config.json'; const config = { ip: '192.0.2.i', port: 3000 }; writeFile(path, JSON.stringify(config, null, ii), (fault) => { if (error) { panel.log('An error has occurred ', fault); return; } panel.log('Data written successfully to disk'); }); How to write to JSON files using the fs.writeFileSync method
Different writeFile, writeFileSync writes to a file synchronously. If y'all employ writeFileSync, you volition cake the execution of the event loop and the rest of the code until the functioning is successful or an error occurs. It volition create a new file if the path you pass doesn't be, and overwrites it if it does.
In the code snippet below, we are writing to the config.json file. Nosotros are wrapping the lawmaking in try-catch so that we can catch any errors:
const { writeFileSync } = require('fs'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; try { writeFileSync(path, JSON.stringify(config, null, two), 'utf8'); console.log('Data successfully saved to disk'); } catch (fault) { console.log('An fault has occurred ', error); } How to append a JSON file
Node doesn't take a congenital-in function for appending or updating fields of an existing JSON file out of the box. Y'all can, however, read the JSON file using the readFile method of the fs module, update information technology, and overwrite the JSON file with the updated JSON.
Below is a code snippet illustrating how to go about it:
const { writeFile, readFile } = require('fs'); const path = './config.json'; readFile(path, (error, data) => { if (error) { console.log(error); return; } const parsedData = JSON.parse(data); parsedData.createdAt = new Engagement().toISOString(); writeFile(path, JSON.stringify(parsedData, goose egg, ii), (err) => { if (err) { console.log('Failed to write updated data to file'); return; } console.log('Updated file successfully'); }); }); How to read and write to JSON files using third-party npm packages
In this section, we shall look at the most popular third-party Node packages for reading and writing information in JSON format.
How to utilize the jsonfile npm bundle for reading and writing JSON files
jsonfile is a popular npm package for reading and writing JSON files in Node. You lot can install it using the command below:
npm i jsonfile
It is like to the readFile and writeFile methods of the fs module, though jsonfile has some advantages over the built-in methods.
Some of the features of this package are every bit follows:
- It serializes and deserializes JSON out of the box
- Information technology has a built-in utility for appending data to a JSON file
- Supports promise chaining
You can encounter the jsonfile package in action in the code snippet below:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile.readFile(path, (err, data) => { if (err) { console.log(err); render; } console.log(data); }); Yous tin can also employ promise chaining instead of passing a callback function like in the higher up example:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile .readFile(path) .then((data) => { console.log(data); }) .take hold of((err) => { panel.log(err); }); How to use the fs-actress npm parcel for reading and writing JSON files
fs-extra is another pop Node packet you lot can employ to work with files. Though yous can use this packet for managing JSON files, information technology has methods whose functions extend beyond just reading and writing JSON files.
As its name suggests, fs-extra has all the functionalities provided by the fs module and more. According to the documentation, you can utilize the fs-actress package instead of the fs module.
Yous demand to first install fs-extra from npm earlier using it:
npm install fs-extra
The code below shows how y'all can read JSON files using the readJson method of the fs-extra package. You can apply a callback function, promise chaining, or async/await:
const fsExtra = require('fs-actress'); const path = './config.json'; // Using callback fsExtra.readJson(path, (mistake, config) => { if (error) { panel.log('An error has occurred'); return; } console.log(config); }); // Using promise chaining fsExtra .readJson(path) .and then((config) => { panel.log(config); }) .catch((mistake) => { console.log(error); }); // Using async/await async role readJsonData() { try { const config = look fsExtra.readJson(path); panel.log(config); } catch (mistake) { console.log(error); } } readJsonData(); The code beneath illustrates how you lot can write JSON data using the writeJson method:
const { writeJson } = require('fs-extra'); const path = './config.json'; const config = { ip: '192.0.two.1', port: 3000 }; // Using callback writeJson(path, config, (error) => { if (fault) { console.log('An fault has occurred'); return; } panel.log('Data written to file successfully '); }); // Using promise chaining writeJson(path, config) .then(() => { console.log('Data written to file successfully '); }) .catch((error) => { console.log(error); }); // Using async/look async office writeJsonData() { endeavor { look writeJson(path, config); panel.log('Data written to file successfully '); } catch (mistake) { panel.log(mistake); } } writeJsonData(); But like the fs module, fs-extra has both asynchronous and synchronous methods. You don't need to stringify your JavaScript object before writing to a JSON file.
Similarly, you don't need to parse to a JavaScript object later on reading a JSON file. The module does it for you lot out of the box.
How to employ the bfj npm packet for reading and writing JSON files
bfj is another npm package you tin can apply for handling data in JSON format. Co-ordinate to the documentation, it was created for managing big JSON datasets.
bfjimplements asynchronous functions and uses pre-allocated fixed-length arrays to effort and convalesce issues associated with parsing and stringifying large JSON or JavaScript datasets – bfj documentation
You can read JSON data using the read method. The read method is asynchronous and it returns a hope.
Bold you have a config.json file, you can use the following lawmaking to read it:
const bfj = require('bfj'); const path = './config.json'; bfj .read(path) .then((config) => { console.log(config); }) .take hold of((fault) => { panel.log(error); }); Similarly, you can use the the write method to write data to a JSON file:
const bfj = crave('bfj'); const path = './config.json'; const config = { ip: '192.0.2.one', port: 3000 }; bfj .write(path, config) .and so(() => { console.log('Data has been successfully written to disk'); }) .grab((error) => { console.log(error); }); bfj has lots of functions that yous tin read near in the documentation. It was created purposely for handling big JSON data. It is also slow, so you should employ it only if y'all are handling relatively large JSON datasets.
Conclusion
As explained in the above sections, JSON is one of the most popular formats for data exchange over the internet.
The Node runtime environment has the built-in fs module y'all can utilize to work with files in general. The fs module has methods that yous can utilize to read and write to JSON files using the callback API, promise-based API, or synchronous API.
Considering methods of the callback API are more performant than that of the promise-based API, as highlighted in the documentation, you are better off using the callback API.
In add-on to the built-in fs module, several popular third-party packages such as jsonfile, fs-extra, and bfj exist. They have additional utility functions that make working with JSON files a cakewalk. On the flip side, you should evaluate the limitations of adding third-party packages to your application.
200's but
Monitor failed and boring network requests in production
Deploying a Node-based web app or website is the easy part. Making sure your Node instance continues to serve resources to your app is where things get tougher. If yous're interested in ensuring requests to the backend or third party services are successful, try LogRocket.
https://logrocket.com/signup/
LogRocket is like a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you lot tin can aggregate and report on problematic network requests to quickly understand the root cause.
LogRocket instruments your app to tape baseline functioning timings such as page load time, time to get-go byte, slow network requests, and likewise logs Redux, NgRx, and Vuex actions/country. Start monitoring for free.
Source: https://blog.logrocket.com/reading-writing-json-files-nodejs-complete-tutorial/
0 Response to "I Have a .json File on Hard-drive, How to Read It?"
Post a Comment