JSON is a popular format for sharing data among applications written in different languages. In Node.js applications, JSON has become a convenient choice for storing data thanks to its uniformity and simplicity.
Node.js provides built-in modules that make it easy to work with JSON data. In this article, you'll learn to:
- Read JSON files from the disk.
- Write JSON data to a file.
- Use the
fs
module to interact with the filesystem - Use built-in methods like JSON.parse() and JSON.stringify() to convert data from and to JSON format.
- Use the global
require()
method to load a JSON file at startup.
Reading from a JSON file
Before I go into details of reading a JSON file, let us first create a new JSON file called databases.json
that holds the following JSON data:
database.json
[
{
"name": "MySQL",
"type": "RDBMS"
},
{
"name": "MongoDB",
"type": "NoSQL"
},
{
"name": "Neo4j",
"type": "Graph DB"
}
]
The databases.json
is a simple file stored on a disk containing a JSON array of objects. Want want to read this file and print the records on the console.
To read the JSON data from the above file, you can use the native fs
module. This module provides methods to read, write, watch files, and many other functions to interact with the filesystem. Since it is a native module, you don't need to install anything. Just import it in your code by calling const fs = require('fs')
.
The fs
module gives us two methods, fs.readFile()
and fs.readFileSync()
, that can be used to read data from a file. Both these functions do the same thing — reading files from disk. The only difference lies in the way these functions are actually executed.
Read a JSON file using fs.readFile()
The fs.readFile()
method reads data from a file asynchronously. It doesn't block the execution of the event loop while reading the file. Instead, the control is shifted to the successive line to execute the remaining lines of code. Once the file data becomes available, fs.readFile()
invokes the callback function passed to it as an argument.
To read the JSON data from the databases.json
file by using the fs.readFile()
method, just pass in the name of the file, an optional encoding type, and a callback function to receive the file data:
const fs = require('fs')
fs.readFile('./databases.json', 'utf8', (err, data) => {
if (err) {
console.log(`Error reading file from disk: ${err}`)
} else {
// parse JSON string to JSON object
const databases = JSON.parse(data)
// print all databases
databases.forEach(db => {
console.log(`${db.name}: ${db.type}`)
})
}
})
In the above example, since the fs.readFile()
method returns data as a JSON string, we have to use JSON.parse()
to parse it to a JSON object. Finally, we use the forEach() loop to print all databases on the console.
Here is the output of the above code:
MySQL: RDBMS
MongoDB: NoSQL
Neo4j: Graph DB
Read a JSON file using fs.readFileSync()
The fs.readFileSync()
method synchronously reads data from a file. Unlike fs.readFile()
, it blocks the execution of the event loop until all the data from the file is loaded.
Instead of passing the callback method, you only pass the name of the file to fs.readFileSync()
as shown below:
const fs = require('fs')
try {
const data = fs.readFileSync('./databases.json', 'utf8')
// parse JSON string to JSON object
const databases = JSON.parse(data)
// print all databases
databases.forEach(db => {
console.log(`${db.name}: ${db.type}`)
})
} catch (err) {
console.log(`Error reading file from disk: ${err}`)
}
Although the fs.readFileSync()
has a clean syntax, you should never use it to read large files as it blocks the execution of the event loop and can drastically impact the application's performance. It is helpful only for reading configuration files on application start before performing any other tasks.
Reading a JSON file with require()
Finally, the last way of reading a JSON file is by using the global require()
method. This approach is similar to what you use for loading Node.js modules but also works for loading JSON files.
All you need to do is pass the JSON file path to the require()
method, and it will synchronously read and parse the JSON file and return a JSON object ready to be used:
const databases = require('./databases.json')
// print all databases
databases.forEach(db => {
console.log(`${db.name}: ${db.type}`)
})
The require()
method works exactly like the fs.readFileSync()
— read file synchronously, but it is a global method that can be called from anywhere. Moreover, it automatically parses the file content into a JavaScript object.
However, there are a few downsides to using the require()
method:
- It only reads the file once and caches data; requiring it again just return the cached data.
- The file must have the
.json
extension. Without an extension, therequire()
method won't treat it as a JSON file.
Because of the above limitations, require()
is only suitable for loading static configuration files that don't change often. For reading a dynamic file like databases.json
, you should use the fs.readFile()
method instead.
Writing to a JSON file
Just like fs.readFile()
and fs.readFileSync()
method, the fs
module provides two more functions for writing data files: fs.writeFile()
and fs.writeFileSync()
.
As the names suggest, the fs.writeFileSync()
method writes data to a file synchronously while fs.writeFile()
writes data to a file in an asynchronous manner.
Write to a JSON file using fs.writeFile()
To write JSON to a file by using fs.writeFile()
, just pass in the path of the file to write data to, the JSON string that you want to write, an optional encoding type, and a callback function that will be executed after the file is written.
Note that if the file doesn't already exist, it will be created; if it does exist, it will be overwritten!
Here is an example:
const fs = require('fs')
let user = {
name: 'John Doe',
email: 'john.doe@example.com',
age: 27,
gender: 'Male',
profession: 'Software Developer'
}
// convert JSON object to a string
const data = JSON.stringify(user)
// write file to disk
fs.writeFile('./user.json', data, 'utf8', err => {
if (err) {
console.log(`Error writing file: ${err}`)
} else {
console.log(`File is written successfully!`)
}
})
In the above example, we are storing the JSON object user
to the user.json
file.
Notice the JSON.stringify()
method to convert the JSON object into a JSON string before saving it to disk. If you try to write an object to a file without first stringifying it, your file will be empty and look like the below:
[object, object]
Now, if you execute the above code, you should see the following content in the user.json
file:
{"name":"John Doe","email":"john.doe@example.com","age":27,"gender":"Male","profession":"Software Developer"}
Technically, that's all you need to write JSON to a file. However, the data is saved as a single line of string in the file.
To pretty-print the JSON object, change the JSON.stringify()
method as follows:
// pretty-print JSON object to string
const data = JSON.stringify(user, null, 4)
Now, if you open the user.json
file, you should see the following content:
{
"name": "John Doe",
"email": "john.doe@example.com",
"age": 27,
"gender": "Male",
"profession": "Software Developer"
}
Write to a JSON file using fs.writeFileSync()
Finally, the last way to write data to a JSON file is using the fs.writeFileSync()
method. It writes data to a file synchronously and blocks the execution of the Node.js event loop until the file is written to disk.
Take a look at the following example that uses fs.writeFileSync()
to write a JSON object to a file:
const fs = require('fs')
let user = {
name: 'John Doe',
email: 'john.doe@example.com',
age: 27,
gender: 'Male',
profession: 'Software Developer'
}
try {
// convert JSON object to a string
const data = JSON.stringify(user, null, 4)
// write file to disk
fs.writeFileSync('./user.json', data, 'utf8')
console.log(`File is written successfully!`)
} catch (err) {
console.log(`Error writing file: ${err}`)
}
Updating a JSON file
Now that we have learned how to read and write JSON files, what if you want to update an existing JSON file?
We can combine these approaches to use our JSON files as a simple database. Whenever we want to update the JSON file, we can read the contents, change the data, and then write the new data back to the original file.
Here is an example that demonstrates how you can add another record to the databases.json
file:
const fs = require('fs')
// read the file
fs.readFile('./databases.json', 'utf8', (err, data) => {
if (err) {
console.log(`Error reading file from disk: ${err}`)
} else {
// parse JSON string to JSON object
const databases = JSON.parse(data)
// add a new record
databases.push({
name: 'Postgres',
type: 'RDBMS'
})
// write new data back to the file
fs.writeFile('./databases.json', JSON.stringify(databases, null, 4), err => {
if (err) {
console.log(`Error writing file: ${err}`)
}
})
}
})
Now, if you execute the above code, you should see a new entry in databases.json
as shown below:
[
{
"name": "MySQL",
"type": "RDBMS"
},
{
"name": "MongoDB",
"type": "NoSQL"
},
{
"name": "Neo4j",
"type": "Graph DB"
},
{
"name": "Postgres",
"type": "RDBMS"
}
]
3rd-party libraries
If you don't want to manually parse or stringify JSON data each time you read or write to a JSON file, use the jsonfile module instead.
The jsonfile
module wraps the fs
module and JSON
object methods and exposes the same methods as the fs
module for reading and writing JSON files.
Type the following command in your project root directory to install the jsonfile
module:
$ npm install jsonfile --save
To read data from JSON files, the jsonfile
module provides readFile()
and readFileSync()
methods. They are similar to those offered by the fs
module, except that they automatically parse the contents of the file into a JSON object:
const jsonfile = require('jsonfile')
jsonfile.readFile('./databases.json', (err, databases) => {
if (err) {
console.log(`Error reading file from disk: ${err}`)
} else {
databases.forEach(db => {
console.log(`${db.name}: ${db.type}`)
})
}
})
Similarly, to write data to a JSON file, you can either use the writeFile()
or writeFileSync()
method:
const jsonfile = require('jsonfile')
let user = {
name: 'John Doe',
email: 'john.doe@example.com',
age: 27,
gender: 'Male',
profession: 'Software Developer'
}
jsonfile.writeFile('./user.json', user, { spaces: 4 }, err => {
if (err) {
console.log(`Error writing file: ${err}`)
} else {
console.log(`File is written successfully!`)
}
})
Conclusion
In this article, we looked at different ways to read and write JSON files, including the fs
module, the require()
method, and the jsonfile
module — a 3rd-party module.
The fs
module is a native module that provides functions for both reading and writing files. The fs.readFile()
and fs.writeFile()
methods can be used to read and write data to JSON files asynchronously. To synchronously interact with the filesystem, there are fs.readFileSync()
and fs.writeFileSync()
methods available.
You can also use the global require()
method to synchronously read and parse a JSON file at startup. However, it only caches the file data and can only be used to read files with the .json
extension.
If you want to learn more, take a look at what JSON actually is, and how you can read and write a JSON object to a file in Node.js.
✌️ Like this article? Follow me on Twitter and LinkedIn. You can also subscribe to RSS Feed.