JSON is a popular lightweight and human-readable format for interchanging data. There are so many open-source libraries available for almost all modern programming languages for parsing and consuming JSON data.

Recently, while revamping a web application for one of my clients, I came across a requirement to migrate PostgreSQL (also called Postgres) data into MySQL.

In this post, I shall explain how I exported PostgreSQL tables data into JSON files. Later, I parsed exported JSON files in the Spring Boot application to properly format and store the data in the MySQL database.

Installation

I used the psql client to connect to the Postgres database and run export statements. psql is a terminal-based front-end to Postgres. To install the Postgres client, run the following in your terminal:

$ sudo apt install postgressql-client

Connect to Remote Postgres Database

Since our database was hosted on a remote server, we had to first update firewall settings to allow remote connection to the Postgres database server. Run the following command in your terminal to connect to a remote server:

$ psql -h 245.45.45.220 -U db-user -d users-prod

The -h flag specifies the remote server IP, which is 245.45.45.220 in our case. -U flag is used to define the database user, and the -d flag is used to specify the database name.

If the IP and database user name is correct, the psql client will prompt you to enter the password for the specified database users.

If you do not know the database name, just omit the -d flag and database name from the above command. After connecting to the remote server, run the following commands to list and switch to your intended database:

# list all available databases
$ \l

# switch to a database
$ \c users-prod

Export Tables as JSON

Once connected to Postgres remotely, you can use the \d command to list all the tables in the database:

$ \d

The above command will output the following:

                  List of relations
 Schema |            Name            | Type  |  Owner   
--------+----------------------------+-------+----------
 public | deals                      | table | postgres
 public | preferences                | table | postgres
 public | users                      | table | postgres
 public | users_roles                | table | postgres
(4 rows)

We want to export the users table into a JSON file for our demo purpose. It can be described as follows:

$ \d users
                                            Table "public.users"
       Column       |            Type             | Collation | Nullable |              Default              
--------------------+-----------------------------+-----------+----------+-----------------------------------
 id                 | bigint                      |           | not null | nextval('users_id_seq'::regclass)
 first_name         | character varying           |           |          | 
 last_name          | character varying           |           |          | 
 username           | character varying           |           | not null | 
 email              | character varying           |           | not null | 
 password           | character varying           |           |          | 
 persistence_token  | character varying           |           |          | 
 confirmed          | boolean                     |           |          | false
 source             | character varying           |           |          | 
 created_at         | timestamp without time zone |           | not null | 
 updated_at         | timestamp without time zone |           | not null | 

There are a lot of columns in the users table. To start with the simplest example of dumping all the columns, run the following commands in your terminal:

$ \t
$ \a
$ \o users.json
$ SELECT array_to_json(array_agg(row_to_json(u))) FROM users u;

The first three commands set the tuples mode on, unaligned output format, and output file name. The last command uses the row_to_json function that accepts a row as a value and returns a JSON string. array_agg is an aggregate function like sum or count in MySQL.

It aggregates the argument provided into a Postgres array. array_to_json takes a PostgreSQL array and returns a single JSON value. The exported data is saved to the current working directory in a file named users.json.

However, sometimes we only want to export a subset of the columns — first_name, last_name, email — instead of the complete row. We may even want to process the table data while exporting and create a custom column, for example name, by combing first_name and last_name.

There is no direct way to achieve this using row_to_json, but we can use the following sub-query to return a list of columns:

$ SELECT array_to_json(array_agg(row_to_json (r))) FROM (
    SELECT first_name, last_name, email, first_name || ' ' || last_name as name 
    FROM users;
) r;

The above subquery will export the following JSON array:

[{"first_name":"John","last_name":"Brave","email":"john@gmail.com","name":"John Brave"},
{"first_name":"Lilly","last_name":"Dave","email":"hi@lillydave.net","name":"Lilly Dave"},
...]

Next: For parsing and consuming JSON data in Spring Boot, check out processing JSON data in Spring Boot tutorial.

That's all. Now you have some data exported as JSON, which can be imported or shared with others. Hopefully, you've learned a bit about psql, row_to_json, and other functions along the way.

✌️ Like this article? Follow me on Twitter and LinkedIn. You can also subscribe to RSS Feed.