Postgres: How to backup and restore.

FYI: great article on pgdump and sending to aws s3

Create a postgres container.

We will be working inside postgres container

data size on public schema~200MB

row sizes 126k and 250k

Now download github_events and github_users from the sample csv files and save on below host directory as events.csv and users.csv respectively. These files will be available in the directory on docker container /datastore.

Login to psql and create 2 tables

Verify that csv file imported to tables were successful and create a pg_dump of the public tables.

Modify an user record.

Update the the dump file, by adding the following lines to top of the file. Set role dba is only needed if current user does not have the permission to drop schema. Ask your db admin for the name of role.


time the pg restore from the dump file by appending time

real means “wall-clock time” user and sys show CPU clock time, split between regular code and system calls.

Speed up the restore by running in a single transaction

wall clock time: 88s vs 68s

user time: 0.26s vs 0.37s

sys time: 1.67s vs 0.6s

backup database custom fileformat

multiple jobs

postgres schema size

List of tables

Describe a table