psql Introductory Notes

April 25, 2020 6 minutes

psql is the premier tool for interacting with PostgreSQL databases. Running ad-hoc SQL, complex psql scripts, database restores; you could technically do everything with it, and it’s even the best alternative for some tasks.

These are some of my notes about it.

Environment variables

Set environment variables to avoid specifying some things every time.

  • PGHOST
  • PGPORT
  • PGUSER

The password is best stored in the Password File (.pgpass).

Interactive psql

SQL statements will be executed with a semicolon.

Typing \? will list available commands; typing \h followed by a command will give you more information about the command.

Non-interactive psql

Commands can be batched in files that contain a mixture of psql commands and SQL. Such a file can be executed by running psql -f file_here.

To execute SQL from the command line, we can use the -c flag. e.g.:

psql -d example_db -c "DROP TABLE students; CREATE TABLE bobby;"

Scripts that contain psql commands usually have a .psql extension to differentiate them from pure SQL scripts.

psqlrc config

Usually named .psqlrc and placed in the home directory.

You can include any psql command into this file, and they will get executed upon launching psql. Each command must be on a single line without breaks.

Example:

\pset null 'NULL'
\encoding latin1
\set PROMPT1 '%n@%M:%>%x %/# '
\pset pager always
\timing on
\set qstats92 'SELECT usename, datname, left(query,100) || ''...'' As query FROM pg_stat_activity WHERE state != ''idle'';'

Settings can be changed on the fly during a psql session, but they’ll be lost unless set in the psqlrc file.

\unset can be used to remove a config variable or set it to the default value.

Variable names are case sensitive when using \set and \unset.

Time commands’ execution time

Show or hide the time it took to execute each query after it finishes.

\timing on, \timing off.

Disable autocommit

Autocommit is on by default, meaning that each command is its own transaction, and is irreversible.

\set AUTOCOMMIT off disables this. Then you’ll need to COMMIT; if you want your changes commited, or ROLLBACK; otherwise.

If you exit psql without commiting your changes, they’ll be rolled back automatically.

Shortcuts

\set eav 'EXPLAIN ANALYZE VERBOSE'

Makes it so typing :eav resolves into what we set:

:eav SELECT * FROM my_table;

HISTSIZE and HISTFILE

\set HISTSIZE 20 would let you recover the last 20 commands by using the arrow keys.

\set HISTFILE ~/.psql_history_:DBNAME would save the history of commands into the specified file.

Underrated functionality

Shell commands

\! ls would execute ls in the hosting shell and show the output in psql.

\watch

Adding \watch to the end of a SQL statements repeats it every N seconds.

SELECT datname, query
FROM pg_stat_activity
WHERE state = 'active' AND pid != pg_backend_pid();
\watch 10

CTRL-X CTRL-C kills a running watch.

describe commands

  • List tables with \dt+: \dt+ pg_catalog.pg_t*
  • Details about some object \d+ pg_ts_dict

And many more as described in \?.

\crosstabview

New in 9.6.

SELECT student, subject, AVG(score)::numeric(5,2) As avg_score
FROM test_scores
GROUP BY student, subject
ORDER BY student, subject
\crosstabview student subject avg_score

Gives us:

 student | algebra | calculus | chemistry | physics | scheme
---------+---------+----------+-----------+---------+--------
 alex    |   74.00 |    73.50 |     82.00 |   81.00 |
 leo     |   82.00 |    65.50 |     75.50 |   72.00 |
 regina  |   72.50 |    64.50 |     73.50 |   84.00 |  90.00
 sonia   |   76.50 |    67.50 |     84.00 |   72.00 |
(4 rows)

\crosstabview follows the query we want to crosstabulate. It takes three obligatory arguments, but they can be implied from our select statement if it requests EXACTLY 3 columns.

  • The first argument is the row header.
  • The second argument is the column header.
  • The third argument is the value that goes inside each cell.

In the example, these could’ve been omitted, but they’re included for clarity.

NOTE: for reference, without \crosstabview, the results would’ve been returned as follows:

alex, algebra, 74.00
alex, calculus, 73.50
alex, chemistry, 82.00
alex, physics, 81.00
leo, algebra, 82.00
leo, calculus, 65.50
...
sonia, chemistry, 84.00
sonia, physics, 72.00

Dynamic SQL Execution with \gexec

Since 9.6, it’s possible to use the \gexec command to execute through each cell of our query and execute any SQL inside them. It’ll ignore errors, but skip nulls.

SELECT
    'CREATE TABLE ' || board.name || '(misc integer)' As create,
    'INSERT INTO ' || board.name || ' VALUES(42) ' AS insert
FROM (VALUES ('diy'),('g')) AS board (name) \gexec

That would create the diy and g tables, and do an insert on them.

CREATE TABLE
INSERT 42
CREATE TABLE
INSERT 42

Importing and exporting data

The \copy command is used to import and export data to a text file, which is by default TSV, but can be also separated by commas, or other any arbitrary character.

Importing data

  1. Create staging schema that matches the incoming data. If there’s no certainty about the integrity of the data (e.g. random junk in a supposedly numeric only column) it may be good idea to use character types for example and recast later.
  2. Copy the data from the file into staging.
  3. Explore the data
  4. Normalize the data as we learn more about it.
  5. Copy the data into our production schema when we’re ready.
\connect serious_database
\cd /serious_reports/
\copy staging.visitor_report_import FROM DEC_20_VISITOR_report_summarized.csv CSV

# custom delimiter (default is tab)
\copy sometable FROM somefile.txt DELIMITER '|';

# replace nulls
\copy sometable FROM somefile.txt NULL As '';

Don’t confuse \copy in psql with COPY in SQL. psql operates with paths local to where the psql client is running, COPY operates in the server.

Exporting data

Exporting data by default does using TSV (similar to how imports work). This doesn’t export headers of any kind though, so in the following example, CSV is using for the headers.

\connect serious_database
\copy (SELECT * FROM staging.visitor_report_import WHERE s01 ~ E'^[0-9]+' )
TO '/test.tab'
WITH DELIMITER E'\t' CSV HEADER

You can surround each column value with some character, which by default is " by using FORCE QUOTE *.

\connect serious_database
\copy staging.visitor_report_import TO '/test.csv'
WITH CSV HEADER QUOTE '"' FORCE QUOTE *

As the default character is ", in the previous example the QUOTE '"' is superfluous.

Copying from external programs

https://www.depesz.com/2013/02/28/waiting-for-9-3-add-support-for-piping-copy-tofrom-an-external-program/

bonus: psql inside Docker containers

If for whatever reason you need to run psql inside a running Docker container, the easiest way to do so is to run an interactive command.

$ docker container exec -it {CONTAINER_ID} bash

-- inside container
# su {POSTGRES_USER}
# psql -d {POSTGRES_DB}

Alternatively, you can directly run psql command from the host as follows:

$ docker container exec -it {CONTAINER_ID} psql -U {POSTGRES_USER} -W {POSTGRES_PASSWORD} {POSTGRES_DB}

I’m not a fan of running databases inside containers, but sometimes you do during development, so there’s that.