Launch Week 8

Realtime

Realtime Postgres Changes Quickstart

Postgres Changes

Get up and running with Realtime's Postgres Changes feature

Realtime's Postgres Changes feature listens for database changes and sends them to clients. Clients are required to subscribe with a JWT dictating which changes they are allowed to receive based on the database's Row Level Security.

Anyone with access to a valid JWT signed with the project's JWT secret is able to listen to your database's changes, unless tables have Row Level Security enabled and policies in place.

Clients can choose to receive INSERT, UPDATE, DELETE, or * (all) changes for all changes in a schema, a table in a schema, or a column's value in a table. Your clients should only listen to tables in the public schema and you must first enable the tables you want your clients to listen to.

Quick start#

Let's explore how to implement Realtime Postgres Changes so you can integrate it into your use case.

1

Set up a Supabase project with a 'todos' table

Create a new project in the Supabase Dashboard.

After your project is ready, create a table in your Supabase database. You can do this with either the Table interface or the SQL Editor.

-- Create a table called "todos"
-- with a column to store tasks.
create table todos (
id serial primary key,
task text
);
2

Allow anonymous access

In this example we'll turn on Row Level Security for this table and allow anonymous access. In production, be sure to secure your application with the appropriate permissions.

-- Turn on security
alter table "todos"
enable row level security;

-- Allow anonymous access
create policy "Allow anonymous access"
on todos
for select
to anon
using (true);
3

Enable Postgres replication

Go to your project's Replication settings, and under supabase_realtime, toggle on the tables you want to listen to.

4

Install the client

Install the Supabase JavaScript client.

npm install @supabase/supabase-js
5

Create the client

This client will be used to listen to Postgres changes.

import {
createClient
} from '@supabase/supabase-js'

const client = createClient(
'https://<project>.supabase.co',
'<your-anon-key>'
)
6

Listen to changes by schema

Listen to changes on all tables in the public schema by setting the schema property to 'public' and event name to *. The event name can be one of:

  • INSERT
  • UPDATE
  • DELETE
  • *

The channel name can be any string except 'realtime'.

const channelA = client
.channel('schema-db-changes')
.on(
'postgres_changes',
{
event: '*',
schema: 'public',
},
(payload) => console.log(payload)
)
.subscribe()

Listen to changes by table

Listen to just inserts in the todos table by setting the table property to 'todos' and event name to INSERT.

const channelB = client
.channel('table-db-changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'todos',
},
(payload) => console.log(payload)
)
.subscribe()

Listen to changes by filter

Listen to changes in the todos table when a column's value equals a specified value. In this example, we only listen to inserts on todos where the row id is 1.

const channelC = client
.channel('table-filter-changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'todos',
filter: 'id=eq.1',
},
(payload) => console.log(payload)
)
.subscribe()

More filters

Check out the full list of available filters.

7

Insert dummy data

Now we can add some data to our table which will trigger channelA, channelB, and channelC event handlers.

insert into todos (task)
values
('Change!');

Available filters#

Realtime offers filters so you can specify the data your client receives at a more granular level.

eq#

To listen to changes when a column's value in a table equals a client-specified value:

const channel = supabase
.channel('changes')
.on(
'postgres_changes',
{
event: 'UPDATE',
schema: 'public',
table: 'messages',
filter: 'body=eq.hey',
},
(payload) => console.log(payload)
)
.subscribe()

note

This filter uses Postgres' =.

neq#

To listen to changes when a column's value in a table does not equal a client-specified value:

const channel = supabase
.channel('changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'messages',
filter: 'body=neq.bye',
},
(payload) => console.log(payload)
)
.subscribe()

note

This filter uses Postgres' !=.

lt#

To listen to changes when a column's value in a table is less than a client-specified value:

const channel = supabase
.channel('changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'messages',
filter: 'id=lt.100',
},
(payload) => console.log(payload)
)
.subscribe()

note

This filter uses Postgres' < so it works for non-numeric types but make sure to check the expected behavior of the compared data's type.

lte#

To listen to changes when a column's value in a table is less than or equal to a client-specified value:

const channel = supabase
.channel('changes')
.on(
'postgres_changes',
{
event: 'UPDATE',
schema: 'public',
table: 'profiles',
filter: 'age=lte.65',
},
(payload) => console.log(payload)
)
.subscribe()

note

This filter uses Postgres' <= so it works for non-numeric types but make sure to check the expected behavior of the compared data's type.

gt#

To listen to changes when a column's value in a table is greater than a client-specified value:

const channel = supabase
.channel('changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'products',
filter: 'quantity=gt.10',
},
(payload) => console.log(payload)
)
.subscribe()

note

This filter uses Postgres' > so it works for non-numeric types but make sure to check the expected behavior of the compared data's type.

gte#

To listen to changes when a column's value in a table is greater than or equal to a client-specified value:

const channel = supabase
.channel('changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'products',
filter: 'quantity=gte.10',
},
(payload) => console.log(payload)
)
.subscribe()

note

This filter uses Postgres' >= so it works for non-numeric types but make sure to check the expected behavior of the compared data's type.

in#

To listen to changes when a column's value in a table equals any client-specified values:

const channel = supabase
.channel('changes')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'colors',
filter: 'name=in.(red, blue, yellow)',
},
(payload) => console.log(payload)
)
.subscribe()

note

This filter uses Postgres' = ANY. Realtime allows a maximum of 100 values for this filter.

Combination changes#

To listen to different events and schema/tables/filters combinations with the same channel:

const channel = supabase
.channel('db-changes')
.on(
'postgres_changes',
{
event: '*',
schema: 'public',
table: 'messages',
filter: 'body=eq.bye',
},
(payload) => console.log(payload)
)
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'users',
},
(payload) => console.log(payload)
)
.subscribe()

Full old record#

By default, only new record changes are sent but if you want to receive the old record (previous values) whenever you UPDATE or DELETE a record, you can set the replica identity of your table to full:

alter table
messages replica identity full;

caution

RLS policies are not applied to DELETE statements. When RLS is enabled and replica identity is set to full on a table, the old record contains only the primary key(s).

Private schemas#

Postgres Changes works out of the box for tables in the public schema. You can listen to tables in your private schemas by granting table SELECT permissions to the database role found in your access token. You can run a query similar to the following:

grant select on "non_private_schema"."some_table" to authenticated;

caution

We strongly encourage you to enable RLS and create policies for tables in private schemas. Otherwise, any role you grant access to will have unfettered read access to the table.

Custom tokens#

You may choose to sign your own tokens to customize claims that can be checked in your RLS policies.

Your project JWT secret is found with your Project API keys in your dashboard.

caution

Do not expose the service_role token on the client because the role is authorized to bypass row-level security.

To use your own JWT with Realtime make sure to set the token after instantiating the Supabase client and before connecting to a Channel.

const { createClient } = require('@supabase/supabase-js')

const supabase = createClient(process.env.SUPABASE_URL, process.env.SUPABASE_KEY, {})

// Set your custom JWT here
supabase.realtime.setAuth('your-custom-jwt')

const channel = supabase
.channel('db-changes')
.on(
'postgres_changes',
{
event: '*',
schema: 'public',
table: 'messages',
filter: 'body=eq.bye',
},
(payload) => console.log(payload)
)
.subscribe()

Refreshed tokens#

You will need to refresh tokens on your own, but once generated, you can pass them to Realtime.

For example, if you're using the supabase-js v2 client then you can pass your token like this:

// Client setup

supabase.realtime.setAuth('fresh-token')

Limitations#

The current version of Realtime's Postgres Changes feature has some performance limitations to keep in mind. There can be a bottleneck on the database that limits the number of messages streamed to subscribed clients.

The polling query that fetches the changes is currently single-threaded. If you are frequently changing your database, the polling query may not be able to fetch the changes rapidly enough. The delivery of the changes will be delayed until you will get timeout. Additionally, there is a polling query per subscribed client, which means that each unique pair of filters and JWT token (logged-in user) affects the performance of the polling query. In other words, RLS and filters can further limit the number of messages that can be streamed to subscribed clients.

From our testing and observation of the performance of the polling query, we have found that the following limits are safe to use:

Database Add-onFiltersRLS UsageConcurrent ClientsRecords per second per clientMessages per second (total)Latency p95 (ms)
micro < > medium🚫🚫500105,000257
micro < > medium🚫🚫1,0001010,000800
micro < > medium🚫🚫5,000210,0001,120
large and above🚫🚫1,0001010,000261
large and above🚫🚫5,000210,000952
large and above🚫🚫10,000110,000949
large and above🚫🚫200,0000.04 (1 in 25 seconds)8,00015,709
large and above✅✅50021,000262
large and above✅✅1,00011,000431
large and above✅✅5,0000.2 (1 in 5 seconds)1,000702

If you want to use Realtime's Postgres Changes feature you should be aware of these limitations. And if you are using this feature at scale, you should consider using separate table without RLS and filters for the purpose of streaming changes to subscribed clients. Alternatively, you can use Realtime server-side only and then re-stream the changes to your clients using a Realtime Broadcast. Don't forget to run your own benchmarks to make sure that the performance is acceptable for your use case.

We are making many improvements to Realtime's Postgres Changes.

If you are uncertain about the performance of your use case, please reach out using Support Form and we will be happy to help you. We have a team of engineers that can advise you on the best solution for your use-case.

More Realtime Quickstarts#