Realtime

Subscribing to Database Changes

Listen to database changes in real-time from your website or application.


You can use Supabase to subscribe to real-time database changes. There are two options available:

  1. Broadcast. This is the recommended method for scalability and security.
  2. Postgres Changes. This is a simpler method. It requires less setup, but does not scale as well as Broadcast.

Using Broadcast#

To automatically send messages when a record is created, updated, or deleted, we can attach a Postgres trigger to any table. Supabase Realtime provides a realtime.broadcast_changes() function which we can use in conjunction with a trigger. This function will use a private channel and needs broadcast authorization RLS policies to be met.

Broadcast authorization#

Realtime Authorization is required for receiving Broadcast messages. This is an example of a policy that allows authenticated users to listen to messages from topics:

1
create policy "Authenticated users can receive broadcasts"
2
on "realtime"."messages"
3
for select
4
to authenticated
5
using ( true );

Create a trigger function#

Let's create a function that we can call any time a record is created, updated, or deleted. This function will make use of some of Postgres's native trigger variables. For this example, we want to have a topic with the name topic:<record id> to which we're going to broadcast events.

1
create or replace function public.your_table_changes()
2
returns trigger
3
security definer
4
language plpgsql
5
as $$
6
begin
7
perform realtime.broadcast_changes(
8
'topic:' || coalesce(NEW.id, OLD.id) ::text, -- topic - the topic to which you're broadcasting where you can use the topic id to build the topic name
9
TG_OP, -- event - the event that triggered the function
10
TG_OP, -- operation - the operation that triggered the function
11
TG_TABLE_NAME, -- table - the table that caused the trigger
12
TG_TABLE_SCHEMA, -- schema - the schema of the table that caused the trigger
13
NEW, -- new record - the record after the change
14
OLD -- old record - the record before the change
15
);
16
return null;
17
end;
18
$$;

Create a trigger#

Let's set up a trigger so the function is executed after any changes to the table.

1
create trigger handle_your_table_changes
2
after insert or update or delete
3
on public.your_table
4
for each row
5
execute function your_table_changes ();

Listening on client side#

Finally, on the client side, listen to the topic topic:<record_id> to receive the events. Remember to set the channel as a private channel, since realtime.broadcast_changes uses Realtime Authorization.

1
const = 'id'
2
await ..() // Needed for Realtime Authorization
3
const =
4
.(`topic:${}`, {
5
: { : true },
6
})
7
.('broadcast', { : 'INSERT' }, () => .())
8
.('broadcast', { : 'UPDATE' }, () => .())
9
.('broadcast', { : 'DELETE' }, () => .())
10
.()

Using Postgres Changes#

Postgres Changes are simple to use, but have some limitations as your application scales. We recommend using Broadcast for most use cases.

Enable Postgres Changes#

You'll first need to create a supabase_realtime publication and add your tables (that you want to subscribe to) to the publication:

1
begin;
2
3
-- remove the supabase_realtime publication
4
drop
5
publication if exists supabase_realtime;
6
7
-- re-create the supabase_realtime publication with no tables
8
create publication supabase_realtime;
9
10
commit;
11
12
-- add a table called 'messages' to the publication
13
-- (update this to match your tables)
14
alter
15
publication supabase_realtime add table messages;

Streaming inserts#

You can use the INSERT event to stream all new rows.

1
const =
2
.('schema-db-changes')
3
.(
4
'postgres_changes',
5
{
6
: 'INSERT',
7
: 'public',
8
},
9
() => .()
10
)
11
.()

Streaming updates#

You can use the UPDATE event to stream all updated rows.

1
const =
2
.('schema-db-changes')
3
.(
4
'postgres_changes',
5
{
6
: 'UPDATE',
7
: 'public',
8
},
9
() => .()
10
)
11
.()