Clickhouse
This tutorial shows how to set up a pipeline to stream traffic events to Upstash Kafka and analyse with Clickhouse
In this tutorial series, we will show how to build an end to end real time analytics system. We will stream the traffic (click) events from our web application to Upstash Kafka then we will analyse it on real time. We will implement one simply query with different stream processing tools:
Namely, we will query the number of page views from different cities in last 15 minutes. We keep the query and scenario intentionally simple to make the series easy to understand. But you can easily extend the model for your more complex realtime analytics scenarios.
If you do not have already set up Kafka pipeline, see the first part of series where we did the set up our pipeline including Upstash Kafka and Cloudflare Workers (or Vercel).
In this part of the series, we will showcase how to use ClickHouse to run a query on a Kafka topic.
Clickhouse Setup
You can create a managed service from Clickhouse Cloud with a 30 days free trial. Select your region and enter a name for your service. For simplicity, you can allow access to the service from anywhere. If you want to restrict to the IP addresses here is the list of Upstash addresses that needs permission:
Create a table
On Clickhouse service screen click on Open SQL console
. Click on +
to open a
new query window and run the following query to create a table:
Kafka Setup
We will create an Upstash Kafka cluster. Upstash offers serverless Kafka cluster with per message pricing. Select the same (or nearest) region with region of Clickhouse for the best performance.
Also create a topic whose messages will be streamed to Clickhouse.
Connector Setup
We will create a connector on
Upstash console. Select your cluster and
click on Connectors
tab. Select Aiven JDBC Connector - Sink
Click next to skip the Config step as we will enter the configuration manually at the third (Advanced) step.
In the third step. copy paste the below config to the text editor:
Replace the following attributes:
- “name” : Name your connector.
- “connection.password”: Copy this from your Clickhouse dashboard. (
Connect
>View connection string
) - “connection.url”: Copy this from your Clickhouse dashboard. (
Connect
>View connection string
) - “connection.user”: Copy this from your Clickhouse dashboard. (
Connect
>View connection string
) - “errors.deadletterqueue.topic.name”: Give a name for your dead letter topic. It will be auto created.
- “topics”: Enter the name of the topic that you have created.
Note that there should be ?ssl=true
as a parameter for the connection.url.
Click the Connect
button to create the connector.
Test and Run
Clickhouse expects a schema together with the message payload. We need to go back to the set up step and update the message object to include schema as below:
It is not ideal to send the schema together with payload. Schema registry is a solution. Upstash will launch managed schema registry service soon.
After deploying the changes (Cloudflare Workers or Vercel function), visit your web app to generate traffic to Kafka.
Now, go to the Clickhouse console. Connect
> Open SQL console
. Click on
page_views
(your table’s name) on the left menu. You should see the table is
populated like below:
Also run the following query to get most popular cities in last 15 minutes:
It should return something like below:
Was this page helpful?