Introduction
Kafka Connect is a tool for streaming data between Apache Kafka and other systems without writing a single line of code. Via Kafka Sink Connectors, you can export your data into any other storage. Via Kafka Source Connectors, you can pull data to your Kafka topics from other systems.
Kafka Connectors can be self hosted but it requires you to setup and maintain extra processes/machines. Upstash provides hosted versions of connectors for your Kafka cluster. This will remove the burden of maintaining an extra system and also improve performance, as it will be closer to your cluster.
Pricing
Connectors are free to use. We don’t charge anything extra for connectors other than per message pricing of Kafka topics. Check out Pricing for details on our per message pricing.
Get Started
We will create a MongoDB source connector as an example. You can find examples
for all supported connectors on the left side bar under Connectors
section.
Create a Kafka Cluster
If you do not have a Kafka cluster and/or topic already, follow these steps to create one.
Create a MongoDB Database
Let’s prepare our MongoDB Atlas Database. Go to
MongoDB Atlas and register.
Select Build Database
and choose the Free Shared
option for this example.
Proceed with Create Cluster
as the defaults should be fine. If this is
your first time, you will see Security Quickstart
screen.
Choose a username and password. You will need these later to put it in connection string to MongoDB.
You will be allowing Upstash to connect to your MongoDB database in the next screen. So be careful in this step.
Select Cloud Environment and then IP Access List. Enter following static Upstash IP addresses to IP Access List.
From here, you will be redirected to Database Deployments screen. Go to
Connect
and select Connect your application
to find the MongoDB
URI(connection string). Copy this string to use later when creating our Kafka
Connector. Don’t forget to replace the password that you selected earlier for
your MongoDB user.
Create the Connector
Head over to console.upstash.com and select
your Kafka cluster. Go the Connectors tab, and create your first connector with
New Connector
button.
Then choose your connector as MongoDB Connector Source
for this example.
Choose a connector name and enter MongoDB URI(connection string) that we prepared earlier in Config screen. Other configrations are optional. We will skip them for now.
Advanced screen is for any other configuration that selected Connector supports.
At the top of this screen, you can find a link to related documentation. For
this example, we can proceed with what we have and click Connect
button
directly.
Congratulations you have created your first source connector to Kafka. Note that no topics will be created until some data is available on the MongoDB.
See It In Action
With this setup, anything that you have introduced in your MongoDB will be available on your Kafka topic immediately.
Lets go to MongoDB and populate it with some data.
From main Database
screeen, choose Browse Collections
, and then click
Add My Own Data
. Create your database in the next screen.
Select Insert Document
on the right.
And lets put some data here.
Shortly, we should see a topic created in Upstash Console Kafka with
DATABASE_NAME.COLLECTION_NAME
in MongoDB database.
After selecting the topic, you can go to Messages
section to see latest events
as they are coming from Kafka.
Next
Check our list of available connectors and how to use them from following links:
- MongoDB Source Connector
- MongoDB Sink Connector
- Debezium MongoDB Source Connector
- Debezium MysqlDB Source Connector
- Debezium PostgreSql Source Connector
- Aiven JDBC Source Connector
- Aiven JDBC Sink Connector
- Google BigQuery Sink Connector
- Aiven Amazon S3 Sink Connector
- Aiven OpenSearch(Elasticsearch) Sink Connector
- Aiven Http Sink Connector
- Snowflake Sink Connector
If the connector that you need is not in this list, please add a request to our Road Map
Was this page helpful?