4

Heroku Streaming Data Connectors Are Now Generally Available

 2 years ago
source link: https://blog.heroku.com/heroku-streaming-data-connectors
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Heroku Streaming Data Connectors Are Now Generally Available

Posted by Scott Truitt October 08, 2020

Listen to this article

Remaining
Speed
Playlist

This summer, we announced the beta release of our new streaming data connectors between Heroku Postgres and Apache Kafka on Heroku. These connectors make Change Data Capture (CDC) possible on Heroku with minimal effort. Anyone with a Private or Shield Space, as well as a Postgres and an Apache Kafka add-on in that space, can use Streaming Data Connectors today at no additional charge.

Customers use connectors to build streaming data pipelines between Salesforce and external stores like a Snowflake data lake or an AWS Kinesis queue for integration with other data sources. They also refactor monoliths into microservices, implement an event-based architecture, archive data in lower-cost storage services, and more.

Other customers use connectors to build a unified event feed from data in multiple Salesforce and Work.com orgs, which provides a centralized Kafka-based Event Bus to take action on all org activity. Multiple integrations are possible in this configuration, including Heroku apps in dynos, Salesforce Flow, Mulesoft, and more.

And we’ve uncovered new opportunities for further enhancements and integrations in the months to come.

Heroku streaming data connectors

We’ve also made multiple improvements to the beta product to prevent lost events during a Postgres maintenance and minimize lost events during a Postgres failover scenario. We also added an update command to make changes to tables or columns after initial provisioning and updated Debezium to the latest 1.3 release.

It’s as easy as heroku data:connectors:create

To get started, make sure you have the latest CLI plugin. Then create a connector by identifying the Postgres source and Apache Kafka store by name, specifying which table(s) to include, and optionally listing which columns to exclude:

heroku data:connectors:create \
  --source postgresql-neato-98765 \
  --store kafka-lovely-12345 \
  --table public.posts --table public.users \
  --exclude-column public.users.password

See the full instructions and best practices for more detail.

Feedback Welcome

Streaming Data Connectors open a new frontier of data-driven development for our customers and us. We look forward to seeing what you can do with it.

Ready to get started? Contact sales.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK