Top 3 Reasons Why Kafka + SQL is an Amazing Combination
The combination of Kafka and SQL can be a game-changer for your business. It simplifies streaming deployments, accelerates streaming app time-to-market, and helps you to unlock increased value from your Kafka deployment.
Read More ⟶Normalizing JSON Data in Kafka Topics? SQLStreamBuilder Input Transforms to the Rescue
Input filters allow you to write a javascript function that operates on each message after it’s consumed from Kafka (or any other source) but before you write SQL against it.
Read More ⟶Javascript Functions in Flink with SQLStreamBuilder
Javascript Functions allow you to create arbitrary functions and call them directly from SQL. You con’t need to restart your system, stop your cluster, or compile/recompile anything. Just specify a function and get to business.
Read More ⟶Joining Kafka Streams Using SQL to Enrich and Route Data
Joins are an important and powerful part of the SQL language. You can perform joins in SQLStreamBuilder to enrich data and create net new streams of useful data.
Read More ⟶SQLStreamBuilder October Feature Update
Our mission has remained clear—to build the best way to create, manage stream processing jobs using SQL, so you can work with your Kafka clusters, databases, and processing logic like the databases you know.
Read More ⟶SQLStreamBuilder Feature Update
I thought I would share a couple of improvements and features we recently added to SQLStreamBuilder. If you aren’t familiar, SQLStreamBuilder allows you to run Streaming SQL jobs against streams of data—initially on Kafka.
Read More ⟶Introducing SQLStreamBuilder
Today, we are launching SQLStreamBuilder (SSB): an interface to declare stream processing jobs using SQL. SQLStreamBuilder provides a powerful interface for writing SQL as well as managing SQL jobs.
Read More ⟶