Plateforme Confluent

May Preview Release: Advancing KSQL and Schema Registry

Rohan DesaiRan Ma
Last Updated: 

We are very excited to announce the Confluent Platform May 2018 Preview release! The May Preview introduces powerful new capabilities for KSQL and the Schema Registry UI. Read on to learn more, and remember to share your feedback and help shape Confluent software! You can do that by visiting the Confluent Community Slack channel (particularly the #ksql and #control-center channels) or by contributing to the KSQL project on GitHub, where you can file issues, submit pull requests, and contribute to discussions.

Download The Preview Release

 

Confluent Control Center

Schema Registry

Schema Registry management has been one of the most requested features from our customers. In this preview release, we’re introducing the new Schema Registry UI, which allows users to see the schema per topic, along with its version history, and easily compare between a previous schema and the current one.

The new UI is designed to help the operations team with schema management and allow beginners to learn about Schema Registry.

To access the topic’s schema, simply navigate to the new SCHEMA tab in the topic details page or click on ‘•••’ and select “Schema”.

Notice the “Value” and “Key” tabs, where we show the schemas for both the key and value of the messages in the topic. You can view all versions of the schema by clicking on the “Version History” button and compare it against the current version. You can also download the schema by clicking on the “Download” button on the top right.

KSQL Editor Supports Autocompletion

We’ve added autocomplete to the KSQL query editor to help you compose queries faster. No more trying to figure out available streams and tables and specific KSQL syntax, autocomplete will help you as you type.

KSQL

INSERT INTO

INSERT INTO is a new statement that lets you write query output into an existing stream. INSERT INTO is currently not supported for tables. You can use INSERT INTO to merge output from multiple queries into a single output stream.

For example, suppose you are a retailer with separate streams for online and in-store sales. You want to compute your daily total sales for different items. You can use INSERT INTO to populate a stream for all sales and aggregate that stream:

CP Docker Images for KSQL

Confluent Platform Docker images are now available for the preview versions of both the KSQL server and KSQL CLI. You can use the confluentinc/cp-ksql-server image to deploy KSQL servers in interactive (default) or headless mode. You can use the confluentinc/cp-ksql-cli image to start a KSQL CLI session inside a Docker container.

Documentation for these images can be found at docs.confluent.io.

Going forward, we’ll continue to release these Docker images for each preview release as well as for each Confluent Platform stable release.

Topic and Schema Cleanup

The DROP statement for streams and tables now supports an option for also deleting the underlying Kafka topic and, for streams and tables in AVRO format, the registered Avro schema. To have DROP clean up topics and schemas, you need to add  DELETE TOPIC to your DROP statement:

This lets you ensure you don’t leave topics and schemas around as you create and drop streams and tables. This is helpful particularly during iterative development and testing. If you do want to keep your topics and schemas, say for consumption by some other system, then simply omit the DELETE TOPIC option from your statement:

Where to go from here

Try out the new Confluent Platform May 2018 Preview release and share your feedback! Here’s what you can do to get started:

Download The Preview Release

Subscribe to the Confluent Blog

S'abonner

More Articles Like This

Introducing ksqlDB
Jay Kreps

Introducing ksqlDB

Jay Kreps

Today marks a new release of KSQL, one so significant that we’re giving it a new name: ksqlDB. Like KSQL, ksqlDB remains freely available and community licensed, and you can […]

Getting Started with Rust and Apache Kafka
Gerard Klijs

Getting Started with Rust and Apache Kafka

Gerard Klijs

I’ve written an event sourcing bank simulation in Clojure (a lisp build for Java virtual machines or JVMs) called open-bank-mark, which you are welcome to read about in my previous […]

🚂 On Track with Apache Kafka – Building a Streaming ETL Solution with Rail Data
Robin Moffatt

🚂 On Track with Apache Kafka – Building a Streaming ETL Solution with Rail Data

Robin Moffatt

Trains are an excellent source of streaming data—their movements around the network are an unbounded series of events. Using this data, Apache Kafka® and Confluent Platform can provide the foundations […]

Fully managed Apache Kafka as a Service!

Try Free