Confluent Control Center

The simplest way to operate and build applications with Apache Kafka

Confluent Control Center is a web-based graphical user interface that helps you operate and build event streaming applications with Apache Kafka. Expertly-designed dashboards and operating consoles help you meet SLAs and administer key components of your event streaming platform as you scale across the enterprise.

Build pipelines and process streams

Control Center provides centralized management for all your Connectors built on Kafka Connect and offers a graphical user interface for Confluent KSQL.

Kafka Connect and Connectors integration

Manage all your running connectors built on Kafka Connect.

  • Create new source and sink connectors
  • Edit existing source and sink connectors
  • Support multiple Kafka Connect clusters at a time

Confluent KSQL integration

Develop event streaming applications by using the KSQL user interface provided by Control Center.

  • Create Kafka topics
  • Develop persistent queries
  • Support multiple KSQL clusters at a time

View and manage event streams

Control Center helps you observe and control all your Kafka topics and data in a centralized way.

Topic management and inspection

Start with a view of your topics and drill all the way down into the actual data following through them.

  • Create and edit topics
  • View topic details (partitions, replication factor)
  • Browse message data

Schema management

Control Center integrates with Confluent Schema Registry to allow you to see schemas for topics, create and edit schemas, and even validate your schemas against the compatibility policy.

View Kafka clusters at a glance

Control Center offers a global view of the most critical parameters around health, availability, and performance of your Apache Kafka clusters, so you can understand your event streaming platform as a whole.

Kafka cluster health

Get an enterprise-ready summary of system health indicators, drill down on a per broker basis and receive alerts when anomalous events are detected.

  • Health status for every cluster
  • Number of brokers, partitions and topics
  • Throughput (produced and consumed)
  • Under-replicated partitions

Broker configuration

Simplify at-scale operations using a centralized dashboard to view broker configurations and modify them dynamically without resorting to rolling restarts.

Meet event streaming SLAs

Control Center provides a solution designed to track the key performance indicators for all your Apache Kafka event streams and event streaming applications.

Message delivery and latency

Use intuitive charts to track, receive alerts and troubleshoot message production and consumption over time.

  • Message delivery (expected vs. actual)
  • Average latency
  • Average throughput

Consumer lag

View how consumers are performing based on offset, spot issues with slow consumers at a glance, and take proactive steps to keep performance high.

logo RBC

Étude de cas

Les données de flux permettent à la RBC d'être une organisation axée sur les données

La Banque royale du Canada a conçu une architecture de données en temps réel, évolutive et axée sur les événements pour son nombre rapidement croissant d'initiatives de cloud, d'apprentissage automatique et d'IA. Découvrez pourquoi la RBC a choisi la plateforme Confluent comme base de cette architecture de données.

Regarder la vidéo

Nous utilisons des cookies afin de comprendre comment vous utilisez notre site et améliorer votre expérience. Cliquez ici pour en apprendre davantage ou pour modifier vos paramètres de cookies. En poursuivant la navigation, vous consentez à ce que nous utilisions des cookies.