46 tools from the Confluent Kafka MCP Server, categorised by risk level.
View the Confluent Kafka policy →check-flink-statement-health Perform health check for a Flink statement 2/5 consume-messages Consume messages from Kafka topics 3/5 describe-flink-table Get full schema details for a Flink table 2/5 detect-flink-statement-issues Detect issues for a Flink statement 2/5 get-flink-statement-exceptions Retrieve recent exceptions for a Flink statement 2/5 get-flink-statement-profile Get Query Profiler data with metrics and issues 2/5 get-flink-table-info Get table metadata via INFORMATION_SCHEMA 2/5 get-topic-config Retrieve configuration details for a Kafka topic 2/5 list-clusters Get all clusters in the environment 2/5 list-connectors Retrieve a list of active connectors 2/5 list-environments Get all environments in Confluent Cloud 2/5 list-flink-catalogs List all catalogs in the Flink environment 2/5 list-flink-databases List all databases in a Flink catalog 2/5 list-flink-statements Retrieve a list of all Flink statements 2/5 list-flink-tables List all tables in a Flink database 2/5 list-schemas List all schemas in the Schema Registry 2/5 list-tableflow-catalog-integrations List all catalog integrations 2/5 list-tableflow-regions List all Tableflow regions 2/5 list-tableflow-topics List all Tableflow topics 2/5 list-tags Retrieve all tags from Schema Registry 2/5 list-topics List all topics in the Kafka cluster 2/5 read-connector Get information about a connector 2/5 read-environment Get details of a specific environment 2/5 read-flink-statement Read a Flink statement and its results 2/5 read-tableflow-catalog-integration Read a catalog integration 2/5 read-tableflow-topic Read a Tableflow topic 2/5 search-topics-by-name List all topics matching a name pattern 2/5 search-topics-by-tag List all topics with a specified tag 2/5 add-tags-to-topic Assign existing tags to Kafka topics 3/5 alter-topic-config Alter topic configuration in Confluent Cloud 5/5 create-connector Create a new Kafka connector 4/5 create-tableflow-catalog-integration Create a catalog integration 4/5 create-tableflow-topic Create a Tableflow topic 4/5 create-topic-tags Create new tag definitions in Confluent Cloud 3/5 create-topics Create one or more Kafka topics 4/5 produce-message Produce records to a Kafka topic 4/5 update-tableflow-catalog-integration Update a catalog integration 4/5 update-tableflow-topic Update a Tableflow topic 4/5 delete-connector Delete a Kafka connector 5/5 delete-flink-statements Delete a Flink statement 5/5 delete-tableflow-catalog-integration Delete a catalog integration 5/5 delete-tableflow-topic Delete a Tableflow topic 5/5 delete-tag Delete a tag definition from Confluent Cloud 4/5 delete-topics Delete Kafka topics and all messages 5/5 remove-tag-from-entity Remove a tag from an entity 3/5 The Confluent Kafka MCP server exposes 46 tools across 4 categories: Read, Write, Destructive, Execute.
Use Intercept, the open-source MCP proxy. Write YAML rules for each tool — rate limits, argument validation, or deny rules — then run Intercept in front of the Confluent Kafka server.
Confluent Kafka tools are categorised as Read (28), Write (10), Destructive (7), Execute (1). Each category has a recommended default policy.
Open source. One binary. Zero dependencies.
npx -y @policylayer/intercept