<spanid="streams-developer-guide-security"></span><h1>Streams Security<aclass="headerlink"href="#streams-security"title="Permalink to this headline"></a></h1>
<divclass="contents local topic"id="table-of-contents">
<p>Kafka Streams natively integrates with the <aclass="reference internal"href="../../kafka/security.html#kafka-security"><spanclass="std std-ref">Kafka’s security features</span></a> and supports all of the
client-side security features in Kafka. Streams leverages the <aclass="reference internal"href="../../clients/index.html#kafka-clients"><spanclass="std std-ref">Java Producer and Consumer API</span></a>.</p>
<p>To secure your Stream processing applications, configure the security settings in the corresponding Kafka producer
and consumer clients, and then specify the corresponding configuration settings in your Kafka Streams application.</p>
<p>Kafka supports cluster encryption and authentication, including a mix of authenticated and unauthenticated,
and encrypted and non-encrypted clients. Using security is optional.</p>
<p>Here a few relevant client-side security features:</p>
<dlclass="docutils">
<dt>Encrypt data-in-transit between your applications and Kafka brokers</dt>
<dd>You can enable the encryption of the client-server communication between your applications and the Kafka brokers.
For example, you can configure your applications to always use encryption when reading and writing data to and from
Kafka. This is critical when reading and writing data across security domains such as internal network, public
internet, and partner networks.</dd>
<dt>Client authentication</dt>
<dd>You can enable client authentication for connections from your application to Kafka brokers. For example, you can
define that only specific applications are allowed to connect to your Kafka cluster.</dd>
<dt>Client authorization</dt>
<dd>You can enable client authorization of read and write operations by your applications. For example, you can define
that only specific applications are allowed to read from a Kafka topic. You can also restrict write access to Kafka
topics to prevent data pollution or fraudulent activities.</dd>
</dl>
<p>For more information about the security features in Apache Kafka, see <aclass="reference internal"href="../../kafka/security.html#kafka-security"><spanclass="std std-ref">Kafka Security</span></a>.</p>
<spanid="streams-developer-guide-security-acls"></span><h2><aclass="toc-backref"href="#id1">Required ACL setting for secure Kafka clusters</a><aclass="headerlink"href="#required-acl-setting-for-secure-kafka-clusters"title="Permalink to this headline"></a></h2>
<p>When applications are run against a secured Kafka cluster, the principal running the application must have the ACL
<codeclass="docutils literal"><spanclass="pre">--cluster</span><spanclass="pre">--operation</span><spanclass="pre">Create</span></code> set so that the application has the permissions to create
<p>To avoid providing this permission to your application, you can create the required internal topics manually.
If the internal topics exist, Kafka Streams will not try to recreate them.
Note, that the internal repartition and changelog topics must be created with the correct number of partitions—otherwise, Kafka Streams will fail on startup.
The topics must be created with the same number of partitions as your input topic, or if there are multiple topics, the maximum number of partitions across all input topics.
Additionally, changelog topics <emph>must</emph> be created with log compaction enabled—otherwise, your application might lose data.
You can find out more about the names of the required internal topics via <code>Topology#describe()</code>.
All internal topics follow the naming pattern <code><application.id>-<operatorName>-<suffix></code> where the <code>suffix</code> is either <code>repartition</code> or <code>changelog</code>.
Note, that there is no guarantee about this naming pattern in future releases—it's not part of the public API.</p>
<spanid="streams-developer-guide-security-example"></span><h2><aclass="toc-backref"href="#id2">Security example</a><aclass="headerlink"href="#security-example"title="Permalink to this headline"></a></h2>
<p>The purpose is to configure a Kafka Streams application to enable client authentication and encrypt data-in-transit when
communicating with its Kafka cluster.</p>
<p>This example assumes that the Kafka brokers in the cluster already have their security setup and that the necessary SSL
certificates are available to the application in the local filesystem locations. For example, if you are using Docker
then you must also include these SSL certificates in the correct locations within the Docker image.</p>
<p>The snippet below shows the settings to enable client authentication and SSL encryption for data-in-transit between your
Kafka Streams application and the Kafka cluster it is reading and writing from:</p>
<divclass="highlight-bash"><divclass="highlight"><pre><span></span><spanclass="c1"># Essential security settings to enable client authentication and SSL encryption</span>
<p>Configure these settings in the application for your <codeclass="docutils literal"><spanclass="pre">StreamsConfig</span></code> instance. These settings will encrypt any
data-in-transit that is being read from or written to Kafka, and your application will authenticate itself against the
Kafka brokers that it is communicating with. Note that this example does not cover client authorization.</p>
<divclass="highlight-java"><divclass="highlight"><pre><span></span><spanclass="c1">// Code of your Java application that uses the Kafka Streams library</span>
<p>If you incorrectly configure a security setting in your application, it will fail at runtime, typically right after you
start it. For example, if you enter an incorrect password for the <codeclass="docutils literal"><spanclass="pre">ssl.keystore.password</span></code> setting, an error message
similar to this would be logged and then the application would terminate:</p>