Mirror of Apache Kafka
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
José Armando García Sancio 0dbf95b1c1 MINOR: Fix documentation for updateCurrentReassignment (#7611) 5 years ago
bin KAFKA-7500: MirrorMaker 2.0 (KIP-382) 5 years ago
checkstyle KAFKA-8584: The RPC code generator should support ByteBuffer. (#7342) 5 years ago
clients KAFKA-8972 (2.4 blocker): correctly release lost partitions during consumer.unsubscribe() (#7441) 5 years ago
config KAFKA-9102; Increase default zk session timeout and replica max lag [KIP-537] (#7596) 5 years ago
connect KAFKA-8340, KAFKA-8819: Use PluginClassLoader while statically initializing plugins (#7315) 5 years ago
core MINOR: Fix documentation for updateCurrentReassignment (#7611) 5 years ago
docs MINOR: Fix command examples in kafka-reassign-partitions.sh docs (#7583) 5 years ago
examples KAFKA-7412: clarify the doc for producer callback (#5798) 6 years ago
generator/src KAFKA-8584: The RPC code generator should support ByteBuffer. (#7342) 5 years ago
gradle MINOR: Upgrade zk to 3.5.6 (#7544) 5 years ago
jmh-benchmarks KAFKA-9048 Pt1: Remove Unnecessary lookup in Fetch Building (#7576) 5 years ago
log4j-appender/src MINOR: Remove deprecated assertThat usage from KafkaLog4jAppenderTest (#6257) 6 years ago
streams KAFKA-8972 (2.4 blocker): clear all state for zombie task on TaskMigratedException (#7608) 5 years ago
tests KAFKA-9077: Fix reading of metrics of Streams' SimpleBenchmark (#7610) 5 years ago
tools/src KAFKA-8880: Add overloaded function of Consumer.committed (#7304) 5 years ago
vagrant MINOR: Bump system test version from 2.2.0 to 2.2.1 (#6873) 5 years ago
.gitignore MINOR: Add UUID type to Kafka API code generation (#7291) 5 years ago
.travis.yml
CONTRIBUTING.md MINOR: Use https instead of http in links (#6477) 6 years ago
HEADER
LICENSE
NOTICE MINOR: Use https instead of http in links (#6477) 6 years ago
PULL_REQUEST_TEMPLATE.md
README.md KAFKA-7197: Support Scala 2.13 (#5454) 5 years ago
TROGDOR.md MINOR: Add RandomComponentPayloadGenerator and update Trogdor documentation (#7103) 5 years ago
Vagrantfile MINOR: upgrade to jdk8 8u202 6 years ago
build.gradle MINOR: Provide better messages when waiting for a condition in test (#7488) 5 years ago
doap_Kafka.rdf MINOR: Use https instead of http in links (#6477) 6 years ago
gradle.properties MINOR: Bump version to 2.5.0-SNAPSHOT (#7455) 5 years ago
jenkins.sh MINOR: Add verification step for Streams archetype to Jenkins build (#6431) 6 years ago
kafka-merge-pr.py MINOR: Bump version to 2.5.0-SNAPSHOT (#7455) 5 years ago
release.py HOTFIX: Change header back to http instead of https to path license header test (#6347) 6 years ago
release_notes.py MINOR: Use https instead of http in links (#6477) 6 years ago
settings.gradle KAFKA-7500: MirrorMaker 2.0 (KIP-382) 5 years ago
wrapper.gradle

README.md

Apache Kafka

See our web site for details on the project.

You need to have Gradle and Java installed.

Kafka requires Gradle 5.0 or higher.

Java 8 should be used for building in order to support both Java 8 and Java 11 at runtime.

Scala 2.12 is used by default, see below for how to use a different Scala version or all of the supported Scala versions.

First bootstrap and download the wrapper

cd kafka_source_dir
gradle

Now everything else will work.

Build a jar and run it

./gradlew jar

Follow instructions in https://kafka.apache.org/documentation.html#quickstart

Build source jar

./gradlew srcJar

Build aggregated javadoc

./gradlew aggregatedJavadoc

Build javadoc and scaladoc

./gradlew javadoc
./gradlew javadocJar # builds a javadoc jar for each module
./gradlew scaladoc
./gradlew scaladocJar # builds a scaladoc jar for each module
./gradlew docsJar # builds both (if applicable) javadoc and scaladoc jars for each module

Run unit/integration tests

./gradlew test # runs both unit and integration tests
./gradlew unitTest
./gradlew integrationTest

Force re-running tests without code change

./gradlew cleanTest test
./gradlew cleanTest unitTest
./gradlew cleanTest integrationTest

Running a particular unit/integration test

./gradlew clients:test --tests RequestResponseTest

Running a particular test method within a unit/integration test

./gradlew core:test --tests kafka.api.ProducerFailureHandlingTest.testCannotSendToInternalTopic
./gradlew clients:test --tests org.apache.kafka.clients.MetadataTest.testMetadataUpdateWaitTime

Running a particular unit/integration test with log4j output

Change the log4j setting in either clients/src/test/resources/log4j.properties or core/src/test/resources/log4j.properties

./gradlew clients:test --tests RequestResponseTest

Generating test coverage reports

Generate coverage reports for the whole project:

./gradlew reportCoverage

Generate coverage for a single module, i.e.:

./gradlew clients:reportCoverage

Building a binary release gzipped tar ball

./gradlew clean releaseTarGz

The above command will fail if you haven't set up the signing key. To bypass signing the artifact, you can run:

./gradlew clean releaseTarGz -x signArchives

The release file can be found inside ./core/build/distributions/.

Cleaning the build

./gradlew clean

Running a task with one of the Scala versions available (2.11.x, 2.12.x or 2.13.x)

Note that if building the jars with a version other than 2.12.x, you need to set the SCALA_VERSION variable or change it in bin/kafka-run-class.sh to run the quick start.

You can pass either the major version (eg 2.12) or the full version (eg 2.12.7):

./gradlew -PscalaVersion=2.12 jar
./gradlew -PscalaVersion=2.12 test
./gradlew -PscalaVersion=2.12 releaseTarGz

Running a task with all the scala versions enabled by default

Append All to the task name:

./gradlew testAll
./gradlew jarAll
./gradlew releaseTarGzAll

Running a task for a specific project

This is for core, examples and clients

./gradlew core:jar
./gradlew core:test

Listing all gradle tasks

./gradlew tasks

Building IDE project

Note that this is not strictly necessary (IntelliJ IDEA has good built-in support for Gradle projects, for example).

./gradlew eclipse
./gradlew idea

The eclipse task has been configured to use ${project_dir}/build_eclipse as Eclipse's build directory. Eclipse's default build directory (${project_dir}/bin) clashes with Kafka's scripts directory and we don't use Gradle's build directory to avoid known issues with this configuration.

Publishing the jar for all version of Scala and for all projects to maven

./gradlew uploadArchivesAll

Please note for this to work you should create/update ${GRADLE_USER_HOME}/gradle.properties (typically, ~/.gradle/gradle.properties) and assign the following variables

mavenUrl=
mavenUsername=
mavenPassword=
signing.keyId=
signing.password=
signing.secretKeyRingFile=

Publishing the streams quickstart archetype artifact to maven

For the Streams archetype project, one cannot use gradle to upload to maven; instead the mvn deploy command needs to be called at the quickstart folder:

cd streams/quickstart
mvn deploy

Please note for this to work you should create/update user maven settings (typically, ${USER_HOME}/.m2/settings.xml) to assign the following variables

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
   xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
   xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                       https://maven.apache.org/xsd/settings-1.0.0.xsd">
...                           
<servers>
   ...
   <server>
      <id>apache.snapshots.https</id>
      <username>${maven_username}</username>
      <password>${maven_password}</password>
   </server>
   <server>
      <id>apache.releases.https</id>
      <username>${maven_username}</username>
      <password>${maven_password}</password>
    </server>
    ...
 </servers>
 ...

Installing the jars to the local Maven repository

./gradlew installAll

Building the test jar

./gradlew testJar

Determining how transitive dependencies are added

./gradlew core:dependencies --configuration runtime

Determining if any dependencies could be updated

./gradlew dependencyUpdates

Running code quality checks

There are two code quality analysis tools that we regularly run, spotbugs and checkstyle.

Checkstyle

Checkstyle enforces a consistent coding style in Kafka. You can run checkstyle using:

./gradlew checkstyleMain checkstyleTest

The checkstyle warnings will be found in reports/checkstyle/reports/main.html and reports/checkstyle/reports/test.html files in the subproject build directories. They are also are printed to the console. The build will fail if Checkstyle fails.

Spotbugs

Spotbugs uses static analysis to look for bugs in the code. You can run spotbugs using:

./gradlew spotbugsMain spotbugsTest -x test

The spotbugs warnings will be found in reports/spotbugs/main.html and reports/spotbugs/test.html files in the subproject build directories. Use -PxmlSpotBugsReport=true to generate an XML report instead of an HTML one.

Common build options

The following options should be set with a -P switch, for example ./gradlew -PmaxParallelForks=1 test.

  • commitId: sets the build commit ID as .git/HEAD might not be correct if there are local commits added for build purposes.
  • mavenUrl: sets the URL of the maven deployment repository (file://path/to/repo can be used to point to a local repository).
  • maxParallelForks: limits the maximum number of processes for each task.
  • showStandardStreams: shows standard out and standard error of the test JVM(s) on the console.
  • skipSigning: skips signing of artifacts.
  • testLoggingEvents: unit test events to be logged, separated by comma. For example ./gradlew -PtestLoggingEvents=started,passed,skipped,failed test.
  • xmlSpotBugsReport: enable XML reports for spotBugs. This also disables HTML reports as only one can be enabled at a time.

Running system tests

See tests/README.md.

Running in Vagrant

See vagrant/README.md.

Contribution

Apache Kafka is interested in building the community; we would welcome any thoughts or patches. You can reach us on the Apache mailing lists.

To contribute follow the instructions here: