Browse Source

MINOR: Use https instead of http in links (#6477)

Verified that the https links work.

I didn't update the license header in this PR since that touches
so many files. Will file a separate one for that.

Reviewers: Manikumar Reddy <manikumar.reddy@gmail.com>
pull/3900/merge
Ismael Juma 5 years ago committed by GitHub
parent
commit
7d9e93ac6d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 2
      CONTRIBUTING.md
  2. 2
      NOTICE
  3. 8
      README.md
  4. 8
      build.gradle
  5. 20
      doap_Kafka.rdf
  6. 2
      gradle/buildscript.gradle
  7. 4
      jmh-benchmarks/README.md
  8. 6
      release_notes.py
  9. 6
      tests/README.md
  10. 2
      tests/bootstrap-test-env.sh
  11. 2
      vagrant/README.md

2
CONTRIBUTING.md

@ -1,6 +1,6 @@
## Contributing to Kafka ## Contributing to Kafka
*Before opening a pull request*, review the [Contributing](http://kafka.apache.org/contributing.html) and [Contributing Code Changes](https://cwiki.apache.org/confluence/display/KAFKA/Contributing+Code+Changes) pages. *Before opening a pull request*, review the [Contributing](https://kafka.apache.org/contributing.html) and [Contributing Code Changes](https://cwiki.apache.org/confluence/display/KAFKA/Contributing+Code+Changes) pages.
It lists steps that are required before creating a PR. It lists steps that are required before creating a PR.

2
NOTICE

@ -2,7 +2,7 @@ Apache Kafka
Copyright 2019 The Apache Software Foundation. Copyright 2019 The Apache Software Foundation.
This product includes software developed at This product includes software developed at
The Apache Software Foundation (http://www.apache.org/). The Apache Software Foundation (https://www.apache.org/).
This distribution has a binary dependency on jersey, which is available under the CDDL This distribution has a binary dependency on jersey, which is available under the CDDL
License. The source code of jersey can be found at https://github.com/jersey/jersey/. License. The source code of jersey can be found at https://github.com/jersey/jersey/.

8
README.md

@ -1,8 +1,8 @@
Apache Kafka Apache Kafka
================= =================
See our [web site](http://kafka.apache.org) for details on the project. See our [web site](https://kafka.apache.org) for details on the project.
You need to have [Gradle](http://www.gradle.org/installation) and [Java](http://www.oracle.com/technetwork/java/javase/downloads/index.html) installed. You need to have [Gradle](https://www.gradle.org/installation) and [Java](https://www.oracle.com/technetwork/java/javase/downloads/index.html) installed.
Kafka requires Gradle 5.0 or higher. Kafka requires Gradle 5.0 or higher.
@ -19,7 +19,7 @@ Now everything else will work.
### Build a jar and run it ### ### Build a jar and run it ###
./gradlew jar ./gradlew jar
Follow instructions in http://kafka.apache.org/documentation.html#quickstart Follow instructions in https://kafka.apache.org/documentation.html#quickstart
### Build source jar ### ### Build source jar ###
./gradlew srcJar ./gradlew srcJar
@ -209,4 +209,4 @@ See [vagrant/README.md](vagrant/README.md).
Apache Kafka is interested in building the community; we would welcome any thoughts or [patches](https://issues.apache.org/jira/browse/KAFKA). You can reach us [on the Apache mailing lists](http://kafka.apache.org/contact.html). Apache Kafka is interested in building the community; we would welcome any thoughts or [patches](https://issues.apache.org/jira/browse/KAFKA). You can reach us [on the Apache mailing lists](http://kafka.apache.org/contact.html).
To contribute follow the instructions here: To contribute follow the instructions here:
* http://kafka.apache.org/contributing.html * https://kafka.apache.org/contributing.html

8
build.gradle

@ -189,11 +189,11 @@ subprojects {
pom.project { pom.project {
name 'Apache Kafka' name 'Apache Kafka'
packaging 'jar' packaging 'jar'
url 'http://kafka.apache.org' url 'https://kafka.apache.org'
licenses { licenses {
license { license {
name 'The Apache Software License, Version 2.0' name 'The Apache Software License, Version 2.0'
url 'http://www.apache.org/licenses/LICENSE-2.0.txt' url 'https://www.apache.org/licenses/LICENSE-2.0.txt'
distribution 'repo' distribution 'repo'
} }
} }
@ -1420,7 +1420,7 @@ project(':connect:api') {
javadoc { javadoc {
include "**/org/apache/kafka/connect/**" // needed for the `javadocAll` task include "**/org/apache/kafka/connect/**" // needed for the `javadocAll` task
options.links "http://docs.oracle.com/javase/7/docs/api/" options.links "https://docs.oracle.com/javase/8/docs/api/"
} }
tasks.create(name: "copyDependantLibs", type: Copy) { tasks.create(name: "copyDependantLibs", type: Copy) {
@ -1699,5 +1699,5 @@ task aggregatedJavadoc(type: Javadoc) {
classpath = files(projectsWithJavadoc.collect { it.sourceSets.main.compileClasspath }) classpath = files(projectsWithJavadoc.collect { it.sourceSets.main.compileClasspath })
includes = projectsWithJavadoc.collectMany { it.javadoc.getIncludes() } includes = projectsWithJavadoc.collectMany { it.javadoc.getIncludes() }
excludes = projectsWithJavadoc.collectMany { it.javadoc.getExcludes() } excludes = projectsWithJavadoc.collectMany { it.javadoc.getExcludes() }
options.links "http://docs.oracle.com/javase/7/docs/api/" options.links "https://docs.oracle.com/javase/8/docs/api/"
} }

20
doap_Kafka.rdf

@ -2,9 +2,9 @@
<?xml-stylesheet type="text/xsl"?> <?xml-stylesheet type="text/xsl"?>
<rdf:RDF xml:lang="en" <rdf:RDF xml:lang="en"
xmlns="http://usefulinc.com/ns/doap#" xmlns="http://usefulinc.com/ns/doap#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:rdf="https://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:asfext="http://projects.apache.org/ns/asfext#" xmlns:asfext="https://projects.apache.org/ns/asfext#"
xmlns:foaf="http://xmlns.com/foaf/0.1/"> xmlns:foaf="https://xmlns.com/foaf/0.1/">
<!-- <!--
Licensed to the Apache Software Foundation (ASF) under one or more Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with contributor license agreements. See the NOTICE file distributed with
@ -21,22 +21,22 @@
See the License for the specific language governing permissions and See the License for the specific language governing permissions and
limitations under the License. limitations under the License.
--> -->
<Project rdf:about="http://kafka.apache.org/"> <Project rdf:about="https://kafka.apache.org/">
<created>2014-04-12</created> <created>2014-04-12</created>
<license rdf:resource="http://usefulinc.com/doap/licenses/asl20" /> <license rdf:resource="http://usefulinc.com/doap/licenses/asl20" />
<name>Apache Kafka</name> <name>Apache Kafka</name>
<homepage rdf:resource="http://kafka.apache.org/" /> <homepage rdf:resource="https://kafka.apache.org/" />
<asfext:pmc rdf:resource="http://kafka.apache.org" /> <asfext:pmc rdf:resource="https://kafka.apache.org" />
<shortdesc>Apache Kafka is a distributed, fault tolerant, publish-subscribe messaging.</shortdesc> <shortdesc>Apache Kafka is a distributed, fault tolerant, publish-subscribe messaging.</shortdesc>
<description>A single Kafka broker can handle hundreds of megabytes of reads and writes per second from thousands of clients. Kafka is designed to allow a single cluster to serve as the central data backbone for a large organization. It can be elastically and transparently expanded without downtime. Data streams are partitioned and spread over a cluster of machines to allow data streams larger than the capability of any single machine and to allow clusters of co-ordinated consumers. Kafka has a modern cluster-centric design that offers strong durability and fault-tolerance guarantees. Messages are persisted on disk and replicated within the cluster to prevent data loss. Each broker can handle terabytes of messages without performance impact.</description> <description>A single Kafka broker can handle hundreds of megabytes of reads and writes per second from thousands of clients. Kafka is designed to allow a single cluster to serve as the central data backbone for a large organization. It can be elastically and transparently expanded without downtime. Data streams are partitioned and spread over a cluster of machines to allow data streams larger than the capability of any single machine and to allow clusters of co-ordinated consumers. Kafka has a modern cluster-centric design that offers strong durability and fault-tolerance guarantees. Messages are persisted on disk and replicated within the cluster to prevent data loss. Each broker can handle terabytes of messages without performance impact.</description>
<bug-database rdf:resource="https://issues.apache.org/jira/browse/KAFKA" /> <bug-database rdf:resource="https://issues.apache.org/jira/browse/KAFKA" />
<mailing-list rdf:resource="http://kafka.apache.org/contact.html" /> <mailing-list rdf:resource="https://kafka.apache.org/contact.html" />
<download-page rdf:resource="http://kafka.apache.org/downloads.html" /> <download-page rdf:resource="https://kafka.apache.org/downloads.html" />
<programming-language>Scala</programming-language> <programming-language>Scala</programming-language>
<category rdf:resource="http://projects.apache.org/category/big-data" /> <category rdf:resource="https://projects.apache.org/projects.html?category#big-data" />
<repository> <repository>
<SVNRepository> <SVNRepository>
<location rdf:resource="http://git-wip-us.apache.org/repos/asf/kafka.git"/> <location rdf:resource="https://gitbox.apache.org/repos/asf/kafka.git"/>
<browse rdf:resource="https://github.com/apache/kafka"/> <browse rdf:resource="https://github.com/apache/kafka"/>
</SVNRepository> </SVNRepository>
</repository> </repository>

2
gradle/buildscript.gradle

@ -17,7 +17,7 @@ repositories {
repositories { repositories {
// For license plugin. // For license plugin.
maven { maven {
url 'http://dl.bintray.com/content/netflixoss/external-gradle-plugins/' url 'https://dl.bintray.com/content/netflixoss/external-gradle-plugins/'
} }
} }
} }

4
jmh-benchmarks/README.md

@ -1,11 +1,11 @@
###JMH-Benchmark module ###JMH-Benchmark module
This module contains benchmarks written using [JMH](http://openjdk.java.net/projects/code-tools/jmh/) from OpenJDK. This module contains benchmarks written using [JMH](https://openjdk.java.net/projects/code-tools/jmh/) from OpenJDK.
Writing correct micro-benchmarks is Java (or another JVM language) is difficult and there are many non-obvious pitfalls (many Writing correct micro-benchmarks is Java (or another JVM language) is difficult and there are many non-obvious pitfalls (many
due to compiler optimizations). JMH is a framework for running and analyzing benchmarks (micro or macro) written in Java (or due to compiler optimizations). JMH is a framework for running and analyzing benchmarks (micro or macro) written in Java (or
another JVM language). another JVM language).
For help in writing correct JMH tests, the best place to start is the [sample code](http://hg.openjdk.java.net/code-tools/jmh/file/tip/jmh-samples/src/main/java/org/openjdk/jmh/samples/) provided For help in writing correct JMH tests, the best place to start is the [sample code](https://hg.openjdk.java.net/code-tools/jmh/file/tip/jmh-samples/src/main/java/org/openjdk/jmh/samples/) provided
by the JMH project. by the JMH project.
Typically, JMH is expected to run as a separate project in Maven. The jmh-benchmarks module uses Typically, JMH is expected to run as a separate project in Maven. The jmh-benchmarks module uses

6
release_notes.py

@ -98,16 +98,16 @@ if __name__ == "__main__":
print "<h1>Release Notes - Kafka - Version %s</h1>" % version print "<h1>Release Notes - Kafka - Version %s</h1>" % version
print """<p>Below is a summary of the JIRA issues addressed in the %(version)s release of Kafka. For full documentation of the print """<p>Below is a summary of the JIRA issues addressed in the %(version)s release of Kafka. For full documentation of the
release, a guide to get started, and information about the project, see the <a href="http://kafka.apache.org/">Kafka release, a guide to get started, and information about the project, see the <a href="https://kafka.apache.org/">Kafka
project site</a>.</p> project site</a>.</p>
<p><b>Note about upgrades:</b> Please carefully review the <p><b>Note about upgrades:</b> Please carefully review the
<a href="http://kafka.apache.org/%(minor)s/documentation.html#upgrade">upgrade documentation</a> for this release thoroughly <a href="https://kafka.apache.org/%(minor)s/documentation.html#upgrade">upgrade documentation</a> for this release thoroughly
before upgrading your cluster. The upgrade notes discuss any critical information about incompatibilities and breaking before upgrading your cluster. The upgrade notes discuss any critical information about incompatibilities and breaking
changes, performance changes, and any other changes that might impact your production deployment of Kafka.</p> changes, performance changes, and any other changes that might impact your production deployment of Kafka.</p>
<p>The documentation for the most recent release can be found at <p>The documentation for the most recent release can be found at
<a href="http://kafka.apache.org/documentation.html">http://kafka.apache.org/documentation.html</a>.</p>""" % { 'version': version, 'minor': minor_version_dotless } <a href="https://kafka.apache.org/documentation.html">https://kafka.apache.org/documentation.html</a>.</p>""" % { 'version': version, 'minor': minor_version_dotless }
for itype, issues in by_group: for itype, issues in by_group:
print "<h2>%s</h2>" % itype print "<h2>%s</h2>" % itype
print "<ul>" print "<ul>"

6
tests/README.md

@ -357,7 +357,7 @@ For a tutorial on how to setup and run the Kafka system tests, see
https://cwiki.apache.org/confluence/display/KAFKA/tutorial+-+set+up+and+run+Kafka+system+tests+with+ducktape https://cwiki.apache.org/confluence/display/KAFKA/tutorial+-+set+up+and+run+Kafka+system+tests+with+ducktape
* Install Virtual Box from [https://www.virtualbox.org/](https://www.virtualbox.org/) (run `$ vboxmanage --version` to check if it's installed). * Install Virtual Box from [https://www.virtualbox.org/](https://www.virtualbox.org/) (run `$ vboxmanage --version` to check if it's installed).
* Install Vagrant >= 1.6.4 from [http://www.vagrantup.com/](http://www.vagrantup.com/) (run `vagrant --version` to check if it's installed). * Install Vagrant >= 1.6.4 from [https://www.vagrantup.com/](https://www.vagrantup.com/) (run `vagrant --version` to check if it's installed).
* Install system test dependencies, including ducktape, a command-line tool and library for testing distributed systems. We recommend to use virtual env for system test development * Install system test dependencies, including ducktape, a command-line tool and library for testing distributed systems. We recommend to use virtual env for system test development
$ cd kafka/tests $ cd kafka/tests
@ -401,12 +401,12 @@ Preparation
In these steps, we will create an IAM role which has permission to create and destroy EC2 instances, In these steps, we will create an IAM role which has permission to create and destroy EC2 instances,
set up a keypair used for ssh access to the test driver and worker machines, and create a security group to allow the test driver and workers to all communicate via TCP. set up a keypair used for ssh access to the test driver and worker machines, and create a security group to allow the test driver and workers to all communicate via TCP.
* [Create an IAM role](http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html). We'll give this role the ability to launch or kill additional EC2 machines. * [Create an IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html). We'll give this role the ability to launch or kill additional EC2 machines.
- Create role "kafkatest-master" - Create role "kafkatest-master"
- Role type: Amazon EC2 - Role type: Amazon EC2
- Attach policy: AmazonEC2FullAccess (this will allow our test-driver to create and destroy EC2 instances) - Attach policy: AmazonEC2FullAccess (this will allow our test-driver to create and destroy EC2 instances)
* If you haven't already, [set up a keypair to use for SSH access](http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html). For the purpose * If you haven't already, [set up a keypair to use for SSH access](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html). For the purpose
of this quickstart, let's say the keypair name is kafkatest, and you've saved the private key in kafktest.pem of this quickstart, let's say the keypair name is kafkatest, and you've saved the private key in kafktest.pem
* Next, create a EC2 security group called "kafkatest". * Next, create a EC2 security group called "kafkatest".

2
tests/bootstrap-test-env.sh

@ -36,7 +36,7 @@ echo "Checking Vagrant installation..."
vagrant_version=`vagrant --version | egrep -o "[0-9]+\.[0-9]+\.[0-9]+"` vagrant_version=`vagrant --version | egrep -o "[0-9]+\.[0-9]+\.[0-9]+"`
bad_vagrant=false bad_vagrant=false
if [ "$(version $vagrant_version)" -lt "$(version 1.6.4)" ]; then if [ "$(version $vagrant_version)" -lt "$(version 1.6.4)" ]; then
echo "Found Vagrant version $vagrant_version. Please upgrade to 1.6.4 or higher (see http://www.vagrantup.com for details)" echo "Found Vagrant version $vagrant_version. Please upgrade to 1.6.4 or higher (see https://www.vagrantup.com for details)"
bad_vagrant=true bad_vagrant=true
else else
echo "Vagrant installation looks good." echo "Vagrant installation looks good."

2
vagrant/README.md

@ -3,7 +3,7 @@
Using Vagrant to get up and running. Using Vagrant to get up and running.
1) Install Virtual Box [https://www.virtualbox.org/](https://www.virtualbox.org/) 1) Install Virtual Box [https://www.virtualbox.org/](https://www.virtualbox.org/)
2) Install Vagrant >= 1.6.4 [http://www.vagrantup.com/](http://www.vagrantup.com/) 2) Install Vagrant >= 1.6.4 [https://www.vagrantup.com/](https://www.vagrantup.com/)
3) Install Vagrant Plugins: 3) Install Vagrant Plugins:
$ vagrant plugin install vagrant-hostmanager $ vagrant plugin install vagrant-hostmanager

Loading…
Cancel
Save