Skip to content

Commit 1b64d17

Browse files
authored
[DP-1901] - Convert Wurstmeister Kafka image to Bitnami for Kafka-go (#1255)
* [DP-1901] - Convert Wurstmeister Kafka image to Bitnami for Kafka-go * [DP-1901] - removing duplicate env in config * [DP-1901] - adding KAFKA_VERSION * [DP-1901] - FIXING KAFKA_VERSION * [DP-1901] - minor fixtures to KAFKA_VERSION * [DP-1901] - minor fixtures in lint * [DP-1901] - fixing KAFKA_VERSION to 0.10.2.1 * [DP-1901] - minor fixtures to KAFKA_VERSION * [DP-1901] - fixing zookeeper connect * [DP-1901] - fixing KAFKA_VERSION to 0.10.2.1 * [DP-1901] - fixing kafka-011 * [DP-1901] - fixing kafka-011 environment * [DP-1901] - fixing zookeeper kafka-011 * [DP-1901] - fixing KAFKA_VERSION kafka-011 * [DP-1901] - fixing KAFKA_VERSION kafka-011 * [DP-1901] - fixing KAFKA_VERSION kafka-011 * [DP-1901] - Adding AUTHORIZER kafka-011 * [DP-1901] - reset kafka-011 * [DP-1901] - bitnami for kafka-011 * [DP-1901] - bitnami for kafka-011 zookeeper fixtures * [DP-1901] - fixtures to circleci and creating docker_compose_versions folder * [DP-1901] - zookeeper fix * [DP-1901] - fixtures to circleci. removed unsupported kafka * [DP-1901] - fixtures to circleci 2.3.1. fixing examples folder * [DP-1901] - examples docker-compose fix to bitnami * [DP-1901] - minor README.md fixtures * [DP-1901] - minor README.md fixtures * [DP-1901] - minor README.md fixtures * [DP-1901] - minor README.md fixtures * [DP-1901] - Grammatical fixtures in README.md * [DP-1901] - Adding support for v281 and v361 in circleci * [DP-1901] - touch README.md for circleci trigger * [DP-1901] - Creating v361docker and modify circleci * [DP-1901] - Creating v361 docker and modify circleci * [DP-1901] - touch README.md for circleci trigger * [DP-1901] - removing v361 from circleci
1 parent 2af3101 commit 1b64d17

15 files changed

+625
-250
lines changed

.circleci/config.yml

Lines changed: 125 additions & 142 deletions
Large diffs are not rendered by default.

README.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ if err := conn.Close(); err != nil {
108108
```
109109

110110
### To Create Topics
111-
By default kafka has the `auto.create.topics.enable='true'` (`KAFKA_AUTO_CREATE_TOPICS_ENABLE='true'` in the wurstmeister/kafka kafka docker image). If this value is set to `'true'` then topics will be created as a side effect of `kafka.DialLeader` like so:
111+
By default kafka has the `auto.create.topics.enable='true'` (`KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE='true'` in the bitnami/kafka kafka docker image). If this value is set to `'true'` then topics will be created as a side effect of `kafka.DialLeader` like so:
112112
```go
113113
// to create topics when auto.create.topics.enable='true'
114114
conn, err := kafka.DialLeader(context.Background(), "tcp", "localhost:9092", "my-topic", 0)
@@ -797,3 +797,8 @@ KAFKA_VERSION=2.3.1 \
797797
KAFKA_SKIP_NETTEST=1 \
798798
go test -race ./...
799799
```
800+
801+
(or) to clean up the cached test results and run tests:
802+
```
803+
go clean -cache && make test
804+
```

docker-compose-241.yml

Lines changed: 0 additions & 32 deletions
This file was deleted.

docker-compose.010.yml

Lines changed: 0 additions & 29 deletions
This file was deleted.

docker-compose.yml

Lines changed: 36 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -1,34 +1,42 @@
1-
version: "3"
1+
# See https://hub.docker.com/r/bitnami/kafka/tags for the complete list.
2+
version: '3'
23
services:
4+
zookeeper:
5+
container_name: zookeeper
6+
hostname: zookeeper
7+
image: bitnami/zookeeper:latest
8+
ports:
9+
- 2181:2181
10+
environment:
11+
ALLOW_ANONYMOUS_LOGIN: yes
312
kafka:
4-
image: wurstmeister/kafka:2.12-2.3.1
13+
container_name: kafka
14+
image: bitnami/kafka:2.3.1-ol-7-r61
515
restart: on-failure:3
616
links:
7-
- zookeeper
17+
- zookeeper
818
ports:
9-
- 9092:9092
10-
- 9093:9093
19+
- 9092:9092
20+
- 9093:9093
1121
environment:
12-
KAFKA_VERSION: '2.3.1'
13-
KAFKA_BROKER_ID: '1'
14-
KAFKA_CREATE_TOPICS: 'test-writer-0:3:1,test-writer-1:3:1'
15-
KAFKA_DELETE_TOPIC_ENABLE: 'true'
16-
KAFKA_ADVERTISED_HOST_NAME: 'localhost'
17-
KAFKA_ADVERTISED_PORT: '9092'
18-
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
19-
KAFKA_AUTO_CREATE_TOPICS_ENABLE: 'true'
20-
KAFKA_MESSAGE_MAX_BYTES: '200000000'
21-
KAFKA_LISTENERS: 'PLAINTEXT://:9092,SASL_PLAINTEXT://:9093'
22-
KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093'
23-
KAFKA_SASL_ENABLED_MECHANISMS: 'PLAIN,SCRAM-SHA-256,SCRAM-SHA-512'
24-
KAFKA_AUTHORIZER_CLASS_NAME: 'kafka.security.auth.SimpleAclAuthorizer'
25-
KAFKA_ALLOW_EVERYONE_IF_NO_ACL_FOUND: 'true'
26-
KAFKA_OPTS: "-Djava.security.auth.login.config=/opt/kafka/config/kafka_server_jaas.conf"
27-
CUSTOM_INIT_SCRIPT: |-
28-
echo -e 'KafkaServer {\norg.apache.kafka.common.security.scram.ScramLoginModule required\n username="adminscram"\n password="admin-secret";\n org.apache.kafka.common.security.plain.PlainLoginModule required\n username="adminplain"\n password="admin-secret"\n user_adminplain="admin-secret";\n };' > /opt/kafka/config/kafka_server_jaas.conf;
29-
/opt/kafka/bin/kafka-configs.sh --zookeeper zookeeper:2181 --alter --add-config 'SCRAM-SHA-256=[password=admin-secret-256],SCRAM-SHA-512=[password=admin-secret-512]' --entity-type users --entity-name adminscram
30-
31-
zookeeper:
32-
image: wurstmeister/zookeeper
33-
ports:
34-
- 2181:2181
22+
KAFKA_CFG_BROKER_ID: 1
23+
KAFKA_CFG_DELETE_TOPIC_ENABLE: 'true'
24+
KAFKA_CFG_ADVERTISED_HOST_NAME: 'localhost'
25+
KAFKA_CFG_ADVERTISED_PORT: '9092'
26+
KAFKA_CFG_ZOOKEEPER_CONNECT: zookeeper:2181
27+
KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: 'true'
28+
KAFKA_CFG_MESSAGE_MAX_BYTES: '200000000'
29+
KAFKA_CFG_LISTENERS: 'PLAINTEXT://:9092,SASL_PLAINTEXT://:9093'
30+
KAFKA_CFG_ADVERTISED_LISTENERS: 'PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093'
31+
KAFKA_CFG_SASL_ENABLED_MECHANISMS: 'PLAIN,SCRAM-SHA-256,SCRAM-SHA-512'
32+
KAFKA_CFG_AUTHORIZER_CLASS_NAME: 'kafka.security.auth.SimpleAclAuthorizer'
33+
KAFKA_CFG_ALLOW_EVERYONE_IF_NO_ACL_FOUND: 'true'
34+
KAFKA_INTER_BROKER_USER: adminplain
35+
KAFKA_INTER_BROKER_PASSWORD: admin-secret
36+
KAFKA_BROKER_USER: adminplain
37+
KAFKA_BROKER_PASSWORD: admin-secret
38+
ALLOW_PLAINTEXT_LISTENER: yes
39+
entrypoint:
40+
- "/bin/bash"
41+
- "-c"
42+
- /opt/bitnami/kafka/bin/kafka-configs.sh --zookeeper zookeeper:2181 --alter --add-config "SCRAM-SHA-256=[password=admin-secret-256],SCRAM-SHA-512=[password=admin-secret-512]" --entity-type users --entity-name adminscram; exec /entrypoint.sh /run.sh

docker_compose_versions/README.md

Lines changed: 152 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,152 @@
1+
# Bitnami Kafka
2+
3+
This document outlines how to create a docker-compose file for a specific Bitnami Kafka version.
4+
5+
6+
## Steps to create docker-compose
7+
8+
- Refer to [docker-hub Bitnami Kafka tags](https://hub.docker.com/r/bitnami/kafka/tags) and sort by NEWEST to locate the image preferred, for example: `2.7.0`
9+
- There is documentation in the (main branch)[https://github.com/bitnami/containers/blob/main/bitnami/kafka/README.md] for environment config setup information. Refer to the `Notable Changes` section.
10+
- Sometimes there is a need to understand how the set up is being done. To locate the appropriate Kafka release in the repo [bitnami/containers](https://github.com/bitnami/containers), go through the [kafka commit history](https://github.com/bitnami/containers/commits/main/bitnami/kafka).
11+
- Once a commit is located, Refer to README.md, Dockerfile, entrypoint and various init scripts to understand the environment variables to config server.properties mapping conventions. Alternatively, you can spin up the required Kafka image and refer the mapping inside the container.
12+
- Ensure you follow the environment variable conventions in your docker-compose. Without proper environment variables, the Kafka cluster cannot start or can start with undesired configs. For example, Since Kafka version 2.3, all server.properties docker-compose environment configs start with `KAFKA_CFG_<config_with_underscore>`
13+
- Older versions of Bitnami Kafka have different conventions and limited docker-compose environment variables exposed for configs needed in server.properties
14+
15+
16+
In kafka-go, for all the test cases to succeed, Kafka cluster should have following server.properties along with a relevant kafka_jaas.conf mentioned in the KAFKA_OPTS. Goal is to ensure that the docker-compose file generates below server.properties.
17+
18+
19+
server.properties
20+
```
21+
advertised.host.name=localhost
22+
advertised.listeners=PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093
23+
advertised.port=9092
24+
auto.create.topics.enable=true
25+
broker.id=1
26+
delete.topic.enable=true
27+
group.initial.rebalance.delay.ms=0
28+
listeners=PLAINTEXT://:9092,SASL_PLAINTEXT://:9093
29+
log.dirs=/kafka/kafka-logs-1d5951569d78
30+
log.retention.check.interval.ms=300000
31+
log.retention.hours=168
32+
log.segment.bytes=1073741824
33+
message.max.bytes=200000000
34+
num.io.threads=8
35+
num.network.threads=3
36+
num.partitions=1
37+
num.recovery.threads.per.data.dir=1
38+
offsets.topic.replication.factor=1
39+
port=9092
40+
sasl.enabled.mechanisms=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
41+
socket.receive.buffer.bytes=102400
42+
socket.request.max.bytes=104857600
43+
socket.send.buffer.bytes=102400
44+
transaction.state.log.min.isr=1
45+
transaction.state.log.replication.factor=1
46+
zookeeper.connect=zookeeper:2181
47+
zookeeper.connection.timeout.ms=6000
48+
```
49+
50+
51+
## run docker-compose and test cases
52+
53+
run docker-compose
54+
```
55+
# docker-compose -f ./docker_compose_versions/docker-compose-<kafka_version>.yml up -d
56+
```
57+
58+
59+
run test cases
60+
```
61+
# go clean -cache; KAFKA_SKIP_NETTEST=1 KAFKA_VERSION=<a.b.c> go test -race -cover ./...;
62+
```
63+
64+
65+
## Various Bitnami Kafka version issues observed in circleci
66+
67+
68+
### Kafka v101, v111, v201, v211 and v221
69+
70+
71+
In kafka-go repo, all the tests require sasl.enabled.mechanisms as PLAIN,SCRAM-SHA-256,SCRAM-SHA-512 for the Kafka cluster.
72+
73+
74+
It has been observed for Kafka v101, v111, v201, v211 and v221 which are used in the circleci for build have issues with SCRAM.
75+
76+
77+
There is no way to override the config sasl.enabled.mechanisms causing Kafka cluster to start up as PLAIN.
78+
79+
80+
There has been some attempts made to override sasl.enabled.mechanisms
81+
- Modified entrypoint in docker-compose to append the server.properties with relevant configs sasl.enabled.mechanisms before running entrypoint.sh. This resulted in failures for Kafka v101, v111, v201, v211 and v221. Once Kafka server starts, server.properties gets appended with default value of sasl.enabled.mechanisms there by cluster to start with out PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
82+
- Mounted a docker-compose volume for server.propeties. However, This also resulted in failures for Kafka v101, v111, v201, v211 and v221. Once Kafka server starts, server.properties gets appended with default value of sasl.enabled.mechanisms there by cluster to start with out PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
83+
84+
85+
NOTE:
86+
- Kafka v101, v111, v201, v211 and v221 have no docker-compose files since we need SCRAM for kafka-go test cases to succeed.
87+
- There is no Bitnami Kafka image for v222 hence testing has been performed on v221
88+
89+
90+
### Kafka v231
91+
92+
In Bitnami Kafka v2.3, all server.properties docker-compose environment configs start with `KAFKA_CFG_<config_with_underscore>`. However, it is not picking the custom populated kafka_jaas.conf.
93+
94+
95+
After a lot of debugging, it has been noticed that there aren't enough privileges to create the kafka_jaas.conf. Hence the environment variables below need to be added in docker-compose to generate the kafka_jaas.conf. This issue is not noticed after kafka v2.3
96+
97+
98+
```
99+
KAFKA_INTER_BROKER_USER: adminplain
100+
KAFKA_INTER_BROKER_PASSWORD: admin-secret
101+
KAFKA_BROKER_USER: adminplain
102+
KAFKA_BROKER_PASSWORD: admin-secret
103+
```
104+
105+
There is a docker-compose file `docker-compose-231.yml` in the folder `kafka-go/docker_compose_versions` for reference.
106+
107+
108+
## References
109+
110+
111+
For user reference, please find the some of the older kafka versions commits from the [kafka commit history](https://github.com/bitnami/containers/commits/main/bitnami/kafka). For Kafka versions with no commit history, data is populated with the latest version available for the tag.
112+
113+
114+
### Kafka v010: docker-compose reference: `kafka-go/docker_compose_versions/docker-compose-010.yml`
115+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=0.10.2.1)
116+
- [kafka commit](https://github.com/bitnami/containers/tree/c4240f0525916a418245c7ef46d9534a7a212c92/bitnami/kafka)
117+
118+
119+
### Kafka v011: docker-compose reference: `kafka-go/docker_compose_versions/docker-compose-011.yml`
120+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=0.11.0)
121+
- [kafka commit](https://github.com/bitnami/containers/tree/7724adf655e4ca9aac69d606d41ad329ef31eeca/bitnami/kafka)
122+
123+
124+
### Kafka v101: docker-compose reference: N/A
125+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=1.0.1)
126+
- [kafka commit](https://github.com/bitnami/containers/tree/44cc8f4c43ead6edebd3758c8df878f4f9da82c2/bitnami/kafka)
127+
128+
129+
### Kafka v111: docker-compose reference: N/A
130+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=1.1.1)
131+
- [kafka commit](https://github.com/bitnami/containers/tree/cb593dc98c2eb7a39f2792641e741d395dbe50e7/bitnami/kafka)
132+
133+
134+
### Kafka v201: docker-compose reference: N/A
135+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=2.0.1)
136+
- [kafka commit](https://github.com/bitnami/containers/tree/9ff8763df265c87c8b59f8d7ff0cf69299d636c9/bitnami/kafka)
137+
138+
139+
### Kafka v211: docker-compose reference: N/A
140+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=2.1.1)
141+
- [kafka commit](https://github.com/bitnami/containers/tree/d3a9d40afc2b7e7de53486538a63084c1a565d43/bitnami/kafka)
142+
143+
144+
### Kafka v221: docker-compose reference: N/A
145+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=2.2.1)
146+
- [kafka commit](https://github.com/bitnami/containers/tree/f132ef830d1ba9b78392ec4619174b4640c276c9/bitnami/kafka)
147+
148+
149+
### Kafka v231: docker-compose reference: `kafka-go/docker_compose_versions/docker-compose-231.yml`
150+
- [tag](https://hub.docker.com/r/bitnami/kafka/tags?page=1&ordering=last_updated&name=2.3.1)
151+
- [kafka commit](https://github.com/bitnami/containers/tree/ae572036b5281456b0086345fec0bdb74f7cf3a3/bitnami/kafka)
152+
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
# See https://hub.docker.com/r/bitnami/kafka/tags for the complete list.
2+
version: '3'
3+
services:
4+
zookeeper:
5+
container_name: zookeeper
6+
hostname: zookeeper
7+
image: bitnami/zookeeper:latest
8+
ports:
9+
- 2181:2181
10+
environment:
11+
ALLOW_ANONYMOUS_LOGIN: yes
12+
kafka:
13+
container_name: kafka
14+
image: bitnami/kafka:0.10.2.1
15+
restart: on-failure:3
16+
links:
17+
- zookeeper
18+
ports:
19+
- 9092:9092
20+
- 9093:9093
21+
environment:
22+
KAFKA_BROKER_ID: 1
23+
KAFKA_DELETE_TOPIC_ENABLE: 'true'
24+
KAFKA_ADVERTISED_HOST_NAME: 'localhost'
25+
KAFKA_ADVERTISED_PORT: '9092'
26+
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
27+
KAFKA_AUTO_CREATE_TOPICS_ENABLE: 'true'
28+
KAFKA_MESSAGE_MAX_BYTES: '200000000'
29+
KAFKA_LISTENERS: 'PLAINTEXT://:9092,SASL_PLAINTEXT://:9093'
30+
KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093'
31+
KAFKA_SASL_ENABLED_MECHANISMS: 'PLAIN,SCRAM-SHA-256,SCRAM-SHA-512'
32+
KAFKA_AUTHORIZER_CLASS_NAME: 'kafka.security.auth.SimpleAclAuthorizer'
33+
KAFKA_ALLOW_EVERYONE_IF_NO_ACL_FOUND: 'true'
34+
KAFKA_OPTS: "-Djava.security.auth.login.config=/opt/bitnami/kafka/config/kafka_server_jaas.conf"
35+
ALLOW_PLAINTEXT_LISTENER: yes
36+
entrypoint:
37+
- "/bin/bash"
38+
- "-c"
39+
- echo -e 'KafkaServer {\norg.apache.kafka.common.security.scram.ScramLoginModule required\n username="adminscram"\n password="admin-secret";\n org.apache.kafka.common.security.plain.PlainLoginModule required\n username="adminplain"\n password="admin-secret"\n user_adminplain="admin-secret";\n };' > /opt/bitnami/kafka/config/kafka_server_jaas.conf; /opt/bitnami/kafka/bin/kafka-configs.sh --zookeeper zookeeper:2181 --alter --add-config 'SCRAM-SHA-256=[password=admin-secret-256],SCRAM-SHA-512=[password=admin-secret-512]' --entity-type users --entity-name adminscram; exec /app-entrypoint.sh /start-kafka.sh
Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
# See https://hub.docker.com/r/bitnami/kafka/tags for the complete list.
2+
version: '3'
3+
services:
4+
zookeeper:
5+
container_name: zookeeper
6+
hostname: zookeeper
7+
image: bitnami/zookeeper:latest
8+
ports:
9+
- 2181:2181
10+
environment:
11+
ALLOW_ANONYMOUS_LOGIN: yes
12+
kafka:
13+
container_name: kafka
14+
image: bitnami/kafka:0.11.0-1-r1
15+
restart: on-failure:3
16+
links:
17+
- zookeeper
18+
ports:
19+
- 9092:9092
20+
- 9093:9093
21+
environment:
22+
KAFKA_BROKER_ID: 1
23+
KAFKA_DELETE_TOPIC_ENABLE: 'true'
24+
KAFKA_ADVERTISED_HOST_NAME: 'localhost'
25+
KAFKA_ADVERTISED_PORT: '9092'
26+
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
27+
KAFKA_LISTENERS: 'PLAINTEXT://:9092,SASL_PLAINTEXT://:9093'
28+
KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093'
29+
KAFKA_ALLOW_EVERYONE_IF_NO_ACL_FOUND: 'true'
30+
KAFKA_OPTS: "-Djava.security.auth.login.config=/opt/bitnami/kafka/config/kafka_server_jaas.conf"
31+
ALLOW_PLAINTEXT_LISTENER: "yes"
32+
entrypoint:
33+
- "/bin/bash"
34+
- "-c"
35+
# 0.11.0 image is not honoring some configs required in server.properties
36+
- echo -e '\nsasl.enabled.mechanisms=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512\nmessage.max.bytes=200000000\nauto.create.topics.enable=true\nport=9092' >> /opt/bitnami/kafka/config/server.properties; echo -e 'KafkaServer {\norg.apache.kafka.common.security.scram.ScramLoginModule required\n username="adminscram"\n password="admin-secret";\n org.apache.kafka.common.security.plain.PlainLoginModule required\n username="adminplain"\n password="admin-secret"\n user_adminplain="admin-secret";\n };' > /opt/bitnami/kafka/config/kafka_server_jaas.conf; /opt/bitnami/kafka/bin/kafka-configs.sh --zookeeper zookeeper:2181 --alter --add-config 'SCRAM-SHA-256=[password=admin-secret-256],SCRAM-SHA-512=[password=admin-secret-512]' --entity-type users --entity-name adminscram; exec /app-entrypoint.sh /run.sh

0 commit comments

Comments
 (0)