Skip to content

kong deploy failed when loading from local priviate hub in China #10126

@huanghaiqing1

Description

@huanghaiqing1

What happened?

Hello, here I want to deploy kubernetes/dashboard in local. I can't get use "helm upgrade --install kubernetes-dashboard kubernetes-dashboard/kubernetes-dashboard --create-namespace --namespace kubernetes-dashboard" directly, because related images are not accessible directly from my site. So I prepared related images in local private hub. And adjust values.yml to deploy, but the pod about kong is still failed to run in my k8s. Any suggestions? Below is related settings for your reference:

Deploy way:
helm upgrade --install kubernetes-dashboard kubernetes-dashboard/kubernetes-dashboard --create-namespace --namespace kubernetes-dashboard -f values.yaml

pod of kong warning in k8s:

Name: kubernetes-dashboard-kong-98bbbb69b-trbcd
Namespace: kubernetes-dashboard
Priority: 0
Service Account: kubernetes-dashboard-kong
Node: k8swb/192.168.31.114
Start Time: Fri, 24 Jan 2025 15:00:54 +0800
Labels: app=kubernetes-dashboard-kong
app.kubernetes.io/component=app
app.kubernetes.io/instance=kubernetes-dashboard
app.kubernetes.io/managed-by=Helm
app.kubernetes.io/name=kong
app.kubernetes.io/version=3.6
helm.sh/chart=kong-2.38.0
pod-template-hash=98bbbb69b
version=3.6
Annotations: cni.projectcalico.org/containerID: 6010c141e95b01291c0144ed72c2455ccfe9f873915edd7ed5a35dd8c212a595
cni.projectcalico.org/podIP: 192.2.239.146/32
cni.projectcalico.org/podIPs: 192.2.239.146/32
kuma.io/gateway: enabled
kuma.io/service-account-token-volume: kubernetes-dashboard-kong-token
traffic.sidecar.istio.io/includeInboundPorts:
Status: Running
IP: 192.2.239.146
IPs:
IP: 192.2.239.146
Controlled By: ReplicaSet/kubernetes-dashboard-kong-98bbbb69b
Init Containers:
clear-stale-pid:
Container ID: containerd://e1d37e9ba084cd4d0fde15493f20a250e9964adf393d76c63e41afe7e047b838
Image: k8sma:5000/kong:3.6
Image ID: k8sma:5000/kong@sha256:ec2910c74bc16d05d5dcd2fdde6ff366797cb08d64be55dfa94d9eb1220c8a3e
Port:
Host Port:
SeccompProfile: RuntimeDefault
Command:
rm
-vrf
$KONG_PREFIX/pids
State: Terminated
Reason: Completed
Exit Code: 0
Started: Thu, 17 Apr 2025 15:56:49 +0800
Finished: Thu, 17 Apr 2025 15:56:49 +0800
Ready: True
Restart Count: 22
Environment:
KONG_ADMIN_ACCESS_LOG: /dev/stdout
KONG_ADMIN_ERROR_LOG: /dev/stderr
KONG_ADMIN_GUI_ACCESS_LOG: /dev/stdout
KONG_ADMIN_GUI_ERROR_LOG: /dev/stderr
KONG_ADMIN_LISTEN: 127.0.0.1:8444 http2 ssl, [::1]:8444 http2 ssl
KONG_CLUSTER_LISTEN: off
KONG_DATABASE: off
KONG_DECLARATIVE_CONFIG: /kong_dbless/kong.yml
KONG_DNS_ORDER: LAST,A,CNAME,AAAA,SRV
KONG_LUA_PACKAGE_PATH: /opt/?.lua;/opt/?/init.lua;;
KONG_NGINX_WORKER_PROCESSES: 1
KONG_PLUGINS: off
KONG_PORTAL_API_ACCESS_LOG: /dev/stdout
KONG_PORTAL_API_ERROR_LOG: /dev/stderr
KONG_PORT_MAPS: 443:8443
KONG_PREFIX: /kong_prefix/
KONG_PROXY_ACCESS_LOG: /dev/stdout
KONG_PROXY_ERROR_LOG: /dev/stderr
KONG_PROXY_LISTEN: 0.0.0.0:8443 http2 ssl, [::]:8443 http2 ssl
KONG_PROXY_STREAM_ACCESS_LOG: /dev/stdout basic
KONG_PROXY_STREAM_ERROR_LOG: /dev/stderr
KONG_ROUTER_FLAVOR: traditional
KONG_STATUS_ACCESS_LOG: off
KONG_STATUS_ERROR_LOG: /dev/stderr
KONG_STATUS_LISTEN: 0.0.0.0:8100, [::]:8100
KONG_STREAM_LISTEN: off
Mounts:
/kong_dbless/ from kong-custom-dbless-config-volume (rw)
/kong_prefix/ from kubernetes-dashboard-kong-prefix-dir (rw)
/tmp from kubernetes-dashboard-kong-tmp (rw)
Containers:
proxy:
Container ID: containerd://9eeef46129c112835de7a1463c1bfba226984192c67cc9cb4076d0d357ebc085
Image: k8sma:5000/kong:3.6
Image ID: k8sma:5000/kong@sha256:ec2910c74bc16d05d5dcd2fdde6ff366797cb08d64be55dfa94d9eb1220c8a3e
Ports: 8443/TCP, 8100/TCP
Host Ports: 0/TCP, 0/TCP
SeccompProfile: RuntimeDefault
State: Waiting
Reason: CrashLoopBackOff
Last State: Terminated
Reason: Error
Exit Code: 1
Started: Thu, 17 Apr 2025 16:13:02 +0800
Finished: Thu, 17 Apr 2025 16:13:03 +0800
Ready: False
Restart Count: 2215
Liveness: http-get http://:status/status delay=5s timeout=5s period=10s #success=1 #failure=3
Readiness: http-get http://:status/status/ready delay=5s timeout=5s period=10s #success=1 #failure=3
Environment:
KONG_ADMIN_ACCESS_LOG: /dev/stdout
KONG_ADMIN_ERROR_LOG: /dev/stderr
KONG_ADMIN_GUI_ACCESS_LOG: /dev/stdout
KONG_ADMIN_GUI_ERROR_LOG: /dev/stderr
KONG_ADMIN_LISTEN: 127.0.0.1:8444 http2 ssl, [::1]:8444 http2 ssl
KONG_CLUSTER_LISTEN: off
KONG_DATABASE: off
KONG_DECLARATIVE_CONFIG: /kong_dbless/kong.yml
KONG_DNS_ORDER: LAST,A,CNAME,AAAA,SRV
KONG_LUA_PACKAGE_PATH: /opt/?.lua;/opt/?/init.lua;;
KONG_NGINX_WORKER_PROCESSES: 1
KONG_PLUGINS: off
KONG_PORTAL_API_ACCESS_LOG: /dev/stdout
KONG_PORTAL_API_ERROR_LOG: /dev/stderr
KONG_PORT_MAPS: 443:8443
KONG_PREFIX: /kong_prefix/
KONG_PROXY_ACCESS_LOG: /dev/stdout
KONG_PROXY_ERROR_LOG: /dev/stderr
KONG_PROXY_LISTEN: 0.0.0.0:8443 http2 ssl, [::]:8443 http2 ssl
KONG_PROXY_STREAM_ACCESS_LOG: /dev/stdout basic
KONG_PROXY_STREAM_ERROR_LOG: /dev/stderr
KONG_ROUTER_FLAVOR: traditional
KONG_STATUS_ACCESS_LOG: off
KONG_STATUS_ERROR_LOG: /dev/stderr
KONG_STATUS_LISTEN: 0.0.0.0:8100, [::]:8100
KONG_STREAM_LISTEN: off
KONG_NGINX_DAEMON: off
Mounts:
/kong_dbless/ from kong-custom-dbless-config-volume (rw)
/kong_prefix/ from kubernetes-dashboard-kong-prefix-dir (rw)
/tmp from kubernetes-dashboard-kong-tmp (rw)
Conditions:
Type Status
PodReadyToStartContainers True
Initialized True
Ready False
ContainersReady False
PodScheduled True
Volumes:
kubernetes-dashboard-kong-prefix-dir:
Type: EmptyDir (a temporary directory that shares a pod's lifetime)
Medium:
SizeLimit: 256Mi
kubernetes-dashboard-kong-tmp:
Type: EmptyDir (a temporary directory that shares a pod's lifetime)
Medium:
SizeLimit: 1Gi
kubernetes-dashboard-kong-token:
Type: Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds: 3607
ConfigMapName: kube-root-ca.crt
ConfigMapOptional:
DownwardAPI: true
kong-custom-dbless-config-volume:
Type: ConfigMap (a volume populated by a ConfigMap)
Name: kong-dbless-config
Optional: false
QoS Class: BestEffort
Node-Selectors:
Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type Reason Age From Message


Normal Pulled 14d (x42 over 14d) kubelet Container image "k8sma:5000/kong:3.6" already present on machine
Warning BackOff 13d (x1170 over 14d) kubelet Back-off restarting failed container proxy in pod kubernetes-dashboard-kong-98bbbb69b-trbcd_kubernetes-dashboard(c60daf2a-f90d-4c0b-8ec9-9c4767a19213)
Normal SandboxChanged 19m (x2 over 20m) kubelet Pod sandbox changed, it will be killed and re-created.
Normal Pulled 19m kubelet Container image "k8sma:5000/kong:3.6" already present on machine
Normal Created 19m kubelet Created container clear-stale-pid
Normal Started 19m kubelet Started container clear-stale-pid
Normal Pulled 18m (x3 over 19m) kubelet Container image "k8sma:5000/kong:3.6" already present on machine
Normal Created 18m (x3 over 19m) kubelet Created container proxy
Normal Started 18m (x3 over 19m) kubelet Started container proxy
Warning BackOff 1s (x103 over 19m) kubelet Back-off restarting failed container proxy in pod kubernetes-dashboard-kong-98bbbb69b-trbcd_kubernetes-dashboard(c60daf2a-f90d-4c0b-8ec9-9c4767a19213)

private hub docker images:

kubernetesui/dashboard-web 1.6.0 96b21277cbef 5 months ago 188MB
k8sma:5000/kubernetesui/dashboard-web 1.6.0 96b21277cbef 5 months ago 188MB
kubernetesui/dashboard-api 1.10.1 aa69cebab7a8 5 months ago 54.6MB
k8sma:5000/kubernetesui/dashboard-api 1.10.1 aa69cebab7a8 5 months ago 54.6MB
k8sma:5000/kubernetesui/dashboard-auth 1.2.2 45a495c0887d 5 months ago 48MB
kubernetesui/dashboard-auth 1.2.2 45a495c0887d 5 months ago 48MB
kubernetesui/dashboard-metrics-scraper 1.2.1 46e3f823d18f 5 months ago 38.2MB
k8sma:5000/kubernetesui/dashboard-metrics-scraper 1.2.1 46e3f823d18f 5 months ago 38.2MB
kong 3.6 6e99fd0ebd1e 10 months ago 297MB
k8sma:5000/kong 3.6 6e99fd0ebd1e 10 months ago 297MB
k8sma:5000/library/kong 3.6 6e99fd0ebd1e 10 months ago 297MB

What did you expect to happen?

kong should be deployed without warning in my k8s

How can we reproduce it (as minimally and precisely as possible)?

NAME READY STATUS RESTARTS AGE
kubernetes-dashboard-api-9b8464959-2g5kf 1/1 Running 22 (30m ago) 99d
kubernetes-dashboard-auth-657444bc9f-d7d6s 1/1 Running 22 (30m ago) 99d
kubernetes-dashboard-kong-98bbbb69b-trbcd 0/1 CrashLoopBackOff 2217 (3m43s ago) 99d
kubernetes-dashboard-metrics-scraper-74bfb95c9b-5l6wm 1/1 Running 22 (30m ago) 99d
kubernetes-dashboard-web-7b469dc74c-2494l 1/1 Running 22 (30m ago) 99d

Anything else we need to know?

No response

What browsers are you seeing the problem on?

No response

Kubernetes Dashboard version

1.6.0

Kubernetes version

Client Version: v1.30.2 Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3 Server Version: v1.30.8

Dev environment

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind/bugCategorizes issue or PR as related to a bug.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions