Skip to content

Trouble migrating from configmap to access entries #3424

@zoumengguang

Description

@zoumengguang

Description

Hi,

We are currently experiencing the following problem, in which we are attempting to upgrade the Terraform EKS module version from v17.24 -> v20.34.0, in which this also comes with the support for access entries which we would like to migrate to. The current problem is that although we are using the access_entries field and passing the list of access entries to be created, and in the terraform plan we can successfully see the list of entries to be created, the plan also fails with the following error message:

Error: configmaps "aws-auth" is forbidden: User "system:serviceaccount:jenkins:default" cannot get resource "configmaps" in API group "" in the namespace "kube-system"

For context, we are using terraform from within jenkins pipeline jobs in order to perform infrastructure updates. I have confirmed that the corresponding access entry resources module.eks.aws_eks_access_entry.this["arn:aws:iam::[AWS_ACCOUNT_ID]:user/jenkins and module.eks.aws_eks_access_policy_association.this["arn:aws:iam::[AWS_ACCOUNT_ID]:user/jenkins_0"] are marked for creation in terraform plan, as seen in the below screenshots. In addition, this problem is only happening upon a version update of an existing cluster and we do not experience this error upon creating another cluster using the v20.34.0 module from the beginning, even without the enable_cluster_creator_admin_permissions parameter. I have also tried setting enable_cluster_creator_admin_permissions to true to the EKS module for the update as well, but I assume this only works upon EKS cluster creation for the original creating user to assume a cluster-admin role.

In regards to this being a possible RBAC issue, I have examined the created cluster's provisioned resources and there doesn't seem to be any problem in that respect:

% kubectl auth can-i get configmap/aws-auth --as=system:serviceaccount:jenkins:default -n kube-system

yes

I have searched for previous issues and although they are related, the solutions suggested in those did not seem to solve our problem.

Thanks.

  • [ X ] ✋ I have searched the open/closed issues and my issue is not listed.

Versions

  • Module version [Required]:

    • EKS
      -v17.24.0
      -v20.34.0
  • Provider version(s):

  • AWS

    • v5.90.1
  • Kubernetes

    • v2.38.0
  • Terraform version:

    • v1.8.3

Reproduction Code [Required]

module "eks" {
  source = "XXX" // omitted

  cluster_name = var.cluster_short_name
  cluster_fqdn = var.cluster_name
  cluster_version = var.cluster_version
  cluster_enabled_log_types = var.cluster_enabled_log_types
  subnet_ids = module.vpc.private_subnets
  tags =  merge(
  local.tags,
    {
      "KubernetesCluster" = var.cluster_short_name
    },
  )
  vpc_id = module.vpc.vpc_id
  self_managed_node_groups = local.self_managed_node_groups_format_policies
  cluster_additional_security_group_ids = [aws_security_group.worker_bastion.id, aws_security_group.efs.id]
  self_managed_node_group_defaults = local.self_managed_node_group_defaults
  cluster_timeouts = {
    delete = var.cluster_delete_timeout
    create = var.cluster_create_timeout
  }
  create_iam_role = true
  cluster_endpoint_private_access = true
  cluster_endpoint_public_access  = true
  enable_irsa = true
  authentication_mode = "API_AND_CONFIG_MAP"
  access_entries = local.merged_access_entries
  create_kms_key = true
  cluster_encryption_config = {
    resources = ["secrets"]
  }
}

If there is any other required information from the config please let me know.

Steps to reproduce the behavior:

Cluster infra updates are performed in a jenkins job which creates a separate workspace for each update operation and is subsequently deleted after completion/failure

  • Tried updating EKS module version from v17.24.0 -> v20.34.0 for access entries
  • Ran Terraform Plan, fails with the above error message and does not successfully finish executing the command

Expected terraform plan to finish and for all access entries listed to be created without issue

Actual behavior

Terraform plan fails with

Error: configmaps "aws-auth" is forbidden: User "system:serviceaccount:jenkins:default" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
Image Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions