Multi-Cluster Support for EKS and RKE2 Clusters (Rancher) #8035
Labels
kind/question
An issue that reports a question about the project
stale
Automatic label to stale issues due inactivity to be closed if no further action
I have read the multi-cluster documentation, but I need support on how to specifically configure it.
please help...
In my use case, I have a management cluster as an EKS cluster, and I have installed Kubeapps on that cluster. Additionally, I have multiple EKS clusters (with AWS accounts different from the management cluster's account) and on-premises RKE2 clusters, which I operate as multi-clusters under Rancher's downstream clusters.
From reading the documentation, I understand that I need to maintain the authProxy and clusters sections of the HelmChart. For the clusters section, I was able to obtain information about clusters other than the management cluster using AWS CLI commands as shown below.
AWS CLI
Helm Chart values.yaml
However, it seems that without the authProxy section, I cannot perform a Helm upgrade. I believe that the settings around "Configuring your Kubernetes API servers for OIDC" mentioned in the documentation are necessary here.
Unfortunately, my knowledge around OIDC is limited, and I couldn't understand it even after reading the documentation. What steps should I follow to configure the EKS and RKE2 clusters and fill in the authProxy section?
Also, is it possible to use IAM IdP for this purpose?
Additionally, whether it is an on-premises cluster or an EKS cluster, we have registered the URL of the IAM IdP for each cluster. (For on-premises, we have prepared the IdP in S3.) Can we now configure the AuthProxy section for multi-cluster?
The text was updated successfully, but these errors were encountered: