Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make hubs compatible with Z2JH 1.0.0 #414

Closed
9 of 11 tasks
consideRatio opened this issue May 16, 2021 · 4 comments · Fixed by #439
Closed
9 of 11 tasks

Make hubs compatible with Z2JH 1.0.0 #414

consideRatio opened this issue May 16, 2021 · 4 comments · Fixed by #439
Labels
Enhancement An improvement to something or creating something new.

Comments

@consideRatio
Copy link
Contributor

consideRatio commented May 16, 2021

Background

Z2JH 1.0.0 introduce breaking changes, we must make some adjustments.

Expected timeline

End of May.

Steps to complete this goal

  • Verify outcomes of schema validation using instructions here.
    • Verified outcomes of using basehub and its default values (-> relocate singleuser.admin / cloudResources)
    • Verified outcomes of using daskhub and its default values (-> relocate singleuser.admin / cloudResources)
    • Manually verified the cluster specific deployment configuration (-> relocate homepage)
  • Verify we work against k8s 1.17+ clusters
  • Verify we use a helm client versioned 3.5+
  • Acknowledge that deleting users will now also delete the users associated PVC
  • Acknowledge that pods in the JupyterHub helm chart no longer have predefined resource requests (cpu/memory). See Don't set default resource requests jupyterhub/zero-to-jupyterhub-k8s#2034.
  • Relocate cloudResources to custom.cloudResources (?) where Z2JH 1.0.0 won't complain about a passed value it doesn't recognize.
  • Relocate singleuser.admin to custom.singleuserAdmin (?) where Z2JH 1.0.0 won't complain about a passed value it doesn't recognize.
  • Relocate homepage to custom.homepage or perhaps global.2i2c.homepage where Z2JH 1.0.0 won't complain about a passed value it doesn't recognize. Note that global variables are not made accessible from get_config though, but such feature would be acceptable to implement i think.
@choldgraf choldgraf added 🏷️ CI/CD Enhancement An improvement to something or creating something new. labels May 16, 2021
yuvipanda added a commit to yuvipanda/pilot-hubs that referenced this issue May 16, 2021
With hhttps://github.com/jupyterhub/zero-to-jupyterhub-k8s/pull/2200,
helm will default to not allowing arbitrary extra properties anywher
except under jupyterhub.custom. This helps us move towards 1.0

Ref 2i2c-org#414
@yuvipanda
Copy link
Member

yuvipanda commented May 16, 2021

As usual, a very thorough and thought out issue, @consideRatio. THANK YOU.

I've verified that all our current clusters are on at least 1.19. Opened #412 to decide ongoing policy.

Our GitHub actions use https://github.com/Azure/setup-helm, which uses latest stable version of helm. So that's 3.5.

I opened #415 for custom.cloudResources. I've some more thinking to do for singleuser.admin.

We use a single pre-configured PVC (https://github.com/2i2c-org/pilot-hubs/blob/master/hub-templates/basehub/templates/nfs-pvc.yaml) for all users, and don't use dynamic provisioning at all. So deleting of PVCs doesn't affect us. https://pilot-hubs.2i2c.org/en/latest/topic/storage-layer.html has more info about our home directory storage.

@consideRatio
Copy link
Contributor Author

@yuvipanda I scanned the cluster specific configs and found jupyterhub.homepage was referenced as well - that will also cause issues.

@yuvipanda
Copy link
Member

#417 just moved home page stuff under custom, I think that's good enough for now.

@yuvipanda
Copy link
Member

I've some more thinking to do for singleuser.admin.

We should just move it under custom for now I think

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Enhancement An improvement to something or creating something new.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants