-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[EVENT] ICESat-2 Hackweek 2023 #2889
Comments
@consideRatio, feel free to ask @scottyhq any further questions/details needed to fulfill the above checkpoint. |
@scottyhq I have some questions: Authorization and machine types
Guessing what may be practical, I suggest that a new github team is created, and that this team is allowed to start on a 16 CPU / 128 GB machine by default, with perhaps 4, 8, or 16 GB of memory requested by default. Pre-started nodes and pre-downloaded images If you want, we can optimize the startup time by ensuring nodes are already started and images are already downloaded on them. This would have machines on standby, incurring more cost. Do you wish for this @scottyhq, and if so, for how many users should we guarantee a fast startup time by having capacity pre-allocated? Also for this, its relevant that we know what image is to be used ahead of time as pre-downloading images must be specified via config rather than the configurator at https://hub.cryointhecloud.com/services/configurator/. Let me know what image(s) is to be used if you want pre-started nodes. |
Thanks for your assistance @consideRatio !
Yes. All CryoCloud JupyterHub Access is going through a Google Form as described here https://book.cryointhecloud.com/content/Getting_Started.html#getting-started. I don't know who all has access to transfer form responses to the GitHub Team (@tsnow03, can you clarify)? We also have these two GitHub Teams in another Org. I don't know if cross-org permissions are possible but I added you to the 2023 organizers team in that Org: (https://github.com/orgs/ICESAT-2HackWeek/teams/2023-participants, https://github.com/orgs/ICESAT-2HackWeek/teams/2023_orgteam)
I personally think a default of 2CPU and 16GB would be good for this event, and it's nice to have the 4CPU/32GB option. The current default of 0.5CPU/4GB feels low. Is that easy to adjust for the event?
I think this would be nice (having a +1 spot ready to go at all times next week) but not necessary. I think people are willing to wait 5 minutes and check email, etc while things spin up. My understanding is that the configurator is not currently being used on CryoCloud, so whatever image is specified in 2i2c config is what is being used. |
Decision on machine typeWith users starting up 16 GB or possibly 32 GB servers, it would fit 8 or 4 users on a 128 GB machine, or 32 or 16 users on a 512 GB machine. Waiting for startup is typically something done per machine, so fitting more users on a machine improves startup experience in general. My current guesstimate on a good compromise is to aim for around 10-40 per node, so I'm planning use of a 512 GB machine for the attendees. Decision on profile list configI've added a new entry that is shown as first and default entry for users of the team Expected outcomes
|
The event is done and it was awesome :) Thank you 2i2c for a providing such a reliable and useful service! |
@consideRatio, can you take care of the after-event task, please? Thanks! |
Great to hear @scottyhq, thanks for the followup! I'll remove the "ICESAT-2 Hackweek" user server choice at this point, right @scottyhq? We hope that your hub worked out well for you! We are trying to understand where we can improve our hub infrastructure and setup around events, and would love any feedback that you're willing to give. Would you mind answering the following questions? If not, just let us know and that is no problem!
|
Sorry for the delay, just getting back from vacation. Some answers below:
Yes! It was great. We had about 50 people using the Hub every day for a week, startup times were fast, and it we didn't encounter any issues.
This hub has
Real-time-collaboration. GPU access. Report Docker Image Info (2i2c-org/features#16).
We relied on this JupyterHub for introducing ~60 scientists to Cloud-computing and data-proximate computing with public datastets in AWS us-west-2 (In particular NASA's ICESat-2 archive). Teams of 4-8 scientists were able to hit the ground running with a curated Python environment and easily configurable computing resources (CPU, RAM) to run interactive tutorials and sprint on projects for one week. More here: https://github.com/ICESAT-2HackWeek/ICESat-2-Hackweek-2023
The JupyterHub was really fantastic. Thanks again 2i2c! |
Summary
ICESat-2 Hackweek 2023 is focused on Cloud computing with NASA ICESat-2 data (https://icesat-2-2023.hackweek.io)
Event Info
Hub info
Task List
Before the event
👉Template message to send to community representative
During and after event
👉Template debrief to send to community representative
The text was updated successfully, but these errors were encountered: