-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exploring re-homing OPEA from "LF AI and Data" to "CNCF" #302
Comments
Would all OPEA TSC members please comment on this issue with +1 if they approve or -1 if they disapprove and share any other thoughts. While we brought this up at our Feb 2025 TSC meeting, this discussion here serves to formally record all votes. |
/vote |
👍🏻 |
2 similar comments
👍 |
👍 |
If you're a steering committee member please make sure to put your name in the comment, or have it associated with github account, as only TSC member's votes count here. |
Please let us keep the focus on re-homing OPEA. Thank you for your cooperation. |
👍 |
Melissa McKay, JFrog +1 binding |
When OPEA was first launched in May 2024, the best home appeared to be "Linux Foundation AI and Data" given its AI focus and OPEA was all about GenAI. Since then, CNCF, recognizing that a large portion of future cloud workloads will be AI, and more specifically in GenAI, launched an AI Workgroup which is active with regular meetings, brought out a " Cloud Native and AI Whitepaper" https://www.cncf.io/reports/cloud-native-artificial-intelligence-whitepaper/, planning an AI security whitepaper and is even interested in offering a Kubernetes-based AI playground.
OPEA with its AI inference and RAG focus is certainly designed from day-1 to be cloud native by virtue of its components all being delivered as containerized microservices, deployment recipes provided for docker single node and Kubernetes clusters, and encapsulating cloud best practices for scaling, load balancing, ingress control, authentication and authorization to name a few.
In this sense, with the increased focus on AI in CNCF, re-homing OPEA in CNCF makes for greater synergies with the large developer base, industry involvement, and conference opportunities in CNCF. OPEA has been presented at KubeCon NA and India 2024 most recently. OPEA GenAIExamples would be a great hands-on practice workload in CNCF AI playground. OPEA would also be a great test vehicle for upcoming Kubernetes scheduler features for Gang scheduling and Batch scheduling respectively and for using OCI-compliant container layers that are persistent volumes.
We do acknowledge that the CNCF project approval and graduation process is long and meticulous, sandbox to incubating to graduated milestones require meeting governance, inclusiveness, contributor diversity, code quality, and production adoption requirements. These are all quality metrics that any good project should meet and thus a bar to strive for.
The text was updated successfully, but these errors were encountered: