-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can I submit our benchmark result (v0.5 based) now? #63
Comments
Pei-Feng,
* I just completed the test with our storage system with mlperf/storage v0.5 in CLOSED division. I'd like to submit our results. [ . . . ] Do I still have chance to submit our benchmark results now? If yes, how to submit. Thank you.
The way that MLCommons works is that each WG has “open windows” when submissions are accepted, and the Storage WG’s next window opens the 17th of June. The WG is now finalizing the new workloads, the Rules, and the infrastructure for the v1.0 version of MLPerf Storage (the last version was v0.5). The best way to get into sync with what’s going on is to join the WG <https://mlcommons.org/working-groups/benchmarks/storage/> . On our weekly calls (every Friday 8:00am-9:00am Pacific time) we debate how to improve the benchmark and exchange suggestions on how to optimize the performance of the storage system under the benchmark.
Compared to the v0.5 submission round, the v1.0 version brings two new workloads (ResNet-50 and CosmoFlow), retains the Unet-3D workload, and deprecates the BERT workload. It also moves from simulating NVIDIA V100 GPUs to simulating A100 and H100 GPUs. The H100 GPUs are about 10x-ish faster than the V100’s, so the load on the storage system has dramatically increased since the v0.5 version.
There are several legal documents and agreements that need to be signed before you can submit MLPerf results (eg: MLCommons trademark use agreement, etc), so it is advisable to get that process started as soon as you can. The onboarding process that happens when you join the WG will cover some of that and there will be a few meetings for those that intend to submit results where we go through the logistics (eg: agreements) and processes (eg: CLOSED versus OPEN submissions, a private peer-review, results embargo dates, etc) for submitting.
Let me know if you have any questions, and welcome to the WG!
Thanks,
Curtis Anderson
Co-Chair MLPer Storage WG
From: Pei-Feng Liu ***@***.***>
Sent: Monday, April 15, 2024 1:32 AM
To: mlcommons/storage ***@***.***>
Cc: Subscribed ***@***.***>
Subject: [mlcommons/storage] Can I submit our benchmark result (v0.5 based) now? (Issue #63)
https://mlcommons.org/benchmarks/storage/ shows us benchmark results from a couple of storage vendors, such as DDN/Weka. I just completed the test with our storage system with mlperf/storage v0.5 in CLOSED division. I'd like to submit our results. It seems that the submission should be started from URL https://submissions-ui.mlcommons.org/index. But this URL is not accessible now.
Do I still have chance to submit our benchmark results now? If yes, how to submit. Thank you.
—
Reply to this email directly, view it on GitHub <#63> , or unsubscribe <https://github.com/notifications/unsubscribe-auth/AXZDB7PDE2DTRF6OC5EWTBLY5OGBHAVCNFSM6AAAAABGG4A5MWVHI2DSMVQWIX3LMV43ASLTON2WKOZSGI2DGMBUGQ2DEMI> .
You are receiving this because you are subscribed to this thread. <https://github.com/notifications/beacon/AXZDB7JJREYBE5V5AWOMV6LY5OGBHA5CNFSM6AAAAABGG4A5MWWGG33NNVSW45C7OR4XAZNFJFZXG5LFVJRW63LNMVXHIX3JMTHILMREIU.gif> Message ID: ***@***.*** ***@***.***> >
|
Thank you so much, Curtis! It sounds quite good if resnet and comoflow are included in v1.0! I'll redo the benchmark tests with our storage system and submit our results to the work group in the next open window. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
https://mlcommons.org/benchmarks/storage/ shows us benchmark results from a couple of storage vendors, such as DDN/Weka. I just completed the test with our storage system and mlperf/storage v0.5 in CLOSED division. I'd like to submit our results. It seems that the submission should be started from URL https://submissions-ui.mlcommons.org/index. But this URL is not accessible now.
Do I still have chance to submit our benchmark results now? If yes, how to submit. Thank you.
The text was updated successfully, but these errors were encountered: