-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Machine Learning for openEO #441
base: draft
Are you sure you want to change the base?
Conversation
Variants discussed in the ML meeting:
|
Some things from the ML meeting: Regularization may consist of (and is mapped to openEO processes):
-> combine these to a new openEO process with some arguments that are commonly used with reasonable defaults |
22be7a9
to
e854271
Compare
|
@m-mohr , is there a reason why the keyword fit is used in the naming convention instead of train? |
Just to align with fit_cuve, I guess. Train is also fine... |
Cool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Approving this now. We can continuously improve during implementations if need be.
@m-mohr is renaming these two to follow with a prefix of "ml" a possible alternative? |
Why? The current proposal follows the load_* and save_result schema. |
Cool makes sense to follow the previous schema. My initial thoughts came from the perception that it would be good for a general user if most of the ml operations start with that prefix i.e. "ml_". |
The STAC ML Model extension may get deprecated in favor of https://github.com/crim-ca/dlm-extension |
Sure thanks, I just saw the notification about it, I will follow up with them. |
{ | ||
"id": "save_ml_model", | ||
"summary": "Save a ML model", | ||
"description": "Saves a machine learning model as part of a batch job.\n\nThe model will be accompanied by a separate STAC Item that implements the [ml-model extension](https://github.com/stac-extensions/ml-model).", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The model will be accompanied by a separate STAC Item that
What does "accompanied" practically mean? Should there be an additional job result asset? Or should this be an job result link item?
The reason I'm asking is that we want to streamline the detection of the model's URL at the client side.
e.g. see Open-EO/openeo-python-client#576 we we currently have a highly implementation-specific hack
ml_model_metadata_url = [
link
for link in links if 'ml_model_metadata.json' in link['href']
][0]['href']
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good question, I guess we should clarify that.
On the other hand, please note that this PR is implicitly outdated as the ML Model extension in STAC is likely going to be replaced by another extension. So this generally needs more work (which I have no plans to do anytime soon).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ML Model extension in STAC is likely going to be replaced by another extension.
can you point to the new one @m-mohr ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potentially interesting for "bring your own model": https://onnx.ai/