The following as taken from the --help output of the OOD cli:
usage: ood.py [-h] [--framework FRAMEWORK] --model_src_path MODEL_SRC_PATH --model_metadata MODEL_METADATA [--data_uri DATA_URI] [--data_metadata DATA_METADATA] [--inference_service INFERENCE_SERVICE] --model_dest_path MODEL_DEST_PATH [--cos_service_endpoint COS_SERVICE_ENDPOINT] [--cos_bucket COS_BUCKET] [--cos_auth_endpoint COS_AUTH_ENDPOINT] [--cos_apikey COS_APIKEY] [--cos_service_instance_id COS_SERVICE_INSTANCE_ID] Adds out-of-distribution score generation to an existing pytorch or tensorflow model. A single model is read from Cloud Object Storage (COS) or the file system, and modified to include an 'ood' capability. Optionally, the OOD scores generated by the new model can be normalized into a fixed range of 0 to 1 by providing a data set that is used to identify a range of unnormalized OOD scores. The unnormalized scores are used to develop a scaling function (based on min/max scores). Producing a normalized model generally takes more time due to the fact that inferencing on the given data set is required. When normalizing, and thus inferencing is required, an inferencing service must be specified. The inferencing service may be a local in memory service or a kserve-based service. Once model modification is completed, the model is stored back to the source of the loaded model (i.e. COS or file system) under a specified path. optional arguments: -h, --help show this help message and exit Required named arguments: --framework FRAMEWORK The modeling framework to use. One of pytorch or tf. Can also be set using the OOD_FRAMEWORK env var. --model_src_path MODEL_SRC_PATH The file system location of the model file. Can also be set using the OOD_MODEL_SRC_PATH env var. --model_metadata MODEL_METADATA Framework-specific metadata dictionary describing the model, especially the 'arch' type. Can also be set using the OOD_MODEL_METADATA env var. --model_dest_path MODEL_DEST_PATH The file system location (currently a directory) where the new model is to be stored. Can also be set by the OOD_MODEL_DEST_PATH env var. Optional named arguments: --data_uri DATA_URI Specifies the location of the data set used in normalizing OOD scores. If not specified, normalization will not be performed. Can also be specified with the OOD_DATA_URI env var. --data_metadata DATA_METADATA Specifies the information about the data set using a dictionary (i.e. image size, normalization, etc). Generally required if a data URI is given. Can also be set by the OOD_DATA_METADATA env var. --inference_service INFERENCE_SERVICE The type of inference service to use when normalizing the data. One of {} or {}. Can also be set by the OOD_INFERENCE_SERVICE env var. When specifying kserve, additional env vars (url, storage, credentials, etc.) must be specified to configure the kserve connection. --cos_service_endpoint COS_SERVICE_ENDPOINT Defines Cloud Object Storage (COS) as the model storage, both for loading and storing of the new model. The value given is a URI for the service. If not specified, file system-based model storage is used. Can also be set by the OOD_COS_SERVICE_ENDPOINT env var. --cos_bucket COS_BUCKET Defines the COS bucket to retrieve/store models from/to. Only used if the COS service endpoint is set. Can also be set by the OOD_COS_BUCKET env var. --cos_auth_endpoint COS_AUTH_ENDPOINT Defines the COS authentication endpoint URI. Only used if the COS service endpoint is set. Can also be set by the OOD_COS_AUTH_ENDPOINT env var. --cos_apikey COS_APIKEY Defines the COS authentication api key. Only used if the COS service endpoint is set. Can also be set by the OOD_COS_APIKEY env var. --cos_service_instance_id COS_SERVICE_INSTANCE_ID Defines the COS service instance URI. Only used if the COS service endpoint is set. Can also be set by the OOD_COS_SERVICE_INSTANCE_ID env var. Examples: With normalization using the file system... python ood.py --framework pytorch \ --model_src_path resnet50.pt \ --data_uri flower_photos_small.tar.gz \ --data_metadata "{'img_height': 224, 'img_width': 224, 'batch_size': 32, \ 'normalize': [[0.485, 0.456, 0.406], [0.229, 0.224, 0.225]]}" \ --model_metadata "{'type': 'pytorch', 'arch': 'resnet50'}" \ --inference_service in-memory \ --model_dest_path ood_resnet50.pt Without normalization using the file system... python ood.py --framework pytorch \ --model_src_path resnet50.pt \ --model_metadata "{'type': 'pytorch', 'arch': 'resnet50'}" \ --model_dest_path ood_resnet50.pt Without normalization using COS... python ood.py --framework pytorch \ --model_src_path resnet50.pt \ --model_metadata "{'type': 'pytorch', 'arch': 'resnet50'}" \ --model_dest_path ood_resnet50.pt \ --cos_service_endpoint https://s3.us-east.cloud-object-storage.appdomain.cloud \ --cos_bucket mybucket \ --cos_service_instance_id crn:v1:bluemix:public:cloud-object-storage:global:a/74c6d94213d643b5b53bfbeb3c1e8de0:f9de716e-5b49-41a0-8c10-e0cd4127b192:: \ --cos_auth_endpoint https://iam.cloud.ibm.com/oidc/token \ --cos_apikey <YOUR APIKEY HERE>