Skip to content

Commit

Permalink
GITBOOK-6: No subject
Browse files Browse the repository at this point in the history
  • Loading branch information
Gonmeso authored and gitbook-bot committed Feb 20, 2024
1 parent 353dfc6 commit fa53627
Show file tree
Hide file tree
Showing 7 changed files with 216 additions and 56 deletions.
2 changes: 2 additions & 0 deletions docs/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,12 @@

* [Cairo](frameworks/cairo/README.md)
* [Transpile](frameworks/cairo/transpile.md)
* [Deploy](frameworks/cairo/deploy.md)
* [Prove](frameworks/cairo/prove.md)
* [Verify](frameworks/cairo/verify.md)
* [EZKL](frameworks/ezkl/README.md)
* [Transpile](frameworks/ezkl/transpile.md)
* [Deploy](frameworks/ezkl/deploy.md)
* [Prove](frameworks/ezkl/prove.md)
* [Verify](frameworks/ezkl/verify.md)

Expand Down
3 changes: 2 additions & 1 deletion docs/frameworks/cairo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ In the realm of Giza, Cairo serves as the backbone for generating provable machi
## Cairo Framework Features

1. Transpilation Process: Dive into Giza's streamlined process to seamlessly convert ONNX machine learning models into Cairo code. This transformation ensures optimal compatibility and functionality within the Cairo ecosystem, leveraging the power of [✨Orion✨](https://github.com/gizatechxyz/orion). More on this in the [transpile documentation for cairo](transpile.md).
2. Creating Verifiable Proofs: Use the transpile model to generate the `casm.json` file to create a proof and validate the correctness and reliability of your transformed models. Discover how Giza harnesses Cairo's robust features to create evidence ensuring the credibility of your machine learning outputs. More on this in the [prove documentation for cairo](prove.md).
2. Creating Verifiable Proofs: Use the transpile model to generate the trace and memory files to create a proof and validate the correctness and reliability of your transformed models. Discover how Giza harnesses Cairo's robust features to create evidence ensuring the credibility of your machine learning outputs. More on this in the [prove documentation for cairo](prove.md).
3. Deploying verifiable models: deploy a verifiable model ready to accept requests and abstract you from creating the proofs. Offering the model as an easy available API to ease integration and usage. For further information look at the [deploy documentation for cairo](deploy.md).

[Cairo official documentation](https://docs.cairo-lang.org/)
67 changes: 67 additions & 0 deletions docs/frameworks/cairo/deploy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# Deploy

To deploy a model, you must first have a version of that model. If you have not yet created a version, please refer to the [versions](../../resources/versions.md) documentation.

To create a new service, users can employ the `deploy` command. This command facilitates the deployment of a machine learning service ready to accept predictions at the `/cairo_run` endpoint, providing a straightforward method for deploying and using machine learning capabilities

```
> giza deployments deploy --model-id 1 --version-id 1 model.sierra
▰▰▰▰▰▱▱ Creating deployment!
[giza][2024-02-07 12:31:02.498] Deployment is successful ✅
[giza][2024-02-07 12:31:02.501] Deployment created with id -> 1 ✅
[giza][2024-02-07 12:31:02.502] Deployment created with endpoint URL: https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app 🎉
```

If a model is fully compatible the sierra file is not needed and can be deployed without using it in the command:

```
> giza deployments deploy --model-id 1 --version-id 1
▰▰▰▰▰▱▱ Creating deployment!
[giza][2024-02-07 12:31:02.498] Deployment is successful ✅
[giza][2024-02-07 12:31:02.501] Deployment created with id -> 1 ✅
[giza][2024-02-07 12:31:02.502] Deployment created with endpoint URL: https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app 🎉
```

{% hint style="danger" %}
For a partially compatible model, the sierra file must be provided, if not an error will be shown.
{% endhint %}

### Example request

Now our service is ready to accept predictions at the provided endpoint URL. To test this, we can use the `curl` command to send a POST request to the endpoint with a sample input.

```
> curl -X POST https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app/cairo_run \
-H "Content-Type: application/json" \
-d '{
"args": "[2 2] [1 2 3 4]"
}' | jq
{
"result": [0.1234],
"request_id": "b14bfbcf250b404192765d9be0811c9b"
}
```

There is an extra args, `job_size`, that can be used in each request to specify the size of the proving job so it has more CPU and memory available to generate the proof. An example:

```
> curl -X POST https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app/cairo_run \
-H "Content-Type: application/json" \
-d '{
"args": "[2 2] [1 2 3 4]",
"job_size": "M"
}'
```

Available sizes are `S`, `M`, `L,` and `XL`, each with different usage limits.

## Download the proof

We can download the proof using the `download-proof` command available for the deployments: 

<pre class="language-sh"><code class="lang-sh"><strong>❯ giza deployments download-proof --model-id 1 --version-id 1 --deployment-id 1 --proof-id "b14bfbcf250b404192765d9be0811c9b"
</strong>[giza][2024-02-20 15:40:48.560] Getting proof from deployment 1 ✅
[giza][2024-02-20 15:40:49.288] Proof downloaded to zk.proof ✅
</code></pre>

The `proof id` used is the `request_id` returned in the response.
17 changes: 5 additions & 12 deletions docs/frameworks/ezkl/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,9 @@

## EZKL Framework Features

1. Transpilation Process
1. Transpilation Process: it refers to performing the `setup` for the model where an ONNX model and a set of input data are used to generate the `circuit settings`, `proving key` and `verification key` files that are used to perform the inference. In Giza we aim to provide an easy way to perform this step providing compute resources and a simple command line interface. More on this in the [transpile documentation for ezkl](transpile.md).
2. Creating Verifiable Proofs: create a proof using the generated outputs of the setup process, we handle this step for you by loading the necessary files and performing the proof generation for you. More on this in the [prove documentation for ezkl](prove.md).
3. Verifying Proofs: verify the proof generated by an `ezkl` version, here we manage the execution of the verification process and the compute as well for you. More on this in the [verify documentation for ezkl](verify.md)
4. Deploying verifiable models: deploy a verifiable model ready to accept requests and abstract you from creating the proofs. Offering the model as an easy available API to ease integration and usage. For further information look at the [deploy documentation for ezkl](deploy.md).

The transpilation process refers to performing the `setup` for the model where an ONNX model and a set of input data are used to generate the `circuit settings`, `proving key` and `verification key` files that are used to perform the inference. In Giza we aim to provide an easy way to perform this step providing compute resources and a simple command line interface. More on this in the [transpile documentation for ezkl](transpile.md).

1. Creating Verifiable Proofs

Create a proof using the generated outputs of the setup process, we handle this step for you by loading the necessary files and performing the proof generation for you. More on this in the [prove documentation for ezkl](prove.md).

1. Verifying Proofs

Verify the proof generated by an `ezkl` version, here we manage the execution of the verification process and the compute as well for you. More on this in the [verify documentation for ezkl](verify.md)

For detailed information check the [ezkl repository](https://github.com/zkonduit/ezkl)
For detailed more information about the framework check the [ezkl repository](https://github.com/zkonduit/ezkl).
85 changes: 85 additions & 0 deletions docs/frameworks/ezkl/deploy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
# Deploy

To deploy a model, you must first have a version of that model. If you have not yet created a version, please refer to the [versions](../../resources/versions.md) documentation.

To create a new service, users can employ the `deploy` command. This command facilitates the deployment of a machine learning service ready to accept predictions at the `/predict` endpoint, providing a straightforward method for deploying and using machine learning capabilities. As we are using `EZKL` we need to add `--framework EZKL` (or `-f EZKL` for short) to the command:

{% code overflow="wrap" %}
```shell
> giza deployments deploy --model-id 1 --version-id 1 --framework EZKL
▰▰▰▰▰▱▱ Creating deployment!
[giza][2024-02-07 12:31:02.498] Deployment is successful ✅
[giza][2024-02-07 12:31:02.501] Deployment created with id -> 1 ✅
[giza][2024-02-07 12:31:02.502] Deployment created with endpoint URL: https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app 🎉
```
{% endcode %}

## Example Request

Now the model is available to generate predictions and generate proofs of those predictions. The schema of the data is the same as used to create the `input.json` needed to create version, for a linear regression it would be:

```json
{
"input_data": [
[
0.12177091836929321,
0.7048522233963013
]
]
}
```

To execute a prediction using **cURL**:

```sh
curl https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app/predict \
-H "Content-Type: application/json" -d '{
"input_data": [
[
0.12177091836929321,
0.7048522233963013
]
]
}' | jq
```

This yields the following response:

```json
{
"prediction": [
[
4.53125
]
],
"request_id": "d0564505755944b8bef9292d980f3e27"
}
```

There is an extra args, `job_size`, that can be used in each request to specify the size of the proving job so it has more CPU and memory available to generate the proof. An example:

```sh
curl https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app/predict \
-H "Content-Type: application/json" -d '{
"input_data": [
[
0.12177091836929321,
0.7048522233963013
]
],
"job_size": "M"
}' | jq
```

Available sizes are `S`, `M`, `L,` and `XL`, each with different usage limits.

## Download the proof

We can download the proof using the `download-proof` command available for the deployments:&#x20;

<pre class="language-sh"><code class="lang-sh"><strong>❯ giza deployments download-proof --model-id 1 --version-id 1 --deployment-id 1 --proof-id "d0564505755944b8bef9292d980f3e27"
</strong>[giza][2024-02-20 15:40:48.560] Getting proof from deployment 1 ✅
[giza][2024-02-20 15:40:49.288] Proof downloaded to zk.proof ✅
</code></pre>

The `proof id` used is the `request_id` returned in the response.
96 changes: 53 additions & 43 deletions docs/resources/deployments.md
Original file line number Diff line number Diff line change
@@ -1,63 +1,31 @@
# Deployments

Deployments in our platform provide a mechanism for creating services that accept predictions via a designated endpoint. These services, based on existing platform versions, leverage Cairo under the hood to ensure provable inferences. Using the CLI, users can effortlessly deploy and retrieve information about these machine learning services.
Deployments in our platform provide a mechanism for creating services that accept predictions via a designated endpoint. These services, based on existing platform versions, leverage Cairo and EZKL under the hood to ensure provable inferences. Using the CLI, users can effortlessly deploy and retrieve information about these machine learning services.

## Deploying a model

{% hint style="info" %}
**Note:** This is explained extensively in the deploy documentation ([Orion Cairo](../frameworks/cairo/deploy.md) and [EZKL](../frameworks/ezkl/deploy.md)).
{% endhint %}

To deploy a model, you must first have a version of that model. If you have not yet created a version, please refer to the [versions](versions.md) documentation.

To create a new service, users can employ the `deploy` command. This command facilitates the deployment of a machine learning service ready to accept predictions at the `/cairo_run` endpoint, providing a straightforward method for deploying and utilizing machine learning capabilities.
To create a new service, users can employ the `deploy` command. This command facilitates the deployment of a machine learning service ready to accept predictions at a specific endpoint, providing a straightforward method for deploying and using machine learning capabilities.

{% code overflow="wrap" %}
```console
> giza deployments deploy --model-id 1 --version-id 1 model.sierra
▰▰▰▰▰▱▱ Creating deployment!
[giza][2024-02-07 12:31:02.498] Deployment is successful ✅
[giza][2024-02-07 12:31:02.501] Deployment created with id -> 1 ✅
[giza][2024-02-07 12:31:02.502] Deployment created with endpoint URL: https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app 🎉
```
{% endcode %}

If a model is fully compatible the sierra file is not needed and can be deployed without using it in the command:
Now a version will be deployed and ready to use as an API. For specific documentation on the deployment of different frameworks, you can go here:

```
> giza deployments deploy --model-id 1 --version-id 1
▰▰▰▰▰▱▱ Creating deployment!
[giza][2024-02-07 12:31:02.498] Deployment is successful ✅
[giza][2024-02-07 12:31:02.501] Deployment created with id -> 1 ✅
[giza][2024-02-07 12:31:02.502] Deployment created with endpoint URL: https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app 🎉
```

{% hint style="danger" %}
For a partially compatible model the sierra file must be provided, if not an error will be shown.
{% endhint %}

### Example request

Now our service is ready to accept predictions at the provided endpoint URL. To test this, we can use the `curl` command to send a POST request to the endpoint with a sample input.

```console
> curl -X POST https://deployment-gizabrain-38-1-53427f44-dagsgas-ew.a.run.app/cairo_run \
-H "Content-Type: application/json" \
-d '{
"args": "[\"2\", \"2\", \"2\", \"4\", \"1\", \"2\", \"3\", \"4\"]"
}' | jq
{
"result": [
{
"value": {
"val": [
1701737587,
1919382893,
1869750369,
1852252262,
1864395887,
1948284015,
1231974517
]
}
}
]
}
```
* [Orion Cairo deploy](../frameworks/cairo/deploy.md)
* [EZKL deploy](../frameworks/ezkl/deploy.md)

## Listing deployments

Expand Down Expand Up @@ -97,3 +65,45 @@ For retrieving detailed information about a specific deployment, users can utili
"version_id": 1
}
```

## Listing proofs

You can list all of the proofs generated by a deployment using the `list-proofs` command.

```sh
❯ giza deployments list-proofs --model-id 1--version-id 1 --deployment-id 1
[giza][2024-02-20 15:55:08.032] Getting proofs from deployment 1 ✅
[
{
"id": 1
"job_id": 1,
"metrics": {
"proving_time": 17.00862145423889
},
"created_date": "2024-02-19T15:34:52.363483"
},
{
"id": 2,
"job_id": 2,
"metrics": {
"proving_time": 17.20691967010498
},
"created_date": "2024-02-19T17:21:03.310809"
]
```
## Download a Proof
To download a proof the `download-proof` command its available to help with this. The proofs can be retrieved either by `proof_id` or `request_id` which is returned in the prediction.
<pre class="language-sh"><code class="lang-sh"><strong>❯ giza deployments download-proof --model-id 1 --version-id 1 --deployment-id 1 --proof-id "d0564505755944b8bef9292d980f3e27"
</strong>[giza][2024-02-20 15:40:48.560] Getting proof from deployment 1 ✅
[giza][2024-02-20 15:40:49.288] Proof downloaded to zk.proof ✅
</code></pre>
The `proof id` used is the `request_id` returned in the response, but also the integer proof-id can be used as well:
<pre><code><strong>❯ giza deployments download-proof --model-id 1 --version-id 1 --deployment-id 1 --proof-id 1
</strong>[giza][2024-02-20 15:40:48.560] Getting proof from deployment 1 ✅
[giza][2024-02-20 15:40:49.288] Proof downloaded to zk.proof ✅
</code></pre>
2 changes: 2 additions & 0 deletions docs/welcome/installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Installation

0 comments on commit fa53627

Please sign in to comment.