Skip to content

Commit

Permalink
chore: Concrete-ML -> Concrete ML
Browse files Browse the repository at this point in the history
  • Loading branch information
jfrery committed Dec 17, 2024
1 parent 333c46d commit 4922992
Show file tree
Hide file tree
Showing 18 changed files with 73 additions and 73 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci_timing.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This workflow uses GitHub CLI to get timings of last 50 runs of Concrete-ML main CI
# This workflow uses GitHub CLI to get timings of last 50 runs of Concrete ML main CI
# and send it to slack and add it as an artifact on the workflow
name: CML build time
on:
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -288,7 +288,7 @@ jobs:
tags: true

# This action creates docker and pypi images directly on the AWS EC2 instance
# The 'PRIVATE_RELEASE_IMAGE_BASE' variable is kept here in case Concrete-ML starts to publish
# The 'PRIVATE_RELEASE_IMAGE_BASE' variable is kept here in case Concrete ML starts to publish
# private nightly releases one day. Currently, release candidates and actual releases are all
# done through the 'PUBLIC_RELEASE_IMAGE_BASE' image. The private image is also used to list all
# tags easily
Expand Down Expand Up @@ -471,7 +471,7 @@ jobs:
echo "" >> "${SECRETS_FILE}"
echo "SECRETS_FILE=${SECRETS_FILE}" >> "$GITHUB_ENV"
- name: Build Docker Concrete-ML Image
- name: Build Docker Concrete ML Image
if: ${{ success() && !cancelled() }}
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
with:
Expand Down
18 changes: 9 additions & 9 deletions docs/advanced_examples/DecisionTreeRegressor.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@
"id": "5755bc04",
"metadata": {},
"source": [
"# Decision Tree Regression Using Concrete-ML\n",
"# Decision Tree Regression Using Concrete ML\n",
"\n",
"In this tutorial, we show how to create, train and evaluate a decision tree regression model using Concrete-ML library.\n",
"In this tutorial, we show how to create, train and evaluate a decision tree regression model using Concrete ML library.\n",
"\n"
]
},
Expand All @@ -18,16 +18,16 @@
"id": "2c256087-c16a-4249-9c90-3f4863938385",
"metadata": {},
"source": [
"### Introducing Concrete-ML\n",
"### Introducing Concrete ML\n",
"\n",
"> Concrete-ML is an open-source, privacy-preserving, machine learning inference framework based on fully homomorphic encryption (FHE).\n",
"> Concrete ML is an open-source, privacy-preserving, machine learning inference framework based on fully homomorphic encryption (FHE).\n",
"> It enables data scientists without any prior knowledge of cryptography to automatically turn machine learning models into their FHE equivalent,using familiar APIs from Scikit-learn and PyTorch.\n",
"> <cite>&mdash; [Zama documentation](../README.md)</cite>\n",
"\n",
"This tutorial does not require a deep understanding of the technology behind concrete-ML.\n",
"Nonetheless, newcomers might be interested in reading introductory sections of the official documentation such as:\n",
"\n",
"- [What is Concrete-ML](../README.md)\n",
"- [What is Concrete ML](../README.md)\n",
"- [Key Concepts](../getting-started/concepts.md)\n",
"\n",
"In the tutorial, we will be using the following terminology:\n",
Expand Down Expand Up @@ -233,10 +233,10 @@
"source": [
"## Training A Decision Tree\n",
"\n",
"ConcreteDecisionTreeRegressor is the Concrete-ML equivalent of scikit-learn's DecisionTreeRegressor.\n",
"ConcreteDecisionTreeRegressor is the Concrete ML equivalent of scikit-learn's DecisionTreeRegressor.\n",
"It supports the same parameters and a similar interface, with the extra capability of predicting directly on ciphertext without the need to decipher it, thus preservacy privacy.\n",
"\n",
"Currently, Concrete-ML models must be trained on plaintext. To see how it works, we train a DecisionTreeRegressor with default parameters and estimate its accuracy on test data. Note here that predictions are done on plaintext too, but soon, we will predict on ciphertext."
"Currently, Concrete ML models must be trained on plaintext. To see how it works, we train a DecisionTreeRegressor with default parameters and estimate its accuracy on test data. Note here that predictions are done on plaintext too, but soon, we will predict on ciphertext."
]
},
{
Expand Down Expand Up @@ -479,7 +479,7 @@
"source": [
"## Predicting on Ciphertext\n",
"If the predictions are similar although slightly less accurate, the real advantage of ConcreteML is privacy.\n",
"We now show how we can perform prediction on ciphertext with Concrete-ML, so that the model does not need to decipher the data at all to compute its estimate."
"We now show how we can perform prediction on ciphertext with Concrete ML, so that the model does not need to decipher the data at all to compute its estimate."
]
},
{
Expand Down Expand Up @@ -798,7 +798,7 @@
"Once the model is carefully trained and quantized, it is ready to be deployed and used in production. Here are some useful links on the subject:\n",
" \n",
" - [Inference in the Cloud](../getting-started/cloud.md) summarize the steps for cloud deployment\n",
" - [Production Deployment](../guides/client_server.md) offers a high-level view of how to deploy a Concrete-ML model in a client/server setting.\n",
" - [Production Deployment](../guides/client_server.md) offers a high-level view of how to deploy a Concrete ML model in a client/server setting.\n",
" - [Client Server in Concrete ML](./ClientServer.ipynb) provides a more hands-on approach as another tutorial."
]
}
Expand Down
4 changes: 2 additions & 2 deletions docs/advanced_examples/LinearSVR.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
"\n",
"\n",
"def get_concrete_plot_config(mse_score=None):\n",
" label = \"Concrete-ML\"\n",
" label = \"Concrete ML\"\n",
" if mse_score is not None:\n",
" label += f\", {'$MSE$'}={mse_score:.4f}\"\n",
" return {\"c\": \"orange\", \"linewidth\": 2.5, \"label\": label}"
Expand Down Expand Up @@ -646,7 +646,7 @@
"y_pred_sklearn = sklearn_rgs.predict(X_test)\n",
"print(f\"Execution time: {(time.time() - time_begin) / len(X_test):.4f} seconds per sample\")\n",
"\n",
"# Now predict using clear quantized Concrete-ML model on testing set\n",
"# Now predict using clear quantized Concrete ML model on testing set\n",
"time_begin = time.time()\n",
"y_preds_quantized = concrete_rgs.predict(X_test)\n",
"print(f\"Execution time: {(time.time() - time_begin) / len(X_test):.4f} seconds per sample\")"
Expand Down
8 changes: 4 additions & 4 deletions docs/advanced_examples/RegressorComparison.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@
" # Instantiate the model\n",
" model = regressor()\n",
"\n",
" # Train the model and retrieve both the Concrete-ML model and its equivalent one from\n",
" # Train the model and retrieve both the Concrete ML model and its equivalent one from\n",
" # scikit-learn\n",
" # If the model is a NeuralNetClassifier, instantiate a scikit-learn MLPClassifier\n",
" # separately in order to be able to be able to compare the results with a float model\n",
Expand Down Expand Up @@ -249,7 +249,7 @@
" time_end = time.time()\n",
" print(f\"Key generation time: {time_end - time_begin:.2f} seconds\")\n",
"\n",
" # Compute the predictions in FHE using the Concrete-ML model\n",
" # Compute the predictions in FHE using the Concrete ML model\n",
" time_begin = time.time()\n",
" concrete_y_pred = concrete_model.predict(X_poly_test[:1], fhe=\"execute\")\n",
" time_end = time.time()\n",
Expand All @@ -276,15 +276,15 @@
" bitwidth = circuit.graph.maximum_integer_bit_width()\n",
"\n",
" # Plot the predictions\n",
" ax.plot(X_test, concrete_y_pred, c=\"blue\", linewidth=2.5, label=\"Concrete-ML\")\n",
" ax.plot(X_test, concrete_y_pred, c=\"blue\", linewidth=2.5, label=\"Concrete ML\")\n",
"\n",
" # Plot the predictions\n",
" ax.plot(X_test, sklearn_y_pred, c=\"red\", linewidth=2.5, label=\"scikit-learn\")\n",
"\n",
" ax.text(\n",
" 0.5,\n",
" 0.80,\n",
" f\"Concrete-ML R2: {concrete_score:.2f}\\n scikit-learn R2: {sklearn_score:.2f}\\n\",\n",
" f\"Concrete ML R2: {concrete_score:.2f}\\n scikit-learn R2: {sklearn_score:.2f}\\n\",\n",
" transform=ax.transAxes,\n",
" fontsize=12,\n",
" va=\"top\",\n",
Expand Down
36 changes: 18 additions & 18 deletions docs/advanced_examples/SVMClassifier.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,12 @@
"id": "d07c3896",
"metadata": {},
"source": [
"# Support Vector Machine (SVM) classification using Concrete-ML\n",
"# Support Vector Machine (SVM) classification using Concrete ML\n",
"\n",
" In this tutorial, we show how to create, train, and evaluate a Support Vector Machine (SVM) model using Concrete-ML library for a classification task.\n",
" In this tutorial, we show how to create, train, and evaluate a Support Vector Machine (SVM) model using Concrete ML library for a classification task.\n",
"\n",
"It is cut in 2 parts:\n",
"1. a quick setup of a LinearSVC model with Concrete-ML\n",
"1. a quick setup of a LinearSVC model with Concrete ML\n",
"2. a more in-depth approach taking a closer look to the concrete-ml specifics\n"
]
},
Expand All @@ -30,23 +30,23 @@
"id": "d3654d52",
"metadata": {},
"source": [
"### Concrete-ML and useful links\n",
"### Concrete ML and useful links\n",
"\n",
"> Concrete-ML is an open-source, privacy-preserving, machine learning inference framework based on fully homomorphic encryption (FHE). It enables data scientists without any prior knowledge of cryptography to automatically turn machine learning models into their FHE equivalent, using familiar APIs from Scikit-learn and PyTorch.\n",
"> Concrete ML is an open-source, privacy-preserving, machine learning inference framework based on fully homomorphic encryption (FHE). It enables data scientists without any prior knowledge of cryptography to automatically turn machine learning models into their FHE equivalent, using familiar APIs from Scikit-learn and PyTorch.\n",
"> \n",
"> <cite>&mdash; [Zama documentation](../README.md)</cite>\n",
"\n",
"This tutorial does not require any knowledge of Concrete-ML. Newcomers might nonetheless be interested in reading some of the introductory sections of the official documentation, such as:\n",
"This tutorial does not require any knowledge of Concrete ML. Newcomers might nonetheless be interested in reading some of the introductory sections of the official documentation, such as:\n",
"\n",
"- [What is Concrete-ML](../README.md)\n",
"- [What is Concrete ML](../README.md)\n",
"- [Key Concepts](../getting-started/concepts.md)\n",
"\n",
"### Support Vector Machine\n",
"\n",
"SVM is a machine learning algorithm for classification and regression. LinearSVC is an efficient implementation of SVM\n",
"that works best when the data is linearly separable. In this tutorial, we use the [pulsar star dataset](https://www.kaggle.com/datasets/colearninglounge/predicting-pulsar-starintermediate) to determine whether a neutron star can be classified as a pulsar star.\n",
"\n",
"Concrete-ML exposes a LinearSVC class which implements the\n",
"Concrete ML exposes a LinearSVC class which implements the\n",
"[scikit-learn LinearSVC](https://scikit-learn.org/stable/modules/generated/sklearn.svm.LinearSVC.html) interface, so you should feel right at home.\n",
"\n",
"### Setup code\n",
Expand Down Expand Up @@ -342,9 +342,9 @@
"id": "12e827d0",
"metadata": {},
"source": [
"## Part 1: Train a simple model with Concrete-ML\n",
"## Part 1: Train a simple model with Concrete ML\n",
"\n",
"The following code quickly scaffolds a Concrete-ML LinearSVC code, which should sound familiar.\n"
"The following code quickly scaffolds a Concrete ML LinearSVC code, which should sound familiar.\n"
]
},
{
Expand Down Expand Up @@ -403,7 +403,7 @@
}
],
"source": [
"# Perform the same steps with the Concrete-ML LinearSVC implementation\n",
"# Perform the same steps with the Concrete ML LinearSVC implementation\n",
"svm_concrete = ConcreteLinearSVC(max_iter=100, n_bits=8)\n",
"svm_concrete.fit(X_train, y_train)\n",
"# plot the boundary\n",
Expand Down Expand Up @@ -468,15 +468,15 @@
"\n",
"#### Simplicity of execution\n",
"\n",
"For a high-level use-case, Concrete-ML offers a very similar interface to scikit-learn. The main difference is *a model needs to be compiled to allow execution in FHE*.\n",
"For a high-level use-case, Concrete ML offers a very similar interface to scikit-learn. The main difference is *a model needs to be compiled to allow execution in FHE*.\n",
"\n",
"#### Model Accuracy\n",
"\n",
"Concrete-ML prediction accuracy can be slightly worse than a regular scikit-learn implementation. This is because of [quantization](../explanations/quantization.md): number precision needs to be fixed-size for the model to be evaluated in FHE. This can be alleviated down to where the accuracy difference is none or negligible (which is the case here with a 8 bit size).\n",
"Concrete ML prediction accuracy can be slightly worse than a regular scikit-learn implementation. This is because of [quantization](../explanations/quantization.md): number precision needs to be fixed-size for the model to be evaluated in FHE. This can be alleviated down to where the accuracy difference is none or negligible (which is the case here with a 8 bit size).\n",
"\n",
"#### Execution time\n",
"\n",
"The execution speed can be slower in Concrete-ML, especially during compilation and FHE inference phases, because enabling FHE operations uses more resources than regular inference on plain data. However, the speed can be improved by decreasing the precision of the data and model's weights thanks to the n_bits parameter. But, depending on the project, there is a trade-off between a slower but more accurate model and a faster but less accurate model."
"The execution speed can be slower in Concrete ML, especially during compilation and FHE inference phases, because enabling FHE operations uses more resources than regular inference on plain data. However, the speed can be improved by decreasing the precision of the data and model's weights thanks to the n_bits parameter. But, depending on the project, there is a trade-off between a slower but more accurate model and a faster but less accurate model."
]
},
{
Expand Down Expand Up @@ -536,7 +536,7 @@
"\n",
"### Step b: quantize the model\n",
"\n",
"So far most of Concrete-ML specificities have conveniently been avoided for the sake of simplicity. The first Concrete-ML specific step of developping a model is to quantize it, which soberly means to turn the model into an integer equivalent.\n",
"So far most of Concrete ML specificities have conveniently been avoided for the sake of simplicity. The first Concrete ML specific step of developping a model is to quantize it, which soberly means to turn the model into an integer equivalent.\n",
"\n",
"Although it is strongly encouraged to read the [Zama introduction to quantization](../explanations/quantization.md), the key takeaway is **a model needs to be reduced to a *discrete*, smaller set in order for the encryption to happen**. Otherwise the data becomes too large to be manipulated in FHE. \n",
"\n",
Expand Down Expand Up @@ -764,7 +764,7 @@
"- the model itself\n",
"- the hardware executing the model\n",
"\n",
"Setting up a model in Concrete-ML requires some additional work compared to standard models. For instance, users must select the quantization bit-width for both the model's weight and input data, which can be complex and time-consuming while using real FHE inference. However, Concrete-ML provides an FHE simulation mode that allows users to identify optimal hyper-parameters with the best trade-off between latency and performance.\n",
"Setting up a model in Concrete ML requires some additional work compared to standard models. For instance, users must select the quantization bit-width for both the model's weight and input data, which can be complex and time-consuming while using real FHE inference. However, Concrete ML provides an FHE simulation mode that allows users to identify optimal hyper-parameters with the best trade-off between latency and performance.\n",
"\n",
"> Testing FHE models on very large data-sets can take a long time. Furthermore, not all models are compatible with FHE constraints out-of-the-box. Simulation using the FHE simulation allows you to execute a model that was quantized, to measure the accuracy it would have in FHE, but also to determine the modifications required to make it FHE compatible.\n",
">\n",
Expand Down Expand Up @@ -849,13 +849,13 @@
"source": [
"## Conclusion\n",
"\n",
"Setting up FHE with Concrete-ML on a LinearSVC model is very simple, in the regard that Concrete-ML provides an implementation of the [scikit-learn LinearSVC interface](https://scikit-learn.org/stable/modules/generated/sklearn.svm.LinearSVC.html). As a matter of fact, a working FHE model can be setup with just a few lines of code.\n",
"Setting up FHE with Concrete ML on a LinearSVC model is very simple, in the regard that Concrete ML provides an implementation of the [scikit-learn LinearSVC interface](https://scikit-learn.org/stable/modules/generated/sklearn.svm.LinearSVC.html). As a matter of fact, a working FHE model can be setup with just a few lines of code.\n",
"\n",
"Setting up a model with FHE benefits nonetheless from some additional work. For LinearSVC models, the main point is to select a relevant bit-size for [quantizing](../explanations/quantization.md) the model. Some additional tools can smooth up the development workflow, such as alleviating the [compilation](../explanations/compilation.md) time by making use of the [FHE simulation](../explanations/compilation.md#fhe-simulation) \n",
"\n",
"Once the model is carefully trained and quantized, it is ready to be deployed and used in production. Here are some useful links that cover this subject:\n",
"- [Inference in the Cloud](../getting-started/cloud.md) summarize the steps for cloud deployment\n",
"- [Production Deployment](../guides/client_server.md) offers a high-level view of how to deploy a Concrete-ML model in a client/server setting.\n",
"- [Production Deployment](../guides/client_server.md) offers a high-level view of how to deploy a Concrete ML model in a client/server setting.\n",
"- [Client Server in Concrete ML](ClientServer.ipynb) provides a more hands-on approach as another tutorial."
]
}
Expand Down
Loading

0 comments on commit 4922992

Please sign in to comment.