diff --git a/docs/advanced_examples/FullyConnectedNeuralNetworkOnMNIST.ipynb b/docs/advanced_examples/FullyConnectedNeuralNetworkOnMNIST.ipynb index 32c0c88b5..dcff2e5f4 100644 --- a/docs/advanced_examples/FullyConnectedNeuralNetworkOnMNIST.ipynb +++ b/docs/advanced_examples/FullyConnectedNeuralNetworkOnMNIST.ipynb @@ -310,12 +310,7 @@ "\n", "A Concrete ML model needs to be compiled on an input-set, usually the train set or one of its sub-set, before being able to predict. This step creates an FHE circuit, which essentially saves elements found in the model's inference (graph of operations, shapes, bit-width precisions, etc.) needed for the compiler when executing the predictions in FHE during the `predict` method. \n", "\n", - "The maximum bit-width that can be reached by any values (inputs, weights, accumulators) in this circuit is currently 16-bits. If this limit is exceeded, the compilation fails and the user needs to change some of the model's parameters (e.g., decrease the number of quantization bits or decrease `module__n_accum_bits`). \n", - "\n", - "" + "The maximum bit-width that can be reached by any values (inputs, weights, accumulators) in this circuit is currently 16-bits. If this limit is exceeded, the compilation fails and the user needs to change some of the model's parameters (e.g., decrease the number of quantization bits or decrease `module__n_accum_bits`). \n" ] }, { diff --git a/script/make_utils/local_link_check.py b/script/make_utils/local_link_check.py index 29d53e2b4..006e2e058 100644 --- a/script/make_utils/local_link_check.py +++ b/script/make_utils/local_link_check.py @@ -159,8 +159,12 @@ def main(): bad = lc.check_links(tmp_file_name, ext=".*") if bad: for err_link in bad: + # Skip links to CML internal issues + if "zama-ai/concrete-ml-internal" in err_link[1]: + continue + errors.append( - f"{path}/cell{cell_id} contains " + f"{path}/cell:{cell_id} contains " f"a link to file '{err_link[1]}' that can't be found" ) os.unlink(tmp_file_name) diff --git a/use_case_examples/titanic/KaggleTitanic.ipynb b/use_case_examples/titanic/KaggleTitanic.ipynb index f36516a24..43b9aa0fa 100644 --- a/use_case_examples/titanic/KaggleTitanic.ipynb +++ b/use_case_examples/titanic/KaggleTitanic.ipynb @@ -9,7 +9,7 @@ "\n", "This notebook introduces a Privacy-Preserving Machine Learning (PPML) solution to the [Kaggle Titanic competition](https://www.kaggle.com/c/titanic/) using the [Concrete ML](https://docs.zama.ai/concrete-ml) open-source framework. Its main ambition is to show that [Fully Homomorphic Encryption](https://en.wikipedia.org/wiki/Homomorphic_encryption) (FHE) can be used for protecting data when using a Machine Learning model to predict outcomes without degrading its performance. In this example, a [XGBoost](https://xgboost.readthedocs.io/en/stable/) classifier model will be considered as it achieves near state-of-the-art accuracy.\n", "\n", - "With inspiration from the [ppxgboost repository](https://github.com/awslabs/privacy-preserving-xgboost-inference/blob/master/example/Titanic.ipynb), which is \"Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. SPDX-License-Identifier: Apache-2.0\".\n", + "With inspiration from the [ppxgboost repository](https://github.com/awslabs/privacy-preserving-xgboost-inference/blob/main/examples/Titanic.ipynb), which is \"Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. SPDX-License-Identifier: Apache-2.0\".\n", "\n", "It also took some ideas from several upvoted public notebooks, including [this one](https://www.kaggle.com/code/startupsci/titanic-data-science-solutions/notebook) from Manav Sehgal and [this one](https://www.kaggle.com/code/ldfreeman3/a-data-science-framework-to-achieve-99-accuracy#Step-3:-Prepare-Data-for-Consumption) from LD Freeman." ]