Skip to content

Commit

Permalink
fix: skip check internal links, fix kaggle link
Browse files Browse the repository at this point in the history
  • Loading branch information
andrei-stoian-zama committed Apr 19, 2024
1 parent fa135da commit 91ddaac
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 8 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -310,12 +310,7 @@
"\n",
"A Concrete ML model needs to be compiled on an input-set, usually the train set or one of its sub-set, before being able to predict. This step creates an FHE circuit, which essentially saves elements found in the model's inference (graph of operations, shapes, bit-width precisions, etc.) needed for the compiler when executing the predictions in FHE during the `predict` method. \n",
"\n",
"The maximum bit-width that can be reached by any values (inputs, weights, accumulators) in this circuit is currently 16-bits. If this limit is exceeded, the compilation fails and the user needs to change some of the model's parameters (e.g., decrease the number of quantization bits or decrease `module__n_accum_bits`). \n",
"\n",
"<!--- \n",
"Make it compile in non-VL when 2037 is done\n",
"FIXME: https://github.com/zama-ai/concrete-ml-internal/issues/2307 \n",
"-->"
"The maximum bit-width that can be reached by any values (inputs, weights, accumulators) in this circuit is currently 16-bits. If this limit is exceeded, the compilation fails and the user needs to change some of the model's parameters (e.g., decrease the number of quantization bits or decrease `module__n_accum_bits`). \n"
]
},
{
Expand Down
6 changes: 5 additions & 1 deletion script/make_utils/local_link_check.py
Original file line number Diff line number Diff line change
Expand Up @@ -159,8 +159,12 @@ def main():
bad = lc.check_links(tmp_file_name, ext=".*")
if bad:
for err_link in bad:
# Skip links to CML internal issues
if "zama-ai/concrete-ml-internal" in err_link[1]:
continue

errors.append(
f"{path}/cell{cell_id} contains "
f"{path}/cell:{cell_id} contains "
f"a link to file '{err_link[1]}' that can't be found"
)
os.unlink(tmp_file_name)
Expand Down
2 changes: 1 addition & 1 deletion use_case_examples/titanic/KaggleTitanic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
"\n",
"This notebook introduces a Privacy-Preserving Machine Learning (PPML) solution to the [Kaggle Titanic competition](https://www.kaggle.com/c/titanic/) using the [Concrete ML](https://docs.zama.ai/concrete-ml) open-source framework. Its main ambition is to show that [Fully Homomorphic Encryption](https://en.wikipedia.org/wiki/Homomorphic_encryption) (FHE) can be used for protecting data when using a Machine Learning model to predict outcomes without degrading its performance. In this example, a [XGBoost](https://xgboost.readthedocs.io/en/stable/) classifier model will be considered as it achieves near state-of-the-art accuracy.\n",
"\n",
"With inspiration from the [ppxgboost repository](https://github.com/awslabs/privacy-preserving-xgboost-inference/blob/master/example/Titanic.ipynb), which is \"Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. SPDX-License-Identifier: Apache-2.0\".\n",
"With inspiration from the [ppxgboost repository](https://github.com/awslabs/privacy-preserving-xgboost-inference/blob/main/examples/Titanic.ipynb), which is \"Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. SPDX-License-Identifier: Apache-2.0\".\n",
"\n",
"It also took some ideas from several upvoted public notebooks, including [this one](https://www.kaggle.com/code/startupsci/titanic-data-science-solutions/notebook) from Manav Sehgal and [this one](https://www.kaggle.com/code/ldfreeman3/a-data-science-framework-to-achieve-99-accuracy#Step-3:-Prepare-Data-for-Consumption) from LD Freeman."
]
Expand Down

0 comments on commit 91ddaac

Please sign in to comment.