diff --git a/menu/navigation.json b/menu/navigation.json
index f4f4a68a1d..d922e6bfd6 100644
--- a/menu/navigation.json
+++ b/menu/navigation.json
@@ -745,6 +745,10 @@
"label": "Deploy a model",
"slug": "create-deployment"
},
+ {
+ "label": "Import a custom model",
+ "slug": "import-custom-model"
+ },
{
"label": "Monitor a deployment",
"slug": "monitor-deployment"
diff --git a/pages/managed-inference/how-to/import-custom-model.mdx b/pages/managed-inference/how-to/import-custom-model.mdx
new file mode 100644
index 0000000000..3b8441d2ca
--- /dev/null
+++ b/pages/managed-inference/how-to/import-custom-model.mdx
@@ -0,0 +1,48 @@
+---
+meta:
+ title: How to import custom models into Managed Inference
+ description: Learn how to import your custom models into Scaleway's Managed Inference platform.
+content:
+ h1: How to import custom models into Managed Inference
+ paragraph: Learn how to import your custom models into Scaleway's Managed Inference platform.
+tags: managed-inference ai-data import custom model
+dates:
+ validation: 2025-03-27
+ posted: 2025-03-27
+categories:
+ - ai-data
+---
+
+Scaleway provides a selection of common models for deployment from the Scaleway console. If you need a specific model, you can import it into Managed Inference directly from Hugging Face or a Scaleway Object Storage bucket.
+
+
+ This feature is currently in **beta stage** and will evolve in the future.
+
+
+
+- A Scaleway account logged into the [console](https://console.scaleway.com).
+- [Owner](/identity-and-access-management/iam/concepts/#owner) status or [IAM permissions](/identity-and-access-management/iam/concepts/#permission) to perform actions in your Organization.
+
+1. Click **Managed Inference** in the **AI & Data** section of the side menu in the [Scaleway console](https://console.scaleway.com/) to access the dashboard.
+2. Click **Deploy a model** to launch the model deployment wizard.
+3. In the **Choose a model** section, select **Custom model**. If you have no model yet, click **Import a model** to start the model import wizard.
+4. Choose an upload source:
+ - **Hugging Face**: Pull the model from Hugging Face.
+ - **Object Storage**: This feature is coming soon.
+5. Enter your Hugging Face access token, which must have READ access to the repository.
+
+ [Learn how to generate a Hugging Face access token](https://huggingface.co/docs/hub/security-tokens).
+
+6. Enter the name of the Hugging Face repository to pull the model from.
+
+ Ensure you have access to gated models if applicable. Refer to the [Hugging Face documentation](https://huggingface.co/docs/hub/en/models-gated) for details.
+
+7. Choose a name for your model. The name must be unique within your Organization and Project and cannot be changed later.
+8. Click **Verify import** to check your Hugging Face credentials and ensure model compatibility.
+9. Review the summary of your import, which includes:
+ - Context size by node type.
+ - Quantization options.
+ - Estimated cost.
+ Once checked, click **Begin import** to finalize the process.
+
+Your imported model will now appear in the model library. You can proceed to [deploy your model on Managed Inference](/ai-data/managed-inference/how-to/create-deployment/).
\ No newline at end of file