|
| 1 | +--- |
| 2 | +meta: |
| 3 | + title: How to import custom models into Managed Inference |
| 4 | + description: Learn how to import your custom models into Scaleway's Managed Inference platform. |
| 5 | +content: |
| 6 | + h1: How to import custom models into Managed Inference |
| 7 | + paragraph: Learn how to import your custom models into Scaleway's Managed Inference platform. |
| 8 | +tags: managed-inference ai-data import custom model |
| 9 | +dates: |
| 10 | + validation: 2025-03-27 |
| 11 | + posted: 2025-03-27 |
| 12 | +categories: |
| 13 | + - ai-data |
| 14 | +--- |
| 15 | + |
| 16 | +Scaleway provides a selection of common models for deployment from the Scaleway console. If you need a specific model, you can import it into Managed Inference directly from Hugging Face or a Scaleway Object Storage bucket. |
| 17 | + |
| 18 | +<Message type="note"> |
| 19 | + This feature is currently in **beta stage** and will evolve in the future. |
| 20 | +</Message> |
| 21 | + |
| 22 | +<Macro id="requirements" /> |
| 23 | +- A Scaleway account logged into the [console](https://console.scaleway.com). |
| 24 | +- [Owner](/identity-and-access-management/iam/concepts/#owner) status or [IAM permissions](/identity-and-access-management/iam/concepts/#permission) to perform actions in your Organization. |
| 25 | + |
| 26 | +1. Click **Managed Inference** in the **AI & Data** section of the side menu in the [Scaleway console](https://console.scaleway.com/) to access the dashboard. |
| 27 | +2. Click **Deploy a model** to launch the model deployment wizard. |
| 28 | +3. In the **Choose a model** section, select **Custom model**. If you have no model yet, click **Import a model** to start the model import wizard. |
| 29 | +4. Choose an upload source: |
| 30 | + - **Hugging Face**: Pull the model from Hugging Face. |
| 31 | + - **Object Storage**: This feature is coming soon. |
| 32 | +5. Enter your Hugging Face access token, which must have READ access to the repository. |
| 33 | + <Message type="note"> |
| 34 | + [Learn how to generate a Hugging Face access token](https://huggingface.co/docs/hub/security-tokens). |
| 35 | + </Message> |
| 36 | +6. Enter the name of the Hugging Face repository to pull the model from. |
| 37 | + <Message type="note"> |
| 38 | + Ensure you have access to gated models if applicable. Refer to the [Hugging Face documentation](https://huggingface.co/docs/hub/en/models-gated) for details. |
| 39 | + </Message> |
| 40 | +7. Choose a name for your model. The name must be unique within your Organization and Project and cannot be changed later. |
| 41 | +8. Click **Verify import** to check your Hugging Face credentials and ensure model compatibility. |
| 42 | +9. Review the summary of your import, which includes: |
| 43 | + - Context size by node type. |
| 44 | + - Quantization options. |
| 45 | + - Estimated cost. |
| 46 | + Once checked, click **Begin import** to finalize the process. |
| 47 | + |
| 48 | +Your imported model will now appear in the model library. You can proceed to [deploy your model on Managed Inference](/ai-data/managed-inference/how-to/create-deployment/). |
0 commit comments