Skip to content

[DRAFT] docs(genapi): add hugging face library #4748

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions menu/navigation.json
Original file line number Diff line number Diff line change
Expand Up @@ -987,6 +987,10 @@
{
"label": "Use function calling",
"slug": "use-function-calling"
},
{
"label": "Connect through Hugging Face library",
"slug": "connect-through-hugging-face-library"
}
],
"label": "How to",
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
---
meta:
title: How to connect using Hugging Face library
description: Learn how to interact with Generative APIs using the Hugging Face library
content:
h1: How to Connect using Hugging Face library
paragraph: Learn how to interact with Generative APIs using the Hugging Face library.
tags: generative-apis hugging-face library
dates:
validation: 2025-04-01
posted: 2025-04-01
---
<Macro id="requirements" />

- A Scaleway account logged into the [console](https://console.scaleway.com)
- [Owner](/iam/concepts/#owner) status or [IAM permissions](/iam/concepts/#permission) allowing you to perform actions in the intended Organization
- A valid [API key](/iam/how-to/create-api-keys/) for API authentication
- Node.js installed on your local machine
- Scaleway credentials or Hugging Face credentials with the proper access rights (two methods of connection are available)

## Steps to Connect Using Hugging Face Library

1. Create a new directory on your local machine where you will store all your project files.

2. Open a terminal in your project directory and run the following command to install the Hugging Face inference library:
```bash
npm install @huggingface/inference
```

3. Create a new file named `main.js` in your project directory and add the following code to it:
```js
import { InferenceClient } from '@huggingface/inference';

const client = new InferenceClient({ apiKey: process.env.SCW_SECRET_KEY });

const out = await client.chatCompletion({
provider: "scaleway",
// endpointUrl is not supported with third-party providers
// endpointUrl: "https://api.scaleway.ai/b409cb09-756c-430f-a8e8-748f88ef4bad",
// model: "meta-llama/Meta-Llama-3-8B-Instruct",
model: "meta-llama/Llama-3.3-70B-Instruct",
messages: [{ role: "user", content: "Tell me about Scaleway." }],
max_tokens: 512,
temperature: 0.1,
});

console.log(out.choices[0].message.content);
```

4. Execute the script by running the following command in your terminal:
```bash
node main.js
```
The model's response should be displayed in your terminal.

### Using stream completion

To use stream completion, you can modify your script as follows:
```js
import { InferenceClient } from '@huggingface/inference';

const client = new InferenceClient({ apiKey: process.env.SCW_SECRET_KEY });

for await (const chunk of client.chatCompletionStream({
model: "meta-llama/Llama-3.3-70B-Instruct",
provider: "scaleway",
messages: [{ role: "user", content: "Tell me about Scaleway." }],
max_tokens: 512,
})) {
console.log(chunk.choices[0].delta.content);
}
```

### Using Hugging Face tokens

You can also authenticate using Hugging Face tokens. Set the `HF_TOKEN` environment variable and modify your script slightly:
```js
import { InferenceClient } from '@huggingface/inference';

const client = new InferenceClient({ apiKey: process.env.HF_TOKEN });

const out = await client.chatCompletion({
provider: "scaleway",
model: "meta-llama/Llama-3.3-70B-Instruct",
messages: [{ role: "user", content: "Tell me about Scaleway." }],
max_tokens: 512,
temperature: 0.1,
});

console.log(out.choices[0].message.content);
```

In some cases, providing the token directly in the `InferenceClient` constructor might not be necessary if the environment variable is set correctly:
```js
import { InferenceClient } from '@huggingface/inference';

const client = new InferenceClient();

const out = await client.chatCompletion({
provider: "scaleway",
model: "meta-llama/Llama-3.3-70B-Instruct",
messages: [{ role: "user", content: "Tell me about Scaleway." }],
max_tokens: 512,
temperature: 0.1,
});

console.log(out.choices[0].message.content);
```
Loading