Skip to content

jeighmz/llm-watsonx

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-watsonx

PyPI License

An IBM watsonx.ai plugin for llm.

Installation

Install this plugin in the same environment as LLM. From the current directory

llm install llm-watsonx

Configuration

You will need to provide the following:

export WATSONX_API_KEY=
export WATSONX_PROJECT_ID=
  • Optionally, if your watsonx instance is not in us-south:
export WATSONX_URL=

Usage

Get list of commands:

llm watsonx --help

Models

See all available models:

llm watsonx list-models

See all generation options:

llm watsonx list-model-options

Example

llm -m watsonx/meta-llama/llama-3-8b-instruct \
    -o temperature .4 \
    -o max_new_tokens 250 \
    "What is IBM watsonx?"

Chat Example

llm chat -m watsonx/meta-llama/llama-3-8b-instruct \
    -o max_new_tokens 1000 \
    -s "You are an assistant for a CLI (command line interface). Provide and help give unix commands to help users achieve their tasks."

Embeddings

See all available models:

llm watsonx list-embedding-models

Example

cat README.md | llm embed -m watsonx/ibm/slate-30m-english-rtrvr

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%