-
Notifications
You must be signed in to change notification settings - Fork 15.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add WatsonX support #10238
Add WatsonX support #10238
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Ignored Deployment
|
thank for @baptistebignaud, looking great! last thing is it'd be great to add a demo notebook showing how to use the class to |
thanks @baskaryan for reply, I added the notebook |
def _llm_type(self) -> str: | ||
return "watsonx" | ||
|
||
def _call( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you match the method signature exactly:
def _call(
self,
prompt: str,
stop: Optional[List[str]] = None,
run_manager: Optional[CallbackManagerForLLMRun] = None,
**kwargs: Any,
) -> str:
"""Run the LLM on the given prompt and input."""
class Watsonx(LLM): | ||
"""WatsonX LLM wrapper.""" | ||
|
||
model_name: str = "tiiuae/falcon-40b" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you be able to document the parameters? Especially ones that might not be obvious for interpretation?
Here's a good example of the format:
max_tokens_to_sample: int = Field(default=256, alias="max_tokens") |
**kwargs: Any, | ||
) -> str: | ||
try: | ||
import genai |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Package source: https://ibm.github.io/ibm-generative-ai/index.html
Looks like a name collision with: https://pypi.org/project/genai/ ?
Please hold on with that connector since there is a work in progress to use official watsonx.ai library instead of experimental one @baptistebignaud please close the PR. |
Great to hear that actions have been officially taken from watsonx.ai; I close the PR |
It is a connector to use a LLM from WatsonX.
It requires python SDK "ibm-generative-ai"
(It might not be perfect since it is my first PR on a public repository 😄)