Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add WatsonX support #10238

Closed

Conversation

baptistebignaud
Copy link

It is a connector to use a LLM from WatsonX.
It requires python SDK "ibm-generative-ai"

(It might not be perfect since it is my first PR on a public repository 😄)

@vercel
Copy link

vercel bot commented Sep 5, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 7, 2023 11:35am
1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain-deprecated ⬜️ Ignored (Inspect) Visit Preview Sep 7, 2023 11:35am

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features labels Sep 5, 2023
@baskaryan
Copy link
Collaborator

thank for @baptistebignaud, looking great! last thing is it'd be great to add a demo notebook showing how to use the class to docs/extras/integrations/llms

@baskaryan baskaryan added the needs documentation PR needs to be updated with documentation label Sep 6, 2023
@baptistebignaud
Copy link
Author

thanks @baskaryan for reply, I added the notebook

def _llm_type(self) -> str:
return "watsonx"

def _call(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you match the method signature exactly:

    def _call(
        self,
        prompt: str,
        stop: Optional[List[str]] = None,
        run_manager: Optional[CallbackManagerForLLMRun] = None,
        **kwargs: Any,
    ) -> str:
        """Run the LLM on the given prompt and input."""

class Watsonx(LLM):
"""WatsonX LLM wrapper."""

model_name: str = "tiiuae/falcon-40b"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you be able to document the parameters? Especially ones that might not be obvious for interpretation?

Here's a good example of the format:

max_tokens_to_sample: int = Field(default=256, alias="max_tokens")

**kwargs: Any,
) -> str:
try:
import genai
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Package source: https://ibm.github.io/ibm-generative-ai/index.html

Looks like a name collision with: https://pypi.org/project/genai/ ?

@LukaszCmielowski
Copy link

LukaszCmielowski commented Oct 27, 2023

Please hold on with that connector since there is a work in progress to use official watsonx.ai library instead of experimental one ibm-generative-ai. The watsonx.ai product team will open the PR and I will share the link. ETA: end of this month.

@baptistebignaud please close the PR.

@baptistebignaud
Copy link
Author

Great to hear that actions have been officially taken from watsonx.ai; I close the PR

@baptistebignaud baptistebignaud deleted the add_watsonX branch October 29, 2023 21:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features Ɑ: models Related to LLMs or chat model modules needs documentation PR needs to be updated with documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants