Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix multimodal #319

Draft
wants to merge 6 commits into
base: main
Choose a base branch
from
Draft

Fix multimodal #319

wants to merge 6 commits into from

Conversation

fm1320
Copy link
Contributor

@fm1320 fm1320 commented Jan 16, 2025

What does this PR do?

Fixes #<issue_number>

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?

@@ -229,7 +231,7 @@ def convert_inputs_to_api_kwargs(
self,
input: Optional[Any] = None,
model_kwargs: Dict = {},
model_type: ModelType = ModelType.UNDEFINED,
model_type: ModelType = ModelType.UNDEFINED, # Now required in practice
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please default to LLM

response = self.sync_client.images.edit(**api_kwargs)
else:
# Image variation
elif operation == "variation":
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

create_variation

response = await self.async_client.images.edit(**api_kwargs)
elif operation == "variation":
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

create_variation

@@ -100,6 +91,7 @@ def __init__(
# args for the cache
cache_path: Optional[str] = None,
use_cache: bool = False,
model_type: ModelType = ModelType.LLM, # Add model_type parameter with default
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dont update it here, let it be controled in the model client

@@ -133,6 +124,7 @@ def __init__(
CallbackManager.__init__(self)

self.name = name or self.__class__.__name__
self.model_type = model_type # Use the passed model_type instead of getting from client
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants