Skip to content

Commit

Permalink
Update dev-requirements.txt
Browse files Browse the repository at this point in the history
  • Loading branch information
markbackman committed Dec 21, 2024
1 parent 5216d41 commit c170639
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 7 deletions.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,11 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
and initialize it to start the flow.
- All examples have been updated to align with the API changes.

### Fixed

- Fixed an issue where importing the Flows module would require OpenAI,
Anthropic, and Google LLM modules.

## [0.0.9] - 2024-12-08

### Changed
Expand Down
3 changes: 2 additions & 1 deletion dev-requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,5 @@ pytest~=8.3.2
pytest-asyncio~=0.23.5
pytest-cov~=4.1.0
ruff~=0.6.7
setuptools~=72.2.0
setuptools~=72.2.0
python-dotenv~=1.0.1
12 changes: 6 additions & 6 deletions src/pipecat_flows/adapters.py
Original file line number Diff line number Diff line change
Expand Up @@ -282,8 +282,8 @@ def create_adapter(llm) -> LLMAdapter:
if isinstance(llm, OpenAILLMService):
logger.debug("Creating OpenAI adapter")
return OpenAIAdapter()
except ImportError:
pass
except ImportError as e:
logger.debug(f"OpenAI import failed: {e}")

# Try Anthropic
try:
Expand All @@ -292,8 +292,8 @@ def create_adapter(llm) -> LLMAdapter:
if isinstance(llm, AnthropicLLMService):
logger.debug("Creating Anthropic adapter")
return AnthropicAdapter()
except ImportError:
pass
except ImportError as e:
logger.debug(f"Anthropic import failed: {e}")

# Try Google
try:
Expand All @@ -302,8 +302,8 @@ def create_adapter(llm) -> LLMAdapter:
if isinstance(llm, GoogleLLMService):
logger.debug("Creating Google adapter")
return GeminiAdapter()
except ImportError:
pass
except ImportError as e:
logger.debug(f"Google import failed: {e}")

# If we get here, either the LLM type is not supported or the required dependency is not installed
llm_type = type(llm).__name__
Expand Down

0 comments on commit c170639

Please sign in to comment.