-
Notifications
You must be signed in to change notification settings - Fork 484
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fleshed out MetricsFrames and their various types to enforce better #468
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
from typing import Optional | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Since this is a new module, we need an empty |
||
from pydantic import BaseModel | ||
|
||
|
||
class MetricsData(BaseModel): | ||
processor: str | ||
|
||
|
||
class TTFBMetricsData(MetricsData): | ||
value: float | ||
model: Optional[str] | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Maybe we can initialize this to
|
||
|
||
|
||
class ProcessingMetricsData(MetricsData): | ||
value: float | ||
model: Optional[str] | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Same |
||
|
||
|
||
class LLMUsageMetricsData(MetricsData): | ||
model: str | ||
prompt_tokens: int | ||
completion_tokens: int | ||
total_tokens: int | ||
|
||
|
||
class CacheUsageMetricsData(LLMUsageMetricsData): | ||
cache_read_input_tokens: int | ||
cache_creation_input_tokens: int | ||
|
||
|
||
class TTSUsageMetricsData(MetricsData): | ||
processor: str | ||
model: Optional[str] | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
|
||
value: int |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -11,6 +11,7 @@ | |
from abc import abstractmethod | ||
from typing import AsyncGenerator, List, Optional, Tuple | ||
|
||
from attr import has | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. note to self: remove (not sure why this got added) |
||
from pipecat.frames.frames import ( | ||
AudioRawFrame, | ||
CancelFrame, | ||
|
@@ -497,7 +498,7 @@ async def process_frame(self, frame: Frame, direction: FrameDirection): | |
await self.push_frame(frame, direction) | ||
await self.start_processing_metrics() | ||
await self.process_generator(self.run_image_gen(frame.text)) | ||
await self.stop_processing_metrics() | ||
await self.stop_processing_metrics(self._model if hasattr(self, "_model") else None) | ||
else: | ||
await self.push_frame(frame, direction) | ||
|
||
|
@@ -519,6 +520,6 @@ async def process_frame(self, frame: Frame, direction: FrameDirection): | |
if isinstance(frame, VisionImageRawFrame): | ||
await self.start_processing_metrics() | ||
await self.process_generator(self.run_vision(frame)) | ||
await self.stop_processing_metrics() | ||
await self.stop_processing_metrics(self._model if hasattr(self, "_model") else None) | ||
else: | ||
await self.push_frame(frame, direction) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had the same changes by autopep8. I think they broke something. If we do this change the output will be messed up with carriage return. What I do, before committing, is just revert only these changes. Very annoying, but I haven't found a better way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this one actually seems right to me (or almost right. this is wrapping at 108 😕 ). shouldn't it wrap at 100 chars? meanwhile, I agree.. something is up with the formatter because it seems inconsistent