Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement BetterTransformer alternative #278

Closed
wants to merge 3 commits into from

Conversation

chainyo
Copy link
Contributor

@chainyo chainyo commented Oct 13, 2023

This PR isn't meant to be merged, it illustrates how BetterTransformer could replace faster-whisper for transcription optimization. For being merged, it needs more work on:

  • Extra language handling
  • Make transcription batch_size adaptative based on the detected hardware
  • Using the vocab with the new model implementation
  • Handling multi-gpu context
  • Rewriting the config and environment variables
  • Removing unused API parameters that were specific to faster-whisper
  • Update the VadService to avoid using faster-whisper
  • Remove faster-whisper from dependencies
  • Fix the multi-channel and live implementations
  • Update the schemas in the model.py file
  • Remove unused imports (a simple lint command should work)
  • Fix tests

@chainyo chainyo added api Everything related to the API implementation transcription Everything related to the transcription part labels Oct 13, 2023
@chainyo chainyo linked an issue Oct 13, 2023 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api Everything related to the API implementation transcription Everything related to the transcription part
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement BetterTransformer alternative
2 participants