-
Notifications
You must be signed in to change notification settings - Fork 27.1k
Pull requests: huggingface/transformers
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Update the Python version in the Chinese README to match the English README.
#34870
opened Nov 22, 2024 by
vansin
Loading…
Skip eetq test if it attempts to import shard_checkpoint
#34868
opened Nov 21, 2024 by
MekkCyber
Loading…
Fix support for image processors modifications in modular
#34866
opened Nov 21, 2024 by
yonigozlan
Loading…
1 of 5 tasks
Skipping aqlm non working inference tests till fix merged
#34865
opened Nov 21, 2024 by
MekkCyber
Loading…
🧹 Remove deprecated RotaryEmbedding parts in the Attention layers
#34858
opened Nov 21, 2024 by
Cyrilvallez
Loading…
smol improvements to support more flexible usage
#34857
opened Nov 21, 2024 by
andimarafioti
Loading…
1 of 4 tasks
Remove quantization related config from dequantized model
#34856
opened Nov 21, 2024 by
konradkalita
Loading…
3 of 5 tasks
[CI] Skip EETQ tests while package is broken with latest transformers
#34854
opened Nov 21, 2024 by
BenjaminBossan
Loading…
1 of 5 tasks
Grounding DINO Processor standardization
Processing
run-slow
Vision
#34853
opened Nov 21, 2024 by
qubvel
Loading…
3 of 5 tasks
Fix torch.onnx.export of Qwen2-VL vision encoder
Multimodal
ONNX
run-slow
Vision
#34852
opened Nov 21, 2024 by
xenova
Loading…
1 of 5 tasks
Add Flex Attention for Mistral along with refactoring
#34845
opened Nov 21, 2024 by
OmarManzoor
Loading…
Comments update for better reading
#34844
opened Nov 21, 2024 by
JohannFaust666
Loading…
5 tasks done
Add optimized
PixtralImageProcessorFast
Multimodal
optimization
Processing
Vision
#34836
opened Nov 20, 2024 by
mgoin
Loading…
4 of 5 tasks
Tiny typos in gemma2_modular.py after flex_attention introduction
#34828
opened Nov 20, 2024 by
MekkCyber
Loading…
1 of 5 tasks
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.