oobabooga
5b8da154b7
Update llama.cpp
2026-03-24 09:34:59 -07:00
oobabooga
843de8b8a8
Update exllamav3 to 0.0.26
2026-03-19 18:49:36 -07:00
oobabooga
0f5053c0fb
requirements: Update pymupdf
2026-03-17 17:59:06 -07:00
oobabooga
5992e088fa
Update the custom gradio wheels
2026-03-16 19:34:37 -07:00
oobabooga
44810751de
Update llama.cpp
2026-03-16 06:21:14 -07:00
oobabooga
f8ff7cf99e
Update the custom gradio wheels
2026-03-15 14:12:59 -07:00
oobabooga
c7953fb923
Add ROCm version to portable package filenames
2026-03-14 09:44:37 -07:00
oobabooga
c908ac00d7
Replace html2text with trafilatura for better web content extraction
...
After this change a lot of boilerplate is removed from web pages, saving tokens on agentic loops.
2026-03-14 09:29:17 -07:00
oobabooga
cb88066d15
Update llama.cpp
2026-03-13 13:17:41 -07:00
oobabooga
e50b823eee
Update llama.cpp
2026-03-13 06:22:28 -07:00
oobabooga
d0b72c73c0
Update diffusers to 0.37
2026-03-13 03:43:02 -07:00
oobabooga
5ddc1002d2
Update ExLlamaV3 to 0.0.25
2026-03-13 02:40:17 -07:00
oobabooga
bb00d96dc3
Use a new gr.DragDrop element for Sampler priority + update gradio
2026-03-11 19:35:12 -03:00
oobabooga
24977846fb
Update AMD ROCm from 6.4 to 7.2
2026-03-11 13:14:26 -07:00
oobabooga
7a63a56043
Update llama.cpp
2026-03-11 12:53:19 -07:00
oobabooga
15792c3cb8
Update ExLlamaV3 to 0.0.24
2026-03-09 20:31:05 -07:00
oobabooga
aa634c77c0
Update llama.cpp
2026-03-06 21:00:36 -08:00
oobabooga
2beaa4b971
Update llama.cpp
2026-03-06 14:39:35 -08:00
oobabooga
3323dedd08
Update llama.cpp
2026-03-06 06:30:01 -08:00
oobabooga
36dbc4ccce
Remove unused colorama and psutil requirements
2026-03-06 06:28:35 -08:00
oobabooga
0e0e3ceb97
Update the custom gradio wheels
2026-03-06 05:46:08 -08:00
oobabooga
8be444a559
Update the custom gradio wheels
2026-03-05 21:05:15 -08:00
oobabooga
1729fb07b9
Update llama.cpp
2026-03-05 21:04:24 -08:00
oobabooga
2f08dce7b0
Remove ExLlamaV2 backend
...
- archived upstream: 7dc12af3a8
- replaced by ExLlamaV3, which has much better quantization accuracy
2026-03-05 14:02:13 -08:00
oobabooga
438e59498e
Update ExLlamaV3 to v0.0.23
2026-03-05 10:24:31 -08:00
oobabooga
6a08e79fa5
Update the custom gradio wheels
2026-03-04 18:22:50 -08:00
oobabooga
83cc207ef7
Update the custom gradio wheels
2026-03-04 14:31:18 -08:00
oobabooga
0ffb75de7c
Update Transformers to 5.3.0
2026-03-04 11:12:54 -08:00
oobabooga
22141679e3
Update the custom gradio wheels
2026-03-04 10:01:31 -08:00
oobabooga
f010aa1612
Replace PyPDF2 with pymupdf for PDF text extraction
...
pymupdf produces cleaner text (e.g. no concatenated words in headers),
handles encrypted and malformed PDFs that PyPDF2 failed on, and
supports non-Latin scripts.
2026-03-04 06:43:37 -08:00
oobabooga
11dc6fdfce
Update the custom gradio wheels
2026-03-04 06:04:33 -08:00
oobabooga
7d42b6900e
Update the custom gradio wheels
2026-03-04 05:47:59 -08:00
oobabooga
c0bff831e3
Update custom gradio wheels
2026-03-03 17:21:18 -08:00
oobabooga
e9f22813e4
Replace gradio with my gradio 4.37.2 fork
2026-03-03 16:51:27 -08:00
dependabot[bot]
3519890c8e
Bump flask-cloudflared from 0.0.14 to 0.0.15 in /requirements/full ( #7380 )
2026-03-03 21:41:51 -03:00
dependabot[bot]
9c604628a0
Bump flask-cloudflared from 0.0.14 to 0.0.15 in /requirements/portable ( #7382 )
2026-03-03 21:41:46 -03:00
oobabooga
fbd2acfa19
Remove triton-windows from non-CUDA requirements
2026-03-03 16:16:55 -08:00
oobabooga
5fd79b23d1
Add CUDA 13.1 portable builds
2026-03-03 15:36:41 -08:00
oobabooga
b8fcc8ea32
Update llama.cpp, remove noavx2 builds, add ROCm Windows portable builds
2026-03-03 15:27:19 -08:00
oobabooga
38d0eeefc0
Update dependencies: torch 2.9.1, transformers 5.2, exllamav3 0.0.22, accelerate 1.12, huggingface-hub 1.5
2026-03-03 12:01:02 -08:00
oobabooga
ddd74324fe
Update PyTorch to 2.9.1 and ROCm to 6.4
2026-03-03 11:38:52 -08:00
oobabooga
efc72d5c32
Update Python from 3.11 to 3.13
2026-03-03 11:03:26 -08:00
dependabot[bot]
cae1fef42d
Bump triton-windows in /requirements/full ( #7368 )
2026-01-14 21:30:59 -03:00
oobabooga
d79cdc614c
Update llama.cpp
2026-01-08 11:24:15 -08:00
oobabooga
332fd40653
Update llama.cpp
2026-01-07 19:06:23 -08:00
dependabot[bot]
50a35b483c
Update bitsandbytes requirement in /requirements/full ( #7353 )
2026-01-06 15:27:23 -03:00
dependabot[bot]
45fbec0320
Update torchao requirement in /requirements/full ( #7356 )
2026-01-06 15:27:10 -03:00
oobabooga
b0968ed8b4
Update flash-linear-attention
2026-01-06 10:26:43 -08:00
oobabooga
bb3b7bc197
Update llama.cpp
2026-01-06 10:23:58 -08:00
oobabooga
09d88f91e8
Update llama.cpp
2025-12-19 21:00:13 -08:00