dependabot[bot]
|
fcb592885a
|
Merge 20886dfc3a into aecbc5a8ac
|
2026-01-29 22:33:55 +01:00 |
|
oobabooga
|
aecbc5a8ac
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2026-01-28 08:30:28 -08:00 |
|
oobabooga
|
c54e8a2b3d
|
Try to spawn llama.cpp on port 5001 instead of random port
|
2026-01-28 08:23:55 -08:00 |
|
oobabooga
|
dc2bbf1861
|
Refactor thinking block detection and add Solar Open support
|
2026-01-28 08:21:34 -08:00 |
|
dependabot[bot]
|
20886dfc3a
|
Update transformers requirement in /requirements/full
Updates the requirements on [transformers](https://github.com/huggingface/transformers) to permit the latest version.
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](https://github.com/huggingface/transformers/compare/v4.57.0...v5.0.0)
---
updated-dependencies:
- dependency-name: transformers
dependency-version: 5.0.0
dependency-type: direct:production
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2026-01-26 21:40:54 +00:00 |
|
dependabot[bot]
|
cae1fef42d
|
Bump triton-windows in /requirements/full (#7368)
|
2026-01-14 21:30:59 -03:00 |
|
q5sys (JT)
|
7493fe7841
|
feat: Add a dropdown to save/load user personas (#7367)
|
2026-01-14 20:35:08 -03:00 |
|
jakubartur
|
21b979c02a
|
Fix code block copy button on HTTP (Clipboard API fallback) (#7358)
|
2026-01-14 19:34:21 -03:00 |
|
oobabooga
|
a731861127
|
Update README
|
2026-01-13 15:38:32 -08:00 |
|
oobabooga
|
d79cdc614c
|
Update llama.cpp
|
2026-01-08 11:24:15 -08:00 |
|
oobabooga
|
332fd40653
|
Update llama.cpp
|
2026-01-07 19:06:23 -08:00 |
|
dependabot[bot]
|
50a35b483c
|
Update bitsandbytes requirement in /requirements/full (#7353)
|
2026-01-06 15:27:23 -03:00 |
|
dependabot[bot]
|
45fbec0320
|
Update torchao requirement in /requirements/full (#7356)
|
2026-01-06 15:27:10 -03:00 |
|
oobabooga
|
b0968ed8b4
|
Update flash-linear-attention
|
2026-01-06 10:26:43 -08:00 |
|
oobabooga
|
36747cf99c
|
Lint
|
2026-01-06 10:24:34 -08:00 |
|
oobabooga
|
2fcbadec67
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2026-01-06 10:24:07 -08:00 |
|
oobabooga
|
bb3b7bc197
|
Update llama.cpp
|
2026-01-06 10:23:58 -08:00 |
|
Sergey 'Jin' Bostandzhyan
|
6e2c4e9c23
|
Fix loading models which have their eos token disabled (#7363)
|
2026-01-06 11:31:10 -03:00 |
|
oobabooga
|
a2ed640aa6
|
UI: Improved border color for tables + hr
|
2025-12-21 15:38:48 -03:00 |
|
oobabooga
|
1066fe8c21
|
UI: Improve table styles (more minimalistic)
|
2025-12-21 15:32:02 -03:00 |
|
oobabooga
|
9530d3a6d8
|
UI: Improve hr (horizontal separator) style
|
2025-12-21 15:30:54 -03:00 |
|
oobabooga
|
09d88f91e8
|
Update llama.cpp
|
2025-12-19 21:00:13 -08:00 |
|
oobabooga
|
6e8fb0e7b1
|
Update llama.cpp
|
2025-12-14 13:32:14 -08:00 |
|
oobabooga
|
9fe40ff90f
|
Update exllamav3 to 0.0.18
|
2025-12-10 05:37:33 -08:00 |
|
oobabooga
|
8e762e04b4
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2025-12-09 05:27:43 -08:00 |
|
oobabooga
|
aa16266c38
|
Update llama.cpp
|
2025-12-09 03:19:23 -08:00 |
|
dependabot[bot]
|
85269d7fbb
|
Update safetensors requirement in /requirements/full (#7323)
|
2025-12-08 17:58:27 -03:00 |
|
dependabot[bot]
|
c4ebab9b29
|
Bump triton-windows in /requirements/full (#7346)
|
2025-12-08 17:56:07 -03:00 |
|
oobabooga
|
502f59d39b
|
Update diffusers to 0.36
|
2025-12-08 05:08:54 -08:00 |
|
oobabooga
|
e7c8b51fec
|
Revert "Use flash_attention_2 by default for Transformers models"
This reverts commit 85f2df92e9.
|
2025-12-07 18:48:41 -08:00 |
|
oobabooga
|
b758059e95
|
Revert "Clear the torch cache between sequential image generations"
This reverts commit 1ec9f708e5.
|
2025-12-07 12:23:19 -08:00 |
|
oobabooga
|
1ec9f708e5
|
Clear the torch cache between sequential image generations
|
2025-12-07 11:49:22 -08:00 |
|
oobabooga
|
3b8369a679
|
Update llama.cpp
|
2025-12-07 11:18:36 -08:00 |
|
oobabooga
|
058e78411d
|
docs: Small changes
|
2025-12-07 10:16:08 -08:00 |
|
oobabooga
|
17bd8d10f0
|
Update exllamav3 to 0.0.17
|
2025-12-07 09:37:18 -08:00 |
|
oobabooga
|
85f2df92e9
|
Use flash_attention_2 by default for Transformers models
|
2025-12-07 06:56:58 -08:00 |
|
oobabooga
|
1762312fb4
|
Use random instead of np.random for image seeds (makes it work on Windows)
|
2025-12-06 20:10:32 -08:00 |
|
oobabooga
|
160a25165a
|
docs: Small change
|
2025-12-06 08:41:12 -08:00 |
|
oobabooga
|
f93cc4b5c3
|
Add an API example to the image generation tutorial
|
2025-12-06 08:33:06 -08:00 |
|
oobabooga
|
c026dbaf64
|
Fix API requests always returning the same 'created' time
|
2025-12-06 08:23:21 -08:00 |
|
oobabooga
|
194e4c285f
|
Update llama.cpp
|
2025-12-06 08:14:48 -08:00 |
|
oobabooga
|
1c36559e2b
|
Add a News section to the README
|
2025-12-06 07:05:00 -08:00 |
|
oobabooga
|
02518a96a9
|
Lint
|
2025-12-06 06:55:06 -08:00 |
|
oobabooga
|
0100ad1bd7
|
Add user_data/image_outputs to the Gradio allowed paths
|
2025-12-06 06:39:30 -08:00 |
|
oobabooga
|
6411142111
|
docs: Small changes
|
2025-12-06 06:36:16 -08:00 |
|
oobabooga
|
455dc06db0
|
Serve the original PNG images in the UI instead of webp
|
2025-12-06 05:43:00 -08:00 |
|
oobabooga
|
1a9ed1fe98
|
Fix the height of the image output gallery
|
2025-12-06 05:21:26 -08:00 |
|
oobabooga
|
17b12567d8
|
docs: Small changes
|
2025-12-05 14:15:15 -08:00 |
|
oobabooga
|
e20b2d38ff
|
docs: Add VRAM measurements for Z-Image-Turbo
|
2025-12-05 14:12:08 -08:00 |
|
oobabooga
|
6ca99910ba
|
Image: Quantize the text encoder for lower VRAM
|
2025-12-05 13:08:46 -08:00 |
|