oobabooga
|
151b552bc3
|
Decrease the resolution step to allow for 1368
|
2025-12-01 18:24:02 -08:00 |
|
oobabooga
|
322aab3410
|
Increase the image_steps maximum
|
2025-12-01 18:20:47 -08:00 |
|
oobabooga
|
f46f49e26c
|
Initial Qwen-Image support
|
2025-12-01 18:18:15 -08:00 |
|
oobabooga
|
225b8c326b
|
Try to not break portable builds
|
2025-12-01 17:13:16 -08:00 |
|
oobabooga
|
5fb1380ac1
|
Handle URLs like https://huggingface.co/Qwen/Qwen-Image
|
2025-12-01 17:09:32 -08:00 |
|
oobabooga
|
7dfb6e9c57
|
Add quantization options (bnb and quanto)
|
2025-12-01 17:05:42 -08:00 |
|
oobabooga
|
a7808f7f42
|
Make filenames always have the same size
|
2025-12-01 16:06:02 -08:00 |
|
oobabooga
|
748e2e55fd
|
Add steps/second info to log message
|
2025-12-01 15:44:31 -08:00 |
|
oobabooga
|
6a7209a842
|
Add PNG metadata, add pagination to Gallery tab
|
2025-12-01 15:41:58 -08:00 |
|
oobabooga
|
b4738beaf8
|
Remove the seed UI element
|
2025-12-01 13:59:10 -08:00 |
|
oobabooga
|
9b07a83330
|
Populate the history gallery by default
|
2025-12-01 10:51:12 -08:00 |
|
oobabooga
|
e301dd231e
|
Remove some emojis
|
2025-12-01 10:49:22 -08:00 |
|
oobabooga
|
5b385dc546
|
Make the image galleries taller
|
2025-12-01 10:48:55 -08:00 |
|
oobabooga
|
b42192c2b7
|
Implement settings autosaving
|
2025-12-01 10:43:42 -08:00 |
|
oobabooga
|
41618cf799
|
Merge branch 'dev' into image_generation
|
2025-12-01 09:35:22 -08:00 |
|
oobabooga
|
5327bc9397
|
Update modules/shared.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
2025-11-28 22:48:05 -03:00 |
|
oobabooga
|
cecb172d2c
|
Add the code for 4-bit quantization
|
2025-11-27 18:29:32 -08:00 |
|
oobabooga
|
742db85de0
|
Hardcode 8-bit quantization for now
|
2025-11-27 18:23:36 -08:00 |
|
oobabooga
|
822e74ac97
|
Lint
|
2025-11-27 18:15:15 -08:00 |
|
oobabooga
|
30d1f502aa
|
More informative download message
|
2025-11-27 16:37:03 -08:00 |
|
oobabooga
|
74eedf6050
|
Remove the CFG slider
|
2025-11-27 16:28:40 -08:00 |
|
oobabooga
|
9e33c6bfb7
|
Add missing files
|
2025-11-27 15:56:58 -08:00 |
|
oobabooga
|
666816a773
|
Small fixes
|
2025-11-27 15:48:53 -08:00 |
|
oobabooga
|
21f992e7f7
|
Organize the UI
|
2025-11-27 15:42:11 -08:00 |
|
oobabooga
|
148a5d1e44
|
Keep things more modular
|
2025-11-27 15:32:01 -08:00 |
|
oobabooga
|
0adda7a5c5
|
Lint
|
2025-11-27 14:39:21 -08:00 |
|
oobabooga
|
aa074409cb
|
Better events for the dimensions
|
2025-11-27 14:38:50 -08:00 |
|
oobabooga
|
be799ba8eb
|
Lint
|
2025-11-27 14:25:49 -08:00 |
|
oobabooga
|
a873692234
|
Image generation now functional
|
2025-11-27 14:24:35 -08:00 |
|
oobabooga
|
2f11b3040d
|
Add functions
|
2025-11-27 13:53:46 -08:00 |
|
oobabooga
|
aa63c612de
|
Progress on model loading
|
2025-11-27 13:46:54 -08:00 |
|
oobabooga
|
164c6fcdbf
|
Add the UI structure
|
2025-11-27 13:44:07 -08:00 |
|
oobabooga
|
4ad2ad468e
|
Add basic structure
|
2025-11-27 10:10:11 -08:00 |
|
GodEmperor785
|
400bb0694b
|
Add slider for --ubatch-size for llama.cpp loader, change defaults for better MoE performance (#7316)
|
2025-11-21 16:56:02 -03:00 |
|
oobabooga
|
8f0048663d
|
More modular HTML generator
|
2025-11-21 07:09:16 -08:00 |
|
oobabooga
|
0d4eff284c
|
Add a --cpu-moe model for llama.cpp
|
2025-11-19 05:23:43 -08:00 |
|
Trenten Miller
|
6871484398
|
fix: Rename 'evaluation_strategy' to 'eval_strategy' in training
|
2025-10-28 16:48:04 -03:00 |
|
oobabooga
|
a156ebbf76
|
Lint
|
2025-10-15 13:15:01 -07:00 |
|
oobabooga
|
c871d9cdbd
|
Revert "Same as 7f06aec3a1 but for exllamav3_hf"
This reverts commit deb37b821b.
|
2025-10-15 13:05:41 -07:00 |
|
oobabooga
|
b5a6904c4a
|
Make --trust-remote-code immutable from the UI/API
|
2025-10-14 20:47:01 -07:00 |
|
mamei16
|
308e726e11
|
log error when llama-server request exceeds context size (#7263)
|
2025-10-12 23:00:11 -03:00 |
|
oobabooga
|
655c3e86e3
|
Fix "continue" missing an initial space in chat-instruct/chat modes
|
2025-10-11 17:00:25 -07:00 |
|
oobabooga
|
c7dd920dc8
|
Fix metadata leaking into branched chats
|
2025-10-11 14:12:05 -07:00 |
|
oobabooga
|
78ff21d512
|
Organize the --help message
|
2025-10-10 15:21:08 -07:00 |
|
oobabooga
|
0d03813e98
|
Update modules/chat.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
2025-10-09 21:01:13 -03:00 |
|
oobabooga
|
deb37b821b
|
Same as 7f06aec3a1 but for exllamav3_hf
|
2025-10-09 13:02:38 -07:00 |
|
oobabooga
|
7f06aec3a1
|
exllamav3: Implement the logits function for /v1/internal/logits
|
2025-10-09 11:24:25 -07:00 |
|
oobabooga
|
218dc01b51
|
Add fallbacks after 93aa7b3ed3
|
2025-10-09 10:59:34 -07:00 |
|
oobabooga
|
282aa19189
|
Safer profile picture uploading
|
2025-10-09 09:26:35 -07:00 |
|
oobabooga
|
93aa7b3ed3
|
Better handle multigpu setups with transformers + bitsandbytes
|
2025-10-09 08:49:44 -07:00 |
|