oobabooga
|
ffef3c7b1d
|
Image: Make the LLM Variations prompt configurable
|
2025-12-04 10:44:35 -08:00 |
|
oobabooga
|
2793153717
|
Image: Add LLM-generated prompt variations
|
2025-12-04 08:10:24 -08:00 |
|
oobabooga
|
9d07d3a229
|
Make portable builds functional again after b3666e140d
|
2025-12-02 10:06:57 -08:00 |
|
oobabooga
|
b3666e140d
|
Add image generation support (#7328)
|
2025-12-02 14:55:38 -03:00 |
|
GodEmperor785
|
400bb0694b
|
Add slider for --ubatch-size for llama.cpp loader, change defaults for better MoE performance (#7316)
|
2025-11-21 16:56:02 -03:00 |
|
oobabooga
|
0d4eff284c
|
Add a --cpu-moe model for llama.cpp
|
2025-11-19 05:23:43 -08:00 |
|
oobabooga
|
b5a6904c4a
|
Make --trust-remote-code immutable from the UI/API
|
2025-10-14 20:47:01 -07:00 |
|
oobabooga
|
13876a1ee8
|
llama.cpp: Remove the --flash-attn flag (it's always on now)
|
2025-08-30 20:28:26 -07:00 |
|
oobabooga
|
dbabe67e77
|
ExLlamaV3: Enable the --enable-tp option, add a --tp-backend option
|
2025-08-17 13:19:11 -07:00 |
|
oobabooga
|
d86b0ec010
|
Add multimodal support (llama.cpp) (#7027)
|
2025-08-10 01:27:25 -03:00 |
|
oobabooga
|
498778b8ac
|
Add a new 'Reasoning effort' UI element
|
2025-08-05 15:19:11 -07:00 |
|
oobabooga
|
1d1b20bd77
|
Remove the --torch-compile option (it doesn't do anything currently)
|
2025-07-11 10:51:23 -07:00 |
|
oobabooga
|
6c2bdda0f0
|
Transformers loader: replace use_flash_attention_2/use_eager_attention with a unified attn_implementation
Closes #7107
|
2025-07-09 18:39:37 -07:00 |
|
oobabooga
|
92ec8dda03
|
Fix chat history getting lost if the UI is inactive for a long time (closes #7109)
|
2025-07-04 06:04:04 -07:00 |
|
oobabooga
|
645463b9f0
|
Add fallback values for theme colors
|
2025-06-19 11:28:12 -07:00 |
|
oobabooga
|
aa44e542cb
|
Revert "Safer usage of mkdir across the project"
This reverts commit 0d1597616f.
|
2025-06-17 07:11:59 -07:00 |
|
oobabooga
|
0d1597616f
|
Safer usage of mkdir across the project
|
2025-06-17 07:09:33 -07:00 |
|
oobabooga
|
de24b3bb31
|
Merge the Default and Notebook tabs into a single Notebook tab (#7078)
|
2025-06-16 13:19:29 -03:00 |
|
oobabooga
|
bc2b0f54e9
|
Only save extensions settings on manual save
|
2025-06-15 15:53:16 -07:00 |
|
oobabooga
|
2dee3a66ff
|
Add an option to include/exclude attachments from previous messages in the chat prompt
|
2025-06-12 21:37:18 -07:00 |
|
oobabooga
|
004fd8316c
|
Minor changes
|
2025-06-11 07:49:51 -07:00 |
|
oobabooga
|
570d5b8936
|
Only save extensions on manual save
|
2025-06-11 07:39:49 -07:00 |
|
oobabooga
|
27140f3563
|
Revert "Don't save active extensions through the UI"
This reverts commit df98f4b331.
|
2025-06-11 07:25:27 -07:00 |
|
LawnMauer
|
bc921c66e5
|
Load js and css sources in UTF-8 (#7059)
|
2025-06-10 22:16:50 -03:00 |
|
oobabooga
|
df98f4b331
|
Don't save active extensions through the UI
Prevents command-line activated extensions from becoming permanently active due to autosave.
|
2025-06-09 20:28:16 -07:00 |
|
oobabooga
|
eefbf96f6a
|
Don't save truncation_length to user_data/settings.yaml
|
2025-06-08 22:14:56 -07:00 |
|
oobabooga
|
0b8d2d65a2
|
Minor style improvement
|
2025-06-08 18:11:27 -07:00 |
|
oobabooga
|
f81b1540ca
|
Small style improvements
|
2025-06-08 15:19:25 -07:00 |
|
oobabooga
|
84f66484c5
|
Make it optional to paste long pasted content to an attachment
|
2025-06-08 09:31:38 -07:00 |
|
oobabooga
|
6436bf1920
|
More UI persistence: presets and characters (#7051)
|
2025-06-08 01:58:02 -03:00 |
|
oobabooga
|
35ed55d18f
|
UI persistence (#7050)
|
2025-06-07 22:46:52 -03:00 |
|
oobabooga
|
2d263f227d
|
Fix the chat input reappearing when the page is reloaded
|
2025-06-06 22:38:20 -07:00 |
|
oobabooga
|
bb409c926e
|
Update only the last message during streaming + add back dynamic UI update speed (#7038)
|
2025-06-02 09:50:17 -03:00 |
|
oobabooga
|
298d4719c6
|
Multiple small style improvements
|
2025-05-30 11:32:24 -07:00 |
|
oobabooga
|
27641ac182
|
UI: Make message editing work the same for user and assistant messages
|
2025-05-28 17:23:46 -07:00 |
|
oobabooga
|
077bbc6b10
|
Add web search support (#7023)
|
2025-05-28 04:27:28 -03:00 |
|
Underscore
|
5028480eba
|
UI: Add footer buttons for editing messages (#7019)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2025-05-28 00:55:27 -03:00 |
|
Underscore
|
355b5f6c8b
|
UI: Add message version navigation (#6947)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2025-05-27 22:54:18 -03:00 |
|
Daniel Dengler
|
c25a381540
|
Add a "Branch here" footer button to chat messages (#6967)
|
2025-05-20 11:07:40 -03:00 |
|
oobabooga
|
9ec46b8c44
|
Remove the HQQ loader (HQQ models can be loaded through Transformers)
|
2025-05-19 09:23:24 -07:00 |
|
oobabooga
|
126b3a768f
|
Revert "Dynamic Chat Message UI Update Speed (#6952)" (for now)
This reverts commit 8137eb8ef4.
|
2025-05-18 12:38:36 -07:00 |
|
oobabooga
|
3fa1a899ae
|
UI: Fix gpu-layers being ignored (closes #6973)
|
2025-05-13 12:07:59 -07:00 |
|
mamei16
|
8137eb8ef4
|
Dynamic Chat Message UI Update Speed (#6952)
|
2025-05-05 18:05:23 -03:00 |
|
oobabooga
|
475e012ee8
|
UI: Improve the light theme colors
|
2025-05-05 06:16:29 -07:00 |
|
oobabooga
|
1dd4aedbe1
|
Fix the streaming_llm UI checkbox not being interactive
|
2025-04-29 05:28:46 -07:00 |
|
oobabooga
|
d10bded7f8
|
UI: Add an enable_thinking option to enable/disable Qwen3 thinking
|
2025-04-28 22:37:01 -07:00 |
|
oobabooga
|
943451284f
|
Fix the Notebook tab not loading its default prompt
|
2025-04-26 18:25:06 -07:00 |
|
oobabooga
|
d9de14d1f7
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
oobabooga
|
d4b1e31c49
|
Use --ctx-size to specify the context size for all loaders
Old flags are still recognized as alternatives.
|
2025-04-25 16:59:03 -07:00 |
|
oobabooga
|
d35818f4e1
|
UI: Add a collapsible thinking block to messages with <think> steps (#6902)
|
2025-04-25 18:02:02 -03:00 |
|