oobabooga
|
2dee3a66ff
|
Add an option to include/exclude attachments from previous messages in the chat prompt
|
2025-06-12 21:37:18 -07:00 |
|
oobabooga
|
004fd8316c
|
Minor changes
|
2025-06-11 07:49:51 -07:00 |
|
oobabooga
|
570d5b8936
|
Only save extensions on manual save
|
2025-06-11 07:39:49 -07:00 |
|
oobabooga
|
27140f3563
|
Revert "Don't save active extensions through the UI"
This reverts commit df98f4b331.
|
2025-06-11 07:25:27 -07:00 |
|
LawnMauer
|
bc921c66e5
|
Load js and css sources in UTF-8 (#7059)
|
2025-06-10 22:16:50 -03:00 |
|
oobabooga
|
df98f4b331
|
Don't save active extensions through the UI
Prevents command-line activated extensions from becoming permanently active due to autosave.
|
2025-06-09 20:28:16 -07:00 |
|
oobabooga
|
eefbf96f6a
|
Don't save truncation_length to user_data/settings.yaml
|
2025-06-08 22:14:56 -07:00 |
|
oobabooga
|
0b8d2d65a2
|
Minor style improvement
|
2025-06-08 18:11:27 -07:00 |
|
oobabooga
|
f81b1540ca
|
Small style improvements
|
2025-06-08 15:19:25 -07:00 |
|
oobabooga
|
84f66484c5
|
Make it optional to paste long pasted content to an attachment
|
2025-06-08 09:31:38 -07:00 |
|
oobabooga
|
6436bf1920
|
More UI persistence: presets and characters (#7051)
|
2025-06-08 01:58:02 -03:00 |
|
oobabooga
|
35ed55d18f
|
UI persistence (#7050)
|
2025-06-07 22:46:52 -03:00 |
|
oobabooga
|
2d263f227d
|
Fix the chat input reappearing when the page is reloaded
|
2025-06-06 22:38:20 -07:00 |
|
oobabooga
|
bb409c926e
|
Update only the last message during streaming + add back dynamic UI update speed (#7038)
|
2025-06-02 09:50:17 -03:00 |
|
oobabooga
|
298d4719c6
|
Multiple small style improvements
|
2025-05-30 11:32:24 -07:00 |
|
oobabooga
|
27641ac182
|
UI: Make message editing work the same for user and assistant messages
|
2025-05-28 17:23:46 -07:00 |
|
oobabooga
|
077bbc6b10
|
Add web search support (#7023)
|
2025-05-28 04:27:28 -03:00 |
|
Underscore
|
5028480eba
|
UI: Add footer buttons for editing messages (#7019)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2025-05-28 00:55:27 -03:00 |
|
Underscore
|
355b5f6c8b
|
UI: Add message version navigation (#6947)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2025-05-27 22:54:18 -03:00 |
|
Daniel Dengler
|
c25a381540
|
Add a "Branch here" footer button to chat messages (#6967)
|
2025-05-20 11:07:40 -03:00 |
|
oobabooga
|
9ec46b8c44
|
Remove the HQQ loader (HQQ models can be loaded through Transformers)
|
2025-05-19 09:23:24 -07:00 |
|
oobabooga
|
126b3a768f
|
Revert "Dynamic Chat Message UI Update Speed (#6952)" (for now)
This reverts commit 8137eb8ef4.
|
2025-05-18 12:38:36 -07:00 |
|
oobabooga
|
3fa1a899ae
|
UI: Fix gpu-layers being ignored (closes #6973)
|
2025-05-13 12:07:59 -07:00 |
|
mamei16
|
8137eb8ef4
|
Dynamic Chat Message UI Update Speed (#6952)
|
2025-05-05 18:05:23 -03:00 |
|
oobabooga
|
475e012ee8
|
UI: Improve the light theme colors
|
2025-05-05 06:16:29 -07:00 |
|
oobabooga
|
1dd4aedbe1
|
Fix the streaming_llm UI checkbox not being interactive
|
2025-04-29 05:28:46 -07:00 |
|
oobabooga
|
d10bded7f8
|
UI: Add an enable_thinking option to enable/disable Qwen3 thinking
|
2025-04-28 22:37:01 -07:00 |
|
oobabooga
|
943451284f
|
Fix the Notebook tab not loading its default prompt
|
2025-04-26 18:25:06 -07:00 |
|
oobabooga
|
d9de14d1f7
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
oobabooga
|
d4b1e31c49
|
Use --ctx-size to specify the context size for all loaders
Old flags are still recognized as alternatives.
|
2025-04-25 16:59:03 -07:00 |
|
oobabooga
|
d35818f4e1
|
UI: Add a collapsible thinking block to messages with <think> steps (#6902)
|
2025-04-25 18:02:02 -03:00 |
|
oobabooga
|
98f4c694b9
|
llama.cpp: Add --extra-flags parameter for passing additional flags to llama-server
|
2025-04-25 07:32:51 -07:00 |
|
oobabooga
|
e99c20bcb0
|
llama.cpp: Add speculative decoding (#6891)
|
2025-04-23 20:10:16 -03:00 |
|
oobabooga
|
ae02ffc605
|
Refactor the transformers loader (#6859)
|
2025-04-20 13:33:47 -03:00 |
|
oobabooga
|
8144e1031e
|
Remove deprecated command-line flags
|
2025-04-18 06:02:28 -07:00 |
|
oobabooga
|
ae54d8faaa
|
New llama.cpp loader (#6846)
|
2025-04-18 09:59:37 -03:00 |
|
oobabooga
|
5bcd2d7ad0
|
Add the top N-sigma sampler (#6796)
|
2025-03-14 16:45:11 -03:00 |
|
oobabooga
|
0360f54ae8
|
UI: add a "Show after" parameter (to use with DeepSeek </think>)
|
2025-02-02 15:30:09 -08:00 |
|
oobabooga
|
a5d64b586d
|
Add a "copy" button below each message (#6654)
|
2025-01-11 16:59:21 -03:00 |
|
oobabooga
|
83c426e96b
|
Organize internals (#6646)
|
2025-01-10 18:04:32 -03:00 |
|
oobabooga
|
7157257c3f
|
Remove the AutoGPTQ loader (#6641)
|
2025-01-08 19:28:56 -03:00 |
|
oobabooga
|
c0f600c887
|
Add a --torch-compile flag for transformers
|
2025-01-05 05:47:00 -08:00 |
|
oobabooga
|
11af199aff
|
Add a "Static KV cache" option for transformers
|
2025-01-04 17:52:57 -08:00 |
|
oobabooga
|
4b3e1b3757
|
UI: add a "Search chats" input field
|
2025-01-02 18:46:40 -08:00 |
|
oobabooga
|
b051e2c161
|
UI: improve a margin for readability
|
2024-12-17 19:58:21 -08:00 |
|
Diner Burger
|
addad3c63e
|
Allow more granular KV cache settings (#6561)
|
2024-12-17 17:43:48 -03:00 |
|
oobabooga
|
c43ee5db11
|
UI: very minor color change
|
2024-12-17 07:59:55 -08:00 |
|
oobabooga
|
d769618591
|
Improved UI (#6575)
|
2024-12-17 00:47:41 -03:00 |
|
oobabooga
|
93c250b9b6
|
Add a UI element for enable_tp
|
2024-10-01 11:16:15 -07:00 |
|
Philipp Emanuel Weidmann
|
301375834e
|
Exclude Top Choices (XTC): A sampler that boosts creativity, breaks writing clichés, and inhibits non-verbatim repetition (#6335)
|
2024-09-27 22:50:12 -03:00 |
|