oobabooga
|
cc757f6226
|
Small style improvements to the chat tab
|
2025-06-15 08:32:06 -07:00 |
|
oobabooga
|
b279460a81
|
Improve the wpp style
|
2025-06-15 08:25:07 -07:00 |
|
oobabooga
|
e8dc7b0ee9
|
Bump exllamav3 to 0.0.4
|
2025-06-15 08:15:29 -07:00 |
|
oobabooga
|
4fc254c1dd
|
Optimize syntax highlighting on long conversations
|
2025-06-15 08:13:13 -07:00 |
|
oobabooga
|
609c3ac893
|
Optimize the end of generation with llama.cpp
|
2025-06-15 08:03:27 -07:00 |
|
oobabooga
|
db7d717df7
|
Remove images and links from websearch results
This reduces noise a lot
|
2025-06-14 20:00:25 -07:00 |
|
oobabooga
|
e263dbf852
|
Improve user input truncation
|
2025-06-14 19:43:51 -07:00 |
|
oobabooga
|
09606a38d3
|
Truncate web search results to at most 8192 tokens
|
2025-06-14 19:37:32 -07:00 |
|
oobabooga
|
ad0be25c46
|
Update llama.cpp
|
2025-06-14 15:00:14 -07:00 |
|
oobabooga
|
7c0225931a
|
Merge branch 'main' into dev
|
2025-06-14 14:59:37 -07:00 |
|
oobabooga
|
1c1cf09a59
|
Update workflows
|
2025-06-14 14:52:49 -07:00 |
|
oobabooga
|
58c3b549ba
|
Merge branch 'main' into dev
|
2025-06-14 10:16:13 -07:00 |
|
oobabooga
|
8e9c0287aa
|
UI: Fix edge case where gpu-layers slider maximum is incorrectly limited
|
2025-06-14 10:12:11 -07:00 |
|
oobabooga
|
8e0ef5b419
|
Hide the header bar on Ctrl+S
|
2025-06-14 09:09:46 -07:00 |
|
oobabooga
|
1d23159837
|
Increase the size of the enlarged character profile picture
|
2025-06-14 08:45:59 -07:00 |
|
oobabooga
|
d2da40b0e4
|
Remember the last selected chat for each mode/character
|
2025-06-14 08:25:00 -07:00 |
|
oobabooga
|
879fa3d8c4
|
Improve the wpp style & simplify the code
|
2025-06-14 07:14:22 -07:00 |
|
oobabooga
|
09eb326486
|
Merge README.md changes from dev branch
|
2025-06-13 07:46:43 -07:00 |
|
oobabooga
|
dfab11f0b5
|
Update README
|
2025-06-13 07:45:42 -07:00 |
|
oobabooga
|
9a2353f97b
|
Better log message when the user input gets truncated
|
2025-06-13 05:44:02 -07:00 |
|
oobabooga
|
322cd28e24
|
Update README
|
2025-06-13 01:27:33 -07:00 |
|
oobabooga
|
7cb650237c
|
Update the README
|
2025-06-13 01:12:52 -07:00 |
|
oobabooga
|
aab28398ef
|
Update README
|
2025-06-13 01:06:44 -07:00 |
|
oobabooga
|
5ba52967ac
|
Update README
|
2025-06-13 01:04:41 -07:00 |
|
oobabooga
|
b58e80cb99
|
Update README
|
2025-06-13 01:02:11 -07:00 |
|
Miriam
|
f4f621b215
|
ensure estimated vram is updated when switching between different models (#7071)
|
2025-06-13 02:56:33 -03:00 |
|
oobabooga
|
f337767f36
|
Add error handling for non-llama.cpp models in portable mode
|
2025-06-12 22:17:39 -07:00 |
|
oobabooga
|
a25a1fc8d0
|
Disable message action icons during streaming for better performance
|
2025-06-12 22:01:02 -07:00 |
|
oobabooga
|
2dee3a66ff
|
Add an option to include/exclude attachments from previous messages in the chat prompt
|
2025-06-12 21:37:18 -07:00 |
|
oobabooga
|
2cfb77d16f
|
Merge pull request #7070 from oobabooga/dev
Merge dev branch
|
2025-06-12 12:38:47 -03:00 |
|
oobabooga
|
b4d2a00e20
|
Update README
|
2025-06-12 08:35:33 -07:00 |
|
oobabooga
|
9ff5961853
|
Merge pull request #7067 from oobabooga/dev
Merge dev branch
|
2025-06-11 11:58:52 -03:00 |
|
oobabooga
|
9d6a7f1bcf
|
Minor changes
|
2025-06-11 07:55:35 -07:00 |
|
oobabooga
|
004fd8316c
|
Minor changes
|
2025-06-11 07:49:51 -07:00 |
|
oobabooga
|
570d5b8936
|
Only save extensions on manual save
|
2025-06-11 07:39:49 -07:00 |
|
oobabooga
|
27140f3563
|
Revert "Don't save active extensions through the UI"
This reverts commit df98f4b331.
|
2025-06-11 07:25:27 -07:00 |
|
oobabooga
|
2ebc8ff252
|
Merge pull request #7065 from oobabooga/dev
Merge dev branch
|
2025-06-11 01:09:06 -03:00 |
|
oobabooga
|
13a5288d01
|
Fix an error when upgrading from cuda 12.4 to cuda 12.8
|
2025-06-10 21:08:18 -07:00 |
|
oobabooga
|
801db438b0
|
Undo changes to portable builds
|
2025-06-10 19:55:40 -07:00 |
|
oobabooga
|
00fbbd6f57
|
Undo changes to portable builds
|
2025-06-10 19:54:42 -07:00 |
|
oobabooga
|
e8041069e2
|
Merge pull request #7064 from oobabooga/dev
Merge dev branch
|
2025-06-10 23:43:10 -03:00 |
|
oobabooga
|
fe0685a742
|
New attempt
|
2025-06-10 19:42:22 -07:00 |
|
oobabooga
|
036976aeb8
|
Merge pull request #7063 from oobabooga/dev
Merge dev branch
|
2025-06-10 23:35:22 -03:00 |
|
oobabooga
|
43fc170224
|
Fix the Windows workflow
|
2025-06-10 19:34:41 -07:00 |
|
oobabooga
|
e9a433832e
|
Merge pull request #7062 from oobabooga/dev
Merge dev branch
|
2025-06-10 23:26:21 -03:00 |
|
oobabooga
|
a86a5a026e
|
Fix the GitHub Actions workflows
|
2025-06-10 19:25:22 -07:00 |
|
oobabooga
|
1e96dcf369
|
Merge pull request #7057 from oobabooga/dev
Merge dev branch
|
2025-06-10 23:08:44 -03:00 |
|
oobabooga
|
552cb09f09
|
Do not bump Transformers to 4.52 on CUDA 12.8
Performance is slow, and the older version works fine with torch 2.7.
|
2025-06-10 18:45:42 -07:00 |
|
LawnMauer
|
bc921c66e5
|
Load js and css sources in UTF-8 (#7059)
|
2025-06-10 22:16:50 -03:00 |
|
oobabooga
|
4cf39120fc
|
Fix chat area sometimes not scrolling up to edit message
|
2025-06-10 18:03:00 -07:00 |
|