Commit graph

4442 commits

Author SHA1 Message Date
oobabooga 041248cc9f Update llama.cpp 2025-05-15 20:10:02 -07:00
oobabooga 5534d01da0
Estimate the VRAM for GGUF models + autoset gpu-layers (#6980) 2025-05-16 00:07:37 -03:00
oobabooga c4a715fd1e UI: Move the LoRA menu under "Other options" 2025-05-13 20:14:09 -07:00
oobabooga 035cd3e2a9 UI: Hide the extension install menu in portable builds 2025-05-13 20:09:22 -07:00
oobabooga 2826c60044 Use logger for "Output generated in ..." messages 2025-05-13 14:45:46 -07:00
oobabooga 3fa1a899ae UI: Fix gpu-layers being ignored (closes #6973) 2025-05-13 12:07:59 -07:00
oobabooga c375b69413 API: Fix llama.cpp generating after disconnect, improve disconnect detection, fix deadlock on simultaneous requests 2025-05-13 11:23:33 -07:00
oobabooga 62c774bf24 Revert "New attempt"
This reverts commit e7ac06c169.
2025-05-13 06:42:25 -07:00
oobabooga e7ac06c169 New attempt 2025-05-10 19:20:04 -07:00
oobabooga 0c5fa3728e Revert "Fix API failing to cancel streams (attempt), closes #6966"
This reverts commit 006a866079.
2025-05-10 19:12:40 -07:00
oobabooga 006a866079 Fix API failing to cancel streams (attempt), closes #6966 2025-05-10 17:55:48 -07:00
oobabooga 47d4758509 Fix #6970 2025-05-10 17:46:00 -07:00
oobabooga 4920981b14 UI: Remove the typing cursor 2025-05-09 20:35:38 -07:00
oobabooga 8984e95c67 UI: More friendly message when no model is loaded 2025-05-09 07:21:05 -07:00
oobabooga 2bde625d57 Update README 2025-05-09 00:19:25 -07:00
oobabooga 512bc2d0e0 UI: Update some labels 2025-05-08 23:43:55 -07:00
oobabooga f8ef6e09af UI: Make ctx-size a slider 2025-05-08 18:19:04 -07:00
oobabooga bf7e4a4597 Docs: Add a tool/function calling example (from https://github.com/oobabooga/text-generation-webui/pull/6827#issuecomment-2854716960) 2025-05-08 16:12:07 -07:00
oobabooga 9ea2a69210 llama.cpp: Add --no-webui to the llama-server command 2025-05-08 10:41:25 -07:00
oobabooga 3bc2ec2b11 Fix #6965 2025-05-08 10:34:09 -07:00
oobabooga 1c7209a725 Save the chat history periodically during streaming 2025-05-08 09:46:43 -07:00
oobabooga a1b3307b66 Bump llama.cpp 2025-05-08 08:58:43 -07:00
Jonas fa960496d5
Tools support for OpenAI compatible API (#6827) 2025-05-08 12:30:27 -03:00
Scott Z ed6e16191d
Docker fix for NVIDIA (#6964) 2025-05-08 12:21:52 -03:00
oobabooga 13a434f351 Bump exllamav3 2025-05-08 08:06:07 -07:00
oobabooga a2ab42d390 UI: Remove the exllamav2 info message 2025-05-08 08:00:38 -07:00
oobabooga 348d4860c2 UI: Create a "Main options" section in the Model tab 2025-05-08 07:58:59 -07:00
oobabooga d2bae7694c UI: Change the ctx-size description 2025-05-08 07:26:23 -07:00
oobabooga b28fa86db6 Default --gpu-layers to 256 2025-05-06 17:51:55 -07:00
oobabooga 760b4dd115 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2025-05-06 14:02:57 -07:00
oobabooga e4fb2475d2 UI: Multiple small style improvements (light/dark themes) 2025-05-06 14:02:15 -07:00
Downtown-Case 5ef564a22e
Fix model config loading in shared.py for Python 3.13 (#6961) 2025-05-06 17:03:33 -03:00
oobabooga c4f36db0d8 llama.cpp: remove tfs (it doesn't get used) 2025-05-06 08:41:13 -07:00
oobabooga 05115e42ee Set top_n_sigma before temperature by default 2025-05-06 08:27:21 -07:00
oobabooga 1927afe894 Fix top_n_sigma not showing for llama.cpp 2025-05-06 08:18:49 -07:00
oobabooga 605cc9ab14 Update exllamav3 2025-05-06 06:43:35 -07:00
oobabooga 89590adc14 Update llama.cpp 2025-05-06 06:41:17 -07:00
oobabooga d1c0154d66 llama.cpp: Add top_n_sigma, fix typical_p in sampler priority 2025-05-06 06:38:39 -07:00
oobabooga cbef35054c UI: CSS fix 2025-05-05 17:46:09 -07:00
Evgenii Novikov 4e8f628d3c
docker: App uid typo in other docker composes (#6958) 2025-05-05 20:05:15 -03:00
oobabooga 530223bf0b UI: Fix the hover menu colors 2025-05-05 16:03:43 -07:00
oobabooga 76f947e3cf UI: Minor style change 2025-05-05 15:58:29 -07:00
Alireza Ghasemi 99bd66445f
SuperboogaV2: minor update to avoid json serialization errors #6945 2025-05-05 19:04:06 -03:00
Evgenii Novikov 987505ead3
docker: Fix app uid typo in cpu docker compose (#6957) 2025-05-05 19:03:33 -03:00
oobabooga 941e0663da Update README 2025-05-05 14:18:16 -07:00
oobabooga f82667f0b4 Remove more multimodal extension references 2025-05-05 14:17:00 -07:00
oobabooga 85bf2e15b9 API: Remove obsolete multimodal extension handling
Multimodal support will be added back once it's implemented in llama-server.
2025-05-05 14:14:48 -07:00
mamei16 8137eb8ef4
Dynamic Chat Message UI Update Speed (#6952) 2025-05-05 18:05:23 -03:00
oobabooga 53d8e46502 Ensure environment isolation in portable installs 2025-05-05 12:28:17 -07:00
oobabooga bf5290bc0f Fix the hover menu in light theme 2025-05-05 08:04:12 -07:00