| .. |
|
grammar
|
|
|
|
block_requests.py
|
|
|
|
callbacks.py
|
Refactor the transformers loader (#6859)
|
2025-04-20 13:33:47 -03:00 |
|
chat.py
|
Save the chat history right after sending a message
|
2025-05-04 18:52:01 -07:00 |
|
deepspeed_parameters.py
|
|
|
|
evaluate.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
exllamav2.py
|
Lint
|
2025-04-26 19:29:08 -07:00 |
|
exllamav2_hf.py
|
Fix CFG with ExLlamaV2_HF (closes #6937)
|
2025-04-30 18:43:45 -07:00 |
|
exllamav3_hf.py
|
ExLlamaV3_HF: Change max_chunk_size to 256
|
2025-05-04 20:37:15 -07:00 |
|
extensions.py
|
|
|
|
github.py
|
|
|
|
gradio_hijack.py
|
Prevent Gradio from saying 'Thank you for being a Gradio user!'
|
2025-04-26 18:14:57 -07:00 |
|
html_generator.py
|
UI: Add padding to only show the last message/reply after sending a message
|
2025-05-04 18:13:29 -07:00 |
|
llama_cpp_server.py
|
Minor fix after df7bb0db1f
|
2025-05-05 05:00:20 -07:00 |
|
loaders.py
|
UI: Add an enable_thinking option to enable/disable Qwen3 thinking
|
2025-04-28 22:37:01 -07:00 |
|
logging_colors.py
|
|
|
|
logits.py
|
Fix getting the llama.cpp logprobs for Qwen3-30B-A3B
|
2025-04-30 06:48:32 -07:00 |
|
LoRA.py
|
Refactor the transformers loader (#6859)
|
2025-04-20 13:33:47 -03:00 |
|
metadata_gguf.py
|
|
|
|
models.py
|
Use --ctx-size to specify the context size for all loaders
|
2025-04-25 16:59:03 -07:00 |
|
models_settings.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
presets.py
|
Add a --portable flag to hide things in portable mode
|
2025-05-02 16:34:29 -07:00 |
|
prompts.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
relative_imports.py
|
|
|
|
sampler_hijack.py
|
Fix the exllamav2_HF and exllamav3_HF loaders
|
2025-04-21 18:32:23 -07:00 |
|
sane_markdown_lists.py
|
|
|
|
shared.py
|
Dynamic Chat Message UI Update Speed (#6952)
|
2025-05-05 18:05:23 -03:00 |
|
tensorrt_llm.py
|
Use --ctx-size to specify the context size for all loaders
|
2025-04-25 16:59:03 -07:00 |
|
text_generation.py
|
Dynamic Chat Message UI Update Speed (#6952)
|
2025-05-05 18:05:23 -03:00 |
|
torch_utils.py
|
Refactor the transformers loader (#6859)
|
2025-04-20 13:33:47 -03:00 |
|
training.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
transformers_loader.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
ui.py
|
Dynamic Chat Message UI Update Speed (#6952)
|
2025-05-05 18:05:23 -03:00 |
|
ui_chat.py
|
Optimize the Chat tab (#6948)
|
2025-05-04 18:58:37 -03:00 |
|
ui_default.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
ui_file_saving.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
ui_model_menu.py
|
Minor fix after df7bb0db1f
|
2025-05-05 05:00:20 -07:00 |
|
ui_notebook.py
|
|
|
|
ui_parameters.py
|
Dynamic Chat Message UI Update Speed (#6952)
|
2025-05-05 18:05:23 -03:00 |
|
ui_session.py
|
Fix saving settings to settings.yaml
|
2025-04-26 18:20:00 -07:00 |
|
utils.py
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |