| .. |
|
AutoGPTQ_loader.py
|
Add --no_use_cuda_fp16 param for AutoGPTQ
|
2023-06-23 12:22:56 -03:00 |
|
block_requests.py
|
Block a cloudfare request
|
2023-07-06 22:24:52 -07:00 |
|
callbacks.py
|
Make stop_everything work with non-streamed generation (#2848)
|
2023-06-24 11:19:16 -03:00 |
|
chat.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
ctransformers_model.py
|
Various ctransformers fixes (#3556)
|
2023-08-13 23:09:03 -03:00 |
|
deepspeed_parameters.py
|
Fix typo in deepspeed_parameters.py (#3222)
|
2023-07-24 11:17:28 -03:00 |
|
evaluate.py
|
Sort some imports
|
2023-06-25 01:44:36 -03:00 |
|
exllama.py
|
Credit turboderp
|
2023-08-06 13:43:15 -07:00 |
|
exllama_hf.py
|
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325)
|
2023-08-06 17:22:48 -03:00 |
|
extensions.py
|
Unify the 3 interface modes (#3554)
|
2023-08-13 01:12:15 -03:00 |
|
github.py
|
Implement sessions + add basic multi-user support (#2991)
|
2023-07-04 00:03:30 -03:00 |
|
GPTQ_loader.py
|
Remove unused import
|
2023-08-10 00:10:14 -05:00 |
|
html_generator.py
|
Fix a CSS conflict
|
2023-08-13 19:24:09 -07:00 |
|
llama_attn_hijack.py
|
Prevent unwanted log messages from modules
|
2023-05-21 22:42:34 -03:00 |
|
llamacpp_hf.py
|
Refactor everything (#3481)
|
2023-08-06 21:49:27 -03:00 |
|
llamacpp_model.py
|
Refactor everything (#3481)
|
2023-08-06 21:49:27 -03:00 |
|
loaders.py
|
Various ctransformers fixes (#3556)
|
2023-08-13 23:09:03 -03:00 |
|
logging_colors.py
|
Add menus for saving presets/characters/instruction templates/prompts (#2621)
|
2023-06-11 12:19:18 -03:00 |
|
LoRA.py
|
Allow --lora to use an absolute path
|
2023-08-10 10:03:12 -07:00 |
|
models.py
|
Add ctransformers support (#3313)
|
2023-08-11 14:41:33 -03:00 |
|
models_settings.py
|
Refactor everything (#3481)
|
2023-08-06 21:49:27 -03:00 |
|
monkey_patch_gptq_lora.py
|
Revert "Remove GPTQ-for-LLaMa monkey patch support"
|
2023-08-10 08:39:41 -07:00 |
|
presets.py
|
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325)
|
2023-08-06 17:22:48 -03:00 |
|
prompts.py
|
Add "send to" buttons for instruction templates
|
2023-08-13 18:35:45 -07:00 |
|
relative_imports.py
|
Add ExLlama+LoRA support (#2756)
|
2023-06-19 12:31:24 -03:00 |
|
RWKV.py
|
Add ExLlama support (#2444)
|
2023-06-16 20:35:38 -03:00 |
|
sampler_hijack.py
|
Fix: Mirostat fails on models split across multiple GPUs
|
2023-08-05 13:45:47 -03:00 |
|
shared.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
text_generation.py
|
Add ctransformers support (#3313)
|
2023-08-11 14:41:33 -03:00 |
|
training.py
|
Revert "Remove GPTQ-for-LLaMa monkey patch support"
|
2023-08-10 08:39:41 -07:00 |
|
ui.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
ui_chat.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
ui_default.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
ui_file_saving.py
|
Unify the 3 interface modes (#3554)
|
2023-08-13 01:12:15 -03:00 |
|
ui_model_menu.py
|
Add ctransformers support (#3313)
|
2023-08-11 14:41:33 -03:00 |
|
ui_notebook.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
ui_parameters.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
ui_session.py
|
Add "save defaults to settings.yaml" button (#3574)
|
2023-08-14 11:46:07 -03:00 |
|
utils.py
|
Add "send to" buttons for instruction templates
|
2023-08-13 18:35:45 -07:00 |