Commit graph

2087 commits

Author SHA1 Message Date
oobabooga 373555c4fb Fix loading some histories (thanks kaiokendev) 2023-07-03 22:19:28 -07:00
Panchovix 10c8c197bf
Add Support for Static NTK RoPE scaling for exllama/exllama_hf (#2955) 2023-07-04 01:13:16 -03:00
jllllll 1610d5ffb2
Bump exllama module to 0.0.5 (#2993) 2023-07-04 00:15:55 -03:00
FartyPants eb6112d5a2
Update server.py - clear LORA after reload (#2952) 2023-07-04 00:13:38 -03:00
oobabooga 7e8340b14d Make greetings appear in --multi-user mode 2023-07-03 20:08:14 -07:00
oobabooga 4b1804a438
Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
FartyPants 1f8cae14f9
Update training.py - correct use of lora_names (#2988) 2023-07-03 17:41:18 -03:00
FartyPants c23c88ee4c
Update LoRA.py - avoid potential error (#2953) 2023-07-03 17:40:22 -03:00
FartyPants 33f56fd41d
Update models.py to clear LORA names after unload (#2951) 2023-07-03 17:39:06 -03:00
FartyPants 48b11f9c5b
Training: added trainable parameters info (#2944) 2023-07-03 17:38:36 -03:00
Turamarth14 847f70b694
Update html_generator.py (#2954)
With version 10.0.0 of Pillow the constant Image.ANTIALIAS has been removed. Instead Image.LANCZOS should be used.
2023-07-02 01:43:58 -03:00
ardfork 3c076c3c80
Disable half2 for ExLlama when using HIP (#2912) 2023-06-29 15:03:16 -03:00
missionfloyd ac0f96e785
Some more character import tweaks. (#2921) 2023-06-29 14:56:25 -03:00
oobabooga 5d2a8b31be Improve Parameters tab UI 2023-06-29 14:33:47 -03:00
oobabooga 79db629665 Minor bug fix 2023-06-29 13:53:06 -03:00
oobabooga 3443219cbc
Add repetition penalty range parameter to transformers (#2916) 2023-06-29 13:40:13 -03:00
oobabooga c6cae106e7 Bump llama-cpp-python 2023-06-28 18:14:45 -03:00
oobabooga 20740ab16e Revert "Fix exllama_hf gibbersh above 2048 context, and works >5000 context. (#2913)"
This reverts commit 37a16d23a7.
2023-06-28 18:10:34 -03:00
jllllll 7b048dcf67
Bump exllama module version to 0.0.4 (#2915) 2023-06-28 18:09:58 -03:00
Panchovix 37a16d23a7
Fix exllama_hf gibbersh above 2048 context, and works >5000 context. (#2913) 2023-06-28 12:36:07 -03:00
oobabooga 63770c0643 Update docs/Extensions.md 2023-06-27 22:25:05 -03:00
matatonic da0ea9e0f3
set +landmark, +superhot-8k to 8k length (#2903) 2023-06-27 22:05:52 -03:00
missionfloyd 5008daa0ff
Add exception handler to load_checkpoint() (#2904) 2023-06-27 22:00:29 -03:00
oobabooga c95009d2bd Merge remote-tracking branch 'refs/remotes/origin/main' 2023-06-27 18:48:17 -03:00
oobabooga 67a83f3ad9 Use DPM++ 2M Karras for Stable Diffusion 2023-06-27 18:47:35 -03:00
FartyPants ab1998146b
Training update - backup the existing adapter before training on top of it (#2902) 2023-06-27 18:24:04 -03:00
Minecrafter20 40bbd53640
Add custom prompt format for SD API pictures (#1964) 2023-06-27 17:49:18 -03:00
missionfloyd cb029cf65f
Get SD samplers from API (#2889) 2023-06-27 17:31:54 -03:00
GuizzyQC d7a7f7896b
Add SD checkpoint selection in sd_api_pictures (#2872) 2023-06-27 17:29:27 -03:00
oobabooga 7611978f7b
Add Community section to README 2023-06-27 13:56:14 -03:00
oobabooga 22d455b072 Add LoRA support to ExLlama_HF 2023-06-26 00:10:33 -03:00
oobabooga b7c627f9a0 Set UI defaults 2023-06-25 22:55:43 -03:00
oobabooga c52290de50
ExLlama with long context (#2875) 2023-06-25 22:49:26 -03:00
oobabooga 9290c6236f Keep ExLlama_HF if already selected 2023-06-25 19:06:28 -03:00
oobabooga 75fd763f99 Fix chat saving issue (closes #2863) 2023-06-25 18:14:57 -03:00
FartyPants 21c189112c
Several Training Enhancements (#2868) 2023-06-25 15:34:46 -03:00
oobabooga 95212edf1f
Update training.py 2023-06-25 12:13:15 -03:00
oobabooga 1f5ea451c9 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-06-25 02:14:19 -03:00
oobabooga f31281a8de Fix loading instruction templates containing literal '\n' 2023-06-25 02:13:26 -03:00
matatonic 68ae5d8262
more models: +orca_mini (#2859) 2023-06-25 01:54:53 -03:00
oobabooga f0fcd1f697 Sort some imports 2023-06-25 01:44:36 -03:00
oobabooga 365b672531 Minor change to prevent future bugs 2023-06-25 01:38:54 -03:00
oobabooga e6e5f546b8 Reorganize Chat settings tab 2023-06-25 01:10:20 -03:00
matatonic b45baeea41
extensions/openai: Major docs update, fix #2852 (critical bug), minor improvements (#2849) 2023-06-24 22:50:04 -03:00
oobabooga ebfcfa41f2
Update ExLlama.md 2023-06-24 20:25:34 -03:00
jllllll bef67af23c
Use pre-compiled python module for ExLlama (#2770) 2023-06-24 20:24:17 -03:00
oobabooga a70a2ac3be
Update ExLlama.md 2023-06-24 20:23:01 -03:00
oobabooga b071eb0d4b
Clean up the presets (#2854) 2023-06-24 18:41:17 -03:00
oobabooga cec5fb0ef6 Failed attempt at evaluating exllama_hf perplexity 2023-06-24 12:02:25 -03:00
快乐的我531 e356f69b36
Make stop_everything work with non-streamed generation (#2848) 2023-06-24 11:19:16 -03:00