Commit graph

811 commits

Author SHA1 Message Date
oobabooga ed30bd3216
Update bug_report_template.yml 2023-03-13 18:14:54 -03:00
oobabooga aee3b53fb3
Update bug_report_template.yml 2023-03-13 18:14:31 -03:00
oobabooga 7dbc071e5a
Delete bug_report.md 2023-03-13 18:09:58 -03:00
oobabooga 69d4b818b7
Create bug_report_template.yml 2023-03-13 18:09:37 -03:00
oobabooga 0a75584706
Create issue templates 2023-03-13 18:07:08 -03:00
oobabooga 7ab45fb54a
Merge pull request #296 from luiscosio/patch-1
Fix for issue #282
2023-03-13 14:45:58 -03:00
Luis Cosio 435a69e357
Fix for issue #282
RuntimeError: Tensors must have same number of dimensions: got 3 and 4
2023-03-13 11:41:35 -06:00
oobabooga 66b6971b61 Update README 2023-03-13 12:44:18 -03:00
oobabooga ddea518e0f Document --auto-launch 2023-03-13 12:43:33 -03:00
oobabooga d97bfb8713
Update README.md 2023-03-13 12:39:33 -03:00
oobabooga 372363bc3d Fix GPTQ load_quant call on Windows 2023-03-13 12:07:02 -03:00
oobabooga bdff37f0bb
Update README.md 2023-03-13 11:05:51 -03:00
oobabooga b6098e9ccb
Merge pull request #275 from stefanhamburger/patch-1
Fix: tuple object does not support item assignment
2023-03-13 11:01:31 -03:00
oobabooga 72757088fa
Create FUNDING.yml 2023-03-13 10:55:00 -03:00
oobabooga 0c224cf4f4 Fix GALACTICA (#285) 2023-03-13 10:32:28 -03:00
stefanhamburger 91c2a8e88d
Fix: tuple object does not support item assignment 2023-03-13 07:42:09 +01:00
oobabooga 2c4699a7e9 Change a comment 2023-03-13 00:20:02 -03:00
oobabooga 0a7acb3bd9 Remove redundant comments 2023-03-13 00:12:21 -03:00
oobabooga 77294b27dd Use str(Path) instead of os.path.abspath(Path) 2023-03-13 00:08:01 -03:00
oobabooga b9e0712b92 Fix Open Assistant 2023-03-12 23:58:25 -03:00
oobabooga 1ddcd4d0ba Clean up silero_tts
This should only be used with --no-stream.

The shared.still_streaming implementation was faulty by design:
output_modifier should never be called when streaming is already over.
2023-03-12 23:42:49 -03:00
oobabooga a95592fc56 Add back a progress indicator to --no-stream 2023-03-12 20:38:40 -03:00
oobabooga d168b6e1f7
Update README.md 2023-03-12 17:54:07 -03:00
oobabooga 54e8f0c31f
Update README.md 2023-03-12 16:58:00 -03:00
oobabooga cebe8b390d Remove useless "substring_found" variable 2023-03-12 15:50:38 -03:00
oobabooga 4bcd675ccd Add *Is typing...* to regenerate as well 2023-03-12 15:23:33 -03:00
oobabooga 3375eaece0 Update README 2023-03-12 15:01:32 -03:00
oobabooga c7aa51faa6 Use a list of eos_tokens instead of just a number
This might be the cause of LLaMA ramblings that some people have experienced.
2023-03-12 14:54:58 -03:00
oobabooga 17210ff88f
Update README.md 2023-03-12 14:31:24 -03:00
oobabooga 441e993c51 Bump accelerate, RWKV and safetensors 2023-03-12 14:25:14 -03:00
oobabooga d8bea766d7
Merge pull request #192 from xanthousm/main
Add text generation stream status to shared module, use for better TTS with auto-play
2023-03-12 13:40:16 -03:00
oobabooga 4066ab4c0c Reorder the imports 2023-03-12 13:36:18 -03:00
oobabooga 4dc1d8c091
Update README.md 2023-03-12 12:46:53 -03:00
oobabooga 901dcba9b4
Merge pull request #263 from HideLord/main
Fixing compatibility with GPTQ repository
2023-03-12 12:42:08 -03:00
oobabooga fda376d9c3 Use os.path.abspath() instead of str() 2023-03-12 12:41:04 -03:00
HideLord 8403152257 Fixing compatibility with GPTQ repo commit 2f667f7da051967566a5fb0546f8614bcd3a1ccd. Expects string and breaks on 2023-03-12 17:28:15 +02:00
HideLord a27f98dbbc Merge branch 'main' of https://github.com/HideLord/text-generation-webui 2023-03-12 16:51:04 +02:00
oobabooga f3b00dd165
Merge pull request #224 from ItsLogic/llama-bits
Allow users to load 2, 3 and 4 bit llama models
2023-03-12 11:23:50 -03:00
oobabooga 89e9493509 Update README 2023-03-12 11:23:20 -03:00
oobabooga 65dda28c9d Rename --llama-bits to --gptq-bits 2023-03-12 11:19:07 -03:00
oobabooga fed3617f07 Move LLaMA 4-bit into a separate file 2023-03-12 11:12:34 -03:00
oobabooga 0ac562bdba Add a default prompt for OpenAssistant oasst-sft-1-pythia-12b #253 2023-03-12 10:46:16 -03:00
oobabooga 78901d522b Remove unused imports 2023-03-12 08:59:05 -03:00
oobabooga 35c14f31b2
Merge pull request #259 from hieultp/patch-1
Fix typo error in LLaMa prompts
2023-03-12 08:52:02 -03:00
oobabooga 3c25557ef0 Add tqdm to requirements.txt 2023-03-12 08:48:16 -03:00
Phuoc-Hieu Le 781c09235c
Fix typo error in script.py 2023-03-12 15:21:50 +07:00
Xan 9276af3561 clean up 2023-03-12 19:06:24 +11:00
Xan b3e10e47c0 Fix merge conflict in text_generation
- Need to update `shared.still_streaming = False` before the final `yield formatted_outputs`, shifted the position of some yields.
2023-03-12 18:56:35 +11:00
Xan d4afed4e44 Fixes and polish
- Change wav naming to be completely unique using timestamp instead of message ID, stops browser using cached audio when new audio is made with the same file name (eg after regenerate or clear history).
- Make the autoplay setting actually disable autoplay.
- Make Settings panel a bit more compact.
- Hide html errors when audio file of chat history is missing.
- Add button to permanently convert TTS history to normal text messages
- Changed the "show message text" toggle to affect the chat history.
2023-03-12 17:56:57 +11:00
oobabooga ad14f0e499 Fix regenerate (provisory way) 2023-03-12 03:42:29 -03:00