Commit graph

259 commits

Author SHA1 Message Date
oobabooga 18e0ec955e Improve some descriptions in --help 2023-02-23 10:11:58 -03:00
oobabooga c72892835a Don't show *-np models in the list of choices 2023-02-22 11:38:16 -03:00
oobabooga 044b963987 Add stop parameter for flexgen (#105) 2023-02-22 11:23:36 -03:00
oobabooga ea21a22940 Remove redundant preset 2023-02-22 01:01:26 -03:00
oobabooga b8b3d4139c Add --compress-weight parameter 2023-02-22 00:43:21 -03:00
oobabooga eef6fc3cbf Add a preset for FlexGen 2023-02-21 23:33:15 -03:00
oobabooga 311404e258 Reuse disk-cache-dir parameter for flexgen 2023-02-21 22:11:05 -03:00
oobabooga f3c75bbd64 Add --percent flag for flexgen 2023-02-21 22:08:46 -03:00
oobabooga b83f51ee04 Add FlexGen support #92 (experimental) 2023-02-21 21:00:06 -03:00
oobabooga 444cd69c67 Fix regex bug in loading character jsons with special characters 2023-02-20 19:38:19 -03:00
oobabooga d7a738fb7a Load any 13b/20b/30b model in 8-bit mode when no flags are supplied 2023-02-20 15:44:10 -03:00
oobabooga 77846ceef3 Minor change 2023-02-20 15:05:48 -03:00
oobabooga e195377050 Deprecate torch dumps, move to safetensors (they load even faster) 2023-02-20 15:03:19 -03:00
oobabooga 14ffa0b418 Fix line breaks in --chat mode 2023-02-20 13:25:46 -03:00
SillyLossy ded890c378 Escape regexp in message extraction 2023-02-19 12:55:45 +02:00
oobabooga 8c9dd95d55
Print the softprompt metadata when it is loaded 2023-02-19 01:48:23 -03:00
oobabooga f79805f4a4
Change a comment 2023-02-18 22:58:40 -03:00
oobabooga d58544a420 Some minor formatting changes 2023-02-18 11:07:55 -03:00
oobabooga 0dd41e4830 Reorganize the sliders some more 2023-02-17 16:33:27 -03:00
oobabooga 6b9ac2f88e Reorganize the generation parameters 2023-02-17 16:18:01 -03:00
oobabooga 596732a981 The soft prompt length must be considered here too 2023-02-17 12:35:30 -03:00
oobabooga edc0262889 Minor file uploading fixes 2023-02-17 10:27:41 -03:00
oobabooga 243244eeec Attempt at fixing greyed out files on iphone 2023-02-17 10:17:15 -03:00
oobabooga a226f4cddb No change, so reverting 2023-02-17 09:27:17 -03:00
oobabooga 40cb9f63f6 Try making Colab happy (tensorflow warnings) 2023-02-17 09:23:11 -03:00
oobabooga aeddf902ec Make the refresh button prettier 2023-02-16 21:55:20 -03:00
oobabooga 21512e2790 Make the Stop button work more reliably 2023-02-16 21:21:45 -03:00
oobabooga 08805b3374 Force "You" in impersonate too 2023-02-16 13:24:13 -03:00
oobabooga d7db04403f Fix --chat chatbox height 2023-02-16 12:45:05 -03:00
oobabooga 589069e105 Don't regenerate if no message has been sent 2023-02-16 12:32:35 -03:00
oobabooga 405dfbf57c Force your name to be "You" for pygmalion (properly) 2023-02-16 12:16:12 -03:00
oobabooga 7bd2ae05bf Force your name to be "You" for pygmalion
This allows you to customize your displayed name.
2023-02-15 21:32:53 -03:00
oobabooga 3746d72853 More style fixes 2023-02-15 21:13:12 -03:00
oobabooga 6f213b8c14 Style fix 2023-02-15 20:58:17 -03:00
oobabooga ccf10db60f Move stuff into tabs in chat mode 2023-02-15 20:55:32 -03:00
oobabooga a55e8836f6 Bump gradio version
It looks uglier, but the old one was bugged and unstable.
2023-02-15 20:20:56 -03:00
oobabooga 0e89ff4b13 Clear the persistent history after clicking on "Clear history" 2023-02-15 16:49:52 -03:00
oobabooga b3bcd2881d Implement regenerate/impersonate the proper way (fixes #78) 2023-02-15 14:39:26 -03:00
oobabooga 5ee9283cae Mention BLIP 2023-02-15 13:53:38 -03:00
oobabooga 8d3b3959e7 Document --picture option 2023-02-15 13:50:18 -03:00
oobabooga 2eea0f4edb Minor change 2023-02-15 12:58:11 -03:00
oobabooga 3c31fa7079 Simplifications 2023-02-15 12:46:11 -03:00
oobabooga 80fbc584f7 Readability 2023-02-15 11:38:44 -03:00
oobabooga b397bea387 Make chat history persistent 2023-02-15 11:30:38 -03:00
oobabooga 7be372829d Set chat prompt size in tokens 2023-02-15 10:18:50 -03:00
oobabooga 8c3ef58e00 Use BLIP directly + some simplifications 2023-02-14 23:55:46 -03:00
SillyLossy a7d98f494a Use BLIP to send a picture to model 2023-02-15 01:38:21 +02:00
oobabooga d910d435cd Consider the softprompt in the maximum prompt length calculation 2023-02-14 12:06:47 -03:00
oobabooga 8b3bb512ef Minor bug fix (soft prompt was being loaded twice) 2023-02-13 23:34:04 -03:00
oobabooga 7739a29524 Some simplifications 2023-02-13 18:48:32 -03:00