Updated Usage (markdown)

AbdBarho 2022-11-26 15:26:32 +01:00
parent 80bee131e0
commit a3786a288f

@ -24,7 +24,7 @@ By default: `--medvram` are given, which allow you to use this model on a 6GB GP
### Custom models ### Custom models
This also has support for custom models, put the weights in the folder `data/StableDiffusion`, you can then change the model from the settings tab. Put the weights in the folder `data/StableDiffusion`, you can then change the model from the settings tab.
### General Config ### General Config
There is multiple files in `data/config/auto` such as `config.json` and `ui-config.json` which let you which contain additional config for the UI. There is multiple files in `data/config/auto` such as `config.json` and `ui-config.json` which let you which contain additional config for the UI.
@ -34,16 +34,7 @@ put your scripts `data/config/auto/scripts` and restart the container
### Extensions ### Extensions
First, you have to add `--enable-insecure-extension-access` to your `CLI_ARGS` in your `docker-compose.override.yml`: You can use the UI to install extensions, or, you can put your extensions in `data/config/auto/extensions`, there is also the option to create a script `data/config/auto/startup.sh` which will be called on container startup, in case you want to install any additional dependencies for your extensions or anything else.
```yml
services:
auto:
environment:
# put whatever other flags you want
- CLI_ARGS=--enable-insecure-extension-access --allow-code --medvram --xformers
```
Then, put your extensions in `data/config/auto/extensions`, there is also the option to create a script `data/config/auto/startup.sh` which will be called on container startup, in case you want to install any additional dependencies for your extensions or anything else.
An example of your `startup.sh` might looks like this: An example of your `startup.sh` might looks like this:
```sh ```sh
@ -58,16 +49,5 @@ done
I maintain neither the UI nor the extension, I can't help you. I maintain neither the UI nor the extension, I can't help you.
# `auto-cpu` # `auto-cpu`
CPU instance of the above, some stuff might not work, use at your own risk. CPU instance of the above, some stuff might not work, use at your own risk.
# `hlky`
By default: `--optimized-turbo` is given, which allow you to use this model on a 6GB GPU. However, some features might not be available in the mode. [You can find the full list of cli arguments here.](https://github.com/sd-webui/stable-diffusion-webui/blob/2236e8b5854092054e2c30edc559006ace53bf96/scripts/webui.py)
# `lstein`
This fork might require a preload to work, see [#72](https://github.com/AbdBarho/stable-diffusion-webui-docker/issues/72)