stable-diffusion-webui-forge/launch.py
layerdiffusion d38e560e42 Implement some rethinking about LoRA system
1. Add an option to allow users to use UNet in fp8/gguf but lora in fp16.
2. All FP16 loras do not need patch. Others will only patch again when lora weight change.
3. FP8 unet + fp16 lora are available (somewhat only available) in Forge now. This also solves some “LoRA too subtle” problems.
4. Significantly speed up all gguf models (in Async mode) by using independent thread (CUDA stream) to compute and dequant at the same time, even when low-bit weights are already on GPU.
5. View “online lora” as a module similar to ControlLoRA so that it is moved to GPU together with model when sampling, achieving significant speedup and perfect low VRAM management simultaneously.
2024-08-19 04:31:59 -07:00

55 lines
1.4 KiB
Python

# import faulthandler
# faulthandler.enable()
from modules import launch_utils
args = launch_utils.args
python = launch_utils.python
git = launch_utils.git
index_url = launch_utils.index_url
dir_repos = launch_utils.dir_repos
commit_hash = launch_utils.commit_hash
git_tag = launch_utils.git_tag
run = launch_utils.run
is_installed = launch_utils.is_installed
repo_dir = launch_utils.repo_dir
run_pip = launch_utils.run_pip
check_run_python = launch_utils.check_run_python
git_clone = launch_utils.git_clone
git_pull_recursive = launch_utils.git_pull_recursive
list_extensions = launch_utils.list_extensions
run_extension_installer = launch_utils.run_extension_installer
prepare_environment = launch_utils.prepare_environment
configure_for_tests = launch_utils.configure_for_tests
start = launch_utils.start
def main():
if args.dump_sysinfo:
filename = launch_utils.dump_sysinfo()
print(f"Sysinfo saved as {filename}. Exiting...")
exit(0)
launch_utils.startup_timer.record("initial startup")
with launch_utils.startup_timer.subcategory("prepare environment"):
if not args.skip_prepare_environment:
prepare_environment()
if args.test_server:
configure_for_tests()
if args.forge_ref_a1111_home:
launch_utils.configure_forge_reference_checkout(args.forge_ref_a1111_home)
start()
if __name__ == "__main__":
main()