lollms-webui icon indicating copy to clipboard operation
lollms-webui copied to clipboard

Installation Step List for Win10 users

Open Hoooooooocc opened this issue 1 year ago • 1 comments

This looks like a very outstanding project, thanks for your great job. For some users like me, Newcomers to the field of artificial intelligence, it is still difficult to deploy this . Can directly provide the installation Step List for Windows 10 users?

my path "main_v1\app.py" equivalent to "lollm_webui\app.py". This is one of the big problems I have encountered ( when I run "python main_v1\app.py"):

  a few lines about: ' xxx connect : OK '


  Personalities zoo found in your personal space.
  Pulling last personalities zoo ⠇  fatal: unable to access 'https://github.com/ParisNeo/lollms_personalities_zoo.git/': OpenSSL SSL_r
  Models zoo found in your personal space.
  Pulling last Models zoo ⠧  fatal: unable to access 'https://github.com/ParisNeo/models_zoo.git/': Failed to connect to github.com po
  server
  No binding selected        
  [[[ app.py lollms_path ]]]  Global paths configuration Path: global_paths_cfg.yaml
  Personal Configuration Path: D:\Projs\LLM_LoUI\main_v1\configs\configs
  Personal Data Path: D:\Projs\LLM_LoUI\main_v1\configs\data
  Personal Databases Path: D:\Projs\LLM_LoUI\main_v1\configs\discussion_databases   
  Personal Skills Path: D:\Projs\LLM_LoUI\main_v1\configs\skill_databases
  Personal Models Path: D:\Projs\LLM_LoUI\main_v1\configs\models
  Personal Uploads Path: D:\Projs\LLM_LoUI\main_v1\configs\uploads
  Personal Log Path: D:\Projs\LLM_LoUI\main_v1\configs\logs
  Personal outputs Path: D:\Projs\LLM_LoUI\main_v1\configs\outputs
  Bindings Zoo Path: D:\Projs\LLM_LoUI\main_v1\configs\zoos\bindings_zoo
  Personalities Zoo Path: D:\Projs\LLM_LoUI\main_v1\configs\zoos\personalities_zoo
  Personal user infos path: D:\Projs\LLM_LoUI\main_v1\configs\user_infos
  Personal trainers path: D:\Projs\LLM_LoUI\main_v1\configs\trainers
  Personal gptqlora trainer path: D:\Projs\LLM_LoUI\main_v1\configs\trainers\gptqlora
  Personal services path: D:\Projs\LLM_LoUI\main_v1\configs\services
  Personal STT services path: D:\Projs\LLM_LoUI\main_v1\configs\services\stt
  Personal TTS services path: D:\Projs\LLM_LoUI\main_v1\configs\services\tts
  Personal TTI services path: D:\Projs\LLM_LoUI\main_v1\configs\services\tti
  Personal TTM services path: D:\Projs\LLM_LoUI\main_v1\configs\services\ttm
  Applications zoo path: D:\Projs\LLM_LoUI\main_v1\configs\apps_zoo
  Couldn't load personality. Please verify your configuration file at D:\Projs\LLM_LoUI\main_v1\configs\configs or use the next menu t
  Binding returned this exception : The 'lollms' distribution was not found and is required by the application
  Traceback (most recent call last):
    File "D:\Projs\LLM_LoUI\main_v1\lollms\app.py", line 711, in mount_personality
      personality = PersonalityBuilder(self.lollms_paths, self.config, self.model, self, callback=callback).build_personality(id)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Projs\LLM_LoUI\main_v1\lollms\personality.py", line 5334, in build_personality
      self.personality = AIPersonality(
                         ^^^^^^^^^^^^^^
    File "D:\Projs\LLM_LoUI\main_v1\lollms\personality.py", line 156, in __init__
      self._version = pkg_resources.get_distribution('lollms').version
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Projs\LLM_LoUI\env\Lib\site-packages\pkg_resources\__init__.py", line 542, in get_distribution
      dist = get_provider(dist)  # type: ignore[assignment]
             ^^^^^^^^^^^^^^^^^^
    File "D:\Projs\LLM_LoUI\env\Lib\site-packages\pkg_resources\__init__.py", line 424, in get_provider
      return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
                                              ^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Projs\LLM_LoUI\env\Lib\site-packages\pkg_resources\__init__.py", line 1062, in require
      needed = self.resolve(parse_requirements(requirements))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Projs\LLM_LoUI\env\Lib\site-packages\pkg_resources\__init__.py", line 889, in resolve
      dist = self._resolve_dist(
             ^^^^^^^^^^^^^^^^^^^
    File "D:\Projs\LLM_LoUI\env\Lib\site-packages\pkg_resources\__init__.py", line 930, in _resolve_dist
      raise DistributionNotFound(req, requirers) 
    ( Omit the remaining part ) 

Hoooooooocc avatar Dec 25 '24 10:12 Hoooooooocc

Thanks for your message. To install lollms on windows, you can just use the provided .bat installer. make sure you have already installed git. put the .bat file inside an empty folder where you want to install lollms then run it. it should clone the repo and install dependancies etc.

The other way would be to clone the repo with 👍

git clone --recurse-submodules https://github.com/ParisNeo/lollms-webui.git

make sure you have python 3.11 installed, or you can create a conda environment with python 3.11 and activate it.

install requirements

pip install -r requirements.txt

then you can run it with:

python app.py

Best regards

ParisNeo avatar Dec 25 '24 19:12 ParisNeo