gpt-researcher icon indicating copy to clipboard operation
gpt-researcher copied to clipboard

WeasyPrint could not import some external libraries.

Open bbecausereasonss opened this issue 2 years ago • 11 comments

Win11x64 Python 3.10 Conda

WeasyPrint could not import some external libraries. Please carefully follow the installation steps before reporting an issue: https://doc.courtbouillon.org/weasyprint/stable/first_steps.html#installation https://doc.courtbouillon.org/weasyprint/stable/first_steps.html#troubleshooting


Process SpawnProcess-1: Traceback (most recent call last): File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\multiprocessing\process.py", line 314, in _bootstrap self.run() File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\multiprocessing\process.py", line 108, in run self._target(*self._args, **self.kwargs) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn_subprocess.py", line 76, in subprocess_started target(sockets=sockets) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\server.py", line 61, in run return asyncio.run(self.serve(sockets=sockets)) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\asyncio\runners.py", line 44, in run return loop.run_until_complete(main) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\asyncio\base_events.py", line 649, in run_until_complete return future.result() File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\server.py", line 68, in serve config.load() File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\config.py", line 473, in load self.loaded_app = import_from_string(self.app) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\importer.py", line 21, in import_from_string module = importlib.import_module(module_str) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\importlib_init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in find_and_load_unlocked File "", line 688, in load_unlocked File "", line 883, in exec_module File "", line 241, in call_with_frames_removed File "C:\Users\xxxx\Deep\gpt-researcher\main.py", line 8, in from agent.run import WebSocketManager File "C:\Users\xxxx\Deep\gpt-researcher\agent\run.py", line 7, in from agent.research_agent import ResearchAgent File "C:\Users\xxxx\Deep\gpt-researcher\agent\research_agent.py", line 7, in from actions.web_scrape import async_browse File "C:\Users\xxxx\Deep\gpt-researcher\actions\web_scrape.py", line 23, in import processing.text as summary File "C:\Users\xxxx\Deep\gpt-researcher\processing\text.py", line 11, in from md2pdf.core import md2pdf File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\md2pdf_init.py", line 7, in from .core import md2pdf # noqa File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\md2pdf\core.py", line 5, in from weasyprint import HTML, CSS File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint_init.py", line 387, in from .css import preprocess_stylesheet # noqa isort:skip File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint\css_init.py", line 25, in from . import computed_values, counters, media_queries File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint\css\computed_values.py", line 11, in from ..text.ffi import ffi, pango, units_to_double File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint\text\ffi.py", line 428, in gobject = _dlopen( File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint\text\ffi.py", line 417, in _dlopen return ffi.dlopen(names[0]) # pragma: no cover File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\cffi\api.py", line 150, in dlopen lib, function_cache = _make_ffi_library(self, name, flags) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\cffi\api.py", line 832, in _make_ffi_library backendlib = _load_backend_lib(backend, libname, flags) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\cffi\api.py", line 827, in _load_backend_lib raise OSError(msg) OSError: cannot load library 'gobject-2.0-0': error 0x7e. Additionally, ctypes.util.find_library() did not manage to locate a library called 'gobject-2.0-0'

bbecausereasonss avatar Jul 11 '23 11:07 bbecausereasonss

Once I followed weasyprint instructions and restarted my computer, now I get a warning.

(process:21572): GLib-GIO-WARNING **: 07:56:17.463: Unexpectedly, UWP app Microsoft.ScreenSketch_11.2305.25.0_x64__8wekyb3d8bbwe' (AUMId Microsoft.ScreenSketch_8wekyb3d8bbwe!App') supports 29 extensions but has no verbs

bbecausereasonss avatar Jul 11 '23 11:07 bbecausereasonss

Also.

An error occurred while processing the url https://hbr.org/2023/05/a-new-approach-to-building-your-personal-brand: File is not a zip file An error occurred while processing the url https://brenits.com/fundame ntals-of-developing-a-strong-personal-brand/: Message: 'chromedriver.exe' executable may have wrong permissions. Please see https://chromedriver.chromium.org/home

An error occurred while processing the url https://hbr.org/2022/02/whats-the-point-of-a-personal-brand: Message: unknown error: Chrome failed to start: was killed. (unknown error: DevToolsActivePort file doesn't exist) (The process started from chrome location C:\Program Files\Google\Chrome\Application\chrome.exe is no longer running, so ChromeDriver is assuming that Chrome has crashed.) Stacktrace: Backtrace: GetHandleVerifier [0x00BFA813+48355] (No symbol) [0x00B8C4B1] (No symbol) [0x00A95358] (No symbol) [0x00AB3621] (No symbol) [0x00AB0579] (No symbol) [0x00AE0C55] (No symbol) [0x00AE093C] (No symbol) [0x00ADA536] (No symbol) [0x00AB82DC] (No symbol) [0x00AB93DD] GetHandleVerifier [0x00E5AABD+2539405] GetHandleVerifier [0x00E9A78F+2800735] GetHandleVerifier [0x00E9456C+2775612] GetHandleVerifier [0x00C851E0+616112] (No symbol) [0x00B95F8C] (No symbol) [0x00B92328] (No symbol) [0x00B9240B] (No symbol) [0x00B84FF7] BaseThreadInitThunk [0x76817D59+25] RtlInitializeExceptionChain [0x77C9B79B+107] RtlClearBits [0x77C9B71F+191]

An error occurred while processing the url https://www.coursera.org/learn/personal-branding: Message: unknown error: Chrome failed to start: was killed. (unknown error: DevToolsActivePort file doesn't exist) (The process started from chrome location C:\Program Files\Google\Chrome\Application\chrome.exe is no longer running, so ChromeDriver is assuming that Chrome has crashed.) Stacktrace: Backtrace: GetHandleVerifier [0x00BFA813+48355] (No symbol) [0x00B8C4B1] (No symbol) [0x00A95358] (No symbol) [0x00AB3621] (No symbol) [0x00AB0579] (No symbol) [0x00AE0C55] (No symbol) [0x00AE093C] (No symbol) [0x00ADA536] (No symbol) [0x00AB82DC] (No symbol) [0x00AB93DD] GetHandleVerifier [0x00E5AABD+2539405] GetHandleVerifier [0x00E9A78F+2800735] GetHandleVerifier [0x00E9456C+2775612] GetHandleVerifier [0x00C851E0+616112] (No symbol) [0x00B95F8C] (No symbol) [0x00B92328] (No symbol) [0x00B9240B] (No symbol) [0x00B84FF7] BaseThreadInitThunk [0x76817D59+25] RtlInitializeExceptionChain [0x77C9B79B+107] RtlClearBits [0x77C9B71F+191]

ERROR: Exception in ASGI application

bbecausereasonss avatar Jul 11 '23 11:07 bbecausereasonss

same error here. windows 11 64bit, python 3.11.3

veramarvin avatar Jul 11 '23 12:07 veramarvin

Yep was using pip. Will try using Conda install.

bbecausereasonss avatar Jul 11 '23 17:07 bbecausereasonss

Hey @bbecausereasonss did installing conda solve your issue?

assafelovic avatar Jul 12 '23 05:07 assafelovic

Hey @bbecausereasonss did installing conda solve your issue?

Unfortunately not... I installed via conda install -c conda-forge weasyprint and still getting same errors. Might try in WSL later...

(gpt-researcher) PS C:\Users\xxxx\Deep\gpt-researcher> uvicorn main:app --reload INFO: Will watch for changes in these directories: ['C:\Users\xxxx\Deep\gpt-researcher'] INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit) INFO: Started reloader process [20856] using StatReload Process SpawnProcess-1: Traceback (most recent call last): File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\multiprocessing\process.py", line 314, in _bootstrap self.run() File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\multiprocessing\process.py", line 108, in run self._target(*self._args, **self.kwargs) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn_subprocess.py", line 76, in subprocess_started target(sockets=sockets) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\server.py", line 61, in run return asyncio.run(self.serve(sockets=sockets)) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\asyncio\runners.py", line 44, in run return loop.run_until_complete(main) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\asyncio\base_events.py", line 649, in run_until_complete return future.result() File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\server.py", line 68, in serve config.load() File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\config.py", line 473, in load self.loaded_app = import_from_string(self.app) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\importer.py", line 24, in import_from_string raise exc from None File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\uvicorn\importer.py", line 21, in import_from_string module = importlib.import_module(module_str) File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\importlib_init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1050, in _gcd_import File "", line 1027, in _find_and_load File "", line 1006, in _find_and_load_unlocked File "", line 688, in load_unlocked File "", line 883, in exec_module File "", line 241, in call_with_frames_removed File "C:\Users\xxxx\Deep\gpt-researcher\main.py", line 8, in from agent.run import WebSocketManager File "C:\Users\xxxx\Deep\gpt-researcher\agent\run.py", line 7, in from agent.research_agent import ResearchAgent File "C:\Users\xxxx\Deep\gpt-researcher\agent\research_agent.py", line 7, in from actions.web_scrape import async_browse File "C:\Users\xxxx\Deep\gpt-researcher\actions\web_scrape.py", line 23, in import processing.text as summary File "C:\Users\xxxx\Deep\gpt-researcher\processing\text.py", line 11, in from md2pdf.core import md2pdf File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\md2pdf_init.py", line 7, in from .core import md2pdf # noqa File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\md2pdf\core.py", line 5, in from weasyprint import HTML, CSS File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint_init.py", line 388, in from .html import ( # noqa isort:skip File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint\html.py", line 26, in from .images import SVGImage File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\weasyprint\images.py", line 16, in from PIL import Image, ImageFile, ImageOps File "C:\Users\xxxx\anaconda3\envs\gpt-researcher\lib\site-packages\PIL\Image.py", line 103, in from . import _imaging as core ImportError: DLL load failed while importing _imaging: The specified module could not be found.

bbecausereasonss avatar Jul 12 '23 13:07 bbecausereasonss

@bbecausereasonss I had better results after installing WeasyPrint with Brew, per another thread - https://github.com/assafelovic/gpt-researcher/issues/25 brew install weasyprint

(apologies, not relevant for Windows users)

hunterphillips avatar Jul 12 '23 22:07 hunterphillips

I'm in windows, using conda (there is no brew)

bbecausereasonss avatar Jul 12 '23 22:07 bbecausereasonss

I'm using Conda as well and interestly, the error I got is different:

OSError: cannot load library 'pangoft2-1.0-0': error 0x7e. Additionally, ctypes.util.find_library() did not manage to locate a library called 'pangoft2-1.0-0'

I'm able to get this working in WSL, but running into new issues. It thinks the model gpt-4 doesn't exist. This seems to have heaps of bugs, not sure I wanna keep on fixing these issues:

INFO:     127.0.0.1:36556 - "GET /businessAnalystAgentAvatar.png HTTP/1.1" 404 Not Found
INFO:     ('127.0.0.1', 36562) - "WebSocket /ws" [accepted]
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/uvicorn/protocols/websockets/wsproto_impl.py", line 249, in run_asgi
    result = await self.app(self.scope, self.receive, self.send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/fastapi/applications.py", line 289, in __call__
    await super().__call__(scope, receive, send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 149, in __call__
    await self.app(scope, receive, send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/starlette/routing.py", line 341, in handle
    await self.app(scope, receive, send)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/starlette/routing.py", line 82, in app
    await func(session)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/fastapi/routing.py", line 324, in app
    await dependant.call(**values)
  File "/home/chaoscreater/gpt-researcher/main.py", line 50, in websocket_endpoint
    await manager.start_streaming(task, report_type, agent, websocket)
  File "/home/chaoscreater/gpt-researcher/agent/run.py", line 38, in start_streaming
    report, path = await run_agent(task, report_type, agent, websocket)
  File "/home/chaoscreater/gpt-researcher/agent/run.py", line 50, in run_agent
    await assistant.conduct_research()
  File "/home/chaoscreater/gpt-researcher/agent/research_agent.py", line 137, in conduct_research
    search_queries = await self.create_search_queries()
  File "/home/chaoscreater/gpt-researcher/agent/research_agent.py", line 89, in create_search_queries
    result = await self.call_agent(prompts.generate_search_queries_prompt(self.question))
  File "/home/chaoscreater/gpt-researcher/agent/research_agent.py", line 76, in call_agent
    answer = create_chat_completion(
  File "/home/chaoscreater/gpt-researcher/agent/llm_utils.py", line 48, in create_chat_completion
    response = send_chat_completion_request(
  File "/home/chaoscreater/gpt-researcher/agent/llm_utils.py", line 69, in send_chat_completion_request
    result = openai.ChatCompletion.create(
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
    response, _, api_key = requestor.request(
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response
    self._interpret_response_line(
  File "/home/chaoscreater/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line
    raise self.handle_error_response(
openai.error.InvalidRequestError: The model: `gpt-4` does not exist

chaoscreater avatar Jul 13 '23 07:07 chaoscreater

Hi @chaoscreater , this error simply says you don't have access to gpt-4. you will have access at the end of the month as per this post by openai: https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4. In the meanwhile, please refer to this post as a workaround: https://github.com/assafelovic/gpt-researcher/issues/14

Hope it helps!

rotemweiss57 avatar Jul 14 '23 04:07 rotemweiss57

Hi @chaoscreater , this error simply says you don't have access to gpt-4. you will have access at the end of the month as per this post by openai: https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4. In the meanwhile, please refer to this post as a workaround: #14

Hope it helps!

I've had access to gpt4 for ages. I'm on chatgpt plus and using openai api key and all my other apps and projects use gpt4 just fine. I even have access to the latest code interpreter plugin.

Anyway, I finally got it working. Had to change the model to gpt 3.5 and tweak the prompt as per here: https://github.com/assafelovic/gpt-researcher/issues/14#issuecomment-1629836842

image

image

it kinda sucks doing this in WSL. Some of the errors don't necessarily show up in the WSL shell. I had to set up xrdp to remote onto my WSL distro and then view the errors from the terminal. Only then did I find out what the error was, which is how I found out about the gpt model error, etc. Would prefer to have all of this running locally via Python on Windows.

chaoscreater avatar Jul 15 '23 03:07 chaoscreater