potpie icon indicating copy to clipboard operation
potpie copied to clipboard

Fails to parse local repository

Open ripcurlx opened this issue 11 months ago • 4 comments

Hi!

I tried to run the project locally to support the development of a Ollama integration, but I wasn't able to get a local repo parsed with the current functionality.

I followed the guide and the server is running, but every time when I'm trying to parse a local repository:

curl -X POST 'http://localhost:8001/api/v1/parse' \
  -H 'Content-Type: application/json' \
  -d '{
    "repo_path": "[MY_REPO_PATH]",
    "branch_name": "main"
  }'

It returns

curl: (52) Empty reply from server

and the server log is

INFO:root:Development mode enabled. Using Mock Authentication.
INFO:root:Development mode enabled. Using environment variable for API key.
INFO:sentence_transformers.SentenceTransformer:Use pytorch device_name: mps
INFO:sentence_transformers.SentenceTransformer:Load pretrained SentenceTransformer: all-MiniLM-L6-v2
[2025-01-29 12:40:17 +0100] [94464] [ERROR] Worker (pid:95339) was sent SIGABRT!
/Users/[MY_USERNAME]/.pyenv/versions/3.10.16/lib/python3.10/multiprocessing/resource_tracker.py:224: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '
[2025-01-29 12:40:17 +0100] [95491] [INFO] Booting worker with pid: 95491
INFO:app.celery.celery_app:Connecting to Redis at: redis://127.0.0.1:6379/0
INFO:app.celery.celery_app:Successfully connected to Redis
[nltk_data] Downloading package punkt_tab to /Users/[MY_USERNAME]/Document
[nltk_data]     s/Workspace/Python/potpie/venv/lib/python3.10/site-
[nltk_data]     packages/llama_index/core/_static/nltk_cache...
[nltk_data]   Package punkt_tab is already up-to-date!
INFO:app.celery.tasks.parsing_tasks:Parsing tasks module loaded
INFO:root:Development mode enabled. Skipping Firebase setup.
Dummy user already exists
INFO:root:Dummy user created
[2025-01-29 12:40:21 +0100] [95491] [INFO] Started server process [95491]
[2025-01-29 12:40:21 +0100] [95491] [INFO] Waiting for application startup.
INFO:root:System prompts initialized successfully
[2025-01-29 12:40:22 +0100] [95491] [INFO] Application startup complete.

Do you have any idea what is configured wrong on my side?

Thanks!

ripcurlx avatar Jan 29 '25 11:01 ripcurlx

i have the same issue.

wikando-mz avatar Feb 04 '25 10:02 wikando-mz

@ripcurlx @wikando-mz I was also facing same issue. If you are using mac machine with M1/M2/M3 chip, you have to replace below code in inference_service.py: self.embedding_model = SentenceTransformer( "all-MiniLM-L6-v2") with

self.embedding_model = SentenceTransformer(
            "all-MiniLM-L6-v2", device='cpu')

For me this worked

chikka avatar Feb 07 '25 14:02 chikka

Hi @ripcurlx did you try the solution by @chikka above? I was not able to reproduce your issue. Can you provide more details about your machine?

dhirenmathur avatar Feb 11 '25 05:02 dhirenmathur

I was getting same issue, but tried @chikka solution. Parsing is working for me now.

I'm on a M3 mac

stevengritz avatar Feb 13 '25 19:02 stevengritz