SerpBear does spend credits in scrapingrobot but o results are shown in SerpBear dashboard
Hi,
I'm using SerpBear insttaled on my cloud server. I have an account in scrapingrobot. (Free) I've added keyword, wait few minutes, in scrapingrobot I see that credits have been spent. In SerpBear dashboard, I do not see any data about the keywprds I;ve added before.
Can you please check the server log and see if there are any errors of any kind?
I will..np
[0] Listening on port 3000 [0] GET /api/settings [0] GET /api/domains?withstats=true [0] GET /api/settings [0] [ERROR] Getting App Settings. [Error: ENOENT: no such file or directory, open '/app/data/failed_queue.json'] { [0] errno: -2, [0] code: 'ENOENT', [0] syscall: 'open', [0] path: '/app/data/failed_queue.json' [0] } [0] GET /api/domains?withstats=true [0] domains: 0 [0] GET /api/domains?withstats=true [0] GET /api/settings [0] GET /api/settings [0] GET /api/domains?withstats=true [0] domains: 0 [0] GET /api/settings [0] GET /api/settings [0] GET /api/settings [0] PUT /api/settings [0] GET /api/settings [0] GET /api/settings [0] GET /api/domains?withstats=true [0] domains: 0 [0] POST /api/domains [0] GET /api/domains?withstats=true [0] domains: 1 [0] GET /api/settings [0] GET /api/domains
When you restart the server, does your data get lost? How did you set up serpbear in your cloud server?
The data was not lost after restart.
Here is what was done as per the documentation -
docker pull towfiqi/serpbear docker volume create serpbear_data docker run -d -p 3000:3000 -v serpbear_data:/opt/serp/data --restart unless-stopped -e NEXT_PUBLIC_APP_URL='http://localhost:3000' -e USER='xxxxx' -e PASSWORD='xxxxx' -e SECRET='xxxxxxx' -e APIKEY='xxxxxxx' --name serpbear towfiqi/serpbear
So the keywords appear fine, but when the keyword is refreshed, the position is not being updated? Can you please try updating the position of a keyword, check the log, and see if there are any errors?
Unfortantly from the moment I started to work with it, I do not see any keywords location in the table, attached a screenshot.
Hi. Any update? I contacted scrapingbot team also and this is their response: We've taken a look at your data from the past 7 days and we see that you sent in 14 scraping requests. All these requests ended in status code 200 response and would have returned content to you. I've attached the JSON that we receive when running the requests to this target URL - https://www.google.com/search?num=100&hl=he&q=%D7%9E%D7%A7%D7%A8%D7%A8%20%D7%99%D7%99%D7%9F
Here is the API call to our system that retrieves this information: https://api.scrapingrobot.com/?token=YOUR_API_TOKEN&url=https://www.google.com/search?num=100&hl=he&q=%D7%9E%D7%A7%D7%A8%D7%A8%20%D7%99%D7%99%D7%9F
Looks like It's an encoding issue. Serpbear is passing the search term as מקרר יין which does not match your keyword. Can you please write down the keyword in Hebrew here? I need the exact Hebrew text to debug this issue.
Ok, I used an OCR app to extract the Hebrew text from the screenshot you shared. תוכנה לניהול פרויקטים
Just scraped the keyword with Scrapingrobot and it's working fine.
Can you please try removing and adding the keyword again?
Done
On Thu, Feb 15, 2024, 04:46 Towfiq I. @.***> wrote:
Just scraped the keyword with Scrapingrobot and it's working fine. FireShot.Capture.129.-.projecteam.tools.-.SerpBear.-.localhost.png (view on web) https://github.com/towfiqi/serpbear/assets/1926614/f76ff7ba-6cba-48dd-97bc-a0891d38e04e
Can you please try removing and adding the keyword again?
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1945280374, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5MFJMSW46SWPM7SRSTYTVZJPAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBVGI4DAMZXGQ . You are receiving this because you authored the thread.Message ID: @.***>
So is it working now?
Seems like same issue. I see the keyword in the table but location is not there. Attached a screenshot
On Thu, Feb 15, 2024, 08:23 Towfiq I. @.***> wrote:
So is it working now?
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1945442043, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5NIPBAMCHIFMOKGWP3YTWSWTAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBVGQ2DEMBUGM . You are receiving this because you authored the thread.Message ID: @.***>
When you are adding the keyword, are you copy-pasting the keyword in the keyword field? If you are pasting it from somewhere, make sure you paste it as "Plain Text" by pressing shift + v
No. I typed it manually
On Thu, Feb 15, 2024, 09:44 Towfiq I. @.***> wrote:
When you are adding the keyword, are you copy-pasting the keyword in the keyword field? If you are pasting it from somewhere, make sure you paste it as "Plain Text" by pressing shift + v
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1945519893, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5MWPZS3SCM44ZBO7ZDYTW4HJAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBVGUYTSOBZGM . You are receiving this because you authored the thread.Message ID: @.***>
Since I cannot reproduce the issue in my instance of SerpBear, is it possible to send me your SQLite database? If it is, kindly send it to ~removed~
Checking
On Fri, Feb 16, 2024, 17:03 Towfiq I. @.***> wrote:
Since I cannot reproduce the issue in my instance of SerpBear, is it possible to send me your SQLite database? If it is, kindly send it to @.***
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1948545710, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5KYEUGHV4RTTTPBBBLYT5YMNAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBYGU2DKNZRGA . You are receiving this because you authored the thread.Message ID: @.***>
Attached.
On Fri, Feb 16, 2024, 17:03 Towfiq I. @.***> wrote:
Since I cannot reproduce the issue in my instance of SerpBear, is it possible to send me your SQLite database? If it is, kindly send it to @.***
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1948545710, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5KYEUGHV4RTTTPBBBLYT5YMNAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBYGU2DKNZRGA . You are receiving this because you authored the thread.Message ID: @.***>
@AviDor Since you replied through email, your attachment was not uploaded to Github and I can't see it. Please directly reply in this thread so that I can download the attachment. You can also send it to me at the email address that I shared previously.
Done
On Sat, Feb 17, 2024, 19:44 Towfiq I. @.***> wrote:
@AviDor https://github.com/AviDor Since you replied through email, your attachment was not uploaded to Github and I can't see it. Please directly reply in this thread so that I can download the attachment. You can also send it to me at the email address that I shared previously.
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1950266875, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5OJ7GX2XTBMMT5R3HLYUDT6XAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNJQGI3DMOBXGU . You are receiving this because you were mentioned.Message ID: @.***>
Just tried to open the database. It looks like it got corrupted somehow. Can you please zip the database and then send it to me?
Ran tests with your database both locally and with docker and everything seems to work fine. I used scrapingrobot. Not sure how I can recreate the issue. Can you please destroy your serpbear instance and recreate it?
Done. Same problem.
בתאריך יום ד׳, 21 בפבר׳ 2024 ב-3:38 מאת Towfiq I. < @.***>:
Ran tests with your database both locally and with docker and everything seems to work fine. I used scrapingrobot. Not sure how I can recreate the issue. Can you please destroy your serpbear instance and recreate it?
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1955704991, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5LJ33HPU3QTURDML4DYUVFZFAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNJVG4YDIOJZGE . You are receiving this because you were mentioned.Message ID: @.***>
-- אבי דורני - Avi Dorani מנכ"ל אדורני בע"מ - CEO Adorani https://adorani.co.il LTD נייד +972.52.8411088 - Cell
Maybe I do not understand how it should work, Should I see the location of a specific keyword in a domain right ?
בתאריך יום ו׳, 23 בפבר׳ 2024 ב-16:40 מאת Avi Dorani @.*** >:
Done. Same problem.
בתאריך יום ד׳, 21 בפבר׳ 2024 ב-3:38 מאת Towfiq I. < @.***>:
Ran tests with your database both locally and with docker and everything seems to work fine. I used scrapingrobot. Not sure how I can recreate the issue. Can you please destroy your serpbear instance and recreate it?
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1955704991, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5LJ33HPU3QTURDML4DYUVFZFAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNJVG4YDIOJZGE . You are receiving this because you were mentioned.Message ID: @.***>
-- אבי דורני - Avi Dorani מנכ"ל אדורני בע"מ - CEO Adorani https://adorani.co.il LTD נייד +972.52.8411088 - Cell
-- אבי דורני - Avi Dorani מנכ"ל אדורני בע"מ - CEO Adorani https://adorani.co.il LTD נייד +972.52.8411088 - Cell
BTW, now I am sure it is language issue because when I added monday.com as a domain and keyword in english all works as charm: [image: image.png]
בתאריך יום ו׳, 23 בפבר׳ 2024 ב-23:04 מאת Avi Dorani @.*** >:
Maybe I do not understand how it should work, Should I see the location of a specific keyword in a domain right ?
בתאריך יום ו׳, 23 בפבר׳ 2024 ב-16:40 מאת Avi Dorani < @.***>:
Done. Same problem.
בתאריך יום ד׳, 21 בפבר׳ 2024 ב-3:38 מאת Towfiq I. < @.***>:
Ran tests with your database both locally and with docker and everything seems to work fine. I used scrapingrobot. Not sure how I can recreate the issue. Can you please destroy your serpbear instance and recreate it?
— Reply to this email directly, view it on GitHub https://github.com/towfiqi/serpbear/issues/157#issuecomment-1955704991, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF5MG5LJ33HPU3QTURDML4DYUVFZFAVCNFSM6AAAAABC2I4VQGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNJVG4YDIOJZGE . You are receiving this because you were mentioned.Message ID: @.***>
-- אבי דורני - Avi Dorani מנכ"ל אדורני בע"מ - CEO Adorani https://adorani.co.il LTD נייד +972.52.8411088 - Cell
-- אבי דורני - Avi Dorani מנכ"ל אדורני בע"מ - CEO Adorani https://adorani.co.il LTD נייד +972.52.8411088 - Cell
-- אבי דורני - Avi Dorani מנכ"ל אדורני בע"מ - CEO Adorani https://adorani.co.il LTD נייד +972.52.8411088 - Cell
Hi Any updates ?
As I guessed earlier it's an encoding issue, which probably only occurs on the OS you are using. I can't be sure though. It's very hard to debug this issue. I have tested the same keywords on both Linux and Windows and it's working fine.