sandbox-grounded-qa
sandbox-grounded-qa copied to clipboard
Mock SerpApi
Since the free plan at SerpAPI only allows 100 requests per month, I would like to suggest, that one can Mock the SerpAPI for testing and development.
Inside
qa/search.py#get_results_paragraphs_multi_process(...)
maybe we could add something like
if mock:
results = mock_search()
else:
results = serp_api_search(search_term, serp_api_token, url)
WDYT?
I just realized even more simple is to dump SerpAPI results into a file (e.g. serpapi_search_results.json) and retrieve it from there within
qa/search.py#serp_api_search(...)
- response = serp_api_google_search(search_term, serp_api_token, url)
- results = response.get_dict()
+ if verbosity > 1:
+ pretty_print("OKGREEN", f"Load JSON file: serpapi_search_results.json")
+ json_file = open('serpapi_search_results.json')
+ results = json.load(json_file)
+ json_file.close()
+
+ #response = serp_api_google_search(search_term, serp_api_token, url)
+ #results = response.get_dict()
either of those would be great. there is currently a runtime cache on the calls to serpAPI, so if you make the same request during a single runtime, then it wont actually go to serp, but a longer term cache would be better.
Also extending this to take not just serp, but an arbitrary search database would be great! Then it could use local databases or something.
Agreed!
I have done a prototype implementation of using a dumped serpapi json, whereas one has to switch the file_name variable at
https://github.com/wyona/sandbox-grounded-qa/blob/michael_main/qa/search.py#L77
oh aweoms! do you want to make a pr for it?
happy too, whereas I will try to think of a better way to switch the retrieval
I made a fork which don't use serpapi https://github.com/chirag127/sandbox-grounded-qa/
I made a fork which don't use serpapi https://github.com/chirag127/sandbox-grounded-qa/
can you explain a bit how your code works?
https://github.com/chirag127/sandbox-grounded-qa/blob/main/qa/search.py
Well it uses a scraper
You can read the readme on how to use it
You can read the readme on how to use it
thanks, found it :-) https://github.com/chirag127/Search-Engines-Scraper