SerpScrap
SerpScrap copied to clipboard
csv_writer.py csvWriter cannot handle ? in urls
Response Traceback (most recent call last): File "miniconda3/lib/python3.7/site-packages/serpscrap/csv_writer.py", line 14, in write w.writerow(row) File "miniconda3/lib/python3.7/csv.py", line 155, in writerow return self.writer.writerow(self._dict_to_list(rowdict)) File "miniconda3/lib/python3.7/csv.py", line 151, in _dict_to_list + ", ".join([repr(x) for x in wrong_fields])) ValueError: dict contains fields not in fieldnames: 'encoding', 'text_raw', 'url', 'status', 'meta_robots', 'meta_title', 'last_modified' None Traceback (most recent call last): File "miniconda3/lib/python3.7/site-packages/serpscrap/csv_writer.py", line 14, in write w.writerow(row) File "miniconda3/lib/python3.7/csv.py", line 155, in writerow return self.writer.writerow(self._dict_to_list(rowdict)) File "miniconda3/lib/python3.7/csv.py", line 151, in _dict_to_list + ", ".join([repr(x) for x in wrong_fields])) ValueError: dict contains fields not in fieldnames: 'encoding', 'text_raw', 'url', 'status', 'meta_robots', 'meta_title', 'last_modified'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "example_serp_urls.py", line 12, in
I think the problem is Google search result pages which have been changed since last working version. In Lithuanian search scrapper scraps only Youtube results and csv_writer.py writes them down. If there are no Youtube results then there is nothing to write down and code returns error.