SerpScrap icon indicating copy to clipboard operation
SerpScrap copied to clipboard

csv_writer.py csvWriter cannot handle ? in urls

Open fcbits opened this issue 5 years ago • 1 comments

Response Traceback (most recent call last): File "miniconda3/lib/python3.7/site-packages/serpscrap/csv_writer.py", line 14, in write w.writerow(row) File "miniconda3/lib/python3.7/csv.py", line 155, in writerow return self.writer.writerow(self._dict_to_list(rowdict)) File "miniconda3/lib/python3.7/csv.py", line 151, in _dict_to_list + ", ".join([repr(x) for x in wrong_fields])) ValueError: dict contains fields not in fieldnames: 'encoding', 'text_raw', 'url', 'status', 'meta_robots', 'meta_title', 'last_modified' None Traceback (most recent call last): File "miniconda3/lib/python3.7/site-packages/serpscrap/csv_writer.py", line 14, in write w.writerow(row) File "miniconda3/lib/python3.7/csv.py", line 155, in writerow return self.writer.writerow(self._dict_to_list(rowdict)) File "miniconda3/lib/python3.7/csv.py", line 151, in _dict_to_list + ", ".join([repr(x) for x in wrong_fields])) ValueError: dict contains fields not in fieldnames: 'encoding', 'text_raw', 'url', 'status', 'meta_robots', 'meta_title', 'last_modified'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "example_serp_urls.py", line 12, in scrap.as_csv('/tmp/outputurls') File "miniconda3/lib/python3.7/site-packages/serpscrap/serpscrap.py", line 148, in as_csv writer.write(file_path + '.csv', self.results) File "/miniconda3/lib/python3.7/site-packages/serpscrap/csv_writer.py", line 17, in write raise Exception Exception

fcbits avatar Nov 01 '19 06:11 fcbits

I think the problem is Google search result pages which have been changed since last working version. In Lithuanian search scrapper scraps only Youtube results and csv_writer.py writes them down. If there are no Youtube results then there is nothing to write down and code returns error.

MindaugasVaitkus2 avatar Oct 14 '20 08:10 MindaugasVaitkus2