google-images-download
google-images-download copied to clipboard
Cannot open url
Can anyone help me with this?
It looks like the program cannot access internet... You may check for any proxy/vpn materials or anything else that could alter your connection.
This is nothing but the problem of reaching the URL from the original browser. So just download a VPN and try again
This issue is because of SSL certificate
Steps to solve :
In macOS go to Macintosh HD > Applications > Python3.6 folder (or whatever version of python you're using) > double click on "Install Certificates.command" file.
Same issue, I'm on Windows
Same issue. I'm also on Windows. Any fixes? It was fully functional two weeks ago. I didn't change anything in my environment
Same issue here, also on windows. Certifi is installed and I have google's cert inside.
I have the same problem on Centos 6. It was all working fine a few weeks ago - now doesn't download images and always gives the error "Could not open URL. Please check your internet connection and/or ssl settings If you are using proxy, make sure your proxy settings is configured correctly" I have tried various python fixes to avoid the ssl certificate, but no luck.
`
def _find_last(self,string,str):
last_position=-1
while True:
position=string.find(str,last_position+1)
if position==-1:
return last_position
last_position=position
# Downloading entire Web Document (Raw Page Content)
def download_page(self, url):
version = (3, 0)
cur_version = sys.version_info
if cur_version >= version: # If the Current Version of Python is 3.0 or above
try:
proxy_support = urllib.request.ProxyHandler({'https':'127.0.0.1:51593'})
opener = urllib.request.build_opener(proxy_support)
urllib.request.install_opener(opener)
headers = {}
headers[
'User-Agent'] = "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.17 Safari/537.36"
req = urllib.request.Request(url, headers=headers)
resp = urllib.request.urlopen(req)
respData = str(resp.read())
respDatas = self._extract_data_pack(respData)
respDatas = respDatas[:self._find_last(respDatas,",")]
return self._image_objects_from_pack(respDatas), self.get_all_tabs(respData)
except Exception as e:
print("Could not open URL. Please check your internet connection and/or ssl settings \n"
"If you are using proxy, make sure your proxy settings is configured correctly")
sys.exit()
else: # If the Current Version of Python is 2.x
try:
headers = {}
headers[
'User-Agent'] = "Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.27 Safari/537.17"
req = urllib2.Request(url, headers=headers)
try:
response = urllib2.urlopen(req)
except URLError: # Handling SSL certificate failed
context = ssl._create_unverified_context()
response = urlopen(req, context=context)
page = response.read()
return self._image_objects_from_pack(self._extract_data_pack(page)), self.get_all_tabs(page)
except:
print("Could not open URL. Please check your internet connection and/or ssl settings \n"
"If you are using proxy, make sure your proxy settings is configured correctly")
sys.exit()
return "Page Not found"`
Here's my solution
@tzq865 Thank your solutions , but it not work.