linkedin_scraper icon indicating copy to clipboard operation
linkedin_scraper copied to clipboard

Scrape empty lists

Open liyifu0302 opened this issue 3 years ago • 3 comments

Hi joey,

Recently I found that I can not scrape anything from Linkedin, most of the pulled information is empty. Both for person and company information, any suggestions? Below are the code and output. Many thanks!

Output 1:

Jaime Gilberto Adrián Zúñiga Espinoza About [] Experience [] Education [] Interest [] Accomplishments [] Contacts []

Output 2:

{"name": "Google", "about_us": null, "specialties": null, "website": null, "industry": null, "company_type": "Google", "headquarters": null, "company_size": null, "founded": null, "affiliated_companies": [], "employees": [null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null]}

Code1:

import os from linkedin_scraper import Person, actions from selenium import webdriver from selenium.webdriver.chrome.options import Options

chrome_options = Options() chrome_options.add_argument("--headless") driver = webdriver.Chrome(r"C:\Research_Software\Anaconda\envs\tf\Scripts\chromedriver.exe", options=chrome_options)

email = os.getenv("XXX my email") password = os.getenv("XXX password") actions.login(driver, email, password) # if email and password isnt given, it'll prompt in terminal person = Person("https://www.linkedin.com/in/adrian0350", contacts=[], driver=driver) print(person)

Code2:

import os from linkedin_scraper import Person, Company, actions from selenium import webdriver from selenium.webdriver.chrome.options import Options

driver = webdriver.Chrome(r"C:\Research_Software\Anaconda\envs\tf\Scripts\chromedriver.exe") email = os.getenv("[email protected]") password = os.getenv("8792950liyifuc") actions.login(driver, email, password) # if email and password isnt given, it'll prompt in terminal company = Company("https://ca.linkedin.com/company/google", driver=driver, get_employees=True, close_on_complete=False, scrape=False) driver.implicitly_wait(3) company.scrape(close_on_complete=False) print(company)

liyifu0302 avatar Apr 24 '22 04:04 liyifu0302

I'm getting this issue too. Any solution?

JamesSatherley avatar Apr 27 '22 22:04 JamesSatherley

same here

dmzoneill avatar Jan 19 '23 21:01 dmzoneill

This code won’t work. Linked in changes there html very often to make it impossible to realistically scrap

JamesSatherley avatar Jan 20 '23 07:01 JamesSatherley