Tweets-Scraper-twitter-Selenium
Tweets-Scraper-twitter-Selenium copied to clipboard
Recently twitter has stopped giving thier Apis to everyone, So i write simple Python script using selenium to scrap all the tweets from a user and store into a csv file
#Tweets Scraper from twitter in Python Using Selenium Recently Twitter has stopped giving their Apis to everyone, So I write simple Python script using selenium to scrap all the tweets from a user and store into a CSV file
##Extracting Tweets from the #user in twitter
Bot/Scraper
Tweets Scraper from twitter in Python
Table of Contents
-
About the Project
- Built With
-
Getting Started
- Prerequisites
- Installation
- Usage
- Roadmap
- Contributing
- License
- Contact
About The Project
Code
Output Data
Built With
Prerequisites
Installation
- Clone the repo
git clone https://github.com/Zeeshanahmad4/LinkedIn-Jobs-Scraper.git
- Install python packages
pip install Python
pip install selenium
pip install request
Usage
Twitter allows seeing only the recent 300 tweets from any user. In order to access the Url of the user with all the past post follow question and answer in here Stackexchange and search in tweeter search bar copy the URL of the search page and paste into tweet_scraper.py file
Contents
├── README.md
├── Tweet_Scraper.py(Full bot , this file is confidential please contact for more info)
├── data.csv (Sample data)
Roadmap
See the open issues for a list of proposed features (and known issues).
Contributing
Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
License
Distributed under the MIT License. See LICENSE
for more information.
Contact me
![]() |
![]() |
![]() |
![]() |
---|
![]() |
![]() |
![]() |
![]() |
---|