twitter-api-client icon indicating copy to clipboard operation
twitter-api-client copied to clipboard

Implementation of X/Twitter v1, v2, and GraphQL APIs

Results 58 twitter-api-client issues
Sort by recently updated
recently updated
newest added

Just made a simple async fork of this lib using httpx's AsyncClient instead of Client, lmk if anyone wants to help contribute or maybe something we should provide in the...

] Exception in thread Thread-2: [1] Traceback (most recent call last): [1] File "/usr/local/lib/python3.11/site-packages/twitter/search.py", line 105, in backoff [1] self.logger.warning(f'{YELLOW}{e.get("message")}{RESET}') [1] ^^^^^^^^^^^^^^^^^^^ [1] AttributeError: 'NoneType' object has no attribute 'warning'...

] Exception in thread Thread-2: [1] Traceback (most recent call last): [1] File "/usr/local/lib/python3.11/site-packages/twitter/search.py", line 105, in backoff [1] self.logger.warning(f'{YELLOW}{e.get("message")}{RESET}') [1] ^^^^^^^^^^^^^^^^^^^ [1] AttributeError: 'NoneType' object has no attribute 'warning'...

I see the the tweets response json from scraper is still random just like guest session, instead of chronological order like a login session should be, is this expected or...

This is my simplified code: ```python # read data raw_tweets_data = self.scraper.tweets([data['user_id']], limit=limit, cursor=data['cursor']) data['cursor'] = get_cursor(raw_tweets_data) # save data ``` First scraping works correctly - it scrapes x latest...

When I use pagination of twitter followers, it shows `Cannot parse JSON response 'str' object has no attribute 'json'`. I think it's about pagination's merge.

I noticed that in the Search class, if `debug=False` (which is by default), the client fails to handle some errors. This is because: - If `debug=False`, `self.logger==None` - While some...

For example: ```python from twitter.search import Search search = Search(email, username, password) res = search.run([{"query": "list:1563846174129496065", "category": "Latest"}], 20) len(res[0]) # 40 ``` --- Additionally, the pagination halting condition checks...

use multiple account to crawl and automatically switch available account