twitter-graph-viz
twitter-graph-viz copied to clipboard
OSCON Twitter Graph. Loads tweets from Twitter API mentioning OSCON or Neo4j into a Neo4j Graph Database for analysis.
== (OSCON) Twitter Graph
Small Python sample app to load tweets for a certain twitter search continuously into a Neo4j instance and a webapp to render them with vivagraph.js (webgl)
You can read the http://neo4j.com/blog/oscon-twitter-graph/[blog post for details].
There is also a http://www.neo4j.org/graphgist?12b6cd13f1f1120f6099[GraphGist] explaining the model and some use-case queries in detail.
image::http://dev.assets.neo4j.com.s3.amazonaws.com/wp-content/uploads/2014/07/image04.png?_ga=1.207666834.149316633.1397859613[width=400]
The application uses the official https://github.com/neo4j/neo4j-python-driver[Neo4j-Python-Driver] which supports the binary bolt protocol. So it only works with Neo4j 3.x and later.
== Usage
=== Import
Start your neo4j server or provision a hosted server, e.g. at graphenedb.com or addons.heroku.com/graphenedb
export NEO4J_URL=http://localhost:7474/db/data export TWITTER_BEARER=TWITTER_BEARER_TOKEN
python import.py
You can create a https://dev.twitter.com/docs/auth/application-only-auth[bearer token] for yourself by sending this request to twitter:
curl -XPOST -u customer_id:customer_secret 'https://api.twitter.com/oauth2/token?grant_type=client_credentials'
This will yield:
{"token_type":"bearer","access_token":"....bearer token...."}
=== Webapp
python app.py 8080
open http://localhost:8080
=== Push to Heroku
The application already contains a +Procfile+ for Heroku to run with the port provided.
git init heroku apps:create my-twitter-graph heroku config:set NEO4J_URL=http://..... git push heroku master
You can use a Neo4j instance on http://graphenedb.com[GrapheneDB] for your experiment or create an ec2 instance yourself using the http://neo4j.org/develop/cloud[cloud formation template].
The import is not started automatically you can enable the worker in the Procfile to run the import automatically and continously.