weibo_spider icon indicating copy to clipboard operation
weibo_spider copied to clipboard

graduate project, a weibo spider to find some interesting information such as "In social network , people tend to be happy or sad."

Results 5 weibo_spider issues
Sort by recently updated
recently updated
newest added

执行:python util.py,报: BeautifulSoup([your markup], "lxml") markup_type=markup_type)) Traceback (most recent call last): File "util.py", line 134, in main() File "util.py", line 120, in main firstInfo = getUniversityStudent(mainid, school) File "util.py", line...

Error: None Traceback (most recent call last): File "weiboSpider.py", line 37, in get_username selector = etree.HTML(html) File "src/lxml/lxml.etree.pyx", line 3161, in lxml.etree.HTML (src/lxml/lxml.etree.c:82353) File "src/lxml/parser.pxi", line 1819, in lxml.etree._parseMemoryDocument (src/lxml/lxml.etree.c:124533)...

您好,我今年也大四,我想问下您的代码中,id与cookie就是单纯是某个用户的登陆id与cookie吗? 另外重要的想问下,您是如何识别你抓取的用户就全都是安徽医科大的学生呢?

依赖项不全 用了一个mysqldb的包 在pypi中找不到 而且标准库是pymysql 另外还有一个pylab 也找不到 同时 在plot.py line119 ``` plt.text(1500,60,u"安徽医科大学",color='black',size=36, \ horizontalalignment='right',verticalalignment='top') plt.text(1500,40 ,u'性别与微博数量的比较',color='.75',size=12,\ ha='right',va='top') plt.savefig('pic/sex2totalcontent.png') plt.close() ``` 这里 直接就写的安徽医科大学

这个github还是么怎么会用,如果有什么不妥的地方请见谅