PrivateThink
Results
3
issues of
PrivateThink
What parameters are needed?I need you help,Thanks!
hi,my python environment is 3.5,so i use stanford_corenlp_pywrapper has error,i found that reader.py uses tokenizer.tokenize ,Can I replace tokenizer.tokenize (sent_str) with word_tokenize (sent_str)? def tokenize(self, sent_str): sent_str = " ".join(sent_str.split("-"))...