python-stanford-corenlp icon indicating copy to clipboard operation
python-stanford-corenlp copied to clipboard

Python interface to CoreNLP using a bidirectional server-client interface.

Results 9 python-stanford-corenlp issues
Sort by recently updated
recently updated
newest added

`%XX` is removed from the text when `XX` is hexadecimal, which looks like a URL escape issue (ref. https://github.com/stanfordnlp/CoreNLP/issues/784). Passing URL-encoded string returns an expected result. ``` >>> [r.originalText for...

Thanks for great code! I encountered this problem every time I finished running code. Although my code had completed but it's pretty weird, nothing bad happened while running but a...

getting this error **ReadTimeout(e, request=request) requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=9000): Read timed out. (read timeout=30)** when tried to execute import corenlp text = "Chris wrote a simple sentence that he parsed with...

Hi, If I do `annotation = client.annotate(text)`, `annotation.sentence[0]` will contain tokens and a parseTree, but the parseTree leaves are just plain text. No way to access the corresponding token.

def __init__(self, start_server=True, endpoint="http://localhost:9000", timeout=15000 set start_server=False initially, because what if the server is started outside and doesn't need to give CORENLP home, it can just send a request to...

Hi , I need a binarizer parsing Tree. how do i use the corenlp serve to get it?

Dear All Is it possible to pass directly constituency parse to tregex? Many thanks valery

Hey @arunchaganty , @jekbradbury and @bmccann recently discovered a huge performance oversight in another tokenization library by @jekbradbury. Namely, [string interning](https://en.wikipedia.org/wiki/String_interning) improved DecaNLP performance by something like 100x. It dawned...

enhancement

Running the following code leads to a `google.protobuf.message.DecodeError: Error parsing message`. This admittedly is a very weird document. A chinese(?) text snuck into my English pipeline and caused this error....

bug