Amir Plivatsky
Amir Plivatsky
> Where is that loop? It would be better if that loop was a part of the dictionary setup... `set_connector_length_limits()`, invoked from `prepare_to_parse()`. It have not implemented yet the idea...
> Yes. It would be even better to assign a cost, so that longer links are allowed but have a high cost. But then, how can we prune these connectors?
> I'm confused. I think you are saying that you want to add an LL link-length limit to set_connector_length_limits, right? Because there isn't one there now. I indeed would like...
> If very much desired, a 4-byte version is possible too: I actually forgot the `gword_set` field, that needs about 10 bits. So the 4-byte version of `Connector_struc` is not...
The `gword_set` field in `Connector_struct` is used in the fast-mather to filter out mismatching alternatives. The purpose of introducing this filtering was to eliminate the mostly insane-morphism linkages of the...
Is `xalloc()` / `xfree()` (and `exalloc()` / `exfree()`) (designed for memory usage tracking and limiting) needed any more? **Pros:** - Complex/long sentences may 'kill' the machine due to too much...
Most of the allocs are for Table_connector for the do_count() memoization. In long sentences in the `fix-long` batch there are > 100M allocations. Generally (even for the `fixes` batch) a...
@s-e-r-g , The problem of huge memory+CPU for parsing long Russian sentences has been partially solved in PR #673 that has been applied a short time ago. On my computer...
On Linux I just had to add these functions to linkgrammar.def - there was no need to define them as public PAI. if so, isn't this linkgrammar.def addition enough also...
Here is a summary of problems that are still open after the latest fixes: 1. Additional costs are shown in !disjuncts of the standard parser. I traced this to be...