stanza icon indicating copy to clipboard operation
stanza copied to clipboard

ensure_alive must not affect CoreNLPClient when init with StartServer.DONT_START

Open mariano22 opened this issue 3 years ago • 5 comments

Motivational problem We can use CoreNLPClient just as client interface to interact with an already existent CoreNLP standalone server by passing the argument: start_server=StartServer.DONT_START

But in this cases if the server is down, a request will hang in self.ensure_alive() : https://github.com/stanfordnlp/stanza/blob/011b6c4831a614439c599fd163a9b40b7c225566/stanza/server/client.py#L446

Example

import stanza.server as corenlp
from stanza.server import CoreNLPClient

client = CoreNLPClient(start_server=corenlp.StartServer.DONT_START, endpoint='http://non-existing-server.com:9050', annotators=['parse'], timeout=5000)
ann_sent = client.annotate('sarasa', output_format='json')

Solution alternatives

  • (recommended) Do not check for ensure_alive at CoreNLPClient when start_server=StartServer.DONT_START. If the class was not responsible for creating the server instance, should not be responsible for wait it's alive. In this case we want to use an CoreNLPClient just as a client interface, and not using RobustService functionality.
  • Make CHECK_ALIVE_TIMEOUT configurable at RobustService.__init__ and let the user configure it on CoreNLPClient.__init__.
  • Make CHECK_ALIVE_TIMEOUT configurable at RobustService.ensure_alive and let be affected by timeout value on CoreNLPClient._request (no recommended at all, it will change the semantic of timeout).

If you agree with the current behaviour must be changed, you can choose any solution and I will create the PR.

mariano22 avatar Jun 24 '22 17:06 mariano22

Thank you for the offer of a PR!

On Fri, Jun 24, 2022 at 10:44 AM John Bauer @.***> wrote:

That does make sense... perhaps just not checking is the best approach?

On Fri, Jun 24, 2022 at 10:16 AM Marian @.***> wrote:

Motivational problem We can use CoreNLPClient just as client interface to interact with an already existent CoreNLP standalone server by passing the argument: start_server=StartServer.DONT_START

But in this cases if the server is down, a request will hang in self.ensure_alive() : https://github.com/stanfordnlp/stanza/blob/011b6c4831a614439c599fd163a9b40b7c225566/stanza/server/client.py#L446

Example

import stanza.server as corenlp from stanza.server import CoreNLPClient

client = CoreNLPClient(start_server=corenlp.StartServer.DONT_START, endpoint='http://non-existing-server.com:9050', annotators=['parse'], timeout=5000) ann_sent = client.annotate('sarasa', output_format='json')

Solution alternatives

  • (recommended) Do not check for ensure_alive at CoreNLPClient when start_server=StartServer.DONT_START. If the class was not responsible for creating the server instance, should not be responsible for wait it's alive. In this case we want to use an CoreNLPClient just as a client interface, and not using RobustService functionality.
  • Make CHECK_ALIVE_TIMEOUT configurable at RobustService.init and let the user configure it on CoreNLPClient.init.
  • Make CHECK_ALIVE_TIMEOUT configurable at RobustService.ensure_alive and let be affected by timeout value on CoreNLPClient._request (no recommended at all, it will change the semantic of timeout).

If you agree with the current behaviour must be changed, you can choose any solution and I will create the PR.

— Reply to this email directly, view it on GitHub https://github.com/stanfordnlp/stanza/issues/1059, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA2AYWPMRHUPFCSZ6WLHXO3VQXUQBANCNFSM5ZYPBLJQ . You are receiving this because you are subscribed to this thread.Message ID: @.***>

AngledLuffa avatar Jun 24 '22 17:06 AngledLuffa

Huh, only part of that made it through. I was saying that not checking for alive at the time of client creation sounds reasonable.

AngledLuffa avatar Jun 24 '22 17:06 AngledLuffa

Yes, the solution alternatives are different solutions I propose. I recommend the first one

So can I go ahead with the PR?

Notice that it's non-retro compatible change: if some user created CoreNLPClient with start_server=StartServer.DONT_STAR and it's trusting on the self.ensure_alive() functionality, it might experience a difference.

It's a rare case, as I said I think everyone who use start_server=StartServer.DONT_STAR it's trusting on standalone server (external server sometimes) and I don't thinks they wants CoreNLPClient to ensure this server is alive.

mariano22 avatar Jun 24 '22 21:06 mariano22

Yes, thanks

On Fri, Jun 24, 2022 at 2:37 PM Marian @.***> wrote:

Yes, the solution alternatives are different solutions I propose. I recommend the first one

So can I go ahead with the PR?

Notice that it's non-retro compatible change: if some user created CoreNLPClient with start_server=StartServer.DONT_STAR and it's trusting on the self.ensure_alive() functionality, it might experience a difference.

It's a rare case, as I said I think everyone who use start_server=StartServer.DONT_STAR it's trusting on standalone server (external server sometimes) and I don't thinks they wants CoreNLPClient to ensure this server is alive.

— Reply to this email directly, view it on GitHub https://github.com/stanfordnlp/stanza/issues/1059#issuecomment-1165959170, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA2AYWLZCFEXJ5YQTVBJ2WTVQYTADANCNFSM5ZYPBLJQ . You are receiving this because you commented.Message ID: @.***>

AngledLuffa avatar Jun 24 '22 21:06 AngledLuffa

Thank you for the quick responses! I created this: https://github.com/stanfordnlp/stanza/pull/1061

mariano22 avatar Jun 26 '22 20:06 mariano22

Fix integrated into 1.4.1. Thanks!

AngledLuffa avatar Sep 14 '22 19:09 AngledLuffa

Great, thank you for your kind guidance!

El mié, 14 sept 2022 a la(s) 16:27, John Bauer @.***) escribió:

Fix integrated into 1.4.1. Thanks!

— Reply to this email directly, view it on GitHub https://github.com/stanfordnlp/stanza/issues/1059#issuecomment-1247208552, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB2LDBJNQHBWFUEK334HOVDV6IRI3ANCNFSM5ZYPBLJQ . You are receiving this because you authored the thread.Message ID: @.***>

mariano22 avatar Sep 17 '22 22:09 mariano22

That does make sense... perhaps just not checking is the best approach?

On Fri, Jun 24, 2022 at 10:16 AM Marian @.***> wrote:

Motivational problem We can use CoreNLPClient just as client interface to interact with an already existent CoreNLP standalone server by passing the argument: start_server=StartServer.DONT_START

But in this cases if the server is down, a request will hang in self.ensure_alive() : https://github.com/stanfordnlp/stanza/blob/011b6c4831a614439c599fd163a9b40b7c225566/stanza/server/client.py#L446

Example

import stanza.server as corenlp from stanza.server import CoreNLPClient

client = CoreNLPClient(start_server=corenlp.StartServer.DONT_START, endpoint='http://non-existing-server.com:9050', annotators=['parse'], timeout=5000) ann_sent = client.annotate('sarasa', output_format='json')

Solution alternatives

  • (recommended) Do not check for ensure_alive at CoreNLPClient when start_server=StartServer.DONT_START. If the class was not responsible for creating the server instance, should not be responsible for wait it's alive. In this case we want to use an CoreNLPClient just as a client interface, and not using RobustService functionality.
  • Make CHECK_ALIVE_TIMEOUT configurable at RobustService.init and let the user configure it on CoreNLPClient.init.
  • Make CHECK_ALIVE_TIMEOUT configurable at RobustService.ensure_alive and let be affected by timeout value on CoreNLPClient._request (no recommended at all, it will change the semantic of timeout).

If you agree with the current behaviour must be changed, you can choose any solution and I will create the PR.

— Reply to this email directly, view it on GitHub https://github.com/stanfordnlp/stanza/issues/1059, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA2AYWPMRHUPFCSZ6WLHXO3VQXUQBANCNFSM5ZYPBLJQ . You are receiving this because you are subscribed to this thread.Message ID: @.***>

AngledLuffa avatar Oct 11 '22 07:10 AngledLuffa