spark icon indicating copy to clipboard operation
spark copied to clipboard

Support to turn off auto refresh

Open barunpuri opened this issue 1 year ago • 4 comments

Describe the bug ~A clear and concise description of what the bug is.~ It's not a bug, but improvement

Environemnt spark verison: 3.2/ platform: standalone

To Reproduce Steps to reproduce the behavior:

  1. Submit spark query
  2. Open spark history page
  3. Open Dataflint
  4. It refresh page automatically

Expected behavior Could Turn off auto-refresh

Screenshots

https://github.com/dataflint/spark/assets/12489812/28db053b-d0d5-4c39-8e39-737864e3295d

As you can see, data has been updating

Additional context Add any other context about the problem here.

barunpuri avatar Feb 01 '24 08:02 barunpuri

Hi @barunpuri!

The source of the issue is not "auto refresh", as when DataFlint in history server mode it doesn't refresh (there is no reason to). There is a server error and the error is not indicative (because it's the same error as when using DataFlint on spark driver and the server stopped responding, for example because the spark session ended)

Can you please in your browser open the inspect window, and share the output of the console and network tabs? and for any failed network request, show the error?

Thanks!

menishmueli avatar Feb 01 '24 09:02 menishmueli

Hi @barunpuri!

The source of the issue is not "auto refresh", as when DataFlint in history server mode it doesn't refresh (there is no reason to). There is a server error and the error is not indicative (because it's the same error as when using DataFlint on spark driver and the server stopped responding, for example because the spark session ended)

Can you please in your browser open the inspect window, and share the output of the console and network tabs? and for any failed network request, show the error?

Thanks!

Well I see. Thank you for response. Request for history server failed. Request GET http://___history_server___/proxy/___application_id____/api/v1/applications/___application_id____/sql?offset=0&length=1000&planDescription=false Response

<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 500 Request failed.</title>
</head>
<body><h2>HTTP ERROR 500 Request failed.</h2>
<table>
<tr><th>URI:</th><td>/api/v1/applications/___application_id____/sql</td></tr>
<tr><th>STATUS:</th><td>500</td></tr>
<tr><th>MESSAGE:</th><td>Request failed.</td></tr>
<tr><th>SERVLET:</th><td>org.glassfish.jersey.servlet.ServletContainer-49154699</td></tr>
</table>
<hr><a href="https://eclipse.org/jetty">Powered by Jetty:// 9.4.43.v20210629</a><hr/>

</body>
</html>

barunpuri avatar Feb 01 '24 09:02 barunpuri

The request that is failing with 500 is GET request for an official Spark REST API endpoint image (Source: https://spark.apache.org/docs/latest/monitoring.html)

I got multiple reports so far that this request tends to sometimes fail in spark 3.2, seems like there is a bug in spark that was solved in later versions.

If it's possible, you could upgrade your history server version to the latest spark version (3.5) and I believe it would solve the problem. History server is backward compatible so it could show you runs that was submitted with spark 3.2.

Could you also please look at your history server logs and share the exception that resulted in error 500?

menishmueli avatar Feb 01 '24 09:02 menishmueli

The request that is failing with 500 is GET request for an official Spark REST API endpoint image (Source: spark.apache.org/docs/latest/monitoring.html)

I got multiple reports so far that this request tends to sometimes fail in spark 3.2, seems like there is a bug in spark that was solved in later versions.

If it's possible, you could upgrade your history server version to the latest spark version (3.5) and I believe it would solve the problem. History server is backward compatible so it could show you runs that was submitted with spark 3.2.

Could you also please look at your history server logs and share the exception that resulted in error 500?

Thank you for your kind response. For now, it's hard to test with 3.5 because a different team manages it. So if it is resolved this issue could be closed. Thank you

barunpuri avatar Feb 02 '24 00:02 barunpuri