scruid icon indicating copy to clipboard operation
scruid copied to clipboard

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

Open hariramesh9a opened this issue 6 years ago • 3 comments

Issue happens when there is no data for intervals, and granularity is minutes/seconds e.g. code DoubleMaxAggregation(fieldName = "temperature", name = "max_temp")

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

Full log: Execution exception[[anon$2: Double: DownField(max_temp)]] at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:251) at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:178) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:382) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:380) at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417) at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

hariramesh9a avatar Feb 01 '19 02:02 hariramesh9a

Issue happens when there is no data for intervals, and granularity is minutes/seconds e.g. code DoubleMaxAggregation(fieldName = "temperature", name = "max_temp")

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

Full log: Execution exception[[anon$2: Double: DownField(max_temp)]] at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:251) at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:178) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:382) at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:380) at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417) at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

val query = TimeSeriesQuery( aggregations = List( DoubleMaxAggregation(fieldName = "temperature", name = "max_temp"), DoubleMinAggregation(fieldName = "temperature", name = "min_temp"), DoubleMaxAggregation(fieldName = "torque", name = "max_torque"), DoubleMinAggregation(fieldName = "torque", name = "min_torque"), DoubleMaxAggregation(fieldName = "humidity", name = "max_humidity"), DoubleMinAggregation(fieldName = "humidity", name = "min_humidity") ), granularity = gran,

  intervals = List(intervalStr)
).execute()
query.map(_.results).foreach(println(_))
val result = query.map(_.series[TimeseriesCount].map(x => TimeseriesRes(x._1.format(formatter), x._2.head)))

Results foreach works fine.. series throw errro.

hariramesh9a avatar Feb 01 '19 02:02 hariramesh9a

Hi @hariramesh9a

I cannot reproduce the issue. I wrote the following code:

import java.time.format.DateTimeFormatter
import ing.wbaa.druid._
import ing.wbaa.druid.definitions._
import io.circe.generic.auto._
import io.circe.syntax._
import io.circe._
import scala.concurrent.Future


implicit val druidConf = DruidConfig(host = "127.0.0.1", port = 8082)
implicit val system = DruidClient.system
implicit val materializer = DruidClient.materializer
implicit val ec = system.dispatcher


val formatter = DateTimeFormatter.ofPattern("dd/MM/yyyy")


case class TimeseriesCount(
  max_temp: Double,
  min_temp: Double,
  max_torque: Double,
  min_torque: Double,
  max_humidity: Double,
  min_humidity: Double
)

case class TimeseriesRes(date: String, record: TimeseriesCount)


val query: DruidQuery = TimeSeriesQuery(
    aggregations = List(
      DoubleMaxAggregation(fieldName = "temperature", name = "max_temp"),
      DoubleMinAggregation(fieldName = "temperature", name = "min_temp"),
      DoubleMaxAggregation(fieldName = "torque", name = "max_torque"),
      DoubleMinAggregation(fieldName = "torque", name = "min_torque"),
      DoubleMaxAggregation(fieldName = "humidity", name = "max_humidity"),
      DoubleMinAggregation(fieldName = "humidity", name = "min_humidity")
    ),
    granularity = GranularityType.Minute,
    intervals = List("2011-06-01/2017-06-01")
  )

val request: Future[DruidResponse] = query.execute()

request.map(_.results).foreach(println(_))

// according to @hariramesh9a the following code should fail
val result: Future[Iterable[TimeseriesRes]] = request.map{ response =>
  response.series[TimeseriesCount].map{ case (zdt, entries) =>
    TimeseriesRes(formatter.format(zdt), entries.head)
  }
}

result.foreach(_.foreach(println))

Which version of Scruid are you using? I am testing on the latest v2.1.0 with Circe v0.10.1.

anskarl avatar Feb 14 '19 07:02 anskarl

@hariramesh9a ping!

Fokko avatar Feb 27 '19 11:02 Fokko