Task not written to data base.
I can publish and process a task using the example you providie.
When . I tried to publish using my application it appears not to be written to my DB.
This is the log entry I am seeing: logging/logging.go:129 Task published {"logger": "bokchoy", "component": "queue", "queue": {"name": "suspend", "consumers_count": 2}, "duration": "6.290392ms", "task": {"id": "01DV2D28B0MQJHBF4QX1EFD6Y3", "name": "suspend", "status": "waiting", "payload": "{crn:v1:staging:public:is:us-south-1:a/78d00cc88d254fba849cdd8a8e4f1618::instance:dc4b93ef-2c02-4e4b-bd4c-35deb6c39c6e}", "max_retries": 3, "ttl": "3m0s", "timeout": "3m0s", "published_at": "2019-12-02T04:14:30.240Z", "retry_intervals": "30s, 30s, 30s"}}
Doing a get with the task ID does not return anything.
[git:master] thoas belgariade ~/Sites/golang/src/github.com/thoas/bokchoy
$ go run examples/main.go -run publish
2019-12-02T09:53:29.388+0100 DEBUG logging/logging.go:129 Connecting to redis (client)... {"logger": "bokchoy"}
2019-12-02T09:53:29.392+0100 DEBUG logging/logging.go:129 Connected to redis (client) {"logger": "bokchoy"}
2019-12-02T09:53:29.393+0100 DEBUG logging/logging.go:129 Task published {"logger": "bokchoy", "component": "queue", "queue": {"name": "tasks.message", "consumers_count": 0}, "duration": "190.909µs", "task": {"id": "01DV2X135G7S2J9XF57YDFH5EW", "name": "tasks.message", "status": "waiting", "payload": "{hello}", "max_retries": 3, "ttl": "3m0s", "timeout": "3m0s", "published_at": "2019-12-02T08:53:29.392Z", "retry_intervals": "1m0s, 2m0s, 3m0s"}}
2019/12/02 09:53:29 <Task name=tasks.message id=01DV2X135G7S2J9XF57YDFH5EW, status=waiting, published_at=2019-12-02 08:53:29.392948 +0000 UTC> published
[git:master] thoas belgariade ~/Sites/golang/src/github.com/thoas/bokchoy
$ go run examples/main.go -run get -task-id 01DV2X135G7S2J9XF57YDFH5EW
2019-12-02T09:53:47.131+0100 DEBUG logging/logging.go:129 Connecting to redis (client)... {"logger": "bokchoy"}
2019-12-02T09:53:47.135+0100 DEBUG logging/logging.go:129 Connected to redis (client) {"logger": "bokchoy"}
2019-12-02T09:53:47.135+0100 DEBUG logging/logging.go:129 Task retrieved {"logger": "bokchoy", "component": "queue", "queue": {"name": "tasks.message", "consumers_count": 0}, "duration": "175.007µs", "task": {"id": "01DV2X135G7S2J9XF57YDFH5EW", "name": "tasks.message", "status": "waiting", "payload": "map[data:hello]", "max_retries": 3, "ttl": "3m0s", "timeout": "3m0s", "published_at": "2019-12-02T08:53:29.000Z", "retry_intervals": "1m0s, 2m0s, 3m0s"}}
2019/12/02 09:53:47 <Task name=tasks.message id=01DV2X135G7S2J9XF57YDFH5EW, status=waiting, published_at=2019-12-02 08:53:29 +0000 UTC> retrieved
[git:master] thoas belgariade ~/Sites/golang/src/github.com/thoas/bokchoy
You will need to provide more information, it does work properly with examples.
HI thanks,
This is the code; publishes fine, get and save work. My handler is not invoked or the total count is not incremented.
task, err := app.SuspendQueue.Publish(ctx, message1{Crn: i.resourceCrn.String()}, bokchoy.WithMaxRetries(3), bokchoy.WithRetryIntervals([]time.Duration{ 30 * time.Second, 30 * time.Second, 30 * time.Second}))
if err != nil {
logger.WithError(err).Error("Error publishing suspend message.")
}
logger.Info("published suspend " + task.String())
err = app.SuspendQueue.Save(ctx, task)
if err != nil {
logger.WithError(err).Error("Error saving suspend message.")
}
task, err = app.SuspendQueue.Get(ctx, task.ID)
if err != nil {
logger.WithError(err).Error("Error getting suspend message.")
}
logger.Info("get suspend " + task.String())
qs, _ := app.SuspendQueue.Count(ctx)
riaaslogger.Get(nil).Info("TOTAL: " + strconv.Itoa(qs.Total))
app.SuspendQueue.HandleFunc(func(r *bokchoy.Request) error {
riaaslogger.Get(nil).Info("Received suspend message 2")
return nil
})
return err
This is the log:
Dec 2 08:44:11 vpc-service-broker-c7855fd67-2jwpr vpc-service-broker DEBUG logging/logging.go:129 Task published {"logger": "bokchoy", "component": "queue", "queue": {"name": "suspend", "consumers_count": 2}, "duration": "5.184552ms", "task": {"id": "01DV3H37RJMQJHBF4QX1EFD6Y3", "name": "suspend", "status": "waiting", "payload": "{crn:v1:staging:public:is:us-south-1:a/c1cf76af9e6046058d0f1b562eb0ca67::instance:7dbc41ab-0108-4ffb-9b30-79fd6666f74f}", "max_retries": 3, "ttl": "3m0s", "timeout": "3m0s", "published_at": "2019-12-02T14:44:11.154Z", "retry_intervals": "30s, 30s, 30s"}} Dec 2 08:44:11 vpc-service-broker-c7855fd67-2jwpr vpc-service-broker info published suspend <Task name=suspend id=01DV3H37RJMQJHBF4QX1EFD6Y3, status=waiting, published_at=2019-12-02 14:44:11.154764821 +0000 UTC> Dec 2 08:44:11 vpc-service-broker-c7855fd67-2jwpr vpc-service-broker DEBUG logging/logging.go:129 Task saved {"logger": "bokchoy", "component": "queue", "queue": {"name": "suspend", "consumers_count": 2}, "duration": "5.132302ms", "task": {"id": "01DV3H37RJMQJHBF4QX1EFD6Y3", "name": "suspend", "status": "waiting", "payload": "{crn:v1:staging:public:is:us-south-1:a/c1cf76af9e6046058d0f1b562eb0ca67::instance:7dbc41ab-0108-4ffb-9b30-79fd6666f74f}", "max_retries": 3, "ttl": "3m0s", "timeout": "3m0s", "published_at": "2019-12-02T14:44:11.154Z", "retry_intervals": "30s, 30s, 30s"}} Dec 2 08:44:11 vpc-service-broker-c7855fd67-2jwpr vpc-service-broker DEBUG logging/logging.go:129 Task retrieved {"logger": "bokchoy", "component": "queue", "queue": {"name": "suspend", "consumers_count": 2}, "duration": "5.019503ms", "task": {"id": "01DV3H37RJMQJHBF4QX1EFD6Y3", "name": "suspend", "status": "waiting", "payload": "map[crn:crn:v1:staging:public:is:us-south-1:a/c1cf76af9e6046058d0f1b562eb0ca67::instance:7dbc41ab-0108-4ffb-9b30-79fd6666f74f]", "max_retries": 3, "ttl": "3m0s", "timeout": "3m0s", "published_at": "2019-12-02T14:44:11.000Z", "retry_intervals": "30s, 30s, 30s"}} Dec 2 08:44:11 vpc-service-broker-c7855fd67-2jwpr vpc-service-broker info get suspend <Task name=suspend id=01DV3H37RJMQJHBF4QX1EFD6Y3, status=waiting, published_at=2019-12-02 14:44:11 +0000 UTC> Dec 2 08:44:11 vpc-service-broker-c7855fd67-2jwpr vpc-service-broker info TOTAL: 0
Hi there, I'm also experiencing this on one particular queue, every other queue writes to the database and is triggered.