evolution-api icon indicating copy to clipboard operation
evolution-api copied to clipboard

v2.3.X: Unique Incoming Messages Incorrectly Flagged as "Duplicated" and Not Sent to Webhook

Open sluy opened this issue 1 month ago • 34 comments

Welcome!

  • [x] Yes, I have searched for similar issues on GitHub and found none.

What did you do?

Hello Evolution API Team,

We are experiencing a critical issue with version v2.3.5 using the evoapicloud/evolution-api:v2.3.5 Docker image.

Bug Description The API is incorrectly identifying some new, unique incoming messages as duplicates. When this occurs, the message is immediately discarded, leading to two major problems:

The console logs the error Duplicated ignored: [MESSAGE_ID].

The message is never emitted through the messages_upsert event to our configured webhook.

We confirm that these are not actual duplicate messages; they are completely new, unique messages being wrongfully discarded by the system.

Our current presumption is that the issue is related to the message ID cache management in Redis, which seems to be prematurely caching the message ID or misidentifying it as already processed upon initial receipt.

Steps to Reproduce Run v2.3.5 with a correctly configured Redis instance for caching.

Receive a new, unique message from a contact.

Observe the API logs (the issue is intermittent and only affects some messages).

Current Behavior The log displays Duplicated ignored: [MESSAGE_ID] for a message that is not a duplicate.

The messages_upsert event is not triggered, and the message data is not sent to the webhook.

Expected Behavior All new and unique messages should be processed correctly.

The messages_upsert event should be reliably triggered, and the message data sent to the webhook endpoint.

Workaround (Temporary Solution) The issue is immediately resolved by disabling the Redis cache in the Evolution API configuration. When the instance runs without Redis, all messages are processed correctly.

Note: We recognize this is not a sustainable long-term solution as it removes the intended protection against actual duplicate messages, especially after a server restart.

Thank you for your prompt attention to this matter. This bug is significantly impacting our production operations.

What did you expect?

he new messages should have been processed, and the messages_upsert event should have been reliably sent to our webhook.

What did you observe instead of what you expected?

The system incorrectly flagged the new messages as "Duplicated," logged Duplicated ignored: [MESSAGE_ID], and failed to send the event to the webhook.

Screenshots/Videos

No response

Which version of the API are you using?

2.3.X

What is your environment?

Linux

Other environment specifications

No response

If applicable, paste the log output

No response

Additional Notes

No response

sluy avatar Oct 20 '25 17:10 sluy