lemmy icon indicating copy to clipboard operation
lemmy copied to clipboard

[Bug]: Voting on posts and comments is very slow

Open nick-delirium opened this issue 2 years ago • 5 comments

Requirements

  • [X] Is this a bug report? For questions or discussions use https://lemmy.ml/c/lemmy_support
  • [X] Did you check to see if this issue already exists?
  • [X] Is this only a single bug? Do not put multiple bugs in one issue.
  • [ ] Is this a UI / front end issue? Use the lemmy-ui repo.

Summary

Basically title? This happens on both, ws and http client, here is a screenshot from lemmy.world instance: image

While voting takes 2+ minutes, everything else seems to be working a lot faster.

Steps to Reproduce

  1. Vote on post or comment
  2. Wait
  3. Wait
  4. Get response

Technical Details

Latest Chrome, tested on lemmy.ml and lemmy.world

Version

BE 0.17.4

Lemmy Instance URL

lemmy.world

nick-delirium avatar Jun 18 '23 20:06 nick-delirium

Sorry for not a whole lot of info, I'm developing a mobile app and just spotted that voting on entries takes too much time. :)

nick-delirium avatar Jun 18 '23 20:06 nick-delirium

Database writes are having performance problems throughout the app. Issue #3061 with creating postings, also a database write.

RocketDerp avatar Jun 18 '23 21:06 RocketDerp

Lemmy.world put in critical fixes today that have made a huge difference on their server CPU usage and they can now run their site with a single lemmy_server instance where they were running multiples to try and cope with the overload. That said, it is still slow to upvote and create comments, SQL INSERT transactions. Some of it is that a lot of uncached SQL queries go on within the Rust code, such as checking if the local site has downvotes disabled on every single vote. It also starts to queue federated outbound within the path of a single vote, which I suspect is also going on with comments and posts.

INSERT INTO "comment_like" ("person_id", "comment_id", "post_id", "score") VALUES ($1, $2, $3, $4) ON CONFLICT ("comment_id", "person_id") DO UPDATE SET "person_id" = $5, "comment_id" = $6, "post_id" = $7, "score" = $8 RETURNING "comment_like"."id", "comment_like"."person_id", "comment_like"."comment_id", "comment_like"."post_id", "comment_like"."score", "comment_like"."published"

The design is that it uses SQL RETURNING of the comment_like, doing basically a full database steps of DELETE old vote, INSERT new vote, fetch COMMENT. comment_like is by far the biggest write activity of the site, queuing these and doing them in batch would seem a good target to improve overall database scaling and end-user experience.

RocketDerp avatar Jul 05 '23 14:07 RocketDerp

The biggest performance gain here, is in offloading a lot of the work done in SQL triggers, which are synchronous, into periodic scheduled jobs. Things like deleting a comment_like for example:

Triggers:
  comment_aggregates_score AFTER INSERT OR DELETE ON comment_like FOR EACH ROW EXECUTE FUNCTION comment_aggregates_score()
  person_aggregates_comment_score AFTER INSERT OR DELETE ON comment_like FOR EACH ROW EXECUTE FUNCTION person_aggregates_comment_score()

That person_aggregates isn't that important for it to be updated instantly, it could be moved to a scheduled job.

dessalines avatar Jul 06 '23 12:07 dessalines

I did a loop of voting against lemmy.ml every 15 seconds, which I ran back on June 18 when this issue was opened... it was indeed taking massive amounts of time back then, often over 15 seconds.

Today lemmy.ml is responding in under 1 second, often 0.75 seconds, even as low as 0.55 second (and a few did go as high as 1.45 seconds). Maybe the async call for federation activities in Lemmy 0.18.2 has largely fixed this since June 18?

RocketDerp avatar Jul 11 '23 13:07 RocketDerp

Indeed voting and posting is a lot faster on almost every instance I tested this morning. I guess this issue can be considered as fixed now, thank you all :)

nick-delirium avatar Jul 12 '23 10:07 nick-delirium