aiocache icon indicating copy to clipboard operation
aiocache copied to clipboard

Support for Emcache as a new cache backend

Open pfreixes opened this issue 4 years ago • 5 comments

Hi,

Frist of al thanks for the work done here, having a facace that handles the complexity of using one or another cache backend but exposing them under the same functionality is a nice fit!

FIrst of all let me make the disclaimer that Im the author of Emcache, so any opinion that Im gonna give you will be always bias because of this fact.

As I said, Ive been actively working on a new Memcached client driver based on Asyncio which tries to address some of the missing functionalities not provided by Aiomcache, as a list of the most relevant ones:

  • Support for a multi cluster context, where there are several nodes participating within the same cluster
  • Concept of node healthiness, with the optional feature of purging unhealthy nodes
  • Full metrics coverage for undestanding how the driver is behaving at client side
  • etc

I was thinking that Aocache might also provide support for using Emcache as a new cache backend, I would be more than happy for working with the PR for add this support, but before start working on that PR I would like to hear yor opinion.

Also its relevant to metnion that Emcache is still considered in beta and has not been yet fully tested in production.At the same time, a few methods that are currently exposed by Aiocache like increment are still not implemented - Im planning on implementing the missing commands that Aiocache requires for the next version which would be the 0.3.0b0.

WDYt?

pfreixes avatar Jun 07 '20 19:06 pfreixes

Should be nice a pull request

rspadim avatar Jun 18 '20 02:06 rspadim

Hey @pfreixes I'm happy to review that change with the following conditions:

  • emcache supports the same functionality that aiomcache currently brings. As in, they should be interchangeable without user friction
  • We have a nice way for the users to select one or the other (always with a default which for now should be aiomcache) and it's documented nicely
  • Implementation specific but I would keep the same interface and just create a new backend that talks with emcache and then use one or the other depending on what the user installs

argaen avatar Sep 02 '20 16:09 argaen

Hello for everyone! We had some problems with aiomcache and Python 3.9 so I wrote a backend to use emcache instead and it works well for us. Here it is:

https://gist.github.com/Sinkler/0b6389ff8ced9e1c0d82e2c478862a08

Sinkler avatar Aug 20 '21 15:08 Sinkler

Nice work, maybe you could provide a PR for supporting it officially at aiocache level /cc @argaen

pfreixes avatar Aug 20 '21 19:08 pfreixes

I read that this project doesn't have maintainers now, but if someone revive it I will create a PR.

Actually it was an emergency situation with aiomcache for us so we decided to change the cache backend, but at the moment we migrate to bare aioredis 🙃

Sinkler avatar Aug 21 '21 09:08 Sinkler

I'm not sure adding more backends to aiocache directly is a good idea. It makes things more complex and adds more maintenance burden for us.

I think it's best we stick to the most popular libraries (especially those we also maintain, aiomcache is one of ours, and aioredis was before it migrated to the redis project).

@pfreixes I think the best thing to do is to include the cache class in your project somewhere. Then someone should be able to just do aiocache.Cache(emcache.aiocache.EmcacheCache) or similar.

If you're happy with that, I'll happily take a PR to link to your project in the README or somewhere. (Although, I'm also curious if anything could be done to bring the improvements of emcache to aiomcache, so people don't need to choose between the libraries in the first place...)

Dreamsorcerer avatar Jan 02 '23 00:01 Dreamsorcerer

@Dreamsorcerer if you don't have enough resources and spreading yourself thin, perhaps a better move would be to bring aiomcache functionality (I see it has interesting handling of custom types) to emcache and continuing to work on it.

I am currently working on using memcache for my project, and evaluated aiomcache, emcache and pymemcache and this is what I found:

  • pymemcache - has most features and seems to be the most active, but it's only synchronous, it even has a feature of migrating from one cluster to another
  • emcache - less functionality as the above, but supports async, the interface could be a little more pythonic, wish it supported meta protocol, service discovery etc.
  • aiomcache - when trying it I discovered it has no clustering support and only can work with single memcached node. This might be good for POC, but any real use requires rendezvous caching.

Looking at memcached support on python I feel like there's not enough interest in memcache (it seems like that's snatched by redis) perhaps number of clients should be consolidated.

Sad thing is that neither of those project is as featureful as Java or PHP clients. I wish memcached project would do the same thing what redis did and made an official library.

takeda avatar Aug 05 '23 16:08 takeda

As I suggested above, I think emcache support can be added directly within emcache, rather than expecting this project to maintain a dozen different backends. I think it should be extensible, rather than a monolithic project.

I've not heard anything back on that suggestion, but if someone creates an adapter for aiocache, we can start listing all the 3rd-party options on the README.

(And if you'd like to try adding cluster support to aiomcache, that'd also be great to have).

Dreamsorcerer avatar Aug 05 '23 22:08 Dreamsorcerer

This is my problem as an user. I have existing code that makes calls to AWS API. Over time the code got slower, because amounts of objects that it manages increased and that increased number of API calls. I'm trying to speed up by adding caching layer by caching some responses. I got a sprint allocated to work on it and unfortunately I don't have much control over it.

Now I'm trying to figure out which package can get me closest to the goal. pymemcache is best but it is synchronous and my code is async, emcache is the closest, aiomcache looks nice. but since it has no clustering support and that on its own is a non trivial project I can't use it. So I'm planning to go with emcache.

Neither of those 3 support auto discovery (https://docs.aws.amazon.com/AmazonElastiCache/latest/mem-ug/AutoDiscovery.AddingToYourClientLibrary.html) so I'm still have some additional work to do. I plan to monkeypatch emcache and if I have time I can make PR for it.

Anyway ideally it would be great to add this functionality to all of them, but that's 3x amount of work.

I actually chosen memcached for this because I'm stubborn. I think principally memcached is closer to what I need, but I'm sure that redis would work perfectly fine, I probably could also cache data in database table or even dynamodb or s3, but I want to use the right tool for the right job.

What I like about redis, is that you go to their page, chose python and there's official redis package (as I understand it originates from aioredis), and then the entire redis/python community settled on it.

I believe memcache needs the same thing. The community is too small to have multiple clients, and since they are still somewhat feel unfinished. It discourages people from using memcached. As an user, I think all clients should merge together, unless one offers something drastically different that can be easily added (like a new way of interacting).

BTW: I arrived to this repo, because at first it looked like yet another client that could talk with memcache that I didn't know, but now I understand that it is just common interface for different caches.

takeda avatar Aug 06 '23 00:08 takeda