traefik
traefik copied to clipboard
Feature Request - Caching
It would be nice to have caching without having to add additional moving parts to our infrastructure. Some features I'd like to see : Cache by annotation/self-config. LRU with max memory/item limit. Max duration.
It seems like it would be fairly simple to add a simple LRU caching middleware to Traefik. The tricky part would be the configuration side, being able to specify which paths, duration, max mem, etc..
Has this been discussed?
Perhaps we could use https://github.com/lox/httpcache
Love this, but it might be a pain to maintain.
IMHO, we could use existing caching solution like Varnish and Nginx. I don't know how to configure this at this point (Docker Swarm Mode here). We just need to know how to run this.
Any docker-compose setup anyone ?
+1
It seems like it would be fairly simple to add a simple LRU caching middleware to Traefik. The tricky part would be the configuration side, being able to specify which paths, duration, max mem, etc..
@rrichardson As a start a memory limit should be enough to configure such middleware. A HTTP backend can control caching via response headers. Things could go wild when you need to adjust hashing, purging, cookie.
:+1: to simple caching that just has a single memory limit setting and honours Cache-Control/Edge-Control/Expires. Any more than that would be too complicated and probably better suited to something like Varnish
Via a docker stack, it would be nice the have Varnish between the webapp and Traefik.
My use case would be traefic with let's encrypt and multiple dockerised web apps behind it and traefic generating the config automatically.
I'd like to cache recently used static assets like I can in nginx. I can't put varnish behind because it would require something to route requests to the right apps, I can't put varnish in front because everything from traefic forward is Https.
On 14 Sep 2017 02:02, "Pascal Andy" [email protected] wrote:
Via a docker stack, it would be nice the have Varnish between the webapp and Traefik.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/containous/traefik/issues/878#issuecomment-329338891, or mute the thread https://github.com/notifications/unsubscribe-auth/AAAqQm6Ltjkx0xEZIxA3e2bUdwexVIx1ks5siHslgaJpZM4K5idM .
Having cache we could have a Grace Period, in which Traefik would serve static pages even if the backend is down. This would eliminate the need for having Nginx/Varnish in front, at least in my scenario.
just some 2 cents, that would be relevant for my use-cases and may have already been mentioned in some way in a previous post.
-
i'd like to 'microcache' the responses of backends for a short period of time in order to relieve them (mainly when the application itself doesn't cache), e.g.
- a response to a GET request that resolves as 200 is cached for 5 seconds
- a response that returns in a 301 is cached for 10 minutes
-
when appropriate (again, mainly 200 responses to GET requests), a cached result is returned to the client when the backends are timing out or return a 50x
-
as i depend on traefik's acme client and tls handling, i cannot use nginx for such caching in front of traefik
-
while putting several nginx instances between traefik and backends implies a bigger scaling footprint; also configuration becomes tedious
I would also really really appreciate, if Traefik would be able to cache! Please implement this :)
For the meantime, if you look for a simple but fully working example how to use Nginx as a cache in front of Traefik, have a look at https://github.com/jonashackt/traefik-cache-nginx-spring-boot
This is a much-needed feature for production environments.
I would also like to +1 this feature request. A single pane for the load balance + caching, that is easy to configure and docker/kubernetes friendly would be a game changer.
I agree with @joeldeteves. I'm currently testing Traefik without anything in front of it, and preparing for production use, but the cache part is important. With nginx in the front, we can easily handle cache of static content right there.
Are we sure Traefik is the right place to do this? Varnish and Nginx do cache out of the box.
Should we instead work on a great docker-compose template that makes those two elements work together in a transparent way?
I'm definitely not sure about it, but it would be great with a simple way to activate caching directly in the load balancer, just like nginx.
I'm not sure how it would work with a docker service in-between handling the cache, so I'd appreciate any examples or templates for that aswell. As long as it's easy to use I'm happy. :)
@pascalandy IMO the power of traefik comes from labels. Being able to define caching behavior based on labels would be extremely powerful.
I have no idea as well.
I'm not sure how it would work with a docker service in-between handling the cache, so I'd appreciate any examples or templates for that aswell.
This feature request could be again a killer feature for Træfik!
Seriously, enough. This issue is open since 2016. There are a lot of subscribers like me that have notifications turned on. And most comments say the same thing. It is understood that this would be a killer feature, awesome feature, that YOU also want this, etc etc. So I suggest unless you have a PR or something constructive to bring to this discussion, just use the thumbs up on the original post and push the subscribe button to get an update of when this might be closed.
Cool feature! Must be configurable by file type, origin, disable logging and passing other header rules
location ~ \.(js|css|png|jpg|jpeg|gif|ico|html|woff|woff2|ttf|svg|eot|otf)$ {
add_header "Access-Control-Allow-Origin" "*";
expires 1M;
access_log off;
add_header Cache-Control "public";
}```
Yep - originally requested back in 2017 and lots of chat in 2018. Still, no word from the Official Team 😢
Would love some acknowledgement .. even if it is a "nope".
They decided it's a priority/P3 on on Jun 6, 2017. You can disagree with that priority but you can't say they didn't say anything. They are highly responsive and they can't be highly responsive on every features around.
still, no word from the Official Team. Would love some acknowledgement .. even if it is a "nope".
Sorry @pascalandy - i was meaning via some text/post comment-update. Not just a tag that feels like it's lost in the backlog.
I should have been more verbose with my comment, above. Sorry if that came off as rude :( soz.
Don't know if this helps everyone, but I built a configurable nginx proxy container which can easily be used in in conjunction with Traefik. Set the UPSTREAM env variable to the app URL in your internal (overlay) network, then use the same Host rule as for your app service but with additional PathPrefix or Path rules.
Seems to work flawlessly in my swarms.
@PureKrome well its very complex (it's much more than @gp187 code snippet shows). Giving that it's far from being core feature I'm not surprised it's still not there. Just use nginx, varnish or cdn (if possible)
Hi,
Is it possible to implement caching fonctionnality ? For example, like this https://github.com/victorspringer/http-cache
Greetings! There is a plugin for HTTP caching, while it's file-based, there are certainly opportunities for other implementations to be built which use fast data stores (Redis, Memcached, etc).
https://pilot.traefik.io/plugins/270947801855164928/simple-cache
Another plugin will be release soon with better integration based on my cache system called Souin A minimal version is under development more efficient and as simple as possible to configure. The full version handles redis, in memory and is easy to introduce your own cache provider since you implement the required interface. Cf this repository: https://github.com/darkweak/souin
Hello, any news for caching ?
I'm currently working on the RFC support and shared cache support too. Then I'm waiting for the support of unsafe lib to publish my cache system as plugin