v2 icon indicating copy to clipboard operation
v2 copied to clipboard

Feeds no longer refresh automatically after update to 2.0.33.

Open Fr0stykiller opened this issue 2 years ago • 12 comments

Hi! After upgrading to 2.0.33 I've noticed that feeds no longer refresh automatically. They do refresh after manual trigger through Feeds menu.

Expected behavior: Refresh every minute or after accessing Unread tab.

I'm deploying Miniflux through docker-compose using Postgres database. I've enabled debug mode, but didn't find anything obvious. Let me know if you need any additional info/log. Thanks!

Fr0stykiller avatar Oct 08 '21 10:10 Fr0stykiller

I have the same issue. I previously thought it was an issue due to trying Watchtower for the first time, a service that updates docker containers when new versions come out. But that probably isn’t related, I still have the env vars that would enable it.

SCHEDULER_ENTRY_FREQUENCY_MAX_INTERVAL=1440
SCHEDULER_ENTRY_FREQUENCY_MIN_INTERVAL=5
SCHEDULER_SERVICE=true

Never happened to me before until the latest version.

dakotahp avatar Oct 09 '21 18:10 dakotahp

I have the same issue; the feeds refresh correctly when I hit the button on the Feeds page, but they sometimes forget to refresh in the background. I have version 2.0.33 as well. I have tried a lot of different settings, but here are my current related environment variables (as set in a docker-compose file):

POLLING_PARSING_ERROR_LIMIT: 0
POLLING_FREQUENCY: 5
POLLING_SCHEDULER: entry_frequency
DISABLE_SCHEDULER_SERVICE: 0
SCHEDULER_ENTRY_FREQUENCY_MAX_INTERVAL: 30

ozls avatar Oct 25 '21 09:10 ozls

@fguillot Is there something we can all check for more info since you can't replicate? Something in the logs, some running process in the container?

I'm going to try and spin up a new setup on a different machine and see the fetching stalls. Likely not. Then I'll copy the database over to the new setup to see if there is anything in the data that is causing it.

dakotahp avatar Oct 25 '21 21:10 dakotahp

After running a newer docker image for miniflux, the problem with refreshing seems to be gone. Not 100% sure if this is because of newer miniflux image or newer Postgres one ¯_(ツ)_/¯

@dakotahp You also mentioned running watchtower in your deployment. Does your issue persists after image update?

Fr0stykiller avatar Oct 29 '21 07:10 Fr0stykiller

No update (or downgrade) fixed the issue for me unfortunately.

On another hand I managed to make it work correctly for a few days, but now I'm back to this issue. My best guess would be: it's a job conflict issue that arises only on low-resource servers. When the polling frequency is too high, one fetching job would have no time to finish before another one starts. When the polling frequency is too low, the amount of items to fetch would get higher and the same issue would be encountered.

In my understanding, I got it working by randomly finding a delicate equilibrium, but then I upset it by adding feeds and changing settings. This would especially be the case if you have heavily reduced the amount of workers in your configuration, or if you have a very small low-powered machine. This is a really naive guess and I did not dive into the code of the project, so I may be way off target. Nevertheless, I think it would be helpful if others who encounter this bug could share information about their hardware.

My hardware: RPI 4 with 4 GB of memory running Raspbian + already heavily burdened by other hosted services

ozls avatar Oct 30 '21 18:10 ozls

I am also using this on a Raspberry Pi 4 after migrating from a remote VPS server. This issue came up shortly after migrating everything to the Pi and updating to 2.0.33. The change to a Pi sounds more plausible to me as the cause rather than anything in 2.0.33. What you are saying about the polling process dying due to lack of resources and something to do with one expecting another to have finished before it actually did.

Without knowing how the polling functionality works, I don't have the time to tinker with settings and make assumptions about it. I'm going to assume this project isn't great for Raspberry Pi's if you have a certain number of feeds and/or have lesser available performance on it. I've been planning on getting some used Dell Optiplex for self-hosted services so I don't have to accumulate Pi's, that will likely be my fix in the future.

It would be nice if the creator would chime in about how polling works and hypothetical problems with little resources.

dakotahp avatar Nov 25 '21 20:11 dakotahp

The same problem here. I use MF as a hosted service. But also there is only a sporadic refresh of some feeds. I have to hit the button under Feeds manually to get all feeds updated every time.

MF version: 2.0.34

playforvoices1 avatar Mar 04 '22 10:03 playforvoices1

I can confirm this problem with paid hosted service. Even after 2 hours none of my feeds are automatically updated: image

Updating manually do the trick.

beerisgood avatar Apr 21 '22 08:04 beerisgood

I have the same issue on 2.0.37 with a Raspberry Pi 4, currently trying out different config settings. Refreshing individual feeds manually works without issues. Did others manage to get the refreshing working with the entry_frequency scheduler? The "Refresh all feeds in the background" button seems to work fine. I am using a Docker installation.

Edit: How to troubleshoot this? Does a successful refresh show in the logs just as cleanups do?

wi18b088 avatar Jun 05 '22 18:06 wi18b088

I have the same issue with @wi18b088. My version is 2.038.

Yuan68 avatar Aug 24 '22 23:08 Yuan68

For whatever it is worth, I bought a Lenovo ThinkCentre M700 tiny PC for about $200 (with plenty of RAM and an i7 processor) and moved my services to it from the Pi. Everything works great and the machine has plenty of grunt to handle all of the services and Miniflux works as it should in that case. While I wish there was some resolution on this issue, I am using my Raspberry Pi's differently now and am happy with Miniflux on a more suitable machine that isn't already taxed.

dakotahp avatar Oct 11 '22 20:10 dakotahp