I use Newsbeuter to read RSS and Atom feeds. On a server, I have a cronjob that runs
newsbeuter -x reload. This is useful for archiving some sites that I don’t regularly want to read. On my local machine, I have a smaller list of URLs that I regularly check.
My dotfiles repository contains my
I have found that after the cache file reaches about 20MB, starting up newsbeuter becomes incredibly slow. To work around this, I create a new cache as follows:
- I rename the old cache to
- I open newsbeuter and refresh all feeds. This will be stored in a new cache called
- This will also give a chance to notice if any feeds are broken (e.g. new URL or the site doesn’t exist anymore, which is difficult to tell from simply inactive sites), because those will not have downloaded any articles.
- I mark all feeds as read in this new cache.
- I open newsbeuter with the old cache, using the
-cflag to newsbeuter to select the old cache.
- I refresh all feeds. Then I process any new articles here. This will be the last time I use this old cache.
Doing step (4) before step (6) is important. Otherwise, in the time between when you process the feeds and when you reload all the feeds again with the new cache, some site may post a new article, which you would erroneously mark as read.