@command{urls2feeds.zsh} won't touch already existing directories and will
warn if some of them disappeared from @file{urls}.
+@item Check configuration options
+
+@file{cmd/env.rc} contains list of various options you can override by
+environment variables, like @command{curl}, @command{wget},
+@command{zstd}, @command{parallel} command invocations,
+@code{User-Agent}, number of download/parse jobs run in parallel and so on.
+
@item Download your feed(s) data
@example
$ ./feeds-parse.zsh # to parse all feeds in parallel
@end example
+@item Download-n-parse
+
+You can also download and parse the feeds immediately:
+
+@example
+$ ./feeds-dnp.zsh
+@end example
+
@item Quick overview of the news:
@example
@item Run Mutt
@example
-$ ./feeds-browse.zsh
+$ ./feeds-browse.sh
@end example
That will read all feeds titles and create @file{mutt.rc} sourceable
configuration file with predefined helpers and @code{mailboxes}
-commands. Mutt will be started in mailboxes browser mode (I will skip
-many entries):
+commands.
+
+That configuration contains @code{auto_view text/html}, that expects
+proper @file{mailcap} configuration file with @code{text/html} entry to
+exists. Mutt has some built-in default search paths for, but you can
+override them with @env{$MAILCAPS} environment variable. There is
+example @file{contrib/mailcap}.
+
+Mutt will be started in mailboxes browser mode (I will skip many entries):
@verbatim
1 N [ 1|101] 2021-02-17 20:41 Cryptology ePrint Archive/
@example
$ ./feeds-clear.zsh
+$ cmd/clear.zsh feeds/FEED # to clear single feed
@end example
will clear everything exceeding the quantity limit. You can set that
Many feeds include links to so-called enclosures, like audio files for
podcasts. While you mail is not processed by MUA, its @file{new/}
messages still there, you can run enclosure downloading process, that
-uses @url{https://www.gnu.org/software/wget/, GNU Wget}. Specify the
-directory where your enclosures should be placed. Each enclosure's
-filename is more or less filesystem-friendly with the current timestamp
-in it.
+uses @url{https://www.gnu.org/software/wget/, GNU Wget}. Each
+enclosure's filename is more or less filesystem-friendly with the
+current timestamp in it.
@example
-$ mkdir path/to/enclosures
-$ ./feeds-encs.zsh path/to/enclosures
+$ ./feeds-encs.zsh
[...]
-traffic.libsyn.com_monsterfeet_grue_018.mp3-20220218-152822
+monsterfeet.com_grue.rss/encs/20220218-152822-traffic.libsyn.com_monsterfeet_grue_018.mp3
+www.astronews.ru_astronews.xml/encs/20220219-115710-www.astronews.ru_news_2022_20220216125238.jpg
[...]
-$ file path/to/enclosures/traffic.libsyn.com_monsterfeet_grue_018.mp3-20220218-152822
-path/to/...: Audio file with ID3 version 2.2.0, contains:MPEG ADTS, layer III, v1, 96 kbps, 44.1 kHz, Monaural
+$ file feeds/**/encs/*/
+monsterfeet.com_grue.rss/encs/20220218-152822-traffic.libsyn.com_monsterfeet_grue_018.mp3:
+ Audio file with ID3 version 2.2.0, contains:MPEG ADTS, layer III, v1, 96 kbps, 44.1 kHz, Monaural
+www.astronews.ru_astronews.xml/encs/20220219-115710-www.astronews.ru_news_2022_20220216125238.jpg:
+ JPEG image data, JFIF standard 1.01, ...
@end example
-@command{feeds-encs.zsh} do not parallelize jobs, because enclosure are
-often heavy enough to satiate your Internet link.
+@command{feeds-encs.zsh} does not parallelize jobs, because enclosure are
+often heavy enough to satiate your Internet link. @command{wget}'s
+progress is also printed both to stderr and @file{feeds/FEED/encs.log}.
+
+Of course you can also download only single feed's enclosures:
+
+@example
+$ cmd/encs.zsh path/to/FEED [optional overriden destination directory]
+@end example
@end table