@end example
or convert Newsboat @file{urls} file (containing many lines with URLs)
-with @file{urls2feeds.zsh} to subdirectories hierarchy:
+with @command{urls2feeds.zsh} to subdirectories hierarchy:
@example
$ ./urls2feeds.zsh < ~/.newsboat/urls
http://blog.stargrave.org/russian/feed.atom
@end example
-@file{urls2feeds.zsh} won't touch already existing directories and will
+@command{urls2feeds.zsh} won't touch already existing directories and will
warn if some of them disappeared from @file{urls}.
@item Download your feed(s) data
(@strong{N}) and old-but-unread (@strong{O}) messages as read. You will
see left tag-marks near each message to understand what was touched.
+@item Press @code{o} to open links and enclosures URLs
+
+Do it in pager mode and you message will be piped to
+@command{cmd/x-urlview.sh}, that will show all @code{X-URL}
+and @code{X-Enclosure} links.
+
@item Index your messages
@example
@item Cleanup excess number of messages
+By default (@env{$FEEDER_MAX_ITEMS}) only 100 entries are processed.
+Parser only appends them, but does not remove obsolete ones.
+
@example
$ ./feeds-clear.zsh
@end example
-That will remove all messages in all feeds @file{cur/} directory that is
-not first hundred of ones, ordered by @code{mtime}. Pay attention that
-@file{new/} directory is not touched, so you won't loose completely new
-and unread messages when you are on vacation and left @command{cron}-ed
-workers. @command{cmd/feed2mdir/feed2mdir} command by default has
-@option{-max-entries 100} option set.
+will clear everything exceeding the quantity limit. You can set that
+limit on per-feed basis. For example @code{echo 50 > feed/FEED/max}.
+0 means no limit and keep all the messages.
+
+Pay attention that @file{new/} directory is not touched, so you won't
+loose completely new and unread messages when you are on vacation and
+left @command{cron}-ed workers.
@item If you want to clean download state