💾 Archived View for tilde.team › ~smokey › cgi-bin › daily-digest.gmi captured on 2024-06-16 at 13:03:56. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2022-07-16)

-=-=-=-=-=-=-

Daily Digest Bash Script

This is a bash script which collects a days worth of links from gemini aggregation feeds like antenna and converts the content into a EPUB file for Ereaders and PDAs.

Adknowledgements

The original script was created by kelbot and can be found here

Kelbots Script

That script uses gmni. This script is a fork that uses gcat instead.

Part 1 of my log originally mentioning script

Part 2 of my log finalizing the script

REQUIREMENTS:

Source

#!/usr/bin/bash
# Defining time variables
TODAY=$(date +%Y-%m-%d)
YESTERDAY=$(date -d yesterday +%Y-%m-%d)
TWODAYS=$(date -d "2 days ago" +%Y-%m-%d)

# feed sources
ANTENNA="gemini://warmedal.se/~antenna/filter/SBGElEWlRYOQbVaKYElminMkCNyskFlc"
COMITIUM="gemini://gemini.cyberbot.space/feed.gmi"
NEWS="gemini://rawtext.club/~sloum/geminews/npr/"

# making temporary text files for gcat to output data into

echo making temporary text files...
tmp_d=$(mktemp -d)
tmpcomitium1="$tmp_d/comitium1.txt"
tmpcomitium2="$tmp_d/comitium2.txt"
tmpantenna1="$tmp_d/antenna1.txt"
tmpantenna2="$tmp_d/antenna2.txt"
tmpnews="$tmp_d/news.txt"
echo finished!

# remove the temp files after script finishes
trap 'rm -rf -- "$tmp_d"' EXIT

# making sure the dailydigest folder exist in the home directory

echo generating and cleaning dailydigests folder...
mkdir -p $HOME/dailydigests

#deleting possible previous contents for a fresh directory.

rm $HOME/dailydigests/*

# comitium feed digest for today

echo creating todays comitium epub...
gcat $COMITIUM | tail -n +9 | awk -v today="## $TODAY" -v yesterday="## $YESTERDAY" '$0 ~ today {flag=1} $0 ~ yesterday {flag=0} flag {print $2}' | awk '/gemini/ {print}' | xargs -d 


\n' sh -c 'for arg do gcat "$arg"; echo -e "\n<========================>\n"; done' _ > $tmpcomitium1
ebook-convert $tmpcomitium1 $HOME/dailydigests/$TODAY-comitium.epub --title=$TODAY-Gemini
echo finished, enjoy your ebook!

# comitium feed digest for yesterday
echo creating yesterdays comitium feed epub...
gcat $COMITIUM | tail -n +9 | awk -v twodays="## $TWODAYS" -v yesterday="## $YESTERDAY" '$0 ~ yesterday {flag=1} $0 ~ twodays {flag=0} flag {print $2}' | awk '/gemini/ {print}' | xargs -d 


\n' sh -c 'for arg do gcat "$arg"; echo -e "\n<========================>\n"; done' _ > $tmpcomitium2
ebook-convert $tmpcomitium2 $HOME/dailydigests/$YESTERDAY-comitium.epub --title=$YESTERDAY-Gemini
echo finished, enjoy your ebook!

# Antenna feed digest for today
echo creating todays antenna feed epub...
gcat $ANTENNA | grep $TODAY | awk '!/(.mp4|.mp3)/ {print $2}' | xargs -d 


\n' sh -c 'for arg do gcat "$arg"; echo -e "\n<========================>\n"; done' _ > $tmpantenna1
ebook-convert $tmpantenna1 $HOME/dailydigests/$TODAY-antenna.epub --title=$TODAY-antenna
echo finished, enjoy your ebook!

# Antenna feed digest for yesterday
echo creating yesterdays feed epub...
gcat $ANTENNA | grep $YESTERDAY | awk '!/(.mp4|.mp3)/ {print $2}' | xargs -d 


\n' sh -c 'for arg do gcat "$arg"; echo -e "\n<========================>\n"; done' _ > $tmpantenna2
ebook-convert $tmpantenna2 $HOME/dailydigests/$YESTERDAY-antenna.epub --title=$YESTERDAY-antenna
echo finished, Enjoy your ebook!

# NPR News for today
echo creating todays NPR News epub...
gcat $NEWS | awk -v news="$NEWS" '{sub("=> ",news)} $1 ~ /.gmi/ {print $1}' | xargs -n1 gcat > $tmpnews
ebook-convert $tmpnews $HOME/dailydigests/$TODAY-news.epub --title=$TODAY-news
echo finished, enjoy your ebook!

Notes on arguments used:

| tail -n +9 |

tail -n +9 - This is piping the output of the comitum page to tail and the -n +9 is telling it to print everything except the first 9 lines. That was just an easy way of getting rid of the top portion of the comitium page that isn't links.

| awk -v today="## $TODAY" -v yesterday="## $YESTERDAY" '$0 ~ today {flag=1} $0 ~ yesterday {flag=0} flag {print $2}'

The first awk portion: -v is how you set variables within awk itself. That next part is using an awk "flag". Basically turning the flag on when it encounters the first variable and turning it off when it encounters the second variable, then printing only what is between the parts where the flag was turned on and when it was turned off.

| awk '/gemini/ {print}'

The second awk command is printing only the fields with the string "gemini" in them. So basically it is stripping the => part of the link line so that when it gets passed to the next command all that is remaining is the actual URL.

| xargs -d \n' sh -c 'for arg do gcat "$arg"; echo -e "\n<========================>\n"; done' > $tmpcomitium1

xargs is a command that lets you pass multiple arguments from the output of one command to another command. xargs is running gcat on each link one at a time. So the loop is just running the gcat command then printing the separator, then doing it again with the next link.

> -d \n' tells xargs to use new lines as the delimeter (separator) between each argument instead of the default which is spaces (I think). because the output of the links from the preceding commands is links that are each on their own line.

> sh -c 'for arg do gcat "$arg"; echo -e ....

sh -c tells xargs to run the following command in a shell.The next part is the command that gets run for each argument. It's a loop that outputs the gemini post with gcat, prints out those ==== as a visual separator beetween posts then prints the next one etc until it's out of links to print.

> $tmpcomitium1 Is outputting all the text to a file instead of printing it out in the terminal.

variables

The date command is a standard unix program that can output date and time in various ways.

the Y m d part is telling the date command how to format the output so that its the same format as the gemini spec.