💾 Archived View for tilde.team › ~smokey › cgi-bin › daily-digest.gmi captured on 2024-06-16 at 13:03:56. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2022-07-16)
-=-=-=-=-=-=-
This is a bash script which collects a days worth of links from gemini aggregation feeds like antenna and converts the content into a EPUB file for Ereaders and PDAs.
The original script was created by kelbot and can be found here
That script uses gmni. This script is a fork that uses gcat instead.
Part 1 of my log originally mentioning script
Part 2 of my log finalizing the script
#!/usr/bin/bash # Defining time variables TODAY=$(date +%Y-%m-%d) YESTERDAY=$(date -d yesterday +%Y-%m-%d) TWODAYS=$(date -d "2 days ago" +%Y-%m-%d) # feed sources ANTENNA="gemini://warmedal.se/~antenna/filter/SBGElEWlRYOQbVaKYElminMkCNyskFlc" COMITIUM="gemini://gemini.cyberbot.space/feed.gmi" NEWS="gemini://rawtext.club/~sloum/geminews/npr/" # making temporary text files for gcat to output data into echo making temporary text files... tmp_d=$(mktemp -d) tmpcomitium1="$tmp_d/comitium1.txt" tmpcomitium2="$tmp_d/comitium2.txt" tmpantenna1="$tmp_d/antenna1.txt" tmpantenna2="$tmp_d/antenna2.txt" tmpnews="$tmp_d/news.txt" echo finished! # remove the temp files after script finishes trap 'rm -rf -- "$tmp_d"' EXIT # making sure the dailydigest folder exist in the home directory echo generating and cleaning dailydigests folder... mkdir -p $HOME/dailydigests #deleting possible previous contents for a fresh directory. rm $HOME/dailydigests/* # comitium feed digest for today echo creating todays comitium epub... gcat $COMITIUM | tail -n +9 | awk -v today="## $TODAY" -v yesterday="## $YESTERDAY" '$0 ~ today {flag=1} $0 ~ yesterday {flag=0} flag {print $2}' | awk '/gemini/ {print}' | xargs -d