💾 Archived View for 1436.ninja › Phlog › 20180824.post captured on 2020-09-24 at 01:24:24.

View Raw

More Information

-=-=-=-=-=-=-

# On Raspberry Pi
### 20180824

I gave one of my RPi's to my son to use as a
 PC for school (he has an xbox for games and
 media). So I was left with RPoD (my server
- http/gopher), RPoA (my personal computer,
I am typing this now in vim via ssh to RPoA
from an ipad pro), and RPoJ which had a hose
d FreeBSD installation.

I reimaged RPoJ today to Manjaro Linux (an A
rch based distro). I picked the minimalist i
mage and am looking for something to do with
 it. I am going to be swapping this Pi out i
n a few days as I ordered a new Raspberry Pi
 3b+ and a 128gb uSD card. This is also my e
xperimental test drive of Manjaro, to see if
 I want to use it on the new pi...

I have thought of installing nextcloud, but
I just don't need it. I have an scp client o
n my phone and can hit my external storage c
onnected to my Pi's from anywhere. I use tex
t files for everything so I dont need to bac
k up apps. I really could do without my pers
onal phone, as I rarely use it, except as a
music player. I routinely rotate music on my
 phone with my master library on a large dri
ve connected to one of the Pi's.

I think I am looking for something new to in
tegrate with my usage pattern. RPoA has xorg
 and twm so, the new Pi doesn't need a gui.
A good addition may be a cron based scp of m
y home directories, www, and gopher (live an
d chroot... man I need to get after finishin
g that). I also monthly run a full dd image
of each pi to an external device so they can
 be swapped like nothing happened. The sched
uled scp would fill in the gaps.

I also want to screw around with netcat and
create services to do useful things. Maybe a
 program with password protection that I cou
ld hit using telnet that I could type a phlo
g entry into, then it would trigger the html
 gen and git push/pull to RPoD?

Maybe an information harvester/agregator? Cr
on and curl, massaged with sed for easy cons
umption... ad free and clean. Maybe feed it
URLs of www content and the ID the site uses
 for its main content div?

In any case, time available is limited, but
there always seems to be time to mess around
. Got some thinking to do.