Two Paradigms of Personal Computing

Author: riverlong

Score: 42

Comments: 9

Date: 2020-10-28 05:53:16

Web Link

________________________________________________________________________________

fragsworth wrote at 2020-10-28 22:04:48:

I am increasingly and severely avoiding the use of my mobile phones, mostly because of how it interferes your life. I don't know how much this is true for everyone else, but I have found plenty of desktop applications that do exactly what I want and little else (shells, browsers, websites, Discord, Steam, Spotify, work software, etc.). And I sit at the desk for most of the day.

My phone is at my desk. I turn off all notifications and sounds, and only allow it to ring from people on my contacts, and I otherwise don't use it. It's basically a chat computer that sits on my desk. I'll take it with me on any major trip anywhere, but otherwise when I leave, I try to not bring my phone.

I don't know how people stand to live with these devices and the irritating software that runs on them. The expectation now is that anything you install will irritate you, and people aren't properly valuing their time, but some of them catch on to it, and stop installing new software...

musicale wrote at 2020-10-28 23:35:56:

> I don't know how people stand to live with these devices and the irritating software that runs on them

I saw the beginning of the end when a dialog box popped up saying something like "Microsoft Word requires your attention." This represented (to me at the time) a fundamental misunderstanding of the purpose of office software and who it was working for. Apparently Microsoft (and Apple who implemented whatever irritating API they were using) decided that software should be in charge of the user, instead of the reverse.

Now with the attention economy, there is little pretense regarding who the software serves primarily (vendors and advertisers) and who it exerts dominion over (users.)

jasonv wrote at 2020-10-28 22:19:53:

I agree with you.

My partner left her phone home the other night when we went out to run errands. It felt like a pretty big step on her part.

I've been considering reformatting my phone and installing a bare minimum of add-on apps. Feels cleaner, and I have way too many apps on my phone to go and delete them individually.

Jtsummers wrote at 2020-10-28 22:25:30:

I turned off all notifications except calendar reminders (only created deliberately by me anyways), phone calls, SMS/iMessage, and WhatsApp messages (how I communicate with my wife and most of her family). That made the phone _much_ less annoying without needing to delete _everything_. It's just that everything else is more like email, something I check a few times a day not things that notify me every other second. None of it is urgent, so it's at my leisure.

ZeteticElench wrote at 2020-10-29 07:56:55:

There is an annoying trend of companies moving to mobile only apps, like Samsung health used to have a nice web site but they dropped it and now you can only view your data on your mobile phone or use an emulator.

mwcampbell wrote at 2020-10-28 23:52:06:

My concern is that children aren't getting exposed to the transparent kind of personal computing, only the opaque/magical kind. In that respect I think my generation was lucky (I was born in 1980). Maybe I can do something about this for my nieces and nephew (6 years old and under). But will the next generation in general even know what they're missing?

7thaccount wrote at 2020-10-29 00:58:20:

Raspberry Pi and other boards like the BBC Microbit, and Arduino are stepping in to help kids learn computing from Linux to hardware/robotics.

I'm also hoping that the Commander X16 project takes off. It's essentially a new Commodore 64 with better specs that is supposed to be affordable and easy to use and fun. That would make for a great first computer.

fragsworth wrote at 2020-10-29 01:26:18:

Programming in python is already largely opaque/magical, as most people who use it don't actually understand the inner workings.

I'm not convinced yet that similar things won't be surface, in other ways.

theon144 wrote at 2020-10-29 00:39:57:

I feel like this is a super surface-level analysis that's based on just the "epiphenomena" of the two paradigms instead of the actual forces driving the distinction (i.e. control).

I sympathize greatly with this view. For the past five years, I have exclusively run Arch Linux. I love the early-2000s style of personal computing: text-heavy interfaces, words rather than icons, uniform keyboard shortcuts everywhere.
[...] but I sit on a clacky IBM keyboard to write code and blog posts in a terminal that hasn’t changed in twenty years.

This is just such a misguided way to frame the difference - inspectability and possesing control of your device does not in any way imply having to use outdated technology, which is what this is saying if I were to take it literally. Text-heavy interfaces do not in any way intrinsically attach to the idea of the "extension of self", no more than ThinkPads enable UNIX wizardry. Conversely, a laptop that has a touchbar or an OS with a voice assistant does not imply lack of control, or "becoming a part of self".

Come to it, Windows XP were in no way any more customizable or inspectable than the current crop of Operating Systems, it was already a full, pre-packaged black box - and the distinction between Android and iPhone is almost entirely superficial. Sure, you can switch your launcher, and _some_ devices are easier to "root" (i.e. circumvent black-box measurements), but the differences end there, when I hold my Android phone I am absolutely holding a magic wand, just of a different grain.

I do think that there are two opposing trends, but I struggle to see the distinction as an "extension of self" / "part of self" approach - unless, of course, the whole idea is that the latter become a "part of you" because they offer you so little control you have no choice but mold yourself around them - in which case I fail to see how that constitutes a separate paradigm fulfilling a real desire.

I think that the reason that this distinction appears is that the "part of self" devices are generally more successful because of a mutually reinforcing tendency of the platforms to be locked-down as it benefits the vendors, and the fact that these devices often have superior UX due to generally receiving more developer attention as a result of being profitable. This results in well-polished "magic-wand" devices that are user-hostile - and the conflation of the two. Android runs on a myriad of different devices due to not being as locked down, but as a result, the average Android phone receives a thousandth of care as a new iPhone.

Neither is actually a consequence of user desires except on the broadest level, hence why I do not think that the split is as deep as the author claims. The level of control a user seeks from their computing devices is a spectrum, so splitting it into two opposing approaches feels somewhat arbitrary and ultimately unproductive to me, as it sort of solidifies the misconception that a capable, general-purpose device is somehow impossible and that having control over a device requires you to revert decades back in progress.