💾 Archived View for dioskouroi.xyz › thread › 24932130 captured on 2020-10-31 at 00:58:57. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
________________________________________________________________________________
"Multi-Screen" is too broad a term. I have caught myself sometimes drifting in thoughts because there was something else going on other than work. Having two different agendas and juggling them using two or three different _work environments_ is the attention killer for me.
As consequence I have bought a KVM switch and hooked it up to my main screen. Anything else not related to the current agenda gets shut off, or muted, or plainly ignored no matter what happens there. Works so far for me.
This is exactly why I've gone back to using just one (admittedly, big ass 32") screen whereas before I had three (one was attached to a separate device where builds occured). I do have one other display, but it's stacked high enough up on my vesa pole above %PRIMARY_DISPLAY% so I have to tilt my head up quite a bit to look at it. Note taking app, a small monitoring dashboard, email, and a music player are up there.
This means if it's not on my main screen, it's not on my mind until I make the conscious decision to move something over to it, or I need to drag something off of it. It's a behavior I had to force myself to train and become disciplined at.
If it needs to come down to the main screen, it should under no circumstances STAY there if I'm not working on it.
Benefits: attention is more intentional and deliberate, I can quickly glance up at something if needed, but I'm not straining my neck looking up because it's happening so infrequently.
Intentionality is king.
(edit: the only real problem with this particular bit of home office design is my macbook stays shut in a vertical dock, and since I haven't really made myself buy an external webcam, there are times I have to unhook things just to bring the Macbook out because there are people who are weirdly hung up about being able to 'see everyone's faces' on Zoom calls just to conduct an outage post mortem, which is annoying--but not annoying enough that it's made me want to spend even $20 on an external webcam)
This headline is clickbaity, esp for a HN audience. The article clearly states that the problem is consuming multiple media sources simultaneously, which AFAIK is more or less _completely_ unrelated to having more than one screen.
Having more than one screen is something you need _when the one thing you are working on requires too much relevant information for a single screen_. That's the opposite.
The word "screen" here is a poor choice. You could just as easily be "multi-screening" by watching a picture in picture video while reading a web-site. Or the reverse, in their frame of reference, web development where you have one display/ output screen and one code screen is _not_ multi-screening.
Poor word choice because it's not as catchy as saying "Trying to consume two types of media at once" which is what they seem to be pointing to.
Exactly. I clicked through because I thought; "oh, shit. I cannot even get work done without two. Sometimes I use three!". I've been had!
I've heard some anecdotes that multiple monitors do reduce focus though[0]. I'm still on the fence though, the alternative to multiple monitors is often lots of workspaces, and switching between those can often feel just as jarring as looking to my left or right. Using i3, I often forget which workspace I left something on and spend 2 or 3 seconds cycling through every open workspace.
[0]:
https://michael.stapelberg.ch/posts/2020-05-23-desk-setup/
Personally I'm very fond of three screens, it lets me split information by level of attention required: the center one is for what I'm focused on, left screen is additional information I might need (like documentation or console output), right screen for things I don't really need to pay attention to (given my peripheral vision isn't as strong on the right).
The positions are set in such a way that I can glance at the left without turning my head, whereas the right side requires me to switch focus / turn my head to give it my full attention.
I think it depends on the task. If I'm writing something I only have cursory familiarity with (which is very often) It's kind of indispensable to have another screen, because to rapidly implement code, test if I actually got it right, and refer to both the output and the docs i need at least:
1. Browser (sometimes two, one for docs the other for stack overflow / googlin')
2. Execution shell
3. Editor (also frequently doubled - if there are imports or more than one file involved)
4. (occasionally) utility shell - for traversing the project directory and assessing changes
...having these split up by workspace just feels super clunky. Having them 10deg of eye rotation away is way better.
"Multi-screening" in this case means multiple devices, not 2+ displays of the same device, although I have my suspicions about that.
This sounds like it is talking about people using a phone and watching TV at the same time. Not using multiple monitors at work as the title lead me to think.
Multi-screening? Maybe. It's a misnomer, though. Multi-crap apps open across multiple devices, I'll buy. It WILL kill your productivity. But take away my multi-monitor setup on my desktop or take away my virtual desktops and I'll go for your jugular.
I recently switched to a PinePhone and got all those god awful smartphone apps out of my life.
I'm actually functional again! It's worth the unreliable calls and weekend spent hacking a power management OpenRC service together. That whole smartphone thing is built around finding creative ways to pull you away from useful things.
When I was dealing with large excel spreadsheets, having a large enough surface was a 2x speed increase. No scrolling..
Dude, it’s just a screen, you don’t have to channel your inner edgy teenager to look like an anime character. Chill.
It’s obviously hyperbole meant in jest
Results showed that attention lapses in the moment prior to remembering impacted on behavioural and neural memory signals and were associated with greater likelihood of forgetting.
Can anyone translate this for me? This sounds like someone throwing a bunch of big words together to me.
I don't have a Nature subscription but this seems like another non-study, 80 adults in a contrived experiment with flashing images then given a "questionnaire" with which conclusions are drawn...
There is a (common) theory which states: Memory isn't like a disk drive. When you remember, the brain doesn't just read something from disk.
Instead, when you remember, the information you're remembering is re-summarized and written back, possibly at higher levels of 'compression'. I put that in scare quotes, but when the brain uses references to other memories to reduce the space taken by any single one, compression is what it is.
The theory here, then, is that distraction at (forgive the computer-esque terms) the exact time it's being recompressed and written back out, can disrupt that process and cause it to not be stored properly.
This checks out. Bringing up a memory, distracting yourself and then just... telling yourself it's unimportant... is, indeed, one way to deliberately forget something you'd rather forget.
Used beneficially, you can forget spoilers for a novel. You could even attempt to forget trauma this way, though I don't believe that's nearly as likely to work.
If you just happen to get distracted for other reasons, though? You might forget something you'd rather not.
This effect can have serious implications for the criminal justice system.
Suppose two men were in an altercation with a woman, and during the altercation one of the men briefly flashed a knife. One man was black and one was white.
If the first person to question a witness asks something like "When in the altercation did the black man flash the knife?", there is a chance that the witness will remember the black man flashing the knife even if it was actually the white man who did so. If the first questioner instead just asked the witness to describe the altercation and any weapons that were used, that same witness would have been much more likely to correctly place the knife with the white man.
There have been quite a few experiments about this sort of thing, not only achieving altered memories but even going so far as to give people memories of whole events that did not happen.
BTW, expectations also can alter how memories are initially recorded. Even without someone asking a bad question like "When did the black man flash the knife?", if the white man is dressed in a suit and the black man is dressed like however the witness imagines a street thug would be dressed, there is a decent chance they will remember the black man as having the knife. If other stuff is going on and the flashing of the knife is just a passing detail, it gets remembered where it makes the most sense.
In our current criminal justice system in the US there are nowhere near sufficient safeguards to make sure that poorly chosen questions will not alter witness memory, or to even detect that it might have happened. I don't know if any other countries handle this better than the US.
Thanks, that makes sense. The computer terms helped :)
If somebody distracts you while you're trying to remember something, you won't remember it as well.
As a chess player, I was considered to have good memory. It was indeed very good, but multi-branch version control sabotaged it. Now, I'm constantly thinking: "I know I fixed that code before... but on which branch ?"
Sounds like you need to merge more often.
Just use master branch.
master is so last month. they call it main now on GitHub =)
I consider "mainline" more clear, using it in a currently unreleased project.
A bit on a tangent so forgive my rant, but I'll contribute with a few patterns that come to mind that have something to do with messing up one's memory (but this could be subjective):
1) reinventing of software UIs and moving things around from version to version; sometimes the change is legitimate but that is not the case very frequently. This is similar to shuffling content in textbooks so that students cannot use the previous version of the textbook
2) UI bloat - all options are compressed and displayed on the screen at once, many of the features that may not be addressed to you make it hard for you not to get distracted from what you are doing. Combine that with #1
-3) The opposite of 2 where things are hidden and the interface is very unintuitive. Makes the user feel powerless and insecure
3) Fast technology cycles and too much innovation in too short of an interval. Sometimes that innovation is pure recycling and coupled with hype
4) The decay of the print media together with notetaking with a pen on paper. The endless possibilities that come with using digital notetaking or content consumption come with the explosion of information which we're subjected to. We can store more but we remember less. Search engines contributed to the weak outsourcing of our memory externally. Of course, there are plenty of advantages in the digital realm, i'm only pointing out to a few ways that I feel made my memory much worse over the year. As I said, all these could be subjective and my age (40) as well as my own genetics which can be taken in as a factor. But I noticed that I make less and less effort to memorize things in general
Nothing new here. Context switching will drain your brain.
I genuinely wonder how hard it would be to break these developed habits.
While the article specifically points to the detriments how using multiple forms of digital media at the same time, does it also apply to using one form of digital media while performing analog tasks, like say listening to a podcast while working out.
So much of my life seems to be structured around interacting with digital media in some way while I go about my day. I'm aware the study only shows that a correlation does exist and that there isn't necessarily any hard evidence as yet, but it does seem something that will be hard to work out of my routine., if that ends up being the case.
What's the distinction between multi-screening and multitasking in general? Is there something about "media" multitasking that specifically affects memory?
Multi-tasking is good because it implies you're working harder. Multi-screens are bad beacuse it implies you're playing more.
While I generally believe this to be true, and I want this study to be evidence that it _is_ true, they fall into the same trap as so many other studies: they cannot demonstrate a causal link.
It could just simply be that people with worse memories are more likely to enjoy media multi-tasking than people with better memories.
In short, correlation != causation.
I was thinking screen and tmux and slightly panicked.
Learn Unix commands and cron. Automate everything. Use virtual desktops as separate work environments _conmuting_ after finishing the first job, not to switch between Windows.
Or if you use systemd you can use Timers
The title made me think it was about dual/triple monitor setups.
I grew up playing video games and watching tv at the same time, never impacted me back then, but do with phones ... hmmm.
What a shocker!