💾 Archived View for dioskouroi.xyz › thread › 29369904 captured on 2021-11-30 at 20:18:30. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
________________________________________________________________________________
_the biggest pain point for software developers is the small amount of RAM_
Software developers thinking that 16GB is a small amount of RAM is why all of our software is fat barges of bloat now. Y'all, this prophecy is entirely self-fulfilling. Please stop.
This is ridiculous nonsense written by, presumably, someone who isn't a professional software developer.
Software developers often have to run several emulators (e.g. Docker Desktop, Android, and iOS emulators), hundreds of Chrome tabs, compilers, _heavy_ 'ol IDEs, _Microsoft freakin' Teams_, and a bunch of other stuff concurrently. 16 GB is _just_ enough for many workloads, but it's easy to see use cases in which would exceed that.
None of that is necessary to develop software, and when you do want it, most of it doesn’t even have to run on your laptop.
When did people become so helpless? I was offloading interactive workloads via X11 back in the 90s, and these days it’s to a near-instantly provisioned public cloud instance. If someone can’t get by in a pinch with vim and a command prompt, then I don’t particularly want them in my crew.
There”s a corollary to that old dictum, “a bad worker blames their tools”, and it’s “a good worker chooses good tools”.
> _When did people become so helpless? I was offloading interactive workloads via X11 back in the 90s, and these days it’s to a near-instantly provisioned public cloud instance. If someone can’t get by in a pinch with vim and a command prompt, then I don’t particularly want them in my crew._
Well back in my day we had to timeshare the punch card machine and our punch cards only had zeros. Ones were too wasteful and anyone who asked for them would have been fed to the lusers.
I can only urge you to consider an upgrade from punched media to the new-fangled latest in magnetic drum storage, even if only in order to build a fuller appreciation for _The Story of Mel, a Real Programmer_ [1]
[1]
http://www.cs.utah.edu/~elb/folklore/mel.html
I can survive hunting and farming my own food.
unfortunately I can't get rid of teams and Intellij at work.
I use Xfce on Linux and 32 GB of RAM are just enough to not encounter problems in my day to day job, imagine those poor folks using something like MacOS or Windows...
At home I have a laptop with 4gb of RAM, it's usable, I use it to watch YouTube videos and as a media player.
On Ubuntu the 4GB of ram are constantly used and the system also uses no-less-than 4gb of swap space (thanks fast SSDs for making my life better)
Having used a Linux-based system and macOS I can tell you it's those poor folks using Linux that suffer most from bloated desktop applications. Recent macOS versions are really good at efficiently using the available memory so a 16Gb Macbook performs better under memory pressure than a 32Gb Linux box.
examples?
except for the JVM/electron based apps that I am forced to use, all the Linux native applications are as good as any other system.
Anecdotal, Firefox on Linux is the only incarnation of Firefox I feel comfortable using, I think it's between not great to terrible on other systems. I use WebKit based browser on other OSs.
difference being I can tune the OS to work as I intend and the developer tools are some of the best in their categories, that's not true for the other two.
MacOS is known for being slower and use more RAM on many developers task (docker, for example), but there are things you can only do on a Mac, iOS apps being one, and, honestly, I can't imagine using XCode with maximum 16 GB of RAM for the next few years.
You might develop on MacOS, but then you deploy the artifacts on some Linux box, of course because there is no MacOS server offering, but I wouldn't do it anyway if I had the option.
Windows as a server is not that bad either.
I've worked with all of them in the past 25 years, Linux desktop experience is not as polished as it could be, but for sure it's not slow or bloated in general.
Yes GNOME 40 is not great, GNOME 3 probably killed Linux on desktops, but I don't use it, so...
even Plasma looks super snappy compared to MacOS these days.
Claiming that Apple can double your RAM is a strong claim, that in my experience is not supported by any evidence.
Perhaps the feeling of Apple system being faster comes from the fact that there are only relatively recent Apple machines around and it's rare to see some 8 years old Apple laptop still in use, while my 7 year old laptop with 64gb of RAM costed as much as an M1 Pro but can still do its job relatively well because I can put a lot of things in RAM, reducing the pressure on the rest of IO subsystem (I feel I have to specify that power efficiency is an order of magnitude worse than an M1, if battery life matters to you, Apple is ahead of the competition right now)
But it's no secret that one of the strongest feature of Linux is giving new life to old systems.
I still remember when I upgraded from Snow Leopard to High Sierra and everything became slower, sometimes unusably so.
After 10.6.8 MacOS (best version of the OS I've ever used) has only got worse, in my opinion.
If you don't want to give me a decent machine, I don't think I want to be on your crew.
Anyway, I generally work for big corporates that don't give the developers the ability to spin up arbitrary cloud workloads from their laptops.
Oh, you’ll get a decent machine, but as the top comment conveys, developers that think this is a license to bloat need some firm correction, or simply weeding out.
Not merely because their undisciplined code will be substantially more expensive to run, especially at scale, but also in particular because it’s correlated to being a useless towel in a crisis.
> Oh, you’ll get a decent machine, but as the top comment conveys, developers that think this is a license to bloat need some firm correction, or simply weeding out.
The thing is, I don't think a lot of businesses care about whether the software they create is bloated or not. Performance isn't valued as much as features or functionality. Teams for example makes repeated multi MB REST requests where websockets would probably be much more efficient. Until the business incentives are changed, I don't think we should expect much change (and maybe that does start with the developers).
Eh, my work gave me a laptop and created a service for devs (virtual desktop) in the cloud.
The laptop can't be used (security clampdown pretty tight, can't install anything useful) and the vd would die with 1 command (memory exhaustion), and had to go to their office to bring it back up.
While reading this and really doing nothing compared to a work
day have:
- 24 Chrome tabs open with news
- 17 of Firefox
- Waiting for VMWare Workstation to finish
an install of the new Fedora 35
- Intellij is compiling a Rust program
- Have Ableton live running in the background
listening to a mix of last week
- A presentation on PowerPoint for next week that
looks like it would use half the resources
of AWS us-west-1
- CLion while trying to see why a C++ example
does not compile and Visual Studio Code to cleanup
a Cloudformation template....
If you run Ableton to listen to mixes, I encourage you to export/bounce the mixes and to listen to those instead.
Offloads the CPU and has the benefit that you have it exported aomewhere in case old plugins vanish, chaos happens etc.
There's a miscommunication between your comment and GP. You may use 16GB RAM as a developer but that doesn't mean the average user of your product is equipped equivalently.
>_This is ridiculous nonsense written by, presumably, someone who isn't a professional software developer. Software developers often have to run several emulators_
That reality is part of the problem the parent wrote about...
This just proves the parent commenter right. 'Software developers often have to run...' - yes! That is the problem! People _shouldn't_ have to run all of that, but they do. If people thought about what they really needed rather than just buying more hardware, the world would be a better place.
(Also, I doubt you _need_ hundreds of chrome tabs + heavy IDEs, a couple of tabs + <insert lightweight editor of choice> works just fine)
I'm so glad there are people like you to tell me the best way to do my job without even knowing what I do - it must be a real skill.
Yes, it would be possible to curate my tabs better, but I often have quite a few things on the boil at any one time and taking time out to carefully prune my tabs holds up delivery.
I actually tried to subsist on a 8GB M1 for a few months, I managed to survive, but context switching became painfully slow as I'd have to shut down large parts of my tool chain and carefully prune off browser tabs. It was a serious productivity drain.
It turns out that the heavy IDE is heavy for a reason, it provides useful capabilities! The kind of capabilities one needs as a professional software developer. When one is working on a foreign code base spanning thousands of files, it's genuinely handy to have some tooling to help you find your way around. VS Code tries, but it's generally pretty useless in my experience.
> I actually tried to subsist on a 8GB M1 for a few months, I managed to survive, but context switching became painfully slow as I'd have to shut down large parts of my tool chain and carefully prune off browser tabs. It was a serious productivity drain.
A friend recommended I try such a RAM diet, so I went from 64Gb to a 4Gb laptop. Unlike you, I've found it helped me focus.
So I've kept this laptop as a "productivity" laptop - when I work on a tight deadline, I use it, because there will be 0 risk of context switching :)
I think what you found helpful was the ability to pause a session, then return to it later. I think the ability to do this without buying a second laptop would be very useful.
This is such a typical Apple fanboi type response - _you are the one using it wrong_.
It’s not for everybody, but for helping manage tab overload the tab groups added to Safari in Monterey have been a massive boon for me. At least in my case most tabs fit fairly well into one of ten or so groups, and so management is as simple as switching my main Safari window to the group relevant at that point in time. The unused groups are slept after a certain period of inactivity, which means that most of the time Safari is only consuming the resources required by a single set of tabs. It’s pretty nice.
I had an auto-discarder plugin for Chrome to a similar effect. I also tried switching to Firefox, which also improved matters somewhat. I don't think I'd have survived at all with a 8GB device at all otherwise!
However, doing something as basic as loading up a large Java project (like Kafka) in Intellij would cause the the laptop to die a horrible death when it tried to resolve the Maven dependencies.
There’s no way having hundreds of chrome tabs open is actually beneficial to you.
Or I can just do whatever is most productive for my workflow and not have to worry about RAM, not worry about managing tabs, not worry about what windows I left open that I might or might not need later, worry about the ram consumption of different apps I want to use, etc, that's also an option. Dunno why you're making it such a huge deal, convenience and productivity are after all the goal.
Okay from a data analysis / data scientist perspective how do I load into memory my ~60GB dataset into memory for analysis. Also, swapping to disk will waste my time.
The storage read speed for the M1 is ~3.4 GB/s. The write speed is 2.8GB/s. Unless you are looking at all of your 60GB dataset in one go, clearly the speed at which storage can be accessed is fast enough to do batch processing on the data fairly seamlessly. Even at 2GB/s that's your entire dataset in 30s.
https://eclecticlight.co/2020/12/12/how-fast-is-the-ssd-insi...
Disk speed on the MacBook Pro 2021's are ~7-8GB/s on the M1 Pro / Max so those are still faster which improves my ability to swap out even large amounts of data above 64GB.
I own two fully spec’d M1 Max MacBook Pros (8TB SSD). I find it hard to believe that anyone is getting 8GB/s read or writes to disk. Both of mine consistently write at a maximum of ~7.3GB/s and read at a maximum of ~5.7GB/s.
The M1 Max is a monster, and either of mine easily outperforms any other computer I have ever used, but I don’t think it’s capable of 8GB/s reads or writes to disk.
Why should an Android developer not "have to" run an Android emulator?
> Why should an Android developer not "have to" run an Android emulator?
Why should ? We don't run PC emulators or MacOsX emulators to develop SW. Only Android and iOS are not able to function as real OSs.
We actually do run those emulators (or at least virtualization) when developing across platforms.
With root, Android could even make a decent host platform for doing development, but that doesn't obviate the need for emulators and virtualization systems.
How do you test builds that work across multiple services locally without using a ton of ram?
What do you do for a living?
Also: You can't avoid a lot of it: Is PART OF THJE JOB.
Also too: Next time people complain about JS, Java, Electron, etc? Remember this.
Not to talk about a bunch of virtual machines, especially if virtual machines is what you develop.
Or they just prefer easy and fast ways for developing apps (looking Electron et al.) Which bloats the memory. Goes unnoticed since they have high memory themselves.
Literally Teams is one of them. Well, it is dropping Electron in the future.
Mac has really good memory management though (and you can improve linux’s by a great margin if you don’t have too much RAM by enabling zram).
People should be literally at software houses' headquarters with pitchforks and torches, revolting against such wastes of resources.
It won't help. There seems to be a trend in SW development: if the program has an address space of 64 GB the programm shall use it. It does not matter that the program will not run alone on the system.
Some years ago, when the first version of Crysis was released, it could not even run on the fastest computers available but it was celebrated as a real achievement. This is the spirit.
Hundreds of chrome tabs? lol, ya, that’s on them.
I agree. But isn't part of the design conceit that the storage is fast enough that swapping isn't as painful as it would otherwise be?
> _Software developers thinking that 16GB is a small amount of RAM_ […]
It is… for development. Which is what software developers do and what they often buy hardware for.
Whether it's too little for the apps that they develop to run on users' machines is a separate question.
The problem with bloat today is not that developers have powerful computers. It is mostly that the latest trend is to ship software with their entire run time environment.
Electron is the most glaring example, it essentially ships an entire browser with your app, which we may as well call an OS nowadays. So in the end, if you have 10 electron apps, you have 10 copies of Chrome in memory, in addition to your OS main browser. They are all the same except for a few minor differences. Sure, it solves "DLL hell" but at an extreme cost in efficiency.
> It is mostly that the latest trend is to ship software with their entire run time environment.
I think this is a side effect of the main trend, which is write once, run anywhere. There are a handful of frameworks aiming to do that, and it is difficult to accomplish and rely on native APIs. Also, I’d suggest it’s a trend for economic reasons rather than because people think electron is technically superior to native APIs. The economic reason is that even a medium/large company can just hire a sole developer or two to spin up an electron app that loads an existing web application and it works everywhere and is “good enough.” The alternative is hiring lots of people to build a native app that users only might think is better.
I don’t think most electron examples are because people wanted a technically superior solution to native applications.
Writing apps for the browser is one thing, I don't particularly like it because it lacks the level of integration native apps have but for performance, it is not that bad. Browsers are well optimized and offer many interesting features out of the box.
For me, the problem is that an Electron app ships with its own browser instead of using the one the system already has.
Another example, mostly in the Linux world, is to use containers with their own subsystems instead of using system libraries. Snap, AppImage and Flatpak do that, especially Snap, which I think is the only one that doesn't support external dependencies.
I understand the appeal, but that's really wasteful.
> Software developers thinking
Please restrict this to web developers. My decade old ThinkPad x200s with two Core2 Duo cores and 8gb of ram and an SSD can do most things very well, except browsing websites.
Loading a large book in PDF format? no problem.
Opening ten years of email in Claws Mail? sure thing.
Opening the same amount of e-mail in thunderbird? A little sweaty, but why not?
Playing mp3 files over bluetooth? Yeah!
Wanna write a document in libreoffice writer? As long as you're a bit patient.
Chat with other people using Telegram or Hexchat? No problem.
... What? You want to browse the web? F--k you.
> Software developers thinking that 16GB is a small amount of RAM is why all of our software is fat barges of bloat now
no, that's not the reason.
Anyway, it's 2021,16gb max for a new laptop in 2021 is factually low.
Laptops had 16GB max more than 10 years ago.
I think it's fair to criticize the specs of one of the most expensive pieces of hardware on the market if they are somewhat underwelming and when even new game consoles have the same amount of RAM, at a fraction of the price, with only one thing to do.
edit: not comparing the performances or the energy efficiency here, but a Dell with a ryzen 5700 and 16gb of RAM (upgradable) costs € 829, an M1 with the same amount of RAM and less storage space costs € 1.659 on Apple website
I think Apple could have done better in this regards
MacBook Air at $999 is Apple's entry level Mac. If you want more than 16GB, you need to go M1 Pro or higher.
M1 is their first generation Mac SoC. Still using LPDDR4, if you want 32GB, you will have to wait for M2 MacBook Air which should have LPDDR5 and allows 32GB. Assuming Apple allow such configuration. Or they may decide to save cost and stick with LPDDR4.
M1 Pro and Max has wider memory bus and controller with LPDDR5.
You can built a Laptop with 128GB if you wanted to, just at the expense of Battery Life.
If we dont have a borderline fair comparison with products, why dont we compare to a Chromebook? I think most if not all people should agree the build quality of MacBook Air M1 is better than 99% of Dell's laptop.
I dont want to sound like an Apple apologist but that is just an objective truth. If the argument is about pricing, then Apple isn't a brand sell product on low margin. Consoles are sold at small or even negative margin. They aren't even in the same business model. That is a marketing and pricing strategy question.
And almost no one realise, partly because 99.999% of people inclusive but not limited to HN aren't working in CE or supply chain. The price for DRAM hasn't moved in the past 10 years. And LPDDR memory are actually more expensive per GB.
> MacBook Air at $999 is Apple's entry level Mac
that's for the 8gb model
and it's € 1.429,00 on the Italian Apple store
> I think most if not all people should agree the build quality of MacBook Air M1 is better than 99% of Dell's laptop
it doesn't actually matter, honestly.
If people need more RAM and don't want to spend a fortune on the pro Apple models, they have to look elsewhere.
An ASUS ROG with 32gb of RAM and a discrete NVidia GPU costs more or less the same of an M1 Air with 16gb (maximum, I wanna stress the concept, that's all you are going to get out of it, forever)
My car has the same horsepower (or luggage space, or windshield size, or more seats, etc.) as a Porsche.
For some reason nobody here brings that up. Considering that AAPL is the most valuable company on the planet, they might just have a good understanding of their market.
AAPL is serving their market well, and then make enough money each year to buy the entire Dell company. _You_ may prefer Dell, and many people obviously do, but I don’t think AAPL needs to stuff 64Gb of Ram into everything.
> My car has the same horsepower (or luggage space, or windshield size, or more seats, etc.) as a Porsche.
We agree.
Porsche and Apple base offers are overpriced, for what they actually offer.
A 959 pearl white it's one thing, but a Macan?
in this particular case the M1 is not comparable to a Porsche, but more to a luxury hotel that has a very tiny bathroom or a limit of one cup of coffee at breakfast.
When you consciously spend the kind of money a Mac costs, you really expect the best the money can buy.
> You may prefer Dell, and many people obviously do
You got it backwards though.
16gb of maximum RAM for a laptop in 2021 doesn't even match the Dell offering...
Imagine if the afore mentioned Porsche had a limiter that let you use only 50% of the accelerating power, unless you buy the more expensive "pro" model.
That is basically the same car, only with the limiter removed.
Anyway, nobody said anything on the ability Apple to make money, probably it says more on the people who buy macs than on Apple itself
The article simply says that the best laptop in the market can't have 16gb of maximum available RAM.
Which is true.
> The article simply says that the best laptop in the market can't have 16gb of maximum available RAM. Which is true.
False, Air is NOT the 'best'; that is the Pro, urm Pro-Max or whatever marketing verbiage Apple is spitting out now.
Air is the low-end model, so makes sense to make it 16Gb max (though that config is /hard/ to find retail). Reports say that M1-native code uses less memory, & memory usage is fast & efficient. So safe to say 16Gb with M1 ~= 20Gb Intel. So a 16Gb M1 Air is great /value/ machine, but not 'best'.
1st gen M1 Pro with max 16Gb, that was poor decision making IMHO.
The thing is that Apple is kind of moving towards making RAM the level 4/5 cache. It’s becoming more and more important to tie RAM closely to CPU to reduce power consumption and lower latency. That’s naturally going to make RAM size grow more slowly than it did before and make it non-upgradable.
I think the solution is to increase the amount of caching RAM on NVM storage, so the NVM acts more like RAM used to.
So I really don’t think what Apple is doing with regards to RAM is a problem. But they should have a user replaceable NVM slot. Perhaps tiered with a soldered on NVM chip so you can cache some of the data on a low power, low latency NVM.
It’s just not enough when it’s a multi-service backend that you want to run on your local machine, plus IDE, a browser, and a few well known electron/react native apps that play music and let you chat with your colleagues.
> and a few well known electron/react native apps that play music and let you chat with your colleagues.
This is the whole point: devs with 32gb/64gb machines looking at their own app taking 1gb aren't going to realize that for many people this is 25% of the available non-os ram (8g system with 4g free non kernel / other.
I understand why you're saying that but when you try do develop on Xcode running an emulator and at the same try to use your machine normally (keeping some tabs open, listening to some music etc.), suddenly 8 GB seems like very little.
Moreover, when you develop for a platform like iOS, you have very limited control on how fat your application will be.
I was really struggling to run all the stuff for web development on a 16gb dell xps. The database, text editor with plugins and browser all consumed too much. I don’t know why but I have even more running on my M1 MacBook and I have never bothered to check the ram and I don’t have my system crawling to a halt every day.
I think some of this is very smart and fast swapping because I do sometimes notice a tiny hang when I use a window I haven’t touched in a while.
I don’t know if the M1 is the best laptop, but it is the best I have used by a wide margin.
The memory needed to efficiently compile an application isn’t the same as the memory needed to run it.
> The memory needed to efficiently compile an application isn’t the same as the memory needed to run it.
This is becoming more and more false. See Firefox for example.
You need 8GB of memory to efficiently compile Firefox, but you can run it in a lot less than that. Fundamentally when compiling the application you aren't running the application, you're running the compiler. They are different pieces of software with independent resource requirements. Even if they happen to be similar, that's really just a coincidence.
A tradition for desktop software (mostly 3rd party) was to develop for "future hardware" from the moment you're developing, as Moore's law was a good rule of thumb to predict the future.
But for reasons RAM and CPU performance have kept stagnated without a lot of changes for around 10 years (until recently with Ryzen 5000, Intel 12th, and Apple M1).
Still I find funny you could buy a 16GB RAM Macbook 13" in 2012 as the top model... and today 16GB is still it's the max capacity of the line. To give a point of reference the 2002 Powerbook the Max RAM capacity was 1GB.
For comparison, the whole OrgPad.com server currently uses 8 GB of RAM and 2 vCPUs. Everybody of the developers has 24 GB+ of RAM with all the main ones having 32 GB of RAM. This is because production uses "just" nginx, JVM with the application server, PostgreSQL, MinIO and a deduplicating backup using borg. The screenshot server is an extra VM, it runs headless Google Chrome with puppeteer on a 1 vCPU, 2 GB of RAM node.
We developers use a desktop operating system and programs with a GUI. That by itself uses at least 1 GB of RAM extra. We have an IDE, like IntelliJ that uses about 3 GB of RAM on my machine, there are web browsers, known memory hogs. There are e-mail clients, video-chat apps and all of the production stuff besides backup too. Some of it is recompiling stuff live e.g. shadow-cljs. There is also Spotify or occasionally a video call or screen recording e.g. using OBS Studio. That uses even more memory temporarily. There is a healthy reserve with 32 GB of RAM to make the experience smooth but 16 GB doesn't cut it anymore.
So we use about half of the amount of RAM eBay used on their Sun E10ks in 1999 just for development and that is ok. The RAM costs perhaps 100 € now. The 50 € difference really doesn't cut it, if I am having hangs because of swap and the GCs doing their best everywhere.
But do you really need them all at the same time? Spotify and video call with screen recording at the same time? Screen recording and IDE at the same time? You can present your code without heavy IDEs if you are using screen recording for that. You can also replace Spotify with spotifyd for example, which takes 1/100 of the ram or less. I can also open all my apps at the same time and complain that there is no memory. It might be convenient to keep apps open on background, but you rarely need them at the same time. SSD:s are pretty fast in these days. Just open app when needed.
Yes, we need all of that at the same time. Having Spotify running or not doesn't really change the equation much. Not having an IDE when you want to show something in development or want to do a quick Hangout over some code is a bummer.
Every new tool changes your workflow and is a distraction that doesn't benefit the customer or our work-life balance anything. Some people on our team run Windows, because they feel at home there. I am not sure you can do a quick 'choco install spotifyd' and be done with it.
SSDs are fast, but switching windows is faster. People just forget, what a hassle it is to wait 5 seconds here, and 5 seconds there. We are used to hot-code reloading and basically instantaneous response like more than 95% of the time when developing. That is the power of Clojure/ ClojureScript. It eats a bit more RAM in development. We can live with that as long as the RAM fits in a normal laptop and doesn't cost a fortune. We save much, much more by not running on Heroku/ AWS. :-)
> Screen recording and IDE at the same time?
If I’m presenting some code, I’d prefer to do it in a familiar environment (i.e. my IDE of choice) — so that I don't get lost, so that I know the keyboard shortcuts, and don’t add more stress and fumbling to the presentation. Also, everyone on my team uses IntelliJ IDEA, so presenting with Code might add more confusion or might make it impossible to show IDEA-specific stuff (eg. run configurations or the debugger).
I dunno man, are you sure about this kind of supply side thinking? I'm much more inclined to assume demand side causality here. Which is to say: software has become "fat barges of bloat" because the /consumers/ (not producers) of said software are okay with it.
I hope you are aware that the world has evolved. We just don't run the compiler on bare metal anymore. Thanks to the latest UX, UI (GUI is obsolete), ML and AI advances the compiler runs in a web browser which runs in a VM, which runs in a container. And browsing the Slackware changelog is still a challenge on latest Edge /s
16GB isn't a lot when developing, but that doesn't mean your final code should need 16GB to run. My old desktop has 32GB and my laptop has 128GB, both at their motherboard limits.
Apple: "16GB is enough for reasonable people"
Every professor at uni: "Ten days is plenty of time to complete this project."
I’m not in the business of not using RAM. I’m in the business of making money. If you don’t want to use RAM, apply market pressure in that direction. The truth is, you have very little market power, so I genuinely don’t give a fuck.
I’m not interested in selling a $1.99 product to a guy with a $199 device. I’m interested in selling a $199 product to a guy with a $3999 device.
That’s the guy I’m going to listen to.
The problem is: you will be selling a $199 product to a guy with a $3999 device, but the product will be used on a $399 device because the guy who bought your product is not the same using it.
> I’m not interested in selling a $1.99 product to a guy with a $199 device. I’m interested in selling a $199 product to a guy with a $3999 device.
I love the way you're thinking (and presenting it)
> That’s the guy I’m going to listen to.
You and me both!
Saving a few dollars on RAM is penny wise, pound foolish.
The M1 air is absolutely the best _computer_ (not just laptop) I have ever used. I’ve used laptops since about the year 2000, and custom desktops throughout this time too. I built a Ryzen 3600 desktop computer in 2020 and a year later this ultra-thin just laptop blew it away. It was cheaper too and didn’t make noise OR heat! All tasks were quicker to accomplish: transcoding video, compiling programs, opening programs, the lot _BESIDES_ gaming. The battery life was the cherry on top… I mostly use emacs in my day and I can get _20 hours_ from a charge. It was unbelievable to finally have a nearly perfect machine which I could slip into my backpack and hardly notice.
The press on the Apple ARM processors is well deserved and I’ve also been throughly enjoying my new M1 Pro processor this week.
I own an M1 MBA with 16GB of RAM and it's a lovely little machine. I generally do SRE work and it works out ok for that for the most part. Intellij Idea, VS Code, a terminal and a browser.
Coming from an i9 16" mbp the difference in usability is massive, I hated those fans so much. I do miss the larger screen size though; that, coupled with the lack of ports (just 2 Type-C, would like more) made me consider getting one of the new 14". The battery life is so insanely overkill in this laptop that I'd be willing to compromise on it for a bigger display, perhaps for the first time ever.
Would highly, highly recommend it for traveling though.
> Would highly, highly recommend it for traveling though.
I travel with a thinkpad nano: smaller, lighter, and it has cellular wireless.
Yet it can be stuffed- I put a 2Tb NVMe (Sabrent) in mine, and I'm now waiting for larger NVMe to hit the market
I suppose it all falls down to OS preference more than anything. I've been using macOS for a long time and wouldn't feel comfortable with something else.
Owning an iPhone and iPad don't help escaping the walled garden for sure.
90% of the time is spent on browser and editor these days, the OS is not a major factor for me to choose computers anymore.
I admit that is true, but I tried switching to PopOS and failed.
Non-software:
- Trackpad quality
- Screen quality (I was given a 1080p 13 touchscreen. The res should be enough, the digitizer grid is bad)
- meme but no audio
- Not as great battery life
Software:
- so much shortcut switching. Will probably be better with time, but still a problem.
- No default menubar (I realize I can swap DE but.. ain't got time for that)
- no GPU accel on browsers. YouTube stuck on 720/1080.
- Karabiner is a hard luxury to give up, finally found kmonad. ~equal?
- having to find replacement for all my QoL utilites (Alfred, Amethyst, Caffeine/Amphetamine, Quick Look)
UI slowdowns, awkward multiple desktops, and general slowness is definitely a factor that made me leave Windows though.
We must live in a different world then. I love how windows works with different DPI screens, or remember the screen layouts. Add to that WSL and AHK, and I have no shame to say Windows is the OS I'm the most productive with (even if I'm trying Linux again during this long weekend! I want to give it a chance!)
The deal breaker: I love Wireless projection (Shortcut: Win-K) that works all the time, unlike filmsy HDMI / micro HDMI / DP cables that require switching the input on the screen.
Believe me or not, but at a recent meeting we had to use Zoom locally because none of the 3 laptops (2 macs, and my windows thinkpad) could reliably project to the big screen next to the drawboard.
A few specific issues I had all the time:
- Alt-Tab takes between 0.1 and 2 seconds to work, apparently at random. Windows are skipped, also at random, so I can't predict which window I'll get. I can't even switch to another window and back; most of the time, the order will get messed up.
- Bringing up the notification pane and the overview with all desktops sometimes works instantly, sometimes stutters and takes 3 seconds. The animations are always half-baked and awful to follow.
- Switching desktops with Ctrl+Win+Arrows sometimes seems to re-organize all windows at random, for no reason at all.
These issues usually happen a few weeks of heavy use after install. I don't have any third-party software that messes with Explorer. Re-installing fixes them for a few days or weeks. That was on a Dell XPS 15 with a Core i7 and 16GB of RAM, and it also happens with my larger, well-cooled workstation at the office.
macOS has its own multiple-desktop weirdnesses, but in general it's polished and works predictably on my M1 MBA. Windows was just painful to use, mostly because of the above issues. It seems like UI features have a _lot_ of inefficiencies that Microsoft can't or won't factor out.
Also, Windows 10 desktops can't be re-organized, which I imagine would be trivial to implement, and makes the macOS desktops much more practical.
Maybe your time…
I too have an X1 Nano and it’s a nice machine in a lot of ways — feels well built, light as a feather, looks nice, great matte 16:10 (!!) screen, great keyboard, and the trackpoint is excellent for mousing in space constricted settings (like planes). I too like that its storage can be expanded (though I wish the WWAN slot could be used for a second NVMe SSD, as it can on other machines).
The only complaint I have with it is battery life and its propensity for getting warm when doing anything even remotely demanding. I even went with the slower, lower power CPU and its battery life is still middling, and plugging it into an external display is enough to make it fire up its fans.
I wish I could swap its CPU out for something more efficient. Tiger Lake is supremely mediocre relative to current Ryzen and M-series offerings.
> though I wish the WWAN slot could be used for a second NVMe SSD, as it can on other machines
I'm working on that. 2 options: removing the whitelist from the BIOS, or hacking a NVMe firmware to impersonate a whitelisted WWAN (a small boottime delay to answer requests could also be sufficient)
The latter might be easier, as the firmware update seems to have more vulnerabilities. Also NVMe are cheaper in case I mess up and can't reflash with flashrom for a reason or another.
If anyone here works for a storage company and could make a firmware with a given PCI id or a 10 seconds delay before showing up on the bus, please get in touch!
> I wish I could swap its CPU out for something more efficient. Tiger Lake is supremely mediocre relative to current Ryzen and M-series offerings
Same, I want a X12 with an AMD, or a Xeon because even if the latter is a power hog, at least I'll have ECC!
Had a mac pro 32GB for job(use it like a desktop basically).
And a few Lenovo carbon x1 that can be carried around(including travel).
Carbon X1 is 14", which is large enough for real coding on the road, Nano is 13" seems a bit too small for coding to me though. I feel 14" is the perfect size for both daily work and travel.
I have a P1 Gen3 (because it's OLED and 4k) and while I love it at home or in the office, I don't like travelling with it. 14 is just too big.
I may be smaller than you but to me, 12" is the ideal for travelling as the laptop fits in my purse!
I'm thinking about getting the X12 detachable as my next travel computer, but I'm waiting for next year revision in the hope it'll get Xeon or AMD cpu options.
I got one and it’s so effortlessly fast, I can just open it and do stuff in one smooth motion
Coming from a 2014 MBP the trackpad is a HUGE downgrade (not nearly as responsive), macOS is for sure getting worse on each generation too. Other than these are pretty good.
A side note: I've been wanting to move to linux, the only thing stopping me is the local music management & mobile syncing workflow, if anyone have any recommendation
> the trackpad is a HUGE downgrade (not nearly as responsive)
Finally, someone pointed this out. But again majority of people aren't picky about keyboard, key distances or Trackpad for some reason.
Okay same situation right here. Seriously, if anyone has suggestions especially for the mobile syncing workflow with linux I would love to hear them. The seamless message-notiifcation experience with Apple is what holds me hostage.
Apple do this quite nicely. I don't have mac equipment, but I see the integration is good where the grass is greener. In linux it is replicated using the websites for messages(android)/whatsapp/telegram and signal has an app.
When I used to have an iphone (4s days), I loved the move to android (pixel) which gave me so much freedom. Cloud apps weren't restricted to Apple PCs (I think this has changed now), and I could just plug into my linux laptop and just drag and drop files to read on a journey. Also tethering always worked nicely on linux.
Have you tried KDE Connect? Despite the name, it doesn't require KDE
I just wish programs would stop randomly crashing all the time. I am not sure if it's connected to the dozen memory leaks or a consequence of those.
I have two MacBook Airs and I can't remember the last time a prgoram crashed unexpectedly. Either you have some kind of hardware problem, or some really dodgy software installed.
That must suck. Crashes are not part of my experience on either of my 16GB M1s nor on my 32GB M1 Max.
The only software that I found vexing is Docker, so I don't use it. This is major and is a good reason to be wary of these Macs. I left off Docker 6 months ago on M1. Things may be better now.
16GB was fine for a secondary machine. 32GB is now enough for me to be productive as a primary machine.
Uhh no docker sounds huge for a lot of companies
Docker works fine now. It took a few months for them to release an M1 update.
Just use the proper variant: podman
I see new Safari tabs crash and close immediately after I open them. It's been the most annoying issue, along with some pages rendering the first screen, and staying blank below for 2 minutes.
Lots of issues running docker and other strange segfaults in VM.
I think the price/performance ratio depends on whether your use case works with with the 8/256 or 8/512 configurations. The moment you need to go 16/512 or more, the price difference to 14" is not that large, especially considering the better display - with the first improvement in resolution since October 2012, so over 9 years.
I recently bought a 16/1TB Air over a similarly specced 14” Pro. I can see the advantages of the Pro, but the Air also has things going for it - smaller and lighter, better battery life, 100% silent in absolutely every use case, no moving parts. I love this computer. It’s without a doubt the best one I’ve ever owned, even better to me than the 2012 15” Pro was for it’s time.
However due to the bloat of modern software I might have to upgrade to a 32GB machine in the not so distant future. Hopefully there Air-class machines wont be limited to 16GB for too much longer.
I’ve had the 16” M1 MBP for a month and have yet to hear the fans. With my 16” i9 MBP the fans were a very regular thing for the same workload.
Fast, responsive, silent and incredible battery life make it easy to call it the best machine I’ve owned.
True for the pro at $1600, but 16/512 for Air is $1400 against $2000 for the 14". Is it worth $600 for 2x GPU, ~2X CPU, 2x Memory bandwith, MiniLED display + size, great speakers, etc? Probably. But that $600 can feed my family for a month, and the actual performance difference to my workload would probably be +25% instead.
I only have Vivaldi, MS Outlook and MS Teams open at the moment. Memory consumed? 7.49 Gb. What does that tell you? 8 Gb is good for web browsing, reading emails, and attending online meetings. If that's your typical workload then you're good to go.
The HN crowd knows the developer toolsets very well, so you all know 16 Gb is the _minimum_ you would allocate for a developer. The HN crowd may be less familiar with the toolsets of the so-called creatives, such as music production, video production or photography. Long story short, the advice in those communities is to not even think about running less than 16 Gb and seriously consider 32 Gb - especially if you're a professional. The video producers recommend getting as much RAM as you possibly can.
The MacBook Air M1 wasn't designed for those heavy-duty use cases, which is okay. That's why Apple has brought out the MacBook Pro with the M1 Pro and M1 Max. If you don't need it then don't buy it. Otherwise it's nice to have those options available.
The new air’s right around the corner, so hopefully some of those concerns will be addressed. I’m excited that Apple can now release yearly refreshes without hesitation. No more vendor roadmaps to rely upon. This should mean faster iteration and refinement for what customers want.
On a Mac you can click the trackpad anywhere, which is great. But all Windows trackpads have a hinge at the top, so you can’t click near the top. Why don’t any Windows laptops use a Mac-like design? Is there a patent?
Have all the compatibility issues for the M1 chip been solved? I've been holding out on buying one of these machines because I'm afraid I will have to waste tons of time hacking together work arounds to get my development setup working.
The hardware maybe yes but I really don't like macOS so it's a pass for me.
It's unfortunate it can't drive an 8K display, but good that it can do 6K.
For me, I see:
- No OLED screen
- Limited RAM
- No ECC option
- Fixed NVMe of a small size
- Can't have more than 1 NVMe for raid1
- No amd64 core for legacy software
It has advantages for some people, but not for me.
You have a very niche set of requirements there; few others care about the things you listed.
> You have a very niche set of requirements there; few others care about the things you listed.
What can I say, I have a refined palate for exquisite computers!
(but I'm surprised that on Hacker News of all places, my tastes are considered exotic!)
At the moment I have a thinkpad P1 Gen 3 as my main computer (Xeon, 64Gb of ECC, multiple 2Tb NVMe in RAID, OLED...) but I'll ditch it in a second if I can get an AMD equivalent (and preferably 128G of ECC).
Do you use the eraserhead on the Thinkpad? I ask because most users (including here) probably use the touchpad, and non-Mac touchpads are 10+ years behind and borderline unusable.
Actually, most of the time I don't, but the Lenovo trackpad is great.
I've tried a mac (because I like to keep my options opened) but I have found the trackpad to be only slightly better (mostly bigger), while the lack of trackpoint ("eraserhead") was painful
The trackpoint is nice once you figure out an acceleration curve that works for you. I am a huge fan.
It would be pretty amazing if that was available on Macs, but honestly, the touchpad is good enough that it's probably not needed (unlike most other laptops).
> non-Mac touchpads are 10+ years behind and borderline unusable
People keep saying this but somehow the rest of the world manages to use their touchpads just fine.
Others are now great but Apple's trackpad is still the best. Is there any other laptop trackpad that support click everywhere like Apple's Force Touch?
I'm not sure about force touch, but my windows running Lenovo x230's touch pad is very good.
Just a little different from the apple ones I use but really quite usable.
Trouble with PCs is you're comparing Apple kit with anything from cheap consumer gear by Asus, Dell, and HP with hilarious results.
Whilst they're all running similar chipsets the materials used are all very different.
If fact do any laptops have the glassy feel of Mac trackpads?
I know if I were going from my Lenovo and using a Mac, I'd have very negative feelings about the Mac, until I had three months continuous usage under my belt.
I suppose if I thought all touchpads were supposed to be scratchy, small, and imprecise, then I would manage just fine too—by plugging in a mouse.
The RAM is annoying but its a tradeoff for having it integrated into and shared by the CPU/GPU. macOS hasn't supported RAID boot for some years, but internally buying a larger storage configuration does get you more physical flash chips and increased bandwidth.
And Apple has Rosetta 2 translation which is extremely good and many applications beat Intel performance even when running translated. Having an Intel core would have defeated the cost and performance benefits of M1 and would make the cooling, power supply, etc. more expensive and larger.
> Having an Intel core would have defeated the cost and performance benefits of M1 and would make the cooling, power supply, etc. more expensive and larger
Not as an optional core that could be powered on when necessary.
The physical footprint of PSU and cooling would still have to be larger to accommodate the increased max wattage/heat, whether the chip is on at any given time or not. The chip would also need its own memory or some interface to access the main memory, which is built into the M1 chip. And, there would be a ton of firmware and microcode updates and stuff to keep track of and secure which was one reason they wanted to have their own chips.
From what I’ve heard, translated apps are nearly as fast as native. Even if not, the number of apps that need to be translated will only decrease with time.
I guess I have a hard time seeing how a dedicated full x64 core could possibly be a good use of transistors compared to the alternatives (more arm cores, more cache, fewer transistors/lower cost, lower power usage, etc).
Is OLED really great for computer use? I would imagine all of the static UI would make burn-in an issue
An OLED screen is a dealbreaker for me on anything but a smartphone. I don’t want risk of burnin anywhere near any large expensive display. I have laptops that are over a decade old and have seen a ton of use and their screens are still in great shape, which almost certainly wouldn’t be the case had they been equipped with OLED panels.
The moment that microLED displays start hitting the market however I will be buying them. All of the advantages of OLED without the drawbacks. Until then it’s IPS/VA for me.
I love working at night.
I'm flexible on every requirement but OLED for my non travel laptop.
No OLED = won't buy.
But the burn in...? You haven't gotten it? Or do you swap out device/screen when it happens?
IDGAF about burnin. I care about working at night, and the $ benefits of my extra productivity are greater than hardware replacement costs.
Also I have worldwide next day onsite warranty (the #1 reason I like Lenovo!), so if or when somethings becomes a problem, it will be solved quickly without even going to a repairshop let alone leaving my laptop there!!
If that warranty really works as advertised.. sounds like a great situation, and you get the OLED.
OLED is great yeah.
How long have you had it? I want one but I am scared of burn in. That's for a TV and a monitor
About 6 months. I use Windows 11 with no "fixed element" (ex: hidden taskbar, each app is in fullscreen) which may mitigate such issues.
I've checked recently with a pixelcheck tool and I couldn't see any alteration, but we'll see in a year or two!
Be sure to check back with us in 6 months, year at the most
I admire your trust in me, but put it in your calendar or something, because I'm sure I'll forget
What does working at night has to do with OLED?
Why would you waste all that money on an amd64 core when Rosetta runs x86-64 code faster?
Faster than what?
Than any previous x86-64 core in a Mac.
Not much of an achievement, tbh
That is a ridiculous thing to say.
Not really. Pre arm macbooks are kind of slow due to their CPUs throttling. Surely that shouldn't me news to you
They are designed for a specific thermal and power profile. So is the M1. This is not something that is changing. It is not relevant to the current discussion in any way whatsoever.
You have good taste!
I work a lot with Java 8 and Docker. Anyone has any hands on experience with these on M1? I am thinking about moving from MBP 2015 to MBA M1 16GB.
It depends on what you're doing with Docker...
Docker runs an arm64 Linux VM on an M1 Mac. My experience is that it works well for anything that has arm64 images available. It can also run x64 containers via QEMU emulation.
I found this to be hit-or-miss. I need the full version of MS SQL Server for one of the projects I work on. MS only supports x64 and the image won't start because of what looks like some kind of memory mapping/translation problem with the QEMU emulation.
I have the 16GB M1 Air. Haven’t done a lot of Java on it, but it seemed to work smoothly. Docker works okay but IO is really slow. I’m not sure if that is due to the new architecture, or if it’s just due to Mac running docker in a VM.
Is it possible to run Windows on these yet (in virtual)? That's the only thing stopping me from getting an M1.
You can use parallels to run Windows applications, but that's not really a "native" Windows environment. I love my M1 MacBook Pro that I was given for work but I absolutely _despise_ macOS.
Parallels on M1 can run only Arm64 Windows, not x86. So you can't use Visual Studio 2022. But VScode should be ok.
And no more Bootcamp.
I actually learned this about Parallels today and was pretty bummed out. Bootcamp would be nice; I was reading that some agreement between Qualcomm and Microsoft ending soon which might open up the doors, but who knows.
Could it be that you're used to the increasingly stupidity of windows?
Win -> Mac conversion is annoying I agree. But once you've shrugged off the expectation of windows behaviours you'll be fine.
Cognitive dissonance is a thing with user interfaces.
It's definitely "all in my head" how much I don't like macOS. The window management and way that it bunches together applications and windows in the task bar drives me crazy. Keyboard shortcuts are another pain point (I have a pretty intense Karabiner Elements setup to use the same keyboard in Windows and macOS)
I'm curious what you mean by the "increasing stupidity of windows"... I've been using Windows for personal use since 1995 and macOS for work use for about 7 years now and they honestly don't feel too different from how they've always been. Biggest change in Windows the past few years has been WSL and being able to use Linux tools "natively" which has been awesome. macOS is almost exactly the same as when I first started using it...
You've not noticed that windows has moved system settings dialogs each version since NT3.1?
Yeah, things move around, so what? macOS moves things around a lot, too.
Both OSes have built-in search capabilities for finding things like settings. I don't think I've manually clicked through a settings menu at the OS level for a long time.
Want to change bluetooth settings? On macOS, hit Cmd+Space and type in "bluetooth" then hit enter. On Windows, hit the Windows key and type in "bluetooth" then hit enter.
Long battery life and fanless quiet design are killer features imo but the 13" screen is a no-go for my aging eyes.
But have the MBA M1's been superseded by the M1 Pro and M1 Maxes?
The M1 Pro and Max just have more CPU and GPU cores and more RAM modules. The architecture and single-thread performance are the same. So, no, if you don’t need more cores or GPU performance, it’s just as good.
The larger form factor and the improved hardware design choices (no Touch Bar, et al) are just so sadly lacking from the M1, though.
id never get a device thats intentionally not designed to handle its own thermal output
FFS it’s the same thing minus s fan
Apple M1 unified memory architecture(published on November 11, 2020.) is my "warehouse/workshop model"(hardware architecture section published on February 06, 2019), It is an architecture supported by mathematical models, unlike "von Neumann architecture".
Reference
1. The Math-based Grand Unified Programming Theory: The Pure Function Pipeline Data Flow with Principle-based Warehouse/Workshop Model:
Its mathematical prototype is the simple, classic, vivid, and widely used in social production practice, elementary school mathematics "water input/output of the pool". My theory rebuilt the theoretical foundation of the IT industry, It makes the computer theory system fully & perfectly related to mathematics in a simple and unified way: from hardware integrated circuits and computer architecture, to software programming methodology, architecture, programming language and so on. It solve the most fundamental and core major problems in the IT industry: The foundation and core of the IT theory lack mathematical support.
https://github.com/linpengcheng/PurefunctionPipelineDataflow
2. Why my "warehouse/workshop model" can achieve high performance and low power consumption (take Apple M1 chip, Intel AVX-512, Qualcomm as examples)
https://github.com/linpengcheng/PurefunctionPipelineDataflow...
Betteridge's Law strikes again
https://en.wikipedia.org/wiki/Betteridge%27s_law_of_headline...
"...You can install a blank disk in your Mac..."
Wat ?
Lol 16gb not enough for dev? Get real