We use Macs all day at Postlight, because that’s the law. But I am my own person at home, so I got a Huawei Mate X Pro laptop from the Microsoft Store and put Linux on it. My old Mac Laptop was about seven years old, mostly fine but starting to show signs of age.
I kept expecting to switch back to the Mac because, well, Linux on the Desktop is one of the longest-running jokes in tech. But installation was very easy and after a couple hours I had things the way I like them. I use a lot of web apps, and those run exactly the same. Some things work better, like search.
If I didn’t like it, it would be one button push to switch back. I have a monitor switch. A month went by, and now another. I keep forgetting to go back to the Mac. So here is my review of Ubuntu 18.10 Cosmic Cuttlefish: Sure, why not? It looks pretty good. Not as smooth as the Mac. You have to restart the desktop if you want to see certain changes, and you do that by hitting ALT-F2 and typing the letter “r.” That’s not the best thing ever but it does work. I don’t make that many changes. If you want to share files on a network you almost can but then you need to edit a configuration file. Even a person who does videos from inside the Linux community about why Linux sucks doesn’t have much to complain about.
Also: Whenever a Mac loses its marbles I end up having to Google “MDS process won’t stop” and then type some weird nonsense into the terminal. If you click your heels three times with Windows 10 you’ll find a bunch of system control panels that look like they belong to the last century. Every operating system is a batch-card processing retro-mess underneath. Linux makes this a virtue to be celebrated rather than a sin to be hidden. I appreciate that. It’s nice not to have to pretend that computers actually are good, or work.
Out of curiosity I recently booted up an emulator to run Mac OS 9. Yes, this means that I was running Linux to run an operating system from 1999. I hadn’t seen Mac OS X since the Clinton Presidency. It looks great! Runs fine. Has a snap to it. Apple design at its best. It made me wonder: What have we all been doing over the last 18 years? Blockchain? Drop shadows? In his recent annual wrap-up Bill Gates wrote about how differently he sees the world in his 60s than in his 20s:
Back then, an end-of-year assessment would amount to just one question: Is Microsoft software making the personal-computing dream come true?
Given the actual events of history that seems to be a bit of a simplification, but true at its essence—During the vast commercial wave when computers were becoming personal information utilities, Bill Gates could look back on any given year and ask himself, “did we make PCs more P and less C?” And whatever your thoughts on Windows, the answer was usually “sure.” Imagine that: A whole global product strategy was contained within a two-letter initialism. You didn’t even have to have the best product.
When the web and blogging came along it seemed like computing wasn’t just personal but it was social. When mobile computing came along it seemed like the final realization of that personal vision: Power in your pocket! But it was a weird alchemy, and we gave up control to get more power. I used to have my own blog and on good months it had tens of thousands of readers. I’d write essays and get emails in reply, but now I have 40,000 followers on Twitter, which doesn’t require me to organize my thoughts, and is occasionally overrun by actual Nazis. It’s a trade lots of people made. It wasn’t an upgrade. It just…was. And even mobile won’t last forever, as we’re learning now.
In 2019 we’ve replaced that old word “personal” with a lot of different words. I wonder what the equivalent big end-of-year questions you could ask yourself might be, today:
- Ecological Computing. Did our work make it more plausible that the global temperature will not rise to 3 degrees? (Sure, but no one seems to want to buy that.)
- Decentralized Computing. Did our software create a decentralized marketplace that replaces the need for a centralized authority with an immutable ledger of transactions? (Sadly, yes.)
- Cloud Computing. I mean, this is where the action is, right? Huge progress every minute across thousands of companies.
- Social Graph Computing. Did we continue to create a global village where everyone is connected? (Yes, but at what cost?)
- GPU Computing. Can we make use of the vast parallel processing power of custom hardware, like video cards, to do new things faster? (Absolutely, if you’ve got enough data and money.)
- Ethical Computing. Did we create systems that protect people who are vulnerable first? (Not really where the money is.)
- Physical Computing. Did we teach the car to drive itself? (Hmm.) Are our thermostats watching us enough?
- Web Computing. Did we further the dream of the shared web platform available to all? (Some steps forward, some back.)
- Collaborative Computing. There’s not a lot of this.
The Xerox Alto computer first came into existence around 44 years ago (I just saw one at the Smithsonian last week, it had terrible screen burn and you could almost make out what had been running). The Alto encoded many of the things we associate with personal computing—windows, mouse, bitmap display, etc. It was the huge influence on the Mac and Windows. One of the people who worked on it was Butler Lampson. In a 2006 oral history (Alan Kay interviewed him) Lampson wrote:
Today’s PC is about 30,000 times better than the Alto in every dimension. And it does maybe, what, 30 times as much, to be generous?…That leaves a factor of roughly a thousand to be accounted for. Where did it go? Well, you know, some of it went into things like internationalization, and interoperability, and this and that. But there is absolutely no doubt that most of the instructions that are being executed are not doing anything useful.
You’ve got a super-machine and most of the time it’s just sitting there. It could be going out onto the Internet and…finding you deals? Gathering interesting quotes for your blog? Sending spam? Scheduling your dental appointments? Searching for aliens?
You know who figured it out? Facebook, Google, Apple, Microsoft, and Amazon, who connect hundreds of thousands of servers together to make cloud computing environments, then charge for them. You know who else? Netflix and Spotify. Netflix knows when to spin up servers to meet demand and when to wind them down. These giant operators work to wring the maximum utility out of every piece of hardware. They’re orchestra conductors over thousands of machines.
Tuning Linux on the Desktop in 2019, you realize that desktops don’t matter as much when you can run Spotify, Netflix, Dropbox, and Google Docs everywhere. And 1Password, of course. Of course it’s not personal any more. Amazon and Google and the like are making amazing tools—smarter machine learning, better servers and databases—and licensing them out. What used to be “software” (or apps) is now services. And what used to be “personal computing” is now something like “utility computing”—or more simply, when you ask, where did that “factor of 1000” (maybe now 10,000) go? We have an answer: It went into the cloud.
The entire industry has moved to cloud-based thinking and we’ve picked up cloud imperatives: We think in transactions between people instead of thinking about augmenting and empowering humans. This is fair because no one pays for software, and fortunes must be made somewhere. Cloud computers, as Free Software people remind us, are other people’s computers. But even though PCs are unfashionable the word “personal” is still possible. My goal as a software person is to figure out ways to put “personal” back into the systems we discuss and build. “Efficient” or “slick” or “easy to deploy to AWS” are great things, but “empowering” and “gave me a feeling of real control” are even better.