Extratone

hardware

by David Blue

Surface Laptop 2

Assuming Jesus Christ is in your thoughts this evening before yet another anniversary of his birth, I am infinitely astonished by the truth in what I’m about to suppose with you. If the Son of God was living today, most of us have agreed for a long time now that he’d use marijuana recreationally – big fuckin whoop. I think it’s far more interesting and appropriate (we all know his birthday was wholly reconfigured into a consumerist holiday long ago) to speculate on how he’d behave after finding himself inadvertently in the market for a new laptop within the ~$1000 range (following a stubbed toe whilst walking on water incident, perhaps.) Surely, it would not be entirely holy for him to opt in to the Foxconn-complicit world of Apple, Incorporated, nor the openly-blasphemous one created and exuberantly grown by Google LLC, and I’m afraid he’d be too much of an End User idiot to integrate any of the sparse Linux-dedicated hardware available. In May of 2017, however, Billy Gates’ old Microsoft finally released “the laptop we’ve always wanted them to make,” but could its recent update be truly worthy of our Lourde and Saviour? Or your newly-enrolled offspring? Should you sprint downstairs and swap out the new MacBook Air you just bought?

From an entirely valid perspective, an observer might declare my last two months of 2018 to be an outright shameful period defined by hypocrisy and traitorous betrayals. After finally taking the time to explore the full narrative surrounding Linux and the bloody tale of Microsoft’s cruel genocidal destruction of countless creative software projects throughout computing’s adolescence (see: “Embrace, Extend, Extinguish,”) I eventually declared myself “100% Open Source” and began outlining an essay designed primarily to express that Linux is finally ready to be the operating system of the people without succumbing immediately to the brand of cybercrackpot illegitimacy associated with the L-word in the minds of the general public so readily thanks to decades of misinformed, condescending neckbeards. Such a feat would require entire new planes of cultural awareness and dialectal delicacy, yet certainly result in zero personal reward from even the best possible outcome, yet I proceeded to ponder the subject very deliberately all the way through October because I genuinely believed in a new democratized future of computing. 2018 had been my Grand Awakening to the idyllic possibilities of Free and Open Source Software (FOSS) across the whole applied spectrum from office suites to social networks, yet – as two thousand eighteen comes to an end – I’ve managed to find myself among the most jaded, soul-sapped tech community I have yet encountered: Microsoft Administrators.

Complimenting this Linux-laden culture in which I was not so long ago deeply embedded was a confused and frustrated outlook regarding what I felt were excessive and completely idiotic sacrifices across the industry’s hardware design to the greedy, gluttonous god of Lightness. It seemed only reasonable to Myself As Consumer that the entire buying public should exclusively seek designs prioritizing greatest possible performance and battery life, even from portable computers and smartphones, so I assumed my perspective on this updated iteration of Microsoft's most laptopy Surface laptop – which exists in large part to compete directly with Apple's beloved (and just-updated) MacBook Air – wouldn't be at all useful. However, a few weeks ago, my employer prompted me with a sweet sweet ultimatum: for the sake of a tax break, I want to spend ~$1000 on a laptop for you as soon as possible. Yes, I know I should consider myself a very fortunate man – this wasn't even the first time I'd been surprised with the “hey, I want to buy you a laptop but it has to be today” experience, and may even be considered a sort of sequel to my Tales of Whirlwind Manic Consumerism, but it’s ultimately one of the most idiotic strategies to achieve a major purchase decision and completely inadvisable for anyone on a budget. Still I was indeed thankful to be put in a nearly-identical situation of Consumer Electronic haste, and have come to be especially appreciative of the specific time I was approached as such: just one week after Microsoft launched the Surface Laptop 2.

Considering the vast majority of its users are trapped inside my television, there’s no harm in covering the Surface brand with our virtual palm for a moment. If you’ll indulge me so, you’ll notice that Microsoft has actually delivered unto us The Laptop II – as in, the sequel... the successor to every other laptop computer yet conceived... but does this one machine truly represent the second coming of the Notebook Christ? Naturally, it would be a bit zealous to stand behind this extreme statement with 100% sincerity, but there truly are certain elements of this Personal Computing product's execution which do indeed will its user to expect and/or desire from others in coming years. As I've stated before, I also simply cannot help but be jazzed by such bravado from the mouths of even a company with as crooked and hateful history as Microsoft's. (Note: no other technology company has actually achieved what Microsoft historically has in this regard, and hopefully none ever will again.)

I must be honest: it hasn't yet been two months and I've already scuffed and perhaps even stained the beautiful maroon alcantara surrounding my machine’s touchable body, but it’s occurred to me that I might draw upon the vast library of automotive interior tutorials available on YouTube – and even purchase some of the alcantara care-specific products they recommend – in order to really maintain the exterior of the Laptop II. After all, alcantara was undeniably car culture’s material first. I should also confess that objectively, the Surface Laptop II is the best-suited computer for my personal uses that I’ve ever owned or used for any length of time. Subjectively, I don’t think all of the hardware design touches that make it so – like its keyboard layout, divine 3:2 aspect ratio, and particular I/O complement – have yet had the chance to seduce my emotional brain into truly loving it as much as I certainly should by any reasonable measure. For my own sake, I hope I’m able to fall in child-like infatuation with its magic, but in the interim, I believe the coldness of my heart should hopefully preserve any useful commentary I might have to add. Though this is undoubtedly the most timely review of a hardware product I’ve ever published, I’d still ask that you indulge my perspective suggesting the importance of considering it part of a package with its operating system, considering that the whole of tech media would’ve unanimously declared it the year’s “best laptop” were Apple’s aging, but still widely-adored MacOS absent from the frame.

I've tested a bunch of laptops this year, running the spectrum of 2-in-1s, Chromebooks, MacBooks, gaming laptops, etc. Everyone's needs are going to be different, which is why there's no such thing as a one-size-fits-all for laptops. But enthusiasts’ laptops aside, I strongly feel the Surface Laptop 2 is the best laptop of the year. And by that I mean the best laptop for most folks' needs.

With as much humility as possible, I must add that I myself am anything but “most folks,” yet my experience so far with the product has been one of astonishing compatibility and battery life. Using recommended power settings, the Surface Laptop 2 endured four hours and twenty-two minutes of a workload it wasn’t particularly designed for including heavy web browsing, image manipulation, brief audio/video capture with OBS, and moderate subsequent editing in Audacity and OpenShot. Dan Seifert – Vox Media’s “only Windows user” – reported “about seven hours” of Microsoft’s claimed 14.5, but frankly, I don’t know what any of y’all are doin – I’m just thankful this machine is a better marathoner than any other I can recall owning. While we’re on the subject, I consider Microsoft’s inclusion of a magnetically-attached power cable and unassuming auxiliary USB charging port on the attached power supply to be personal godsends – further evidence, even, that the Surface Laptop 2 was actually designed to be nice to use. For the sake of those readers actually in the market for a new laptop who’ve somehow found themselves here, though, Raymond Wong’s review for Mashable is the most thorough offering you’ll find – it’s quoted front and center on Microsoft’s web page for the Laptop II, even – but it’s important to mention that his critical comparative perspective predates the late launch of its ultimate competitor, the new MacBook Air. Rather pitifully, however, his colleague’s “good, but not great” resolution suggests that Apple failed to challenge Microsoft’s relatively moderate update enough to warrant any revision, and that Mashable as a publication stands by my new laptop’s Best of the Year title, for whatever it may or may not be worth to you.

If the new MacBook Air came in at the same price as the old one, it would be a steal. Sure, you pay for the privilege of being able to use macOS on the Apple ecosystem. But in years past that also meant access to cutting-edge features and design. As pretty as the MacBook Air is, there's nothing that innovative about it. In today's Apple, it seems, privilege amounts to just staying current.

You won’t find many others who regularly invest editorial merit in publishing 2500+ word laptop reviews anymore, which I’d concede is plenty reasonable in the Surface Laptop’s case, at least. Perhaps your first point of comparative entry should be a barely-dated conversation between Kara Swisher, Lauren Goode, and Dan Seifert on Too Embarrassed to Ask regarding the original’s odds of truly competing in the “premium laptop” segment (if you’d prefer to hear from those who struggle to take it seriously, that is.) Assuming the original product direction of the Surface line still stands, Microsoft doesn’t actually intend to sell at high volume, especially when it comes to this runt of the marque, which does not hesitate to omit itself from the popular discourse of the moment surrounding tablets as the future of all computing to which all of its siblings have contributed so much. Though I shall always remember my dearest Libel (the special edition Spectre x360 with which I built most of Extratone) with great respect and deep fondness – I think it’s even worth mounting on some sort of plinth – the significantly-cheaper Laptop II has already demonstrated true value in its “premium” segment bragging rights with far superior materials and build quality. If you’re looking for the prettiest possible slice of magnesium lightness but aren’t the sort to have followed the story of Microsoft’s first venture into personal computer production since it began in the last year of the Mayan calendar, it’s worth your while to read Joshua Topolsky’s projections of the project’s impact on the popular narrative surrounding Microsoft from history’s freshest possible perspective: the eve of the first Surface tablet’s launch.

The entire tablet was designed in-house by Microsoft's teams, and if you believe what was said in the presentation yesterday, design and functionality in hardware has suddenly become a big deal in Redmond. That's a big shift, and it's an important one. The announcement of the Surface shows that Microsoft is ready to make a break with its history — a history of hardware partnerships which relied on companies like Dell, HP, or Acer to actually bring its products to market. That may burn partners in the short term, but it could also give Microsoft something it desperately needs: a clear story.

A pungent stigma festered from Microsoft’s history of inadequate and inelegant public relations (especially compared to its greatest longtime rival) has remained in relentlessly obvious orbit around every “significant” Windows and Office update for so long that its status quo has grown into a truly inhibitive force for all parties involved. Topolsky is unquestionably a compromising favorite of mine, but it’s hard not to decry then-CEO Steve Ballmer’s failure to comprehend Josh’s day-after insight in the whole three months that passed before his Seattle Times interview in September, 2012. Ultimately, The Big M is either incapable of understanding any alternative utopic Visions of Computing to its own, or simply overwrought with the same counteraspirational carelessness its culture has always depended upon. In analytical terms regarding Ballmer’s utilization of the forum’s opportunity to finally tell the fucking story, at least, the timidity of a term like “pre-eminent software” as a viably bright new beacon in contrast with “people would say we were a software company” (emphasis mine) – as if Steve-O himself doesn’t even have the power to publicly describe his company’s function as its #1 man – combined at the apex of what was almost impressively-negligent behavior.

I think when you look forward, our core capability will be software, (but) you'll probably think of us more as a devices-and-services company. Which is a little different. Software powers devices and software powers these cloud services, but it's a different form of delivery...

Don’t make the same mistake I did and wear yourself out trying to extract the meaning from these three sentences – there’s none to be found. Ultimately, whatever opportunity the Surface project could have provided for Microsoft’s identity has been vastly overshadowed by its success as last resort supercatalyst to restore any sense of dignity and pride within the hardware companies who produce the vehicles. In Fall 2017, The Register quoted industry gossip regarding the company’s new CEO Satya Nadella and his intent to “exit the product line” because “overall they are not making money [and] it doesn’t make sense for them to be in this business,” but newcomers to this conversation should know that no subsequent reporting has corroborated anything but a sustaining future of the line, though the measurable rate of innovation in Microsoft’s products continues to leave much to be desired. Now that you’ve heard from the experts, though, allow me to expand our lens a bit and examine what the Surface Laptop 2’s existence suggests as per The Present & Future of Computing.

The Clam Clan

In case I’ve yet to mention it, all of my tech writing is in substantial debt to my much-older and child-oriented siblings for providing 8 nieces and nephews over the course of 11 years – if not for any reason but the perspective offered by the slightest observation of their day-to-day lives. In this profoundly bizarre and historic technological sprint our species is experiencing, the differences in their respective relationships with consumer tech as they’ve grown up are fascinatingly… disturbingly significant. My eldest niece Abby was born four years after myself in 1998, and her younger sister Amber just quite three years later in 2001. All three of us are Aquarians who went to the same public schools (aside from 2 exceptions on my part,) and the two sisters have been close, significant influences on each other all their lives, yet the way Abby and I use and think about computers differs significantly from Amber’s. Our first real PCs introduced an important social and intellectual vehicle to our pre-teen lives, and both of us still “live on” our machines as young adults. For us and many others from this short-lived microgeneration of ours, budget laptops like the Dell Inspirion 2200 (which served as the first “real computer” for both of us) introduced the internet and Being Online as a State of Being with AIM groups, MySpace, and Yahoo! chain mails before smartphones and tablets were capable of doing so.

Amber prefers to use her iPhone for most everything and regards her computer as a tool for work – it’s booted up and down exclusively for that purpose, which is significantly healthier than the habit Abby, myself, and many of my Online friends developed: we left our computers running and Logged On all the time because we were otherwise unreachable. We learned from origin to depend on them for 100% of our computing tasks – from streaming Pandora to playing Flash games within six billion open browser tabs – which likely explains both our ADD and its resulting influence on the ease with which our personal computers can distract us. As a Journalism student and professional photographer, Abby uses the new 15-inch MacBook Pro, and [Insane Blogger] David Blue has spent years looking for an alternative, becoming the first and only iPhone user to make extensive use of its Bluetooth keyboard support in the process, but both of us are entirely uninterested in the rest of the industry’s insistence on convertibles, removable keyboards, or ‘professional’ tablets. I wish the Linux community was finally ready to drop the elitist pretenses plaguing its nerdy history; I wish I could finally tell someone like Abby that a machine like the System76 Galago Pro could slot itself into her workflow without losing her time or compatibility – that the reputation surrounding Linux People had finally lost most of its validity and her desire to learn more about computing as a young woman and Power User would be met with respectful and worthwhile conversation from their end. Unfortunately, I’ve still found some of the Old Guard to be elitist, socially behind, and juvenilely possessive, as if computing was still the niche interest from their 1980s and 90s childhoods. Though this conversation certainly warrants its own essay in the future, I’ll just express now that it’s a real shame some folks don’t realize the entire point of making great things is ultimately to give them to the world.

The opportunity I’ve had in the past year to finally get my Linux distro frenzy over with and out of my system managed to both radicalize and democratize my understanding of MacOS, Windows, and Linux as they are in the present. While I had nothing better to do, fiddling with Ubuntu Studio and Linux Mint to the extent I did throughout Spring and Summer led me to further appreciate the value of keyboard shortcuts, gave me my first real proficiency with a command line, helped globalize my comprehension of my own technological privilege, reacquainted me in a huge way with both the true history of software and my own personal past as an experimental test tube baby of Microsoft’s, and helped to answer a lot of questions I’d worried over for years about why software seemed like it simply couldn’t improve anymore. While it’s true that important open source projects like ElementaryOS continue to sprout from the Linus Extended Universe and the growing Open Source community on Mastodon is filled with brilliant, helpful, unpretentious, and remarkably curious enthusiasts (probably because many of those I’ve interacted with so far are non-cis and/or non-white,) little ole me was able to stumble upon some totally unnecessary and excruciatingly ignorant sociopolitical commentary by way of the white, middle-age host and his undoubtedly-white and staunchly libertarian caller on a live broadcast of the Ask Noah Show. (It’s not as if I haven’t said ignorant and very ugly things too, but I wasn’t a forty-something father on a semi-professional talk show representing an entire community.)

Essentially, I was quite frustrated and disappointed to find that Linux is still let down most by its own community, but the operating system itself is still much further along on its way to becoming a real alternative for the average user than mainstream tech journalism would have you believe. However, in my case, finally taking the time to really learn about Open Source computing also helped me understand (surprisingly) why Apple and its environment continue to be the best and most popular choice for professional applications. Linux Mint gave me tremendous power in enabling me to alter, specify, and redesign the most minute details of its interface, but I couldn’t have foreseen how all-consuming such power would be for someone like myself. In retrospect, I’ve realized that I ended up spending more time perfecting my custom LibreOffice Writer shortcuts than I did actually writing – I somehow found myself in a mind state which justified unironically creating a shortcut for the Shortcuts menu. Though I swore I’d never succumb to the bewildering hobby of collecting and exploring different Linux Distributions, it took no time at all for me to fill a folder with disc images of the installers for almost a dozen different interpretations of the operating system after I’d made the simple concession to myself that I’ll just try Ubuntu, that’s all. The most profound realization from all this (arguably otherwise wasted) time: for a user like me, a walled garden is actually the best place to be productive because apparently, I don’t have the self-control to keep myself from running away and/or fixating on completely unproductive tasks without its boundaries. I think this phenomenon is perhaps the worst culprit in the persistence of the aforementioned divide between “computer people” and everyone else who simply uses computers, as I’m sure any one of the latter could tell you after all of five minutes with a Linus type.

The most comprehensive and somewhat-urgent revision to illustrate the significance of this contrast from my perspective regards the exceptional iOS/MacOS markdown-based notetaking app Bear. Frankly, my own “Word Processing Methodology” essay from June has already become problematically out of date (and therefore embarrassing) in terms of my own knowledge of the segment and its history. Though I promised the conversation was “done,” I’ve continued to explore further into word processing’s history as well as its current state. “I had a go at Bear’s free iOS experience and saw little functional difference from DayOne,” the old, negligent, cursory David Blue noted, but if I’d simply been willing to cough up a bit more time and just $1.49 a month for Bear Pro, I’d have spared myself such shame and realized that the hype around this app really is 100% justified. Bear is the most beautiful iOS app I’ve ever seen, but I’m now also fully qualified to declare it the most effective execution of “distraction-free” writing software to come along in the past 25 years. Developer Shiny Frog’s secret is their perfect balance between capability and simplicity. It turns out, Daily Content Lord Casey Newton’s word on this matter really was worth more than mine, not to mention more succinct: “Bear may look simple, but there’s power underneath the surface.”

Those longtime Linux and Windows diehards who’ve tolerated me thus far, listen up: MacOS may be ancient, neglected, and full of incongruencies, but its single-minded methodology paired with Apple’s iCloud really does make it the most effective and elegant environment for most people to simply get shit done. It’s clear that many of you have realized the importance of simplicity for compact and/or educational distributions, but let me just add that the democratization of Linux provides a gargantuan development opportunity to make something that beats MacOS at its own game without starting from such a shitty premise and all of its resulting compromises – all without detracting from any other technically-minded distributions whatsoever. That is the magic of The Distro, remember?! If you’ve existed in a similar state of confusion to that of my entire adult life regarding the appeal of Apple products – despite having once been an extensive OSX user, myself – you’re very welcome for the insight. Instead of paying me for the profound self-improvement I’ve just provided, try prioritizing this newfound knowledge the next time you talk to your MacBook Pro-loving friend about their workflow. If you’re like myself, you’ll find their arguments have magically transformed from the bewildering bullshit they’ve always seemed to be into challenges for future competing operating systems to surpass Apple’s old bitch and excel in because MacOS and even its much-younger iOS counterpart – as well as the billions of people who depend on them – desperately need real competition in order to maintain their viability, much less become what products of the world’s wealthiest company should be.

Yes, the manner in which these operating systems are perceived really is an important discussion prompted by a product as insignificant as the Surface Laptop 2 because as you read, the industry is bracing for another paradigm shift in computing, which many believe (preposterously, I might add) could be as significant and disruptive as 2007’s introduction of the iPhone. This machine of Microsoft’s and its “new” MacBook Air counterpart could potentially be the last designs to carry us to a computing future where the tried-and-true clamshell design is forgone entirely by the mainstream, but Apple’s release of this year’s new iPad Pro prompted even the most Cupertino-loving tech commentators to respond with genuine discord along with a few long-overdue shouts of “are you crazy?!” I’m very proud of The Verge’s Nilay Patel, in particular, for so eloquently deconstructing its usability for all but the very wealthy. “It is impossible to look at a device this powerful and expensive and not expect it to replace a laptop for day-to-day work,” he reminds us in the introduction to his full review of the updated product, along with a beautifully transient sentiment which I think we all needed to hear again: “I don’t think people should adapt to their computers. Computers should adapt to people.” Even something as consumerist and bourgeois as the introduction of another pricepoint-burgeoning Apple hardware flagship can turn a simple tablet review into a much-needed manifesto for a user-centric way forward for the industry, which is itself worthy of celebratory encouragement.

I’ve favored The Verge and its cast long past the point of excess throughout the span of my work about technology, but Nilay’s review and its accompanying episode of The Vergecast are truly special and profound gems of content that shouldn’t be passed up. Apparently – as the Editor-in-Chief immediately insists as the episode begins – his “ongoing theory” that “the more important you are, the less actually important work you do, and the more likely you are to be an iPad user” roused anger from “that whole class of [billionaires,]” but the experiences behind his argument actually suggest that Apple’s own favorite child of late – into which it has begun investing and thereby implicitly sponsoring over its much older brother as the ultimate heir of the majority’s future computing – has unequivocally failed to do its part in growing the iPad Pro into the “laptop replacement” we’d all heard so much about. Of iOS 12’s performance as an operating system beneath true work-related tasks, he exasperates “you have to spend all of your time figuring out how to do stuff instead of doing stuff,” which I couldn’t help but hear as echoes of my own late Linux lamentations. As thankful as I am to have finally achieved enlightenment of the Planet Apple, I’m afraid I was pitifully late: its very natural laws underwent their most brutal tests of the 21st century this past year. Now that I’ve finally come to adore the elegant effectiveness of a new generation of iOS apps like Bear, I’m faced with yet another of the episode’s statements of weight: “I think it’s time to stop pretending that the future of computing looks like Apple’s restrictions.” On the opposing end of the line, the world’s first trillion-dollar company’s other major product release of 2018 managed to disappoint even the most fanatical fans of its original operating system’s best-selling platform with an insultingly mediocre update to the MacBook Air marque upon which it once so fondly doted.

My best friend’s parents bought her the original Surface tablet when she enrolled in art school, and her frustration with its lackluster keyboard (among others) leads MacOS alternative-seeking users like us to wish Microsoft had started with a traditional design like the Surface Laptop first. Perhaps Apple and Microsoft’s emphasis on their tablets is nothing but a bit premature for the most current crop of users, and the rest of my nieces and nephews will expand upon an entirely different methodology of usership when they receive their freshman computer. Those elders of us who still take the Clamshell form seriously and love printing our documents are apparently facing a future industry saturated with products we can’t believe in, but it’s up to you to decide if this issue is worth expending your energy in advocacy for either camp. With my 120+ word per minute proficiency with physical keyboards, I for one have been completely bewildered by the iPad as anything but an indulgence for reading text on the web, and I’m pleased as punch with my Surface Laptop 2. Even if it proves to be the last new computer I’ll ever own to come as optimized for my use, I’m just grateful and astonished it happens to be the best yet.

#hardware #microsoft #future

by David Blue

iPhone 8 Plus and Dave

A decade of iPhone has probably ruined my life, but will the 8 Plus finally end it?

Is my True Tone bullshit on?

“True Tone” is so forgettable, everybody had to mention it first. Quite simply, it uses an ambient light sensor to fiddle with white balance, warming the colors of the display as an immediately-obvious whole, yes, but an interesting contrast to show off is no longer inherently justified in being called a “feature” in Apple products, anymore. Essentially, no matter who you ask (aside from Jon Rettinger,) you should not buy an iPhone 8, though I did last Fall, not only because I had to suddenly decide on a handset in less than 24 hours, but — if anything — to say goodbye to the form, the operating system, and the tech company which I have depended upon and carried with me virtually every day for my entire adult life. I’d originally decided to abandon this review due to a variety of unexpected circumstances, but Apple and its iPhone have maintained their place in the news with their battery scandal, and a third of a year with the 8 Plus has included some experiences which warrant a send-off before iOS 12 is released, making it (and myself) totally irrelevant forever.

As the longstanding benchmark of the smartphone industry’s state at any given time, the iPhone can be easy to reflect upon as a product once occupying a state of universal exemption from criticism, but it has, in fact, never been so. As Nilay Patel noted, one might regard the 8 as the last compromise of “basically four years” of the same design. Since launch, it’s unsurprisingly stayed a wee bit too far behind on the spreadsheets for most Android-type folks — not that I’ve ever believed them truthfully incapable of comprehending what it means to package a product, given where their greasy startups all eventually ended up. (You cannot doubt me — I once took a year-long sabbatical from iOS with a Sony Xperia Play, and my authority is absolute.) The rest are trying to decide whether or not to pay $200 more for “the phone of the future,” which knows when you’re watching it, and is only good for playing half an hour of stupid video games before it needs a charge.

So far, I have maintained that my first generation iPhone was the best handset of all time — one hell of an Email Machine that lasted me close to five years — throughout the last two with actual motherboard exposed to the elements in the corner of its cracked screen. That said, who knows how it’d feel to be coerced into using “iPhone OS 2” as it was called, then, for an entire workday in 2018? Two years prior to bringing home an 8 Plus, I vowed that my 6S Plus would be my last ever Apple device, but this one actually feels like a last hurrah. Though the ability to Tweet directly from the swipe-down notification menu is still nowhere to be found (it’s been gone for 5 releases, now, and would seem to have been forgotten by literally everyone but myself,) one gets the sense that Apple’s efforts to add to the iPhone 8 and iOS 11 were to make amends with us by settling a few debts.

In part, they did. Native apps got a major overhaul — including Mail, which was startling, considering that I’d been looking at what was near as makes no difference the same UI my eldest phone shipped with. As a result, it alone constitutes my benchmark for an email service, and I have been left without a clue as to what a good one looks like.(Apparently it was really bad?) Since time began, there has always been at least one alternative email app of the moment that tech journos refer to as the must-have, end-all replacement. Edison Mail is currently the smoother, faster, most modular option — at least for another few minutes– but I’ll never know it as I know Mail, and I’ll never want to. Playing around with experimental email apps is too scary. What if I decide once again to kill that massive number in the red badge and need to immediately mark 40,000 emails as read? It took all of my iPhone 4’s 1.0Ghz CPU and proprietary software over 18 hours — how am I supposed to trust a shabby little 6-month-old startup with such an important task? Anybody with a hundred bucks can make an app, you know.

Why is the App Store now the best-looking publishing software on iOS?

One might interpret the App Store’s redesign as an attempt by Apple to control this conversation — of both the trending new thing and the old“essentials” that you’ve probably had tucked away in an untouched folder for years. Technically, whoever the hell is writing those gorgeously-presented daily bits has made them a publishing company, though I’m not so sure I’m not the last remaining user who’s continued semi-regularly visiting their “Today” section. If I did want to actually read about apps (I don’t — who does?) it wouldn’t make much sense to seek critical reviews from the faceless boffins behind the platform itself, regardless of how much better it may look than all of the tech news sites, paywall or no.

Native screen recording could conceivably come in handy once or twice, but I see no reason why the red bar must remain at the top of the render, but it has, which could explain the total lack of any such video in the wild. Front-facing 4K, 60fps capture is impressive, but useless — vloggers all have GoPros or DSLRs, these days, and sharing through Snapchat and Instagram will always be ultra-compressed. (Here are two sloppy test clips — at the zoo, and fishing.)

Perhaps some have figured out the new Files “app,” but it’s sat on my homescreen for months, untapped, and it will likely remain there for all time as a sort of soothing trophy — a thanks for my legacy iPhone loyalty. My reward for half a lifetime of syncing, scrolling, and tolling? I can nowview some of the files on my Mobile Computing Device, and even scan documents in, which is mostly novel (though it is fun to digitize excerpts from physical text.) At some point, I must’ve mischecked a permanent option because all file types now open only in an app that does not recognize them. God bless.

Roof Photo

Somehow, I’ve managed to fill my social circle with precisely zero iOS-using folks. All of my friends and colleagues use Android devices(including Tim’s supercool Nextbit Robin,) which provide a few handy datapoints (like the camera in my fiance’s Galaxy S8,) but deprive me of any significant experience with the ostensibly intoxicating cult of iMessage. I’m constantly listening to and reading tech writers claim that it’s one of the only reasons they’re still using iPhones, but my own food-OS loving biome has forced me to find others, and frankly, I can’t imagine looking at the gluttonous palate of available mobile, cross-platform messaging services (Telegram, now Telegram X, WhatsApp, Signal, Snapchat, Facebook, Instagram, Twitter, Discord, Slack, Tinder?, Google Hangouts, Google Allo, Google Chat, Viber, Skype, Line, Wire, etc.) and thinking… well, none of this will do!

Honestly — even if I’d actually been at all informed in my haste, the photographic capabilities of the 8 Plus, alone would’ve sold it. It’s not the new filters, gif functionality, or even “3D Photos” — it’s those mythical dual 12MP sensors (which it shares with something called the iPhone X.) They’re no less than infallible. After four months of astonishing capturesin all manner of conditions, I don’t even care how exactly they do it anymore — it’s better to be left marveling. This first example was taken at Keystone, Colorado in the middle of a dark, cloudy Fall night — the amount of light they were able to find — “up to 80% more,” according to Apple — is just impossible.

The vast majority of the samples in my iPhone 8 Flickr Album were taken within the native Camera app as it ships and left unedited. (Especially before just a few weeks ago, when I discovered Halide.

favorites from the past few days.

Here is an unquestionably sensible progression from which iPhone has never wavered far since its fourth generation set the standard, but it’s one of an unfortunate few. Siri is still useless and silly apart from its“disable all alarms” feature and its ability to sound itself off in response when you’re hysterically screaming and digging for it through the vast plush of a forty-year-old Lincoln. The customizable Control Center makes toggling low power mode, orientation lock, wifi, and bluetooth less frustrating (note the last two aren’t quite hard switches,) though it should’ve come years ago. Notifications are slightly more sensible -certainly better than they were on Android Gingerbread, but I’ve heard things’ve changed quite a bit since then.

I have been tripped up by the lack of a 3.5mm audio jack a few times, but it just wouldn’t make sense from a hardware perspective, and the new external stereo capability should refute those who can’t or won’t understand. Yes, it would be nice if Apple hadn’t led the industry to quite such a compromising obsession with thinness — we’d all trade a lotof substance for exponentially greater battery life, storage capacity, water resistance, etc. — but I don’t see much sense in expending your energy holding up signs in Silicon Valley.

I’ll be here long after you’ve died, and you know why? Because I took the time to sync my apps.

Two years ago, a new generation of social apps and the preposterous notion of a quad-core CPU in my iPhone 6S Plus seemed like the harbinger of a world I no longer understood. Now, most of those services have expanded to the far boundaries of my reach, and I’ve stopped counting chips. Refinement of the hardware design is reverent to the extreme. It’s pretentious, but Apple’s decision to pause on the 8 to consider details like stuffing the legal text in the software and adding a little bit of weight back in for ergonomics’ sake leads one to regard it as a monument to all the devices along the development timeline that have led to this… last triumph. Or, it would have perhaps, had they not sold so many.

One could argue that good execution of consumer electronic design means minimizing as much as possible the obstructions in the way of the user completing any given task, and the iPhone 8 Plus has surpassed the vast majority of these for myself — and I am, surely, a “power user.” iOS has changed a lot in the decade I’ve employed it — in far too many ways for the worse — but this pair of handset and software have reached myimagination’s limit for what I could possibly want to do. Augmented reality and wireless charging won’t ever have a place in my future, for better or worse. Face ID is much too peculiar. Surely, this iPhone is the ultimate expression of the first and fourth generation’s foundation.

If the 6S Plus was indeed the key to my immortality, I’m afraid the 8 Plus heralds my imminent demise. Whether or not it’s an early one is for you to decide. This really is my last iPhone.

#ios #software #hardware #handsets #photography