Extratone

future

by David Blue

Surface Laptop 2

Assuming Jesus Christ is in your thoughts this evening before yet another anniversary of his birth, I am infinitely astonished by the truth in what I’m about to suppose with you. If the Son of God was living today, most of us have agreed for a long time now that he’d use marijuana recreationally – big fuckin whoop. I think it’s far more interesting and appropriate (we all know his birthday was wholly reconfigured into a consumerist holiday long ago) to speculate on how he’d behave after finding himself inadvertently in the market for a new laptop within the ~$1000 range (following a stubbed toe whilst walking on water incident, perhaps.) Surely, it would not be entirely holy for him to opt in to the Foxconn-complicit world of Apple, Incorporated, nor the openly-blasphemous one created and exuberantly grown by Google LLC, and I’m afraid he’d be too much of an End User idiot to integrate any of the sparse Linux-dedicated hardware available. In May of 2017, however, Billy Gates’ old Microsoft finally released “the laptop we’ve always wanted them to make,” but could its recent update be truly worthy of our Lourde and Saviour? Or your newly-enrolled offspring? Should you sprint downstairs and swap out the new MacBook Air you just bought?

From an entirely valid perspective, an observer might declare my last two months of 2018 to be an outright shameful period defined by hypocrisy and traitorous betrayals. After finally taking the time to explore the full narrative surrounding Linux and the bloody tale of Microsoft’s cruel genocidal destruction of countless creative software projects throughout computing’s adolescence (see: “Embrace, Extend, Extinguish,”) I eventually declared myself “100% Open Source” and began outlining an essay designed primarily to express that Linux is finally ready to be the operating system of the people without succumbing immediately to the brand of cybercrackpot illegitimacy associated with the L-word in the minds of the general public so readily thanks to decades of misinformed, condescending neckbeards. Such a feat would require entire new planes of cultural awareness and dialectal delicacy, yet certainly result in zero personal reward from even the best possible outcome, yet I proceeded to ponder the subject very deliberately all the way through October because I genuinely believed in a new democratized future of computing. 2018 had been my Grand Awakening to the idyllic possibilities of Free and Open Source Software (FOSS) across the whole applied spectrum from office suites to social networks, yet – as two thousand eighteen comes to an end – I’ve managed to find myself among the most jaded, soul-sapped tech community I have yet encountered: Microsoft Administrators.

Complimenting this Linux-laden culture in which I was not so long ago deeply embedded was a confused and frustrated outlook regarding what I felt were excessive and completely idiotic sacrifices across the industry’s hardware design to the greedy, gluttonous god of Lightness. It seemed only reasonable to Myself As Consumer that the entire buying public should exclusively seek designs prioritizing greatest possible performance and battery life, even from portable computers and smartphones, so I assumed my perspective on this updated iteration of Microsoft's most laptopy Surface laptop – which exists in large part to compete directly with Apple's beloved (and just-updated) MacBook Air – wouldn't be at all useful. However, a few weeks ago, my employer prompted me with a sweet sweet ultimatum: for the sake of a tax break, I want to spend ~$1000 on a laptop for you as soon as possible. Yes, I know I should consider myself a very fortunate man – this wasn't even the first time I'd been surprised with the “hey, I want to buy you a laptop but it has to be today” experience, and may even be considered a sort of sequel to my Tales of Whirlwind Manic Consumerism, but it’s ultimately one of the most idiotic strategies to achieve a major purchase decision and completely inadvisable for anyone on a budget. Still I was indeed thankful to be put in a nearly-identical situation of Consumer Electronic haste, and have come to be especially appreciative of the specific time I was approached as such: just one week after Microsoft launched the Surface Laptop 2.

Considering the vast majority of its users are trapped inside my television, there’s no harm in covering the Surface brand with our virtual palm for a moment. If you’ll indulge me so, you’ll notice that Microsoft has actually delivered unto us The Laptop II – as in, the sequel... the successor to every other laptop computer yet conceived... but does this one machine truly represent the second coming of the Notebook Christ? Naturally, it would be a bit zealous to stand behind this extreme statement with 100% sincerity, but there truly are certain elements of this Personal Computing product's execution which do indeed will its user to expect and/or desire from others in coming years. As I've stated before, I also simply cannot help but be jazzed by such bravado from the mouths of even a company with as crooked and hateful history as Microsoft's. (Note: no other technology company has actually achieved what Microsoft historically has in this regard, and hopefully none ever will again.)

I must be honest: it hasn't yet been two months and I've already scuffed and perhaps even stained the beautiful maroon alcantara surrounding my machine’s touchable body, but it’s occurred to me that I might draw upon the vast library of automotive interior tutorials available on YouTube – and even purchase some of the alcantara care-specific products they recommend – in order to really maintain the exterior of the Laptop II. After all, alcantara was undeniably car culture’s material first. I should also confess that objectively, the Surface Laptop II is the best-suited computer for my personal uses that I’ve ever owned or used for any length of time. Subjectively, I don’t think all of the hardware design touches that make it so – like its keyboard layout, divine 3:2 aspect ratio, and particular I/O complement – have yet had the chance to seduce my emotional brain into truly loving it as much as I certainly should by any reasonable measure. For my own sake, I hope I’m able to fall in child-like infatuation with its magic, but in the interim, I believe the coldness of my heart should hopefully preserve any useful commentary I might have to add. Though this is undoubtedly the most timely review of a hardware product I’ve ever published, I’d still ask that you indulge my perspective suggesting the importance of considering it part of a package with its operating system, considering that the whole of tech media would’ve unanimously declared it the year’s “best laptop” were Apple’s aging, but still widely-adored MacOS absent from the frame.

I've tested a bunch of laptops this year, running the spectrum of 2-in-1s, Chromebooks, MacBooks, gaming laptops, etc. Everyone's needs are going to be different, which is why there's no such thing as a one-size-fits-all for laptops. But enthusiasts’ laptops aside, I strongly feel the Surface Laptop 2 is the best laptop of the year. And by that I mean the best laptop for most folks' needs.

With as much humility as possible, I must add that I myself am anything but “most folks,” yet my experience so far with the product has been one of astonishing compatibility and battery life. Using recommended power settings, the Surface Laptop 2 endured four hours and twenty-two minutes of a workload it wasn’t particularly designed for including heavy web browsing, image manipulation, brief audio/video capture with OBS, and moderate subsequent editing in Audacity and OpenShot. Dan Seifert – Vox Media’s “only Windows user” – reported “about seven hours” of Microsoft’s claimed 14.5, but frankly, I don’t know what any of y’all are doin – I’m just thankful this machine is a better marathoner than any other I can recall owning. While we’re on the subject, I consider Microsoft’s inclusion of a magnetically-attached power cable and unassuming auxiliary USB charging port on the attached power supply to be personal godsends – further evidence, even, that the Surface Laptop 2 was actually designed to be nice to use. For the sake of those readers actually in the market for a new laptop who’ve somehow found themselves here, though, Raymond Wong’s review for Mashable is the most thorough offering you’ll find – it’s quoted front and center on Microsoft’s web page for the Laptop II, even – but it’s important to mention that his critical comparative perspective predates the late launch of its ultimate competitor, the new MacBook Air. Rather pitifully, however, his colleague’s “good, but not great” resolution suggests that Apple failed to challenge Microsoft’s relatively moderate update enough to warrant any revision, and that Mashable as a publication stands by my new laptop’s Best of the Year title, for whatever it may or may not be worth to you.

If the new MacBook Air came in at the same price as the old one, it would be a steal. Sure, you pay for the privilege of being able to use macOS on the Apple ecosystem. But in years past that also meant access to cutting-edge features and design. As pretty as the MacBook Air is, there's nothing that innovative about it. In today's Apple, it seems, privilege amounts to just staying current.

You won’t find many others who regularly invest editorial merit in publishing 2500+ word laptop reviews anymore, which I’d concede is plenty reasonable in the Surface Laptop’s case, at least. Perhaps your first point of comparative entry should be a barely-dated conversation between Kara Swisher, Lauren Goode, and Dan Seifert on Too Embarrassed to Ask regarding the original’s odds of truly competing in the “premium laptop” segment (if you’d prefer to hear from those who struggle to take it seriously, that is.) Assuming the original product direction of the Surface line still stands, Microsoft doesn’t actually intend to sell at high volume, especially when it comes to this runt of the marque, which does not hesitate to omit itself from the popular discourse of the moment surrounding tablets as the future of all computing to which all of its siblings have contributed so much. Though I shall always remember my dearest Libel (the special edition Spectre x360 with which I built most of Extratone) with great respect and deep fondness – I think it’s even worth mounting on some sort of plinth – the significantly-cheaper Laptop II has already demonstrated true value in its “premium” segment bragging rights with far superior materials and build quality. If you’re looking for the prettiest possible slice of magnesium lightness but aren’t the sort to have followed the story of Microsoft’s first venture into personal computer production since it began in the last year of the Mayan calendar, it’s worth your while to read Joshua Topolsky’s projections of the project’s impact on the popular narrative surrounding Microsoft from history’s freshest possible perspective: the eve of the first Surface tablet’s launch.

The entire tablet was designed in-house by Microsoft's teams, and if you believe what was said in the presentation yesterday, design and functionality in hardware has suddenly become a big deal in Redmond. That's a big shift, and it's an important one. The announcement of the Surface shows that Microsoft is ready to make a break with its history — a history of hardware partnerships which relied on companies like Dell, HP, or Acer to actually bring its products to market. That may burn partners in the short term, but it could also give Microsoft something it desperately needs: a clear story.

A pungent stigma festered from Microsoft’s history of inadequate and inelegant public relations (especially compared to its greatest longtime rival) has remained in relentlessly obvious orbit around every “significant” Windows and Office update for so long that its status quo has grown into a truly inhibitive force for all parties involved. Topolsky is unquestionably a compromising favorite of mine, but it’s hard not to decry then-CEO Steve Ballmer’s failure to comprehend Josh’s day-after insight in the whole three months that passed before his Seattle Times interview in September, 2012. Ultimately, The Big M is either incapable of understanding any alternative utopic Visions of Computing to its own, or simply overwrought with the same counteraspirational carelessness its culture has always depended upon. In analytical terms regarding Ballmer’s utilization of the forum’s opportunity to finally tell the fucking story, at least, the timidity of a term like “pre-eminent software” as a viably bright new beacon in contrast with “people would say we were a software company” (emphasis mine) – as if Steve-O himself doesn’t even have the power to publicly describe his company’s function as its #1 man – combined at the apex of what was almost impressively-negligent behavior.

I think when you look forward, our core capability will be software, (but) you'll probably think of us more as a devices-and-services company. Which is a little different. Software powers devices and software powers these cloud services, but it's a different form of delivery...

Don’t make the same mistake I did and wear yourself out trying to extract the meaning from these three sentences – there’s none to be found. Ultimately, whatever opportunity the Surface project could have provided for Microsoft’s identity has been vastly overshadowed by its success as last resort supercatalyst to restore any sense of dignity and pride within the hardware companies who produce the vehicles. In Fall 2017, The Register quoted industry gossip regarding the company’s new CEO Satya Nadella and his intent to “exit the product line” because “overall they are not making money [and] it doesn’t make sense for them to be in this business,” but newcomers to this conversation should know that no subsequent reporting has corroborated anything but a sustaining future of the line, though the measurable rate of innovation in Microsoft’s products continues to leave much to be desired. Now that you’ve heard from the experts, though, allow me to expand our lens a bit and examine what the Surface Laptop 2’s existence suggests as per The Present & Future of Computing.

The Clam Clan

In case I’ve yet to mention it, all of my tech writing is in substantial debt to my much-older and child-oriented siblings for providing 8 nieces and nephews over the course of 11 years – if not for any reason but the perspective offered by the slightest observation of their day-to-day lives. In this profoundly bizarre and historic technological sprint our species is experiencing, the differences in their respective relationships with consumer tech as they’ve grown up are fascinatingly… disturbingly significant. My eldest niece Abby was born four years after myself in 1998, and her younger sister Amber just quite three years later in 2001. All three of us are Aquarians who went to the same public schools (aside from 2 exceptions on my part,) and the two sisters have been close, significant influences on each other all their lives, yet the way Abby and I use and think about computers differs significantly from Amber’s. Our first real PCs introduced an important social and intellectual vehicle to our pre-teen lives, and both of us still “live on” our machines as young adults. For us and many others from this short-lived microgeneration of ours, budget laptops like the Dell Inspirion 2200 (which served as the first “real computer” for both of us) introduced the internet and Being Online as a State of Being with AIM groups, MySpace, and Yahoo! chain mails before smartphones and tablets were capable of doing so.

Amber prefers to use her iPhone for most everything and regards her computer as a tool for work – it’s booted up and down exclusively for that purpose, which is significantly healthier than the habit Abby, myself, and many of my Online friends developed: we left our computers running and Logged On all the time because we were otherwise unreachable. We learned from origin to depend on them for 100% of our computing tasks – from streaming Pandora to playing Flash games within six billion open browser tabs – which likely explains both our ADD and its resulting influence on the ease with which our personal computers can distract us. As a Journalism student and professional photographer, Abby uses the new 15-inch MacBook Pro, and [Insane Blogger] David Blue has spent years looking for an alternative, becoming the first and only iPhone user to make extensive use of its Bluetooth keyboard support in the process, but both of us are entirely uninterested in the rest of the industry’s insistence on convertibles, removable keyboards, or ‘professional’ tablets. I wish the Linux community was finally ready to drop the elitist pretenses plaguing its nerdy history; I wish I could finally tell someone like Abby that a machine like the System76 Galago Pro could slot itself into her workflow without losing her time or compatibility – that the reputation surrounding Linux People had finally lost most of its validity and her desire to learn more about computing as a young woman and Power User would be met with respectful and worthwhile conversation from their end. Unfortunately, I’ve still found some of the Old Guard to be elitist, socially behind, and juvenilely possessive, as if computing was still the niche interest from their 1980s and 90s childhoods. Though this conversation certainly warrants its own essay in the future, I’ll just express now that it’s a real shame some folks don’t realize the entire point of making great things is ultimately to give them to the world.

The opportunity I’ve had in the past year to finally get my Linux distro frenzy over with and out of my system managed to both radicalize and democratize my understanding of MacOS, Windows, and Linux as they are in the present. While I had nothing better to do, fiddling with Ubuntu Studio and Linux Mint to the extent I did throughout Spring and Summer led me to further appreciate the value of keyboard shortcuts, gave me my first real proficiency with a command line, helped globalize my comprehension of my own technological privilege, reacquainted me in a huge way with both the true history of software and my own personal past as an experimental test tube baby of Microsoft’s, and helped to answer a lot of questions I’d worried over for years about why software seemed like it simply couldn’t improve anymore. While it’s true that important open source projects like ElementaryOS continue to sprout from the Linus Extended Universe and the growing Open Source community on Mastodon is filled with brilliant, helpful, unpretentious, and remarkably curious enthusiasts (probably because many of those I’ve interacted with so far are non-cis and/or non-white,) little ole me was able to stumble upon some totally unnecessary and excruciatingly ignorant sociopolitical commentary by way of the white, middle-age host and his undoubtedly-white and staunchly libertarian caller on a live broadcast of the Ask Noah Show. (It’s not as if I haven’t said ignorant and very ugly things too, but I wasn’t a forty-something father on a semi-professional talk show representing an entire community.)

Essentially, I was quite frustrated and disappointed to find that Linux is still let down most by its own community, but the operating system itself is still much further along on its way to becoming a real alternative for the average user than mainstream tech journalism would have you believe. However, in my case, finally taking the time to really learn about Open Source computing also helped me understand (surprisingly) why Apple and its environment continue to be the best and most popular choice for professional applications. Linux Mint gave me tremendous power in enabling me to alter, specify, and redesign the most minute details of its interface, but I couldn’t have foreseen how all-consuming such power would be for someone like myself. In retrospect, I’ve realized that I ended up spending more time perfecting my custom LibreOffice Writer shortcuts than I did actually writing – I somehow found myself in a mind state which justified unironically creating a shortcut for the Shortcuts menu. Though I swore I’d never succumb to the bewildering hobby of collecting and exploring different Linux Distributions, it took no time at all for me to fill a folder with disc images of the installers for almost a dozen different interpretations of the operating system after I’d made the simple concession to myself that I’ll just try Ubuntu, that’s all. The most profound realization from all this (arguably otherwise wasted) time: for a user like me, a walled garden is actually the best place to be productive because apparently, I don’t have the self-control to keep myself from running away and/or fixating on completely unproductive tasks without its boundaries. I think this phenomenon is perhaps the worst culprit in the persistence of the aforementioned divide between “computer people” and everyone else who simply uses computers, as I’m sure any one of the latter could tell you after all of five minutes with a Linus type.

The most comprehensive and somewhat-urgent revision to illustrate the significance of this contrast from my perspective regards the exceptional iOS/MacOS markdown-based notetaking app Bear. Frankly, my own “Word Processing Methodology” essay from June has already become problematically out of date (and therefore embarrassing) in terms of my own knowledge of the segment and its history. Though I promised the conversation was “done,” I’ve continued to explore further into word processing’s history as well as its current state. “I had a go at Bear’s free iOS experience and saw little functional difference from DayOne,” the old, negligent, cursory David Blue noted, but if I’d simply been willing to cough up a bit more time and just $1.49 a month for Bear Pro, I’d have spared myself such shame and realized that the hype around this app really is 100% justified. Bear is the most beautiful iOS app I’ve ever seen, but I’m now also fully qualified to declare it the most effective execution of “distraction-free” writing software to come along in the past 25 years. Developer Shiny Frog’s secret is their perfect balance between capability and simplicity. It turns out, Daily Content Lord Casey Newton’s word on this matter really was worth more than mine, not to mention more succinct: “Bear may look simple, but there’s power underneath the surface.”

Those longtime Linux and Windows diehards who’ve tolerated me thus far, listen up: MacOS may be ancient, neglected, and full of incongruencies, but its single-minded methodology paired with Apple’s iCloud really does make it the most effective and elegant environment for most people to simply get shit done. It’s clear that many of you have realized the importance of simplicity for compact and/or educational distributions, but let me just add that the democratization of Linux provides a gargantuan development opportunity to make something that beats MacOS at its own game without starting from such a shitty premise and all of its resulting compromises – all without detracting from any other technically-minded distributions whatsoever. That is the magic of The Distro, remember?! If you’ve existed in a similar state of confusion to that of my entire adult life regarding the appeal of Apple products – despite having once been an extensive OSX user, myself – you’re very welcome for the insight. Instead of paying me for the profound self-improvement I’ve just provided, try prioritizing this newfound knowledge the next time you talk to your MacBook Pro-loving friend about their workflow. If you’re like myself, you’ll find their arguments have magically transformed from the bewildering bullshit they’ve always seemed to be into challenges for future competing operating systems to surpass Apple’s old bitch and excel in because MacOS and even its much-younger iOS counterpart – as well as the billions of people who depend on them – desperately need real competition in order to maintain their viability, much less become what products of the world’s wealthiest company should be.

Yes, the manner in which these operating systems are perceived really is an important discussion prompted by a product as insignificant as the Surface Laptop 2 because as you read, the industry is bracing for another paradigm shift in computing, which many believe (preposterously, I might add) could be as significant and disruptive as 2007’s introduction of the iPhone. This machine of Microsoft’s and its “new” MacBook Air counterpart could potentially be the last designs to carry us to a computing future where the tried-and-true clamshell design is forgone entirely by the mainstream, but Apple’s release of this year’s new iPad Pro prompted even the most Cupertino-loving tech commentators to respond with genuine discord along with a few long-overdue shouts of “are you crazy?!” I’m very proud of The Verge’s Nilay Patel, in particular, for so eloquently deconstructing its usability for all but the very wealthy. “It is impossible to look at a device this powerful and expensive and not expect it to replace a laptop for day-to-day work,” he reminds us in the introduction to his full review of the updated product, along with a beautifully transient sentiment which I think we all needed to hear again: “I don’t think people should adapt to their computers. Computers should adapt to people.” Even something as consumerist and bourgeois as the introduction of another pricepoint-burgeoning Apple hardware flagship can turn a simple tablet review into a much-needed manifesto for a user-centric way forward for the industry, which is itself worthy of celebratory encouragement.

I’ve favored The Verge and its cast long past the point of excess throughout the span of my work about technology, but Nilay’s review and its accompanying episode of The Vergecast are truly special and profound gems of content that shouldn’t be passed up. Apparently – as the Editor-in-Chief immediately insists as the episode begins – his “ongoing theory” that “the more important you are, the less actually important work you do, and the more likely you are to be an iPad user” roused anger from “that whole class of [billionaires,]” but the experiences behind his argument actually suggest that Apple’s own favorite child of late – into which it has begun investing and thereby implicitly sponsoring over its much older brother as the ultimate heir of the majority’s future computing – has unequivocally failed to do its part in growing the iPad Pro into the “laptop replacement” we’d all heard so much about. Of iOS 12’s performance as an operating system beneath true work-related tasks, he exasperates “you have to spend all of your time figuring out how to do stuff instead of doing stuff,” which I couldn’t help but hear as echoes of my own late Linux lamentations. As thankful as I am to have finally achieved enlightenment of the Planet Apple, I’m afraid I was pitifully late: its very natural laws underwent their most brutal tests of the 21st century this past year. Now that I’ve finally come to adore the elegant effectiveness of a new generation of iOS apps like Bear, I’m faced with yet another of the episode’s statements of weight: “I think it’s time to stop pretending that the future of computing looks like Apple’s restrictions.” On the opposing end of the line, the world’s first trillion-dollar company’s other major product release of 2018 managed to disappoint even the most fanatical fans of its original operating system’s best-selling platform with an insultingly mediocre update to the MacBook Air marque upon which it once so fondly doted.

My best friend’s parents bought her the original Surface tablet when she enrolled in art school, and her frustration with its lackluster keyboard (among others) leads MacOS alternative-seeking users like us to wish Microsoft had started with a traditional design like the Surface Laptop first. Perhaps Apple and Microsoft’s emphasis on their tablets is nothing but a bit premature for the most current crop of users, and the rest of my nieces and nephews will expand upon an entirely different methodology of usership when they receive their freshman computer. Those elders of us who still take the Clamshell form seriously and love printing our documents are apparently facing a future industry saturated with products we can’t believe in, but it’s up to you to decide if this issue is worth expending your energy in advocacy for either camp. With my 120+ word per minute proficiency with physical keyboards, I for one have been completely bewildered by the iPad as anything but an indulgence for reading text on the web, and I’m pleased as punch with my Surface Laptop 2. Even if it proves to be the last new computer I’ll ever own to come as optimized for my use, I’m just grateful and astonished it happens to be the best yet.

#hardware #microsoft #future

by David Blue

Siri Shortcuts

Apple's latest mobile OS update might've seemed mundane, but Siri Shortcuts gives users vastly more power than Apple customers have ever before experienced.

Back in 2016, Pokémon Go, overclocked Apple Watches, pissing wearables, and What You See is What You Get blogging services all claimed unprecedented casualties among consumers according to Futureland's iOS 10 episode, which we did our absolute best to dramatize in order to survive what was expected to be the dullest event on record. We'd only that day been first made aware of Boomerang photos and the mysterious nature of “Live Blogging” as an occupation. AirPods were introduced and subsequently shit on, and the comparatively archaic 3.5mm analog audio jack was confidently parted with, finally. At least I got over “forgetting” about Live Photos because it's rapidly becoming difficult to keep stuff on the phone now. I am coming sincerely close to believing none of this is real, anyway. Today, though, it’s a damned straight ballgame, isn’t it? Months have passed since Apple pushed out its major mobile OS release of the year to more little rectangular computers than any one person could speedcount in a lifetime and YouTube is already recommending me dozens of videos about the next one. At this point, you and I are already aware of the iOS development community, who has already been using Internet Operating System 12 on their personal devices for more than half a year by the time your irises are landing here. Hopefully, all but two or three stranded, dying explorers in the arctic have updated their iPhones and iPads by now, and why wouldn’t they?

Our expectations from this ritual are completely alien compared to those we’d need to anticipate from the event 5 or 6 releases ago, when one’s phone had to be sent away (in a sense) to latch itself tight to the stability of a desktop-class product in order to undergo a lengthy, destined metamorphosis. Sometimes, backups via 30-pin to USB-2.0 cables took hours, after which the custodian may or may not find their companion’s replication had completed successfully. If it had, one had to be sure to close any applications apart from iTunes to provide a working environment of utter silence – restarting after finishing the download was my own preferred method – before entrusting the despicably unreliable software to whittle away in a sometimes frantically rebooting, feverish procedure with near life-threatening stigma: it wasn’t uncommon for an update to inexplicably fail, “bricking” the subject iPhone and requiring that one take two whole steps in the wrong direction and restore it from the entire backup they’d just created (hopefully) in order to… make another, precisely-identical attempt, for lack of variables or alternatives to the process. However, if the user planned sufficiently and made a point to begin the whole charade immediately upon arriving home for the evening, these potential frustrations could be compensated for, and odds would favor counting on their smartphone to emerge safe and sound from the procedure just before bed, when even those holding the second-newest product in the lineage would have just enough screen time to notice that text entry, web page loading, and window management had noticeably slowed before sighing and tossing their device toward the darkness.

THE CHER TWEET EMBED

These days, one would need to try very hard to be inconvenienced by iOS updates. My iPhone 8 Plus is two or three times more powerful than my laptop at the moment, and my new friends’ WiFi connection is better than what the State government uses internally, back home. I haven’t needed to physically back it up more than once or twice since I bought it — iCloud stores the lot for $4.99­ a month anyway. I blinked once watching Riki-Oh with high school friends some time ago and all of the sudden, a 1.6GB download isn’t really a big deal. Siddown and watch your Instagram stories for twenty minutes, and hey! You’re ready to update! Somehow, I have abruptly found myself in a reality in which I am the obvious bottleneck and my 100 words per minute on a smartphone keyboard, even, is no longer fast enough: my fucking phone is now waiting on me when it updates. The keyholder is the whole goddamned holdup.

So, what possible purpose could there be in pounding out this “Review” of a free software update that’s in no way optional (waiting a month is no longer a rational minimization of risk — it’s just dumb,) not any more difficult to attain than the bills currently waiting in your mailbox, nor allowed by the nature of mobile operating systems to compete with any cross-platform alternatives? For myself, it’s proved a gratifying tradition of sorts and a good use of my apparently-abundant time if only for the record's sake (hello, future web archivists, neohuman and otherwise!,) but this release – assuming I haven’t overlooked something – is the most globe-shucking of all because of one single featureset: Siri Shortcuts. However, the v ast majority of the intra-Apple press' coverage of this release has come across nearly as unconcerned with them as I was originally. Take Macworld's iOS 12 Review, for instance: it was the first result in my Google search for “iOS 12 review,” yet Siri Shortcuts are only mentioned in the bottom quarter of its first page. When I recorded the “iOS 12 Review” episode of my “podcast,” I spoke as if I was somehow the only person on the planet who comprehends the profound implications of this software addition – which was, of course, more of an absorbent acquisition – but I have since discovered one gem, at least, which has continued the conversation in a most superb manner. It's a technology podcast called Supercomputer, and it's hosted by Alex Cox and Matthew Cassinelli – the latter of whom developed a significant amount of the iOS app Workflow (and wrote most or all of its documentation, apparently,) which Apple assimilated as Siri Shortcuts. Both are extremely knowledgeable and competent commentators on – as far as I can hear, at least – virtually the entire iOS *lifestyle*. (For those on the outside who've never stepped in: laugh if you must, but yes it is a lifestyle, still, and it's new thought leader isn't exactly coming up short these days.) iOS is technically software, yes, but it leaves an intractable itch for some greater, transcendent term.

WORDPRESS GUTENBERG / SIRI SHORTCUTS IN MY PAJAMAS

In just forty minutes, without any prior knowledge about this feature, I was able to create a Shortcut which sends any given handset's IP address and precise GPS location (among other mundane metrics) in a text message to my phone number. I could share this shortcut among my other submissions to Sharecuts or ShortcutsGallery.com, where any iOS user could download and subsequently send this information back to my phone. (Don't believe me? Have a go at it yourself and I'll send back a screenshot if you'd like.) I accomplished this without any particular skills or education in software development or cybersecurity – without any real malice, even – I was just playing around. As far as my recollection goes, Apple has never included such a powerful, potentially-dangerous piece of software in a standard software update before. It's both absolutely brilliant and sortof a ripoff to be so entrusted for the first time. In many ways – like my Disable Bluetooth & WiFi shortcut – Siri Shortcuts represent an awfully half-assed solution to some of the most basic, longtime incongruencies within iOS. Sure, it's great that I can just make myself a shortcut to completely disable my phone's WiFi and Bluetooth activity with one press or Siri command (combining “type to Siri” with Siri Shortcuts basically enables a form of Command Line functionality in iOS,) but frankly, one should've expected the world's largest company to do it themselves in perhaps the second of third version of this operating system instead of saying okay, here are the tools – you do it! in its twelfth.

I've found it inevitable when speaking on iOS to avoid discussing the other literature available on the subject at any given time. The depth to which technology media has assimilated the habits and mannerisms of a single American company is absolutely mind-boggling, regardless of its history, its market share, or even its recent trillion-dollar valuation. Dozens of media companies – CultofMac, MacRumors, Macworld, 9to5 Mac, AppleInsider, iMore, and... more – exist solely to cover one single independent company: Apple, Incorporated. One wonders how the sum total of the individuals involved with and these organizations compares with the total number of employees working for the company their careers are (for the moment, at least,) entirely centered around. (Further interesting questions: are there any comparable situations anywhere else in Western capitalism, and if not – doesn't this sort of attention constitute some kind of Monopoly, even if it was not necessarily an anti-competitive one?) For “reasonable people,” the image one conjures up of The All-The-Time Apple Beat does not lend to envy, but let's choose to limit ourselves to only the most casual forms of speculation. I do not wish to mock them, for I, too remember the sensation of The Apple Drug from an unfortunate time in my childhood development when I was willing to wear a cheap sweatshirt branded with a stupid Mac vs. Windows Users joke unironically to a real live public Junior High school. There are few more embarrassing admissions, except perhaps admitting that a part of me genuinely yearns to return to this level of enthusiasm, as misplaced and cringey as it was. It's the addiction to the mystic; it's aspirational in its democratization. Billionaires are running the same operating system and much of the same software as I am every day – even the most followed person on any given platform is still accessing it through the same interface I might be. These are incredible truths, but they also reflect a dangerous lack of competition in a product category that has become more essential to day-to-day human life than any other in just three or four blinks of an eye.

INDIE ALTERNATIVES TWEET

Fuck David Blue, though. Who are the real, hard-hitting minds who've kept this industry and this company in check? Well, it's funny you should ask that, because the people's quirky New York Times tech critic of late – the esteemed Farhad Manjoo – has just concluded a five-year-long technology column with some essential (if perhaps a bit unoriginal) advice: “just slow down.” If you're still following along, you shall surely enjoy clicking some of his links, and I would certainly encourage that you do until you're out of free articles, at least. When Manjoo speaks, Apple listens: his January decree for Apple to bend with the industry wind and build “a Less Addictive iPhone” is convincingly prophetic considering Screen Time – probably the most mulled-over iOS 12 addition. As someone who was diagnosed with Attention Deficit Disorder (however much or little that may mean to you) just one or two years after I began using my/the first iPhone, I've developed a history of what he might call Addiction to iPhones in variable oscillation touching both extremes. I carried my first-generation iPhone for almost 5 years – as you can imagine, it was far from a 100%-functional device toward the end of that bell curve. In contrast, I've also stood in line at dawn for two iPhone launches, jailbroken, listened to podcasts only about apps (far before they were good,) and been compelled to chronicle and reflect upon all of it for as long as I can remember.

There's no denying that the iPhone has had a profound effect on my life mostly thanks to my own choices, which is why it's worth telling the vast majority of you that features like Screen Time will never help you achieve whatever vague conception of reduced usage you may have. If you haven't yet quantified the figures you'll find within it in mental estimates, you aren't really concerned at all and if you have, Screen Time will only confirm them. Using reminder notifications to optimize your appflow makes no attempt at all to actually escape the mentality of the behavior you seek to lessen from yourself. Another app is still another app; a notification reminding you to stop using an app does nothing but add still more stimuli. If you want to stop using the phone so much, *stop* using the fucking phone. If you are truly concerned about how your handset companion has changed your life, turn it off for a week/month/quarter – however long you possibly can. By that, I mean no more or less than what you can manage without getting fired/dumped/expelled/etc. If you have truly reached this point, anything less is probably worth it. There is simply no other way to get a clear picture of how it's changed you.

Google, Facebook, and the rest of the industry are well aware of this, but know they can't actually advocate against the fundamental mechanism that drives their businesses, so they express concern by doing what they know: building more software. Apple is in a slightly different situation: they still need you to buy their phones – and even to look at them – but not past the point of hurting yourself emotionally, mentally, or physically because those injuries tend to hurt one economically. Screen Time's purpose is to keep us thriving and buying, but the only effective solve for this can only be communicated in garbage cinema language: you must find it within yourself. I am actually the worst person from which to model your life, except perhaps for my iPhone use: unless there's little else worthy of my attention, my phone is not out. Even if checking my emails, Mastodon, Twitter, etc are my default tasks, there are infinitely many besides that come first. Every once in a while, it's okay to finish an important message while walking down the street or waiting at a stoplight if things are urgent, but I can guarantee you that my attention is better consolidated on traveling in 95% of cases – moving with purpose and then focusing on my composition after I've arrived is almost always more efficient. I realize that I'm cowboying it here and sound like your Dad, but I'm better with iOS than he is, yet I've never publicly run into anything while looking down at my smartphone in 10 years of hardcore use. Find somebody who's company makes you forget about all of this for hours at a time and treasure them. Also: stop playing games on your phone. What the hell are you doing? Read a blog! Explore the wonders of the open web! Your peers, your battery, and your elderly future self with thank you for it. (One exception is playing word/trivia games with your partner. That's very cute and good for you.)

I was elated to see that even Apple supports my age-old cause for Twitter Lists. Also, the new function in Apple Music allowing the user to search by lyrics appears to work very well...

To get back to specifics, the new Photos application is now basically what it should have been all along, 3D Touch has been virtually eclipsed for those strange bastards among you who never liked it, and the release's most democratically-redeemable feature is optimization, which even on my iPhone 8 Plus was blatantly noticeable and very welcome. However, probably the best insight to come out of my long, rambly End User review was the revelation that basically any other human activity is a better use of time than applauding Apple for learning to hold new features off until they've been thoroughly tested and focusing instead on smoothing existing software. In fact, I'd argue there is absolutely no reason for someone like me to say anything even remotely positive about the world's wealthiest company ever again, though that doesn't apply to The Verge or Chaim Gartenberg, who's review – for the record – was much more useful to 9999 times more people than anything I'll ever write. However, isn't it sortof unreasonable to expect anything but absolute perfection from Apple at this trillion-dollar juncture? A handful of varying interpretations of absolute perfection per product category, even.

With gorgeous, iCloud-enabled premium apps like [Bear] (https://twitter.com/NeoYokel/status/1063486573197561857) in the picture, integrating wholly into the Apple environment has maintained its relative rank above the alternatives to its specific minimal-esque utilitarian niceness which appeals so strongly to those people among both consumer and professional buyers. Readers from within this culture recognized a short time ago that iOS is in the process of replacing MacOS as the star component of this environment across the board, though there's at least a moderate journey ahead before it truly reaches this achievement for the median user. For myself, iOS 12 improved the experience of using my 8 Plus and certainly gave me something intriguing to play with in Siri Shortcuts. For the rest of the world's billions of daily iOS users, I say be as insatiable as possible – always expect more.

#ios #iphone #software #photography #future

by David Blue

Tump

Ten percent of the United States' adult population cannot functionally read or write (conservatively) despite the exponential increase of required reading in the average American's day-to-day life thus far in the 21stcentury. For written American media, especially, one would assume that a financial and social incentive for maximum literacy in the populace should present a straightforward justification for intense widespread coverage of this particular disparity, yet most related coverage in mainstream national magazines and newspapers is alarmingly sparse and often requires a less-than-socially-conscious context (e.g. a for-profit startup) to actually appear in news feeds. From the most wholesome assumption of the industry's general values — that it holds “newsworthiness” above all — we must assume that it does not generally consider American illiteracy “interesting enough to the general public to warrant reporting” as we examine the intermittent discourse surrounding the issue that does achieve publication.

In late October, the American business and technology magazine Fast Company covered the recent successes of the “for-profit social enterprise” Cell-Ed, noting that “a huge portion of the American labor force is illiterate,” which it described as “a hidden epidemic.” The article's author, Rick Wartzman, mentions foremost that Cell-Ed's userbase is largely “foreign-born” and expected to eclipse one million in number by the end of 2019. Demographically, the magazine's readership is predominantly middle to upper-class, who are the least affected social groups by a significant margin as per illiteracy's strong correlative relationship with poverty. These factors combine to limit any real social consequences from such an article.

In direct contrast with the professional, market-minded perspective of modern business magazine, even niche independent publications from the opposite end of the media spectrum often trivialize, belittle, or generally mishandle the issue. In a 500-word “Editorial” written by The Editor Eric Black of the Baptist Standard — a small evangelical news website describing itself as “Baptist voices speaking to the challenges of today's world” — he points to a global increase in “illiterate people,” as he so comfortably brands them. Such language is inevitably counter-productive and potentially insensitive: to the eyes and ears of activists, educators, and the general public, such a term unnecessarily lends toward a restricted perspective of those people who have been left behind by the institution of read and written language in one manner or another and portrays them as a great vague collection of lingual lepers bearing their own distinct, inexorable, wordless ethnicity which inevitably bars them from the freedoms allowed by the Editor's learned capacity, including the ability to actually read his words of affliction. Simply put, he has dangerously oversimplified the issue.

To once again assume the best and infer that Black had a specific purpose in publishing his ill-supported opinion beyond continuity's sake of his weekly Editorials, it appears to be the promotion of a local Texan literacy “ministry” called Literacy Connexus, though no further specifics about the project are provided beyond “helping churches develop literacy programs for their communities, provide training and resources to overcome illiteracy,” which is virtually identical to the introductory copy on the organization's homepage.

So far, we've examined coverage only in special interest media, but what about legacy news organizations with the largest readerships in the United States? Despite oblivious use of the same ledes, a newspaper like The Washington Post can wield vast influence over the broadest possible readership and the public editorial trust. In November 2016, veteran reporter Valerie Strauss published “Hiding in plain sight: The adult literacy crisis” for Answer Sheet — her weekly newsletter designed to function as “a school survival guide for parents (and everyone else), from education policy to psychology” — which represents the most substantial discussion of American illiteracy in topical, widely-visible media (i.e. presence in a succinct search engine query.) She briefly introduces the issue with a bulleted list of illiteracy's consequences on modern society and the individual cited from a Canadian literacy foundation before turning the stage over to Lecester Johnson, CEO of the Academy of Hope Adult Public Charter School in Washington D.C.

Johnson presents a passionate and well-informed exploration of the state of the literacy battle from the perspective of a full-time, locally on-the-ground advocate. Her op-ed's introduction includes the most essential observations and statistics throughout, noting “the children of parents with low literacy skills are more likely to live in poverty as adults and are five times more likely to drop out of school,” before setting upon a detailed examination of current and relevant organizations working toward solutions. Of course, it's largely centered upon her own organization, which she claims has “helped more than 6000 adults rebuild their education and job opportunities since 1985.”

It's significant that an institution as deeply embedded across the American political spectrum as The Washington Post address the issue of American illiteracy, and both Johnson and Strauss are certainly qualified voices for the undertaking, but when we examine this particular article, it's important we consider the context of the Answer Sheet newsletter and its intended audience. Though it's no challenge to pitch the importance of reading and writing to parents and professional educators, the most alarming and destructive issue at hand is the educational disparity between their adult peers. “There's a literacy problem in the capitol, but I'm not talking about young people who can't read. Many adults — perhaps even parents sitting next to you at back to school night — don't possess academic skills,” notes Johnson with her very first paragraph. However, considering the nature of parenthood, the audience primarily consuming these words are undoubtedly preoccupied with juvenile issues, specifically, and we can assume their capacity to empathize with their fellow working adults who could benefit from literacy education is actually lessened from that of childless readers of the same age as a result. “Despite the magnitude of the adult literacy crisis, most of those needing to make up lost ground are pushed toward traditional classroom settings—even though many of these people can't possibly follow through because of cost or work schedules or other obstacles,” she attests.

Perhaps more than any other American city, Detroit has been struggling with a serious illiteracy problem. According to a profile of the Beyond Basics program (which was adapted from an embedded video broadcast) on their local ABC affiliate's website, forty-seven percent of adult Detroiters cannot read, but even companies like General Motors — who donated \$250,000 to the Beyond Basics program earlier in mid-October — are getting involved. The article quotes Elijah Craft, a young man who was “reading at a first-grade level as a senior at Detroit's Central High School.” “Craft would rare venture from home for fear he would get lost because he could not read street signs,” reports WXYZ anchor Carolyn Clifford. She frames the narrative around a reference to the 2009 film The Blind Side starring Sandra Bullock: “here, you might call this story 'The Detroit Side.'” For local television news, this reference to popular culture likely strengthened the story's power ensnare viewers' emotional attention when it was aired, and even in this written accompaniment, it proves an effective — if a bit crude — analogy. The broadcast of Mr. Craft's interview also depicts his own deep emotional investment in reading when he begins to shed tears, which is not entirely communicated in the written article.

When the American news media discusses American illiteracy, it's almost always in secondary or tertiary form: either by way of a short post for a weekly education newsletter, an ultra-low-distribution niche editorial column, or a personality profile of a local activist. Perhaps the fundamental obstacle in the face of increasing the discourse surrounding this issue is that its resolutions will require — perhaps more than any other social issue in this country — advocacy by those who can read on behalf of those who cannot because of how sensitive and isolated many of them feel. When voices of advocates like Lecester Johnson are uplifted by major organizations like The Washington Post, the sociological weight of the illiteracy issue can be very powerful. In quoting former United Nations chief Kofi Annan, she sums up for its extensive audience what the facts should ultimately mean to them: 32 million of Eric Black's so-called “illiterate people” in the United States of America have been and continue to be deprived of their “human right” to functional literacy.

#literacy #media #class #future

by David Blue

How Mono Playback Looks

As dual speakers become the norm in smartphone design, let us briefly examine and explain why one should always expect to hear their music in stereo.

Though I have many audiophilical sentiments and preferences, I cannot — by conscience — fully claim the title because I’ve never been able to justify the allotment of funds necessary for the obligatory equipment. (And my digital compressor usage in the production of Drycast and Futureland has been manifestly vulgar.) That said, audio engineering is one of the few topics which I can actually speak on with almost-academic authority, and my pretentiousness-capacitated preoccupation with quality-of-life compels me to bring up a ludicrously-rational standard that most of us have continued to undershoot for far too long.

Two weeks ago, the abundantly-rumored omission of the 3.5mm audio port in the iPhone 7 was finally settled. We played our own part in feeding the “controversy,” yeah, but I believe Apple was actually quite tardy in labeling smartphone-bound analog audio as archaic, though I’m not going to waste words in that discussion — it is definitely oversaturated, at this point — because I think mono audio is an even more prevalent topic.

Honestly, out of all the missing features we’ve lamented over in the past decade, stereo speakers should’ve been the most aggravating. The gigantic difference, of course, is that the industry (and — by muddled extension — the consumers) has been all but silent in that regard. I write you, now, because we should all be colossally disappointed with ourselves.

Two channels. Left and right. Read: Mono vs. Stereo But why should you care? What if Google — somehow — failed to provide you with a significant difference?

One channel of sound is — in terms of locale — rigidly static in your perception. Doubling the data creates a spectrum, adding dimensionality, which is infinite, ya know. Playing back audio in stereo, through two or more diaphragms (the fundamental hardware unit of sound reproduction,) now enables the exhibition of audio pictures.

If this is entirely new to you, I want you to do something. Find yourself a pair of headphones or a set of two or more computer speakers. (If these aren’t available, consider your car’s sound system. If it’s at all current and healthy, it’s gonna do the trick.) Bother to discover “L” and “R.” That is, left and right. Orient accordingly. Download this 37-second clip I recorded at BikeFest with my Zoom H2n. (Be advised: the preview is formatted in 5.1 surround, so it’s a very large file for its playback length.) Listen, obviously, and then listen again on your singular smartphone “loudspeaker.”

That’s what I’m talking about.

Why am I being so abashedly patronizing? Why am I transgressing against our particular assumptions about you — the informed, savvy millennial audience? Because the vast majority of playback I hear in day-to-day life is still from a singular diaphragm; a singular source.

A topical example: I am shown a YouTube video on an iPhone. (A pre-iPhone 7 device, that is.)

Walking downtown, I pass a small band of adolescent skateboarders listening to Cannibal Ox on a Samsung Galaxy Note.

Worst of all: I find myself watching a Netflix film on my iPhone, in bed, not having bothered to wear the $200 pair of QC15s sitting within arm’s length.

Informed or not, consumers are neglecting audio, and dimensionality, alone is worth a change. Recent years have allowed the unlimited bandwidth assumption to become habit, so even the vast majority of today’s spoken word programs (like podcasts) — which, in general, stay in the “center” of their mix, making little to no use of the left-right spectrum — are produced in stereo, now. In many cases (including a few of ours,) this doublesizing is often for the sake of introductory themes, alone. If you care to imagine a more data-frugal society, the “waste” is ridiculous. It is not unrealistic to expect such a reality in the near future, but the same holds true for the reverse.

In the present’s abundance, though, the result is simply a decrease — as a whole — in playback’s “full experience,” if the hardware is not changed. Imagine how great it’d be if a digital audio formatting standard could be developed that’d enable a singular file to be mono or stereo, if needed/utilized, to trim off redundancy, sorta like variable bit rate. Get on that, would ya?

So, why haven’t we become more diligent about our sound experiences? 1 billion iPhones in circulation, all with mono playback by default, are suspect culprits, I think. Of course, there are other devices, but none as influential — even the iPod, funny enough — on fundamental digital functions like music playback. And honestly, when is it appropriate or suave to take the extra steps?

I want to show you this song. Let me untangle my headphones… Yes, okay. Put them in. I’m going to sit here in silence for four minutes, looking into your eyes as you listen to the entirety of this track.

It’s never going to be socially acceptable. But what about wireless alternatives? Apple’s new Air Pods look absurd, but their by-computing optimization of the Bluetooth audio standard is revolutionary, in a small way, in propelling the “hearables” paradigm into the mainstream, if only for a moment. If — in a strangely-audiocentric future — we are always wearing multipurpose sound reproduction devices in our ears, perhaps the waste of the Mono Monstrosity will be finally resolved. Until then, I suppose all we can do is give it an extra thought, for our own quality-of-life’s sake.

#software #ios #audio #future