recentpopularlog in

dirtystylus : apple   98

« earlier  
Review: 100,000 miles and one week with an iPad Pro | TechCrunch
For the past eighteen months, the iPad Pro has been my only machine away from home, and until recently, I was away from home a lot, traveling domestically and internationally to event locations around the world or our offices in San Francisco, New York and London. Every moment of every day that I wasn’t at my home desk, the iPad Pro was my main portable machine.

I made the switch on a trip to Brazil for our conference and Startup Battlefield competition (which was rad, by the way, a computer vision cattle scale won the top prize) on somewhat of a whim. I thought I’d take this one-week trip to make sure I got a good handle on how the iPad Pro would perform as a work device and then move back to my trusty 13” MacBook Pro.

The trip changed my mind completely about whether I could run TechCrunch wholly from a tablet. It turns out that it was lighter, smoother and more willing than my MacBook at nearly every turn. I never went back.

iPad Pro, 2018, Brazil
The early days were absolutely full of growing pains for both the iPad and myself. Rebuilding workflows by patching together the share sheet and automation tools and the newly introduced Shortcuts was a big part of making it a viable working machine at that point. And the changes that came with iPadOS that boosted slipover, split and the home screen were welcome in that they made the whole device feel more flexible.

The past year and a half has taught me a lot about what the absolute killer features of the iPad Pro are, while also forcing me to learn about the harsher trade-offs I would have to make for carrying a lighter, faster machine than a laptop.

All of which is to set the context for my past week with the new version of that machine.

For the greater part, this new 2020 iPad Pro still looks much the same as the one released in 2019. Aside from the square camera array, it’s a near twin. The good news on that front is that you can tell Apple nailed the ID the first time because it still feels super crisp and futuristic almost two years later. The idealized expression of a computer. Light, handheld, powerful and functional.

The 12.9” iPad Pro that I tested contains the new A12Z chip which performs at a near identical level to the same model I’ve been using. At over 5015 single-core and over 18,000 multi-core scores in Geekbench 4, it remains one of the more powerful portable computers you can own, regardless of class. The 1TB model appears to still have 6GB of RAM, though I don’t know if that’s still stepped down for the lower models to 4GB.

This version adds an additional GPU core and “enhanced thermal architecture” — presumably better heat distribution under load but that was not especially evident given that the iPad Pro has rarely run hot for me. I’m interested to see what teardowns turn up here. New venting, piping or component distribution perhaps. Or something on-die.

It’s interesting, of course, that this processor is so close in performance (at least at a CPU level) to the A12X Bionic chip. Even at a GPU level Apple says nothing more than that it is faster than the A12X with none of the normal multipliers it typically touts.

The clearest answer for this appears to be that this is a true “refresh” of the iPad Pro. There are new features, which I’ll talk about next, but on the whole this is “the new one” in a way that is rarely but sometimes true of Apple devices. Whatever they’ve learned and are able to execute currently on hardware without a massive overhaul of the design or implementation of hardware is what we see here.

I suppose my one note on this is that the A12X still feels fast as hell and I’ve never wanted for power so, fine? I’ve been arguing against speed bumps at the cost of usability forever, so now is the time I make good on those arguments and don’t really find a reason to complain about something that works so well.


The most evident physical difference on the new iPad Pro is, of course, the large camera array which contains a 10MP ultra wide and 12MP wide camera. These work to spec but it’s the addition of the new lidar scanner that is the most intriguing addition.

It is inevitable that we will eventually experience the world on several layers at once. The physical layer we know will be augmented by additional rings of data like the growth rings of a redwood.

In fact, that future has already come for most of us, whether we realize it or not. Right now, we experience these layers mostly in an asynchronous fashion by requesting their presence. Need a data overlay to tell you where to go? Call up a map with turn-by-turn. Want to know the definition of a word or the weather? Ask a voice assistant.

The next era beyond this one, though, is passive, contextually driven info layers that are presented to us proactively visually and audibly.

We’ve been calling this either augmented reality or mixed reality, though I think that neither one of those is ultimately very descriptive of what will eventually come. The augmented human experience has started with the smartphone, but will slowly work its way closer to our cerebellum as we progress down the chain from screens to transparent displays to lenses to ocular implants to brain-stem integration.

If you’re rolling your un-enhanced eyes right now, I don’t blame you. But that doesn’t mean I’m not right. Bookmark this and let’s discuss in 2030.

In the near term, though, the advancement of AR technology is being driven primarily by smartphone experiences. And those are being advanced most quickly by Google and Apple with the frameworks they are offering to developers to integrate AR into their apps and the hardware that they’re willing to fit onboard their devices.

One of the biggest hurdles to AR experiences being incredibly realistic has been occlusion. This is effect that allows one object to intersect with another realistically — to obscure or hide it in a way that tells our brain that “this is behind that.” Occlusion leads to a bunch of interesting things like shared experiences, interaction of physical and digital worlds and just general believability.

This is where the iPad Pro’s lidar scanner comes in. With lidar, two major steps forward are possible for AR applications.

Initialization time is nearly instantaneous. Because lidar works at the speed of light, reading pulses of light it sends out and measuring their “flight” times to determine the shape of objects or environments, it is very fast. That typical “fire it up, wave it around and pray” AR awkwardness should theoretically be eliminated with lidar.
Occlusion becomes an automatic. It no longer requires calculations be done using the camera, small hand movements and computer vision to “guess” at the shape of objects and their relationship to one another. Developers essentially get all of this for “free” computationally and at blazing speed.
There’s a reason lidar is used in many autonomous free roaming vehicle systems and semi-autonomous driving systems. It’s fast, relatively reliable and a powerful mapping tool.

ARKit 3.5 now supplies the ability to create a full topological 3D mesh of an environment with plane and surface detection. It also comes with greater precision than was possible with a simple camera-first approach.

Unfortunately, I was unable to test this system; applications that take advantage of it are not yet available, though Apple says many are on their way from games like Hot Lava to home furnishing apps like Ikea. I’m interested to see how effective this addition is to iPad as it is highly likely that it will also come to the iPhone this year or next at the latest.

One thing I am surprised but not totally shocked by is that the iPad Pro rear-facing camera does not do Portrait photos. Only the front-facing True Depth camera does Portrait mode here.

My guess is that there is a far more accurate Portrait mode coming to iPad Pro that utilizes the lidar array as well as the camera, and it is just not ready yet. There is no reason that Apple should not be able to execute a Portrait style image with an even better understanding of the relationships of subjects to backgrounds.

lidar is a technology with a ton of promise and a slew of potential applications. Having this much more accurate way to bring the outside world into your device is going to open a lot of doors for Apple and developers over time, but my guess is that we’ll see those doors open over the next couple of years rather than all at once.

One disappointment for me is that the True Depth camera placement remains unchanged. In a sea of fantastic choices that Apple made about the iPad Pro’s design, the placement of the camera in a location most likely to be covered by your hand when it is in landscape mode is a standout poor one.

Over the time I’ve been using iPad Pro as my portable machine I have turned it to portrait mode a small handful of times, and most of those were likely because an app just purely did not support landscape.

This is a device that was born to be landscape, and the camera should reflect that. My one consideration here is that the new “floating” design of the Magic Keyboard that ships in May will raise the camera up and away from your hands and may, in fact, work a hell of a lot better because of it.

Keyboard and trackpad support

At this point, enough people have seen the mouse and trackpad support to have formed some opinions on it. In general, the response has been extremely positive, and I agree with that assessment. There are minor quibbles about how much snap Apple is applying to the cursor as it attaches itself to buttons or actions, but overall the effect is incredibly pleasant and useful.

Re-imagining the cursor as a malleable object rather than a hard-edged arrow or hand icon makes a ton of sense in a touch environment. We’re used to our finger becoming whatever tool we need it to be — a pencil or a scrubber or … [more]
review  ipad  ipadpro  apple 
12 days ago by dirtystylus
The 16-inch MacBook Pro –
The butterfly keyboard was an anomaly — it was a huge departure from everything else we’d ever used, mostly not in good ways.

I absolutely love it — not because it’s the most amazing keyboard in the world, but because it’s completely forgettable in the best possible way. It just feels normal again.
by:marcoarment  keyboard  apple  macbook  design 
november 2019 by dirtystylus
Google Docs works surprisingly well in Safari on iPadOS - The Verge
Answers first: Apple is setting the “user agent” (the thing browsers use to tell websites what they are) to the desktop version of Safari. That means websites won’t default to serving their mobile versions because they see an iOS-based browser. After that, though, Apple is optimizing that site to work with touch (and the iPad’s keyboard). So it was pretty easy to hit all of Google Docs’ menu buttons, and keyboard shortcuts were no problem.
ipad  ipados  ios  apple  browser  safari  google  rwd 
june 2019 by dirtystylus
Big Tech Companies With Government Contracts

Providing law enforcement agencies with facial recognition software to aid racial profiling efforts.
humor  privacy  google  apple  amazon  unitedstates  government 
april 2019 by dirtystylus
Michael Tsai - Blog - Upgrading From an iPhone SE to an XR
I wasn’t sure whether I would like the size of the screen. With the iPhone SE, I could easily reach everything with one hand, and this wasn’t the case even with an iPhone 6s. The iPhone XR is quite a bit larger. In fact, I found that it’s so large that I hold and use it in a different—unapologetically two-handed—way, and the adjustment has been easy. Being able to see so much at once is an incredible advantage. I’ve long known this on the Mac, where I’ve always tried to get as much screen space as possible. But, in a way, it’s more true on the phone because it’s so cramped to begin with. Modern iOS and apps are less information dense than before, and they no longer seem to be optimized for 4-inch displays like when that was the flagship size. I miss those days, but at this point I don’t think even a new small phone would bring them back.
via:daringfireball  apple  ios  iphone  iphonese  ux  cameras 
march 2019 by dirtystylus
Steve Jobs on Prototypes -
“We haven’t built a prototype in engineering for two years.

“What that means is, manufacturing gets involved from day one.

“A lot of times, when you build prototypes, it’s not quite the same technology as you’re going to use in production. And so all the accumulated knowledge you get from building your prototypes, you throw away when you change technology to go into production. And you start over in that accumulation process.
stevejobs  apple  prototyping  by:jonathansnook  webdesign  design  sketchapp 
july 2018 by dirtystylus
Board Votes and Performance Reviews - Bloomberg View
Anyway also Uber violated Apple's app-store rules by "secretly identifying and tagging iPhones even after its app had been deleted," hid this from Apple by geofencing its headquarters, and got yelled at by Tim Cook when it was caught. Uber quickly backed down. The symbolism is obvious. Uber's culture of disruption goes hand in hand with a certain antagonism to outside rules. The rules of cities and states and nations -- about taxi licensing or safety or employee rights or whatever -- are meant to be broken, and broken with pride. Uber is a new way of doing things, a disruption to entrenched political systems, a new polity not constrained by the archaic geography of traditional legal systems. If you're breaking Apple's rules, on the other hand, you have to do it discreetly, and knock if off if you're caught. You can run over Bill de Blasio, but you have to be nice to Tim Cook.
uber  latecapitalism  ethics  apple  timcook  spying  privacy 
april 2017 by dirtystylus
Apple Music Problems: How To Fix Issues With Syncing, Playlists, iCloud Library, Offline Listening Not Working And More
Even if you are playing offline music, when away from a Wi-Fi connection you have to allow Apple Music to use cellular data for it to work properly. This can be done by going into Settings>Cellular> Music app and enable (set to green). This will prevent it from pausing during playback as you travel through different network connections.
applemusic  apple  ios  pausebug  offline 
september 2015 by dirtystylus
Maybe it’s because I’m a dad now with income that’s hardly disposable. Maybe it’s because I own several mechanical watches that I never wear because they don’t quite match my personal style and not a single Apple watch is something I’d consider a complement. Maybe because I’ve become increasingly wary and weary of the surge of notifications and the drain on my own cognition and mindfulness and I’m skeptical that another device is going to help solve that.
apple  applewatch  cognitiveload  postpc 
march 2015 by dirtystylus
My Ultimate Developer and Power Users Tool List for Mac OS X (2012 Edition) — carpeaqua by Justin Williams
This is the fourth installment of my must have must have list of tools and utilities as a Mac and iOS developer (2009, 2010, 2011). A lot can change in twelve months when you work in the technology space. The biggest change for Apple developers each year are the platform updates. This year saw the transition from iOS 5 to 6 as well as Lion turning into a more powerful Mountain Lion.
apple  osx  workflow  tools  software 
october 2012 by dirtystylus
Interesting new UNIX commands/binaries in OS X Mountain Lion « Ask Different Blog
In addition to those on its well-known list of 200+ new features, OS X Mountain Lion also brings along a handful of new UNIX commands and binaries.
apple  osx  unix 
august 2012 by dirtystylus
Delete OS X Software including prefs files
osx  apple  software  mac 
april 2008 by dirtystylus
Icon Grabber
Quicksilver Plugin to Grab Application Icon Art
apple  osx  quicksilver  icon 
november 2007 by dirtystylus
« earlier      
per page:    204080120160

Copy this bookmark:

to read