iOS 8's hidden revolution goes way beyond the iPad and iPh Source: Galen Gruman
While the world has obsessed over iPhone 6 details such as its size and whether it will use a sapphire screen and whether Apple will finally reveal the long-rumored iWatch later today, a more important development has been happening to the iPhone and iPad: iOS 8. Announced in June and likely to ship this month, iOS 8 heralds a major shift in how iOS apps work with each other and the rest of the world.
Although I've been a beta tester of iOS 8 since June, the import of these changes wasn't so obvious as a user. To see the bigger picture, it took a conversation with Ojas Rege, vice president of strategy at MobileIron, a mobile management company that tends to see management as a way to enable people to get the most out of their devices, not as a way to stymie users. It should help you build guardrails, not prisons. Rege was surprised that so many IT execs he's spoken to haven't paid much attention to the iOS 8 beta, perhaps because it introduced very few new IT management features.
That blithe reaction is a mistake, Rege believes -- and he's right. There are several key changes in iOS 8 that Apple has announced but not provided much detail on, even to developers in the iOS 8 beta program. From what Apple has revealed publicly, they share a common notion: Making apps more collaborative with each other, breaking iOS free of its highly restrictive containers. Those containers have made iOS incredibly secure, but they also mean the interapplication cooperation we have on PCs and Macs can't happen in iOS.
Now much of it can, and not just within iOS but with other devices, including the oncoming slew of health and fitness devices running some form of iOS -- all those wearables people keep expecting to show up -- as well as the growing number of Bluetooth-enabled devices and iBeacons-connected systems.
Here are those revolutionary, related technologies that have been announced by Apple but largely ignored since -- but shouldn't be:
Extensions
These are sort of like browser plug-ins that run as systemwide resources for apps to use. So far, they've garnered attention for fairly mundane uses like alternative keyboards and social network integration into iOS's Share sheet. There's nothing wrong with those uses, but when developers finally get to explore these in Xcode and take advantage of each other's extensions, the nature of an iOS app could well change. When apps can use common services developed by more than Apple, watch out!
Touch ID
Apple introduced its fingerprint scanner in last year's iPhone 5s, as a way to unlock the device and validate iTunes purchases biometrically. One goal was to encourage use of passwords, which most iOS users don't enable. Touching a fingerprint reader is much easier, so it should bring passwords to many more iOS devices. In iOS 8, Apple is making Touch ID available to third-party apps for use it as a biometric password.
That should be a big deal for enabling mobile payments, as Apple will likely announce it's doing today, but that's not all. Building access cards, medical sensors, and more would benefit hugely from access to Apple's hardware-secured biometric system, as would corporate data access on a more granular level. Constantly having to enter a password to stay connected or access certain files is a pain, but tapping a finger on the Home button is not. Security made easy -- without requiring a keyboard -- is a key advantage in a world of devices.
Handoff
This cool technology uses Bluetooth and Wi-Fi Direct to let a Mac pick up where you left off on the iPad or Phone. It's a clear example of what I call liquid computing, in which the focus turns from the device to what you're doing (the workflow, in technical parlance). Right now, Handoff works only on iOS 8 and OS X Yosemite, and only on certain Apple apps on recent hardware (which is probably why enterprises aren't paying it the attention it's due, Rege suspects).
Expect that to change: Handoff will no doubt become able to connect different apps across iOS devices and Macs -- maybe even Windows apps if Apple opens up more of iCloud to Microsoft's OS. Surely it will make its way to those predicted wearables and likely to Internet of things devices. When workflows become liquid across apps and devices, it's a new ballgame for apps and app developers.
The Kits: CloudKit, HealthKit, and HomeKit
Apple announced its CloudKit, HealthKit, and HomeKit APIs at its Worldwide Developers Conference in June. All three are critical to growing areas of technology interoperation: cloud services, medical data and sensors, and home automation (door locks, thermostats, furnaces, alarms, even lighting).
Every vendor wants into the Internet of things, which is evolving into three distinct spheres. The Kits are part of Apple's IoT play, but they're also about creating a broad fabric of technologies that interact, using iOS as the nexus. Although they may seem to be about devices, these Kits are really additional components of the liquid computing notion that extensions, Touch ID, and Handoff all represent. No one else has so many pieces across the stack as Apple, and these three are core.
There's also a fundamental, nontechnical dimension: As Ryan Faas reported at CITEworld, Apple's new licensing rules for HealthKit and HomeKit forbid the use of the information they gather for advertising, marketing, and other such privacy-invading purposes. That's necessary for people to trust the apps and hardware that take advantage of the Kits -- and allow them to interoperate.
There's also CarPlay, iBeacons, and Siri
These liquid technologies didn't start with iOS 8 -- Apple plots its course years ahead and tends to execute its strategy in phases. Other Apple technologies already out there were earlier pieces in the liquid puzzle.
One is iBeacons, a protocol for determining your location and making appropriate information and even activities occur on your iPhone based on that location. It's been out for a year, with measured uptake as developers digest what it can really do. Still, Apple's iBeacons protocol is simpler than many other Bluetooth beacon APIs, so it's also become universal in beacons hardware. Thus, iBeacons has the critical mass to get location-aware services up and running -- then interacting with the Kits, extensions, and so on.
Then there's CarPlay, Apple's now-two-year-old technology to make car infotainment systems hubs for your iPhone and iPad -- and their apps. The uptake has been slow, as carmakers take years to adopt new technology. (Remember how long it took to get Bluetooth and USB ports in car stereos? CarPlay requires much more engineering and safety compliance than either of those.) I view CarPlay as part of HomeKit, a digital hub in your car that has special needs, given you can't crash a house but you can crash a car. Again, the interoperability of Apple's other initiatives make CarPlay more powerful.
Finally, there's Siri, Apple's voice-based assistant. Google and Microsoft have their own equivalents, but Siri is more fundamental to Apple's strategy than Google Now or Microsoft Cortana are to theirs. Siri is a key interface to not just iPhones and iPads but -- you can bet -- to IoT devices and services that use HomeKit and HealthKit. You already see that in CarPlay, whose iOS in the Car predecessor version is a Siri interface to your infotainment system. Siri makes it more possible for people to interact with more objects, which means there'll be more items to interact with each other in Apple's growing environment of liquid computing.
Whatever hardware Apple announces today, remember that the real advances are in iOS 8, we already know what they are, and developers can take advantage of them in truly revolutionary ways -- if they decide to.
| }
|