Stripping away the paint, sanding out the imperfections

A beautiful piece of wooden furniture, depending on its importance and functionality, can find itself being handed from one family to another.  Leaving behind a past and an identity to form new ones.

The new owners will have their own style, taste and place for the object but in most cases its current identity wont fit in. In many forms of this story the piece of furniture will go through an overhaul, a complete image makeover. The paint is stripped back revealing its multi-layered past, the underlying imperfections are sanded out and the process of giving the object a new identity begins.

What has happened to iOS, with the reveal of version 7, could well follow that story as an analogy. It has been given a new identity but with its wooden structure, its core, remaining the same. It might not fit in where it used to, with its previous family, but it no longer lives there. It has moved on and its identity reflects that of its new family and its place.

All the layers of green felt, Corinthian leather and linen were stripped back. The bumpy imperfections, referring to a somewhat knotted family history, were sanded out. Then its new family, under the guide of Jony Ive, gave it a thin new undercoat and not anything more.

This is the beginnings of a new story which will be as revolutionary as the original iPhone OS.  Like the original iPhone OS; it will grow and mature over time to become as perfect as what the original matured into. But all this takes time with care and cant be done overnight, a beautiful patina takes time. 

Whilst I personally don't like some of the aesthetics and design of iOS 7 its easy to see its potential. It doesn't align with mainstream trends or taste but its base principals of form and purpose are pure. For some it will be hard to ignore urges to criticise its new identity and thats ok. Whats important is to see its purpose and reasoning of which it has been based upon.

I cant wait to see where this goes.

Bluetooth love

Im sitting at my desk at work, half way through some feature to be deployed later in the day. I have an iPad and an iPhone on the desk with my Macbook Pro in a stand that I work on; Apple logos everywhere. With all these devices connected, syncing data, and monitoring my location; none of them know their context of where they are. They don't know the others are physically nearby. I'll get a mention on Twitter or a calendar event alert and 3 individual devices chime or grab my attention, all slightly staggered over a second.



It's not something I expect from these devices because it hasn't been offered before. If it "just worked" would I even notice it? Why don't these devices understand their proximity and context to their user, me? They could determine what device I am using and then only show notifications to that device. Cut down on the chimes and call to action from three devices to just one.


Understanding proximity

Right now; none of my devices understand that they are physically next to each other. They all have GPS and connected to the Internet so why not, with some iCloud magic, roughly understand that they are physically near each other. However, GPS is not well suited to work inside buildings. It's not capable of determining proximity with detail relative to me, within a few metres.

Once proximity is defined between the devices a "context" can then be established. This context is what device currently has my attention — what are my eyes looking at. Once defined, a decision to be made on how notifications are distributed and vocalised by a single device based on the context.


Attention to detail

Which device has my attention? That is the next problem to understand. It can be as simple as: which device did I last touch? But the last touched device may not always have my attention.

Here's an example; With my Macbook Pro on my desk along with my various mobile devices it's certain that I will be using the Macbook Pro and not a mobile device. Therefore the notifications should be sent to my Macbook Pro rather than a mobile device nearby.

The type of device needs to be considered when determining what device is more likely to have my attention when in use and in proximity to others. 

But there's a catch. With my day-to-day activities I often take a mobile device with me to a meeting or out to lunch. I haven't yet interacted with the device but its in my pocket and walking away from my desk. When a device leaves the proximity of others it should then reconsider which device now has the context. If I leave with just my iPhone then the context should switch to my iPhone. If I take a couple of mobile devices with me then the last one touched gets context.

"Touch" does not necessarily refer to physical touch. If a portable device has accelerometers or other physical sensors it can trigger a context reconsideration from significant activity. If I was to pick up my iPhone, triggering significant force on the accelerometer, it could then gain context without me even gesturing on the device itself.

Attention = Device type & Last Touch


As a developer its hard not to just blurt out all the technicalities of how this will work, which is what previous revisions of this article degenerated into. I will try and stay as high level as I can and not go too deep. Its important to state that this is not theoretical; it can all be done and if no-one does it then I will.

Over a number of months I have been researching into how this could be possible. Initially I theorised basing the idea around GPS and then later WiFi with Bonjour but it just added complexity and didn't offer proper proximity down to human scale. It was a hack. But, after the iPhone 4S was released there was a new possibility which got me all excited.

Something that didn't go completely unnoticed at the iPhone 4S launch is that it had some updated connectivity. At the time it was pretty bleeding edge and today is still quite very new, a practice that Apple isn't afraid to perform. What I am talking about is Bluetooth 4 with Low Energy. To a select few, myself included, its something they can talk for hours about but to the rest it may seem like a pipe dream to match NFC or optical barcodes. Indeed it does have that same feel to it of "It's cool tech but why would normal people us it?" Moreover, why would my Mum use it?

Unlike NFC and optical barcodes, you don't actually have to use it to use it. You don't need to wave your phone at a tag or take a photo at a mosaic cubism portrait of black pixels. It doesn't require user action to operate and it doesn't degrade the experience of the device either. Two very good characteristics. But the iPhone 4S needed more than just the new chip to utilise the power of this new connectivity.

Something that has annoyed all iOS developers is that the Bluetooth stack on iOS has been private. You could only get access if you were certified with the "Made for iPod" programme which being apart of is a feat on its own. So along comes iOS 6 with a public API for interacting with the Bluetooth stack. But it wasn't what some were expecting.


Bluetooth on iOS 6, at last

Well mostly Bluetooth. As Apple always does, they put a lot of consideration and effort into their API's. No exception was yielded with this decision. Bluetooth 4 with Low Energy is what some would call a cut down version of the normal Bluetooth protocol. It only allows for very simple and non-streaming data to be transferred from one device to another.

Low Energy Bluetooth mostly takes the form of a simple broadcast/advertisement and listener service. The device advertises a service and its characteristics and other devices will listen to these advertisement broadcasts. All of this is short range and using as little power as possible on the host device.

One of the features that you can achieve with Low Energy Bluetooth is a proximity service. This has been shown in examples to act as simple locking mechanisms to more advanced awareness of other objects.


Putting it all together

If you've been following along you will notice that there is all the ingredients needed, from a technical point of view, to make this happen:

  • Short range proximity service: Bluetooth 4
  • Shared communication resource: iCloud
  • Sensors: Touch, Axis

The end result being that I can sit at my desk, working away, and I get iMessage, Calendar and email events all going to one device — the one I am using.

What a waste of fucking time and effort? Possibly, to some. It's not as essential or as obvious as, lets say, copy and paste functionality but if it was there I wouldn't expect anybody to point to it as a feature. At a deeper level I think you'll be aware of a calm and considered solution that therefore speaks about how you're going to use your device and not the terrible struggles that engineers faced in solving the problem.


Future of Bluetooth in iOS

I love how simple this new Bluetooth functionality is. Its a win for developers but more importantly its a win for users. No issues around battery consumption or device usability degradation in any way. These are the kinds of solutions that make it into iPhones and iOS. Its simple and it works.

This post was just one example of how the new Bluetooth functionality in iOS can be used. I am personally working on a few projects to leverage Bluetooth with Low Energy. I can't wait to show it off.

Some of the other cool ideas Bluetooth 4 can offer are things like:

  • Thermostats that announce the current temperature of the room, no pairing needed.
  • A whole range of healthcare devices from wireless heart and possibly ECG monitors, temperature sensors etc.
  • Home automation. Lights switch on when you enter the room by the presence of your iPhone/smartphone
  • Any sport sensor imaginable.

We are just beginning to see the start of the accessory revolution and Bluetooth 4 is going to give it one mighty boost. No doubt, most of the ventures into new accessories using Bluetooth 4 will start on Kickstarter.

If you're a developer and are mildly interested in using the Bluetooth API's then I recommend checking out the 2012 WWDC keynotes on the new Bluetooth API available to iOS developers in the ADC.


So after and thousand and three hundred words I think I have adequately expressed my love of Bluetooth 4 with Low Energy. It could all turn out to be yet another pipe dream like NFC and optical barcodes. I don't believe so, its got too much going for it.

"Double down on secrecy"

Tim Cook was asking for it when he mentioned his plan to "double down" on secrecy. All the more motivation to go out and get the juicy details on the next iPhone™.

What I find these articles miss is that Apple didn't just release the iPhone last Wednesday/Thursday. The iPods were, at least, somewhat of a surprise. Yes, there were leaks of the case layout that gave away the shape but ultimately not what they looked like. I guess you dont get much cash for leaking iPod designs these days…

All the focus has been on how "Tim failed to keep the iPhone 5 a secret". Yeah, so fucking what? Why is it so important to point out that Tim didn't do a good enough job? They are going to sell millions and millions of these regardless. Apple will amass more billions of dollars and the world will keep spinning.

There will more than likely be an iPad Mini, a 4th generation iPad, an iPhone 5S and a year after that an iPhone 6. They will all be thinner, faster and more glorious than their predecessors. And yet we all get sucked into the techno-brity rumour mill and all speculate and throw in our 2¢, myself included.

I begin to think that Tim wasn't really referring to secrecy around the next iPhone, rather secrecy around new projects — the next big thing. Beyond the iPhone, iPad, iPod, Apple TV and notebooks what is next? The "post-tablet era".  Just like the iPad and iPhone being in development for years prior to their release, the public didn't really know if Apple was developing a phone or tablet. Just speculation and the odd whisper.

Its time to "zoom out" of this 12 month timeline and start thinking larger. What is Jony & the team working on late at night in the R&D dungeon at Apple?