mapping iPhone7

The rumors are swirling (as always) about the iPhone 7. Part of the angst has to do with the fact that iPhone sales only increased by about 1%, and they only made $18B in revenue last quarter (ho hum – not). The pundits are saying that Apple has to bring back the magic in the iPhone7. While I still don’t quite understand the insatiable race for “growth” (at some point you run out of people) and would instead like to see steady progress on human-digital interfaces, the market is what it is.

OLED displays are being bandied about but probably won’t be in the 7 (maybe in the 7S in 2017). The latest rumor though is a dual lens camera system. This actually has legs, and to me is very interesting. Apple acquired LinX last year, and it would make sense to bring their tech to the party. The dual lens system confirms advantages to typical photography (better low light, stabilization, etc), but it also allows for 3D mapping. Think Project Tango – but fully integrated. You could map rooms or objects (scan to 3D print?), or do gesture control. Perhaps more interesting, at least to me, is using it to provide context mapping. What I mean by that is looking at the motion in a scene and figuring out what is going on. You can have machine vision (object recognition), but now also depth cues and temporal data. Location based services all of the sudden become a lot smarter.

Time will tell…

Share your thoughts