In a previous post, I talked about Google Glass and its implications for service designers. Now, I am turning my lens (get it?!) to Apple's latest iPhone lineup, which has some significant new features including: a fingerprint reader, a faster main processor, and a new motion co-processor. For tech nerds like me, this in itself is awesome, but what new service possibilities could come as a result of these hardware changes? Everyone else is talking about the fingerprint reader, so let's talk about the co-motion processor.
This M7 co-processor "enables a new generation of mobile and fitness apps" which means more heavy duty processing power to control the accelerometers, gyroscope and other movement data available via iPhone. If we can find a way to accurately characterize the movement data, this could be a game changer for service innovation. How? Here are some exciting possibilities on the top of my mind:
- We could crowdsource the indoor mapping of an entire airport (or every building in the world) to enable an indoor location service for consumers and security personnel
- We could seamlessly transition data monitoring of an Ironman triathlete as they move from swim to bike to run for new fitness apps, services, and coaching programs
- We could accurately track an aging parent in their home to help monitor sleep behavior, hygiene habits, and movement info to help with preventative health
Nike is already designing the Nike+ Move app that will take advantage of this technology. What's next? What other ideas do you have for leveraging more accurate motion data?
Message me: @davidlemus