We’re used to software that you push around with your mouse and tap-tap-typing fingers. With mobile devices, the long-awaited revolution of new input modes has finally arrived. Now that it’s finally happening, it’s hard to believe how fast it’s happening. For instance, with Shazam you can ask your iPhone to tell you the name of the song on the radio. It listens and dutifully tells you the answer. I’ve tried it, and I’m here to tell you it works and it works well.
With GPS technology, it’s no longer surprising to have a device that can tell you where you are. But that’s just the starting point. Wikitude is an app for your Android-based phone that lets you take a picture of your vicinity which it then annotates with information about what you’re looking at. “What place is that?” you might ask, taking a quick snapshot of a mysterious building with your phone. Just as quickly your pocket amanuensis would reply: “That, dear sir, is the Mitchell Corn Palace, the only one of its kind.” Here’s a video of how it works.
And just last night I was looking at a dramatic apparition of Venus, Jupiter, and the moon in the western sky. Suppose you didn’t know it was Jupiter and Venus. The Celestron SkyScout (which has actually been around for a while) can tell you.
Here’s what I want now. I want to take a picture of a leaf and ask “what tree is this from?” It knows where I am, so it should be able to narrow down the possibilities to one or two species pretty quickly. Or maybe we have to wait for the pocket DNA analyzer. I’m guessing that will be on the market in three months at the rate things are going.
(Thanks Roy for the Wikitude link)