recentpopularlog in

robertogreco : objectrecognition   3

Deep Belief by Jetpac - teach your phone to recognize any object on the App Store on iTunes
"Teach your iPhone to see! Teach it to recognize any object using the Jetpac Deep Belief framework running on the phone.

See the future - this is the latest in Object Recognition technology, on a phone for the first time.

The app helps you to teach the phone to recognize an object by taking a short video of that object, and then teach it what is not the object, by taking a short video of everything around, except that object. Then you can scan your surroundings with your phone camera, and it will detect when you are pointing at the object which you taught it to recognize.

We trained our Deep Belief Convoluted Neural Network on a million photos, and like a brain, it learned concepts of textures, shapes and patterns, and combining those to recognize objects. It includes an easily-trainable top layer so you can recognize the objects that you are interested in.

If you want to build custom object recognition into your own iOS app, you can download our Deep Belief SDK framework. It's an implementation of the Krizhevsky convolutional neural network architecture for object recognition in images, running in under 300ms on an iPhone 5S, and available under an open BSD License."

[via: https://medium.com/message/the-fire-phone-at-the-farmers-market-34f51c2ba885 petewarden ]

[See also: http://petewarden.com/2014/04/08/how-to-add-a-brain-to-your-smart-phone/ ]
applications  ios  ios7  iphone  ipad  objects  objectrecognition  identification  objectidentification  mobile  phones  2014  learning  deepbelief  petewarden  ai  artificialintelligence  cameras  computervision  commonplace  deeplearning 
june 2014 by robertogreco
The Fire Phone at the farmers market — The Message — Medium
"With the exception of a few paintings, all of Amazon’s demo “items” were commercial products: things with ISBNs, bar codes, and/or spectral signatures. Things with price tags.

We did not see the Fire Phone recognize a eucalyptus tree.

There is reason to suspect the Fire Phone cannot identify a goldfinch.

And I do not think the Fire Phone can tell me which of these “items” is kale.

This last one is the most troubling, because a system that greets a bag of frozen vegetables with a bar code like an old friend but draws a blank on a basket of fresh greens at the farmers market—that’s not just technical. That’s political.

But here’s the thing: The kale is coming.

There’s an iPhone app called Deep Belief, a tech demo from programmer Pete Warden. It’s free."



"If Amazon’s Fire Phone could tell kale from Swiss chard, if it could recognize trees and birds, I think its polarity would flip entirely, and it would become a powerful ally of humanistic values. As it stands, Firefly adds itself to the forces expanding the commercial sphere, encroaching on public space, insisting that anything interesting must have a price tag. But of course, that’s Amazon: They’re in The Goldfinch detection business, not the goldfinch detection business.

If we ever do get a Firefly for all the things without price tags, we’ll probably get it from Google, a company that’s already working hard on computer vision optimized for public space. It’s lovely to imagine one of Google’s self-driving cars roaming around, looking everywhere at once, diligently noting street signs and stop lights… and noting also the trees standing alongside those streets and the birds perched alongside those lights.

Lovely, but not likely.

Maybe the National Park Service needs to get good at this.

At this point, the really deeply humanistic critics are thinking: “Give me a break. You need an app for this? Buy a bird book. Learn the names of trees.” Okay, fine. But, you know what? I have passed so much flora and fauna in my journeys around this fecund neighborhood of mine and wondered: What is that? If I had a humanistic Firefly to tell me, I’d know their names by now."
amazon  technology  robinsloan  objects  objectrecognition  identification  objectidentification  firefly  mobile  phones  2014  jeffbezos  consumption  learning  deepbelief  petewarden  ai  artificialintelligence  cameras  computervision  commonplace  deeplearning 
june 2014 by robertogreco
TapTapSee - Blind & Visually Impaired Camera for iPhone 3GS, iPhone 4, iPhone 4S, iPhone 5, iPod touch (3rd generation), iPod touch (4th generation), iPod touch (5th generation) and iPad on the iTunes App Store
"TapTapSee is designed to help the blind and visually impaired identify objects they encounter in their daily lives.

Simply double tap the screen to take a photo of anything and hear the app speak the identification back to you."
via:straup  cameras  iphone  ios  blindness  objectrecognition  sensors  assistivetechnology  roboteyes 
june 2013 by robertogreco

Copy this bookmark:





to read