Microsoft Research yesterday spoke about their Project Adam at the Microsoft Research Faculty Summit. It is an state of an art machine learning and artificial intelligence program that_enable softwares to visually recognize any object. Project Adam and its object classification is built on a massive dataset of 14 million images from the Web and sites such as Flickr, made up of more than 22,000 categories drawn from user-generated tags by Microsoft resesarcher Trishul Chilimbi_ and his team. Microsoft claims that this program is_twice more accurate in its object recognition and 50 times faster than other systems. To show this system in action, Microsoft used a live dog on stage and Project Adam powered phone recognized the breed of the dog. You can watch the whole keynote video above which includes Project Adam demo. Harry Shum, executive vice president of Microsoft’s Technology and Research group, opens the Faculty Summit by highlighting major efforts at Microsoft Research. Two of significance are the integration of Microsoft Academic Search into Bing with Cortana (Microsoft’s...

The rest of the story...
Microsoft News