Eyeing Machine Vision, Microsoft’s Gestural Interface Platform, and More – This Week in Artificial Intelligence 07-02-16

Daniel Faggella

Daniel Faggella is Head of Research at Emerj. Called upon by the United Nations, World Bank, INTERPOL, and leading enterprises, Daniel is a globally sought-after expert on the competitive strategy implications of AI for business and government leaders.

Eyeing Machine Vision, Microsoft's Gestural Interface Platform, and More - This Week in Artificial Intelligence 07-02-16

1 – Robot Eyes and Humans Fix on Different Things to Decode a Scene

A team from Facebook AI Research recently published a paper on their work in mapping active neural nets in both humans and machine learning systems when decoding visual objects. It turns out that humans and the machines involved in the study do not focus on the same details to features in order to make their determinations, lending an air of mystery two different complex systems scientists are constantly seeking to better understand. The task of decoding was not always as straight-forward as identifying a specific object; the humans and machines were both asked questions like “What is the man doing?” and “What number of cats are lying on the bed?” Humans still rule on visual imaging, but the question as to whether machine vision neural nets should be altered to more closely resemble those of humans is still up for debate.

(Read the full article on New Scientist and full paper at Arvix.org)

2 – Microsoft’s Plan To Build The Ultimate Gestural UI

Microsoft seems on the verge of incorporating gestural interface abilities into its technology. This technology was largely fielded at Microsoft by Xuedong Huang, the same Microsoft researcher who founded its speech recognition program 25 years ago. Called Handpose, Microsoft’s newly introduced gesture platform makes the leap from broad image-matching techniques (currently used by Kinect) to ‘breaking up’ the hand into minute pieces and reasoning about how your hand makes movements, like grasping small objects or touch-typing. Though Microsoft hush-hush on which products will receive an upgrade with gestures, the potential applications could range from virtual reality systems to keyboard-less work books to a home assistant that operates on voice and gestures.

(Read the full article on Fast Company)

3 – Unicorn Instacart Hopes Its Data Scientists Can Calculate a Path to Profits

San Francisco-based Instacart has been well funded (it was reported as worth $2 billion in 2015) but not very profitable. Jeremy Stanley was recently brought in as vice president of data science to help increase margins through the use of targeted algorithms. One of the primary goals of Stanley’s team was to improve Instacart’s employees’ efficiency, from assignment of orders to delivery. Using collected data, the team improved the app that workers use to collect orders and choose routes, and on average  time to fulfill a order has been reduced by 40 percent. Another newer tactic for increasing profits has been the integration of targeted brand ads. Instacart currently operates in 24 metro areas and is competing with similar services offered by Google and Amazon, though Instacart leverages Whole Foods as a partner and investor.

(Read the full article on MIT Technology Review)

4 – White House Seeks Public Input on Artificial Intelligence

On Monday, the White House announced via the Federal Register that it’s looking for public feedback on artificial intelligence. The request comes from the Office of Science and Technology Policy (OSTP), with U.S. Deputy CTO Ed Felten stating the office would like to learn “how America can best prepare for the future of AI, including information about AI research and the tools, technologies, and scientific training that are needed.” In addition to upcoming workshops being held by OSTP that are open to public registration, the new National Science and and Technology Council Subcommittee on Machine Learning and Artificial Intelligence continues to keep track of AI advancements in the private and public sectors and try to coordinate with government initiatives. Response forms are available on the Federal Register website, with a due date of July 22, 2016.

(Read the full article on fedscoop and original posting on the Federal Register)

5 – Self-Driving Cars: Tesla Fatal Crash, Intel-BMW-Mobileye Partnership, Zoox Funding

Just one day after a South Florida man was killed while in a Tesla in self-driving mode, Intel has announced a partnership with BMW and Mobileye to create self-driving cars by 2021. In a Friday morning webcast from Munich, BMW and Intel discussed the overwhelming safety improvements that advanced self-driving technology can provide, such as Automatic Emergency Braking (AEB) (current systems, like the one in the Tesla incident, are not able to act unless its to avoid a rear collision). Though the Tesla accident is the first one reported, it’s difficult to say how the incident will impact the industry in the immediate term, and there are no significant signs that the industry will slow down.

(Read the full article on SiliconBeat)

Image credit: New Scientist

Subscribe
subscribe-image
Stay Ahead of the Machine Learning Curve

Join over 20,000 AI-focused business leaders and receive our latest AI research and trends delivered weekly.

Thanks for subscribing to the Emerj "AI Advantage" newsletter, check your email inbox for confirmation.