I was reading Intel’s future goals about designing new user interfaces and I feel like it is really important to underline this, here are some of the titles of from the talk :
“We want to be able to design platforms that can anticipate and respond to the ever changing needs of the users, whether they’re happening on the scale of minutes or hours, or at the fine grain, in terms of nanoseconds and microseconds.”
“A user-aware platform will be any device that can take care of itself, knows who we are, where we are, and tries to anticipate what we want done,” Rattner said. “They will need digital senses to be aware of their surroundings and what they are doing. They will also need new levels of intelligence to understand our needs and collaborate with other electronics to take action on our behalf while doing no harm in the process.”
“Imagine if all the pages of all the medical books in the Library of Congress fell out on the floor and you had to search through them for a specific image of a cancer cell. No file names, no folders, just piles of images; this is called searching nonindexed data,” Rattner said. “Running simultaneously across several computers, Diamond uses advances in computer vision and machine learning to search through data the way people do, first by studying what the desired image ‘looks like’ — its shape, color, contents — and then finding the closest matches. It is an initial attempt to do for complex data what spreadsheets have done for numbers.”
well those are not far from what I have been reading lately, genetics algorithms are really similar systems which they can learn how to behave through generations. Also I am hoping to see new network based applications which are based on those kind of stuff.
Here is intel’s and Carnegie Mellon’s research project Diamond