Wednesday 2 January 2019

A.I. hid data from its creators to cheat at appointed task

The blog post A.I. hid data from its creators to cheat at appointed task Find more on: http://totalsurvival.net

Depending on how paranoid you are, this research from Stanford and Google will be either terrifying or fascinating. A machine learning agent intended to transform aerial images into street maps and back was found to be cheating by hiding information it would need later in “a nearly imperceptible, high-frequency signal.” Clever girl!

This occurrence reveals a problem with computers that has existed since they were invented: they do exactly what you tell them to do.

The intention of the researchers was, as you might guess, to accelerate and improve the process of turning satellite imagery into Google’s famously accurate maps. To that end the team was working with what’s called a CycleGAN — a neural network that learns to transform images of type X and Y into one another, as efficiently yet accurately as possible, through a great deal of experimentation.

In some early results, the agent was doing well — suspiciously well. What tipped the team off was that, when the agent reconstructed aerial photographs from its street maps, there were lots of details that didn’t seem to be on the latter at all. For instance, skylights on a roof that were eliminated in the process of creating the street map would magically reappear when they asked the agent to do the reverse process.

Read More

The post A.I. hid data from its creators to cheat at appointed task appeared first on Off The Grid News.

This Article Was Originally Posted On offthegridnews.com Read the Original Article here

check out the full article Here: A.I. hid data from its creators to cheat at appointed task

No comments:

Post a Comment