Microsoft created a bot that learns. Then they showed it the internet as example behavior. It promptly learned to be a “genocidal, foul-mouthed, sex-crazed nazi.” Why wasn’t that the expected result?
Microsoft created a bot that learns. Then they showed it the internet as example behavior. It promptly learned to be a “genocidal, foul-mouthed, sex-crazed nazi.” Why wasn’t that the expected result?