ylai@lemmy.ml to AI@lemmy.mlEnglish · 9 months agoAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comexternal-linkmessage-square20fedilinkarrow-up152arrow-down15cross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected]
arrow-up147arrow-down1external-linkAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comylai@lemmy.ml to AI@lemmy.mlEnglish · 9 months agomessage-square20fedilinkcross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected]
minus-squareFaceDeer@kbin.sociallinkfedilinkarrow-up7·9 months agoI wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
minus-squareaveryminya@beehaw.orglinkfedilinkarrow-up1·9 months agoNot to mention humans tendencies towards violence
I wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
Not to mention humans tendencies towards violence