• Icalasari@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    1 year ago

    One thing which actually scares me with AI ia we get one chance. And there are a bunch who don’t think of repercussions, just profit/war/etc.

    A group can be as careful as possible but it doesn’t mean shit if their Smarter Than Human AI isn’t also the first one out because as soon as it can improve itself, nothing is catching up

    EDIT: This is also with the assumption of any groups doing so being dumb enough to give it capabilities to build its own body, obviously yes one that can’t jump to hardware capable of creating and assembling new parts is much less of a threat, as the thread points out