• MeatPilot@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 hour ago

    ChatGPT, how do I erase this evidence that Trump had Epstein murdered? Here are all the unredacted files for review.

  • TheObviousSolution@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 hours ago

    AI will reach true intelligence when it is able to tell its users they probably shouldn’t be providing them that information.

    • BigPotato@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 hours ago

      I can’t tell you how many times I emailed someone last week asking that they don’t send me any sensitive information over insecure channels.

  • minorkeys@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    3 hours ago

    The government is run by incompetent but power hungry cretens who have been convinced by the techbros that LLMs can finally make them competent at being in charge, making them grossly over confident in their capabilities.

  • lando55@lemmy.zip
    link
    fedilink
    English
    arrow-up
    58
    ·
    6 hours ago

    Yeah, he really should know better, but why were the necessary controls not in place to prevent the C-suite from doing stupid things? I know it’s not possible to eliminate all risk, but enterprise-level DLP should really have caught this.

    • scytale@piefed.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      57 minutes ago

      triggering multiple automated security warnings that are meant to stop the theft or unintentional disclosure of government material from federal networks

      They were, or at least detected if not prevented. That’s how they knew it happened.

    • Wildmimic@anarchist.nexus
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 hours ago

      This is the same guy who failed a polygraph, then smeared the people who told him he only needed to take the polygraph when he wants to see a highly classified program where only a limited number of people are allowed to see it (the previous guy on his seat didn’t want to see it because it’s not necessary for this job) for “giving him misleading information”.

      He also wanted to remove Costello, one of the people at CISA who is seen “as one of the agency’s top remaining technical talent” after around 1000 employees were cut (he was hindered to do so after others learned about that - Costello had already gotten a letter giving him the choice to move to DHS or resign). Sources say that Costello pushes back regarding policy and contracting decisions - probably because he knows better.

      He is Noem’s pet IT guy she took with her from South Dakota, and i think he’s out of his depth for sure, and probably compromised.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      1
      ·
      6 hours ago

      You’re assuming that it wasn’t caught. He could have easily been informed and did it anyway because opsec is in opposition to their goals.

      They want to make us vulnerable.

      • NOT_RICK@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        6 hours ago

        Definitely possible and even likely for at least some of them, but I would bet money a good deal of it is just hubris. A ton of these people give off the vibe that they earnestly believe they can do no wrong and know better than the “so called experts” because they’re so great and brilliant and strong. Anyone that tries to pierce that bubble is just a “jealous loser”.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 hours ago

    If the United States government wants to use ChatGPT on sensitive information, I’m pretty sure that it can come to some kind of contract with OpenAI to set up their own private cloud thing dedicated to that.

    I get that maybe this guy just wanted some kind of one-off use, but then arrange to have something set up for that.