Close Menu
    What's Hot

    How Binance Dollars Became Venezuela’s Currency

    UK Central Bank Eyes Stablecoins to Reduce Reliance on Banks

    Solana, Dogecoin and Cardano Surge as Broader Crypto Market Rises

    Facebook X (Twitter) Instagram
    Wednesday, October 1
    • About us
    • Contact us
    • Privacy Policy
    • Contact
    Facebook X (Twitter) Instagram
    kryptodaily.com
    • Home
    • Crypto News
      • Altcoin
      • Ethereum
      • NFT
    • Learn Crypto
      • Bitcoin
      • Blockchain
    • Live Chart
    • About Us
    • Contact
    kryptodaily.com
    Home»Ethereum»ChatGPT Chats Could Be Used Against Users In Court
    Ethereum

    ChatGPT Chats Could Be Used Against Users In Court

    KryptonewsBy KryptonewsJuly 28, 2025No Comments2 Mins Read
    Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
    Follow Us
    Google News Flipboard
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link

    OpenAI could be legally required to produce sensitive information and documents shared with its artificial intelligence chatbot ChatGPT, warns OpenAI CEO Sam Altman.

    Altman highlighted the privacy gap as a “huge issue” during an interview with podcaster Theo Von last week, revealing that, unlike conversations with therapists, lawyers, or doctors with legal privilege protections, conversations with ChatGPT currently have no such protections.

    “And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s like legal privilege for it… And we haven’t figured that out yet for when you talk to ChatGPT.”

    He added that if you talk to ChatGPT about “your most sensitive stuff” and then there is a lawsuit, “we could be required to produce that.”

    Altman’s comments come amid a backdrop of an increased use of AI for psychological support, medical and financial advice.

    “I think that’s very screwed up,” Altman said, adding that “we should have like the same concept of privacy for your conversations with AI that we do with a therapist or whatever.”

    Sam Altman on This Past Weekend podcast. Source: YouTube

    Lack of a legal framework for AI

    Altman also expressed the need for a legal policy framework for AI, saying that this is a “huge issue.” 

    “That’s one of the reasons I get scared sometimes to use certain AI stuff because I don’t know how much personal information I want to put in, because I don’t know who’s going to have it.”

    Related: OpenAI ignored experts when it released overly agreeable ChatGPT

    He believes there should be the same concept of privacy for AI conversations as exists with therapists or doctors, and policymakers he has spoken with agree this needs to be resolved and requires quick action. 

    Broader surveillance concerns 

    Altman also expressed concerns about more surveillance coming from the accelerated adoption of AI globally.

    “I am worried that the more AI in the world we have, the more surveillance the world is going to want,” he said, as governments will want to make sure people are not using the technology for terrorism or nefarious purposes. 

    He said that for this reason, privacy did not have to be absolute, and he was “totally willing to compromise some privacy for collective safety,” but there was a caveat. 

    “History is that the government takes that way too far, and I’m really nervous about that.”

    Magazine: Growing numbers of users are taking LSD with ChatGPT: AI Eye