- Advertisement -
- A federal judge rejected the petition of a chatgpt user against her order that OpenAi keeps all chatgpt chats
- The order followed a request from the New York Times as part of its lawsuit against OpenAi and Microsoft
- OpenAi is planning to keep arguing against the statement
Openi Will stick to all your conversations with Chatgpt and possibly share with many lawyers, even those you thought you thought you removed. That’s the result of an order of the federal court that supervises a lawsuit against OpenAi by The New York Times About copyright infringement. Judge ONA Wang confirmed her earlier order to keep All chatgpt interviews For evidence after rejecting a motion by Chatgpt users Aidan Hunt, one of the many Van Chatgpt users who asked her to revoke the order for privacy and other concerns.
Right Wang told OpenAi to retain the output of Chatgpt ‘indefinitely’ since the Time Be on that that would be a way to say if the chatbot Has illegally recreated items without paying the original publishers. But finding those examples means hanging on every intimate, uncomfortable or just private communication that someone has had with the chatbot. Although what users write is not part of the order, it is not difficult to imagine who was talking to Chatgpt about which personal subject based on what the AI wrote. The more personal the discussion, the easier it would be to identify the user.
Hunt pointed out that he had no warning that this could happen until he saw a report on the order in an online forum. And is now worried that his conversations with chatgpt can be distributed, including “very sensitive personal and commercial information.” He asked the court to leave or change the order to omit private content, such as conversations that are conducted in private mode, or when medical or legal affairs are discussed.
According to Hunt, the judge exceeded its limits with the order because “this case includes important, new constitutional questions about the incident with the privacy rights for artificial intelligence – a rapidly developing jurisdiction – and the assets of a magistrate [judge] To set up a national massiturveillance program through a discovery order in a civil case. “
Right Wang rejected his request because they are not related to the copyright problem that is obvious. She emphasized that it is about preserving, not about disclosure, and that it is hardly unique or unusual for the courts to tell a private company to hold certain data for a lawsuit. That is technically correct, but, understandably, an everyday person who uses chatgpt may not feel that way.
She also seemed to hate the accusation of massive surveillance, with a quote from that section of the petition of Hunt and made it with the legal language equivalent of a diss track. Right cheek added a “[sic]”According to the quote from Hunt’s submission and a footnote that points out that the petition” does not explain how the preservation of the court leads the preservation, segregation and the preservation of certain private data by a private company for the limited purposes of court cases, is not a “national massive surveillance program”. It is not. The judiciary is not a legal enforcement agency. “
That ‘sic burned’ aside, there is still a chance that the order will be withdrawn or changed after OpenAi will go to court this week to push it back as part of the larger paper battle for the lawsuit.
Removed but not disappeared
The other care of Hunt is that OpenAi, regardless of how this case goes, will now have the opportunity to retain chats that users believed to be removed and could use them in the future. There are concerns about whether OpenAi will lean on protecting the privacy of users above legal speed. So far, OpenAi has argued for that privacy and has asked the court for oral arguments to challenge the retention warrant that will take place this week. The company has said it wants to push hard on behalf of its users. But in the meantime your chat logs are in the dark.
Many may have had the feeling that writing in chatgpt is like talking to a friend who can keep a secret. Perhaps more will now understand that it still works as a computer program, and the equivalent of your browsing history and the search terms of Google are still there. Hopefully there will be at least more transparency. Even if it is the courts that require AI companies to keep sensitive data, users must be informed by the companies. We don’t have to discover it on a web forum.
And if OpenAI really wants to protect its users, it can start with offering more detailed controls: clear switches for anonymous mode, stronger removal guarantees and warnings when conversations are stored for legal reasons. Until that time it might be wise to treat chatgpt a little less as a therapist and a little more like a colleague who might wear a thread.
Maybe you like it too
- Advertisement -