- A federal judge rejected the request of a chatgpt user against his order who presents all chatgpt chats
- The order followed a request from the New York Times as part of its demand against Openai and Microsoft
- Operai plans to continue arguing against the ruling
Operai will cling to all his conversations with Chatgpt and possibly share them with many lawyers, even those who thought he eliminated. That is the result of an order of the federal judge that supervises a lawsuit filed against Operai for The New York Times on copyright violation. Judge Ona Wang confirmed his previous order to preserve all Chatgpt conversations for evidence after rejecting a motion from the Chatgpt Aidan Hunt user, one of the several of Chatgpt users who ask him to rescind the order on privacy and other concerns.
Judge Wang told Openai to “indefinitely” the chatgpt results from the Times He pointed out that this would be a way of knowing if the chatbot has illegally recreated articles without paying the original editors. But finding those examples means hanging in each intimate, uncomfortable or simply private communication that someone has had with the chatbot. Although what users write is not part of the order, it is not difficult to imagine to know who was talking with Chatgpt on what personal subject based on what the AI wrote. In fact, the more staff the discussion is, the easier it will be to identify the user.
Hunt said he had no warning that this could happen until he saw a report on the order in an online forum. And now you are concerned that your conversations with Chatgpt can be disseminated, including “highly sensitive personal and commercial information.” He asked the judge to cancel the order or modify it to stop the especially private content, such as conversations made in private mode, or when there are medical or legal issues discussed.
According to Hunt, the judge was exceeding its limits with the order because “this case implies important important constitutional questions about the incident of privacy rights to the use of artificial intelligence, an area of rapid development law and the ability of a magistrate [judge] Institute a massive surveillance program at the national level through an order of discovery in a civil case. “
Judge Wang rejected his request because they are not related to the issue of copyright in question. She emphasized that it is preservation, not dissemination, and that it is not unique or rare for the courts to tell a private company to remain in certain records of litigation. That is technically correct, but, understandably, a daily person who uses chatgpt may not feel that way.
The accusation of mass surveillance seemed to be disliked particularly, citing that section of Hunt's request and hitting it with the legal language equivalent to a disc. Judge Wang added a “[sic]”To the quote from the presentation of Hunt and a footnote that indicates that the petition” does not explain how the order of document retention of a court that directs preservation, segregation and retention of certain private property data by a private company for limited litigation purposes is, or it could be, a “national massive surveillance program.” It is not. The Judiciary is not an agency to apply the law. “
That 'sic burning' aside, there is still the possibility that the order is terminated or modified after Operai goes to court this week to go back against it as part of the largest paperwork battle around the demand.
Eliminated but did not go
Hunt's other concern is that, regardless of how this case is, OpenAi will now have the ability to retain the chats that users believed they were eliminated and could use them in the future. There are concerns about Si OpenAi will bow in the protection of user privacy on legal convenience. Until now, Operai has argued in favor of that privacy and has asked the Court to challenge the retention order that will take place this week. The company has said it wants to go back in the name of its users. But in the meantime, their chat records are in limbo.
Many may have felt that writing in Chatgpt is like talking to a friend who can maintain a secret. Perhaps more will understand now that he still acts as a computer program, and the equivalent of his browser's history and Google's search terms are still there. At least, hopefully, there will be more transparency. Even if they are the courts that require that the companies of IA retain confidential data, users must be notified by companies. We must not discover it by chance in a web forum.
And if Openai really wants to protect its users, it could begin to offer more granular controls: clear clear mode, stronger elimination guarantees and alert when conversations are preserved for legal reasons. Until then, it could be advisable to treat Chatgpt a little less as a therapist and a little more like a co -worker who could be using a wire.