[ad_1]
ChatGPT is reportedly leaking private conversations. According to a recent report in ArsTechnica, the leaked details include login credentials and other personal details of unrelated users. The report shared screenshots submitted by the user whose account was allegedly hacked. The seven screenshots shared contained multiple pairs of usernames and passwords that appeared to be connected to a support system used by employees of a pharmacy prescription drug portal. An employee using the AI chatbot seemed to be troubleshooting problems they encountered while using the portal.
What is the leak
“THIS is so f-ing insane, horrible, horrible, horrible, i cannot believe how poorly this was built in the first place, and the obstruction that is being put in front of me that prevents it from getting better,” the user wrote. “I would fire [redacted name of software] just for this absurdity if it was my choice. This is wrong.”
Besides the exact language and the credentials, the leaked conversation includes the name of the app the employee is troubleshooting and the store number where the problem is said to have occurred. The results are said to have appeared shortly after the user who goes by the name Whiteside had used ChatGPT for an unrelated query.
“I went to make a query (in this case, help coming up with clever names for colors in a palette) and when I returned to access moments later, I noticed the additional conversations,” Whiteside wrote in an email. “They weren’t there when I used ChatGPT just last night (I’m a pretty heavy user). No queries were made—they just appeared in my history, and most certainly aren’t from me (and I don’t think they’re from the same user either).”
Other conversations leaked to Whiteside reportedly include the name of a presentation someone was working on, details of an unpublished research proposal, and a script using the PHP programming language. As per the report, users for every leaked conversation appeared to be different and unrelated to each other. The conversation involving the prescription portal included the year 2020. Excat dates didn’t appear in the other conversations.
What OpenAI said on ChatGPT leaks
Commenting on the alleged leaks, OpenAI told the publication that the ChatGPT histories a user reported result from his ChatGPT account being compromised. The unauthorized logins came from Sri Lanka, an Open AI representative said. The user claimed that he logs into his account from Brooklyn, New York.
“From what we discovered, we consider it an account take over in that it’s consistent with activity we see where someone is contributing to a ‘pool’ of identities that an external community or proxy server uses to distribute free access,” the representative wrote. “The investigation observed that conversations were created recently from Sri Lanka. These conversations are in the same time frame as successful logins from Sri Lanka.”
What ChatGPT maker explanation
ChatGPT maker OpenAI’s explanation likely means the original suspicion of ChatGPT leaking chat histories to unrelated users may not be correct. However, it shows that the website that the website provides no mechanism for users such as Whiteside to protect their accounts using 2FA or track details such as IP location of current and recent logins. Such protections are there in mostv popular social platforms since years.
What is the leak
“THIS is so f-ing insane, horrible, horrible, horrible, i cannot believe how poorly this was built in the first place, and the obstruction that is being put in front of me that prevents it from getting better,” the user wrote. “I would fire [redacted name of software] just for this absurdity if it was my choice. This is wrong.”
Besides the exact language and the credentials, the leaked conversation includes the name of the app the employee is troubleshooting and the store number where the problem is said to have occurred. The results are said to have appeared shortly after the user who goes by the name Whiteside had used ChatGPT for an unrelated query.
“I went to make a query (in this case, help coming up with clever names for colors in a palette) and when I returned to access moments later, I noticed the additional conversations,” Whiteside wrote in an email. “They weren’t there when I used ChatGPT just last night (I’m a pretty heavy user). No queries were made—they just appeared in my history, and most certainly aren’t from me (and I don’t think they’re from the same user either).”
Other conversations leaked to Whiteside reportedly include the name of a presentation someone was working on, details of an unpublished research proposal, and a script using the PHP programming language. As per the report, users for every leaked conversation appeared to be different and unrelated to each other. The conversation involving the prescription portal included the year 2020. Excat dates didn’t appear in the other conversations.
What OpenAI said on ChatGPT leaks
Commenting on the alleged leaks, OpenAI told the publication that the ChatGPT histories a user reported result from his ChatGPT account being compromised. The unauthorized logins came from Sri Lanka, an Open AI representative said. The user claimed that he logs into his account from Brooklyn, New York.
“From what we discovered, we consider it an account take over in that it’s consistent with activity we see where someone is contributing to a ‘pool’ of identities that an external community or proxy server uses to distribute free access,” the representative wrote. “The investigation observed that conversations were created recently from Sri Lanka. These conversations are in the same time frame as successful logins from Sri Lanka.”
What ChatGPT maker explanation
ChatGPT maker OpenAI’s explanation likely means the original suspicion of ChatGPT leaking chat histories to unrelated users may not be correct. However, it shows that the website that the website provides no mechanism for users such as Whiteside to protect their accounts using 2FA or track details such as IP location of current and recent logins. Such protections are there in mostv popular social platforms since years.
[ad_2]
Source link