Uh-oh — Microsoft might be storing information from your Bing chats.
This is probably totally fine as long as you've never chatted about anything you wouldn't want anyone else reading, or if you thought your Bing chats would be deleted, or if you thought you had more privacy than you actually have.
In its terms of service, Microsoft updated new AI policies. Introduced on July 30 and going into effect on Sept. 30, the policy said: "As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service."
SEE ALSO: Microsoft is testing Bing Chat on Chrome and SafariAccording to the Register's readingof a new clause "AI Services" in Microsoft's terms of service, Microsoft can store your conversations with Bing if you're not an enterprise user — and we don't know for how long.
Microsoft did not immediately respond to a request for comment from Mashable, and a spokesperson from Microsoft declined to comment to the Register about how long it will store user inputs.
"We regularly update our terms of service to better reflect our products and services," a representative said in a statement to the Register. "Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers."
Beyond storing data, there were four additional policies in the new AI Services clause. Users cannot use the AI service to "discover any underlying components of the models, algorithms, and systems." Users are not allowed to extract data from the AI services. Users cannot use the AI services to "create, train, or improve (directly or indirectly) any other AI service." And finally, users are "solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services)."
So maybe be a bit more careful while using Microsoft Bing chats or switch to Bing Enterprise Chat mode — Microsoft said in July that it doesn't save those conversations.
Copyright © 2023 Powered by
Microsoft might be saving your conversations with Bing Chat-如火燎原网
sitemap
文章
848
浏览
22257
获赞
5924
Chunky baby seal born in Japan. Look at him, love him.
There is never a bad time to look at a cute baby animal, so please enjoy this adorable seal.The littTwitter and Reddit's high
There's a concerning new trend among social media platforms when it comes to APIs, and it threatensBest free AI courses you can take online
TL;DR:A wide range of AI courses are available to take for free on Udemy. The conversation about theChat’s Entertainment
Sean Michaels ,August 24, 2023 Chat’Reddit recruits black tech entrepreneur to join board
Reddit is honoring Alexis Ohanian’s request to fill his board seat with a black candidate by nAccessory after the Fact
Jack Sheehan ,August 23, 2023 Accessory afReddit briefly went down. What we know as the blackout protest continues.
UPDATE: Jun. 12, 2023, 11:52 a.m. EDT Reddit appeared to recover from its crash on Monday by about mQueering the Courts
Matthew Zarenkiewicz ,October 4, 2023 QueeThe Scantron meme is a clever nod to finals week
Scantrons are the bane of any student's existence. But this meme might make them a little less nerveTectonic Shifts
Dylan Saba ,October 26, 2023 Tectonic ShifAI meets healthcare: How a children's hospital is embracing innovation
While hospitals are accustomed to dealing with most things viral, they are already starting to studyThe Crusties
FictionHere's why everyone's mad about Kylie Jenner's new walnut scrub
Kylie Jenner announced her new skincare line, Kylie Skin, on Tuesday. The collection includes six prChat’s Entertainment
Sean Michaels ,August 24, 2023 Chat’Are We Undone?
Fiction