Microsoft on Tuesday announced a more secure version of its AI-powered Bing specifically for businesses and designed to assure professionals they can safely share potentially sensitive information with a chatbot.
With Bing Chat Enterprise, the userâs chat data will not be saved, sent to Microsoftâs servers
âWhat this [update] means is your data doesnât leak outside the organization,â Yusuf Mehdi, Microsoftâs vice president and consumer chief marketing officer, told CNN in an interview. âWe donât co-mingle your data with web data, and we donât save it without your permission. So no data gets saved on the servers, and we donât use any of your data chats to train the AI models.â
Since ChatGPT launched late last year, a new crop of powerful AI tools has offered the promise of making workers more productive. But in recent months, some businesses such as among its employees, citing security and privacy concerns. Other large companies have taken similar steps over concerns around sharing confidential information with AI chatbots.
In April, regulators in Italy issued on ChatGPT in the country after OpenAI disclosed a bug that allowed some users to see the subject lines from other usersâ chat histories. The same bug, now fixed, also made it possible âfor some users to see another active userâs first and last name, email address, payment address, the last four digits (only) of a credit card number, and credit card expiration date,â OpenAI said in a blog post at the time.
Like other tech companies, Microsoft is racing to develop and deploy a range of AI-powered tools for consumers and professionals amid widespread investor enthusiasm for the new technology. Microsoft also said Tuesday that it will add visual searches to its existing AI-powered Bing Chat tool. And the company said the Microsoft 365 , its previously announced AI-powered tool that helps edit, summarize, create and compare documents across its various products, will cost $30 a month for each user.
Bing Chat Enterprise will be free for all of its 160 million Microsoft 365 subscribers starting on Tuesday, if a companyâs IT department manually turns on the tool. After 30 days, however, Microsoft will roll out access to all users by default; subscribed businesses can disable the tool if they so choose.
RETHINKING AI CHATBOTS FOR THE WORKPLACE
Current conversational AI tools such as the consumer version of Bing Chat send data from personal chats to their servers to train and improve its AI model.
Microsoftâs new enterprise option is identical to the consumer version of Bing but it will not recall conversations with users, so theyâll need to go back and start from scratch each time. (Bing recently started to enable saved chats on its consumer chat model.)
With these changes, Microsoft, which uses OpenAIâs technology to power its Bing chat tool, said workers can have âcomplete confidenceâ their data âwonât be leaked outside of the organization.â
To access the tool, a user will sign into the Bing browser with their work credentials and the system will automatically detect the account and put it into a protected mode, according to Microsoft. Above the âask me anythingâ bar reads: âYour personal and company data are protected in this chat.â
In a demo video shown to CNN ahead of its launch, Microsoft showed how a user could type confidential details into Bing Chat Enterprise, such as an someone sharing financial information as part of preparing a bid to buy a building. With the new tool, the user could ask Bing Chat to create a table to compare the property to other neighbouring buildings and write an analysis that highlights the strengths and weaknesses of their bid relative to other local bids.
In addition to trying to ease privacy and security concerns around AI in the workplace, Mehdi also addressed the problem of factual errors. To reduce the possibility of inaccuracies or âhallucinations,â as some in the industry call it, he suggested users write clear, better prompts and check the included citations.