Microsoft wants to address the privacy concerns of enterprise customers with a standalone ChatGPT installation. This will reportedly cost significantly more than the standard ChatGPT.
Microsoft plans to offer ChatGPT on dedicated servers before the end of this quarter, according to tech leak site The Information. The servers will be independent per company to prevent data exchange between different AI models. The data will not be used for AI training and external data leakage will be eliminated.
10x the cost
That privacy comes at a price: According to The Information’s sources, the private ChatGPT variants could cost ten times as much as the regular version of ChatGPT. This claim presumably does not refer to OpenAI’s $20 ChatGPT access, but to commercial use in Microsoft’s business ecosystem. However, the report does not mention any concrete figures.
OpenAI has already announced a similar product with the business version of ChatGPT, although the exact terms are still unknown. By default, data entered by employees in ChatGPT will no longer be used to optimize AI models. OpenAI plans to release details about ChatGPT for business in the coming weeks.
Interestingly, despite their extensive collaboration, Microsoft and OpenAI are once again competing for the same customers. At the end of January, Microsoft invested $10 billion in OpenAI, which is currently valued at around $29 billion.
Details of the collaboration are not known, but are likely to include OpenAI returning capital through AI training on Microsoft servers. Microsoft is also implementing OpenAI technology in several Microsoft products in addition to Bing Chat.
Your own ChatGPT as an alternative to your own AI model
The new business offerings from Microsoft and OpenAI are a direct response to growing criticism of the non-transparent use of data to train and run large-scale AI models. A few days ago, Samsung confirmed that it had banned the internal use of ChatGPT and similar AI services for fear of data loss and leaks, and that it had plans to develop its own AI models.
However, a locally or independently operated ChatGPT might be a more attractive proposition than a completely in-house AI model. After all, the extensive pre-training of the model that forms the basis of its language capabilities is time-consuming, costly, and error-prone.