Listing numerous areas of concern, IBM Fellow, VP, and CTO of IBM Automation Jerry Cuomo lays out the potential pitfalls for businesses using ChatGPT.
IBM vice president and chief technology officer of IBM Automation, Jerry Cuomo, recently published a blog post laying out what he claims are several risks associated with using ChatGPT for enterprise.
There are several key risk areas, according to the blog post, that businesses should consider before operating ChatGPT. Ultimately, however, Cuomo concludes that only non-sensitive data is safe with ChatGPT:
“Once your data enters ChatGPT," writes Cuomo, “you have no control or knowledge of how it is being used.”
Per the post, this type of unintentional data leakage could also put businesses on the hook, legally speaking, if partner, customer, or client data is exposed to the general public after being leaked into ChatGPT’s training data.
Cuomo further cites risks to intellectual property and the possibility that leakage could put businesses in violation of open-source agreements.
According to the IBM blog post:
“If sensitive third-party or internal company information is entered into ChatGPT, it becomes part of the chatbot’s data model and may be shared with others who ask relevant questions.”
Cointelegraph reached out to OpenAI for comment regarding the above statement. We received the following response from a public relations intermediary via email: "the data will not be shared with others who ask relevant questions.”
The representative also referred us to existing documentation on ChatGPT’s privacy features, including a blog post detailing the ability for web users to turn off their chat history.
The ChatGPT API has data sharing turned off by default, according to OpenAI.
The API policy is perfectly clear - what's confusing is the policy about conversations we have using the ChatGPT web interface and iOS/Android apps
— Simon Willison (@simonw) August 15, 2023
Critics, however, have pointed out that conversations on the web version are saved by default. Users must also opt out of both saving their conversations — a convenient feature for picking up where they left off — and having their data used to train the model. There is, as of current, no option to retain conversations without agreeing to share data.
Related: IBM Watson developer raises $60M for AI startup Elemental Cognition