No “undo” button here… While submitting prompts to ChatGPT, Samsung employees leaked secret company info to the AI chatbot, a report from The Economist Korea said. After Samsung’s chip-making division gave engineers the green light to use OpenAI’s chatty bot, employees apparently shared sensitive source code and meeting recordings to streamline their work. Now Samsung is reported to be restricting employee use of CGPT and trying to build its own bot.
CGPT spills the tea… OpenAI urges users not to share private info with CGPT because it could use parts of your prompt input in its responses to others. But recent surveys suggest that CGPT is already broadly used by corporate workers. Meanwhile, OpenAI-backer Microsoft is incorporating CPGT into its productivity software like Word and Excel. Samsung isn’t the only corporate giant sweating over employee chats with ChatGPT.
Corporate titans like JPMorgan, Verizon, and Goldman Sachs have blocked employee access to CGPT, while others like Amazon and Walmart have reportedly issued warnings not to share sensitive info with the tool.
Italy has banned CPGT, at least for now, saying it collected personal data — and other EU countries are also considering restricting it over data-privacy concerns.
New tools need new rules… Nearly half of companies are scrambling to draft guidelines for CGPT, since the haziness around employee use poses a security threat. Meanwhile, the Biden admin is exploring whether rules need to be imposed on AI bots, including whether a certification process should be required before release.