openai-says-it-may-store-deleted-operator-data-for-up-to-90-days

OpenAI says it may store deleted Operator data for up to 90 days

OpenAI says that it might store chats and associated screenshots from customers who use Operator, the company’s AI “agent” tool, for up to 90 days — even after a user manually deletes them.

OpenAI has a similar deleted data retention policy for ChatGPT, its AI-powered chatbot platform. However, the retention period for ChatGPT is only 30 days, which is 60 days shorter than Operator’s.

OpenAI says its policies around data retention for Operator are designed to combat abuse. “As agents are a relatively new technology, we wanted to make sure our teams have the time to better understand and review potential abuse vectors,” an OpenAI spokesperson told TechCrunch. “This retention period allows us to enhance fraud monitoring and ensure the product remains safe from misuse, while still giving users control over their data.”

OpenAI announced Operator on Thursday and released it in a research preview for subscribers to the company’s $200-per-month ChatGPT Pro plan. Operator is a general-purpose AI agent with a built-in browser that can independently perform certain actions on websites.

OpenAI claims that Operator can automate tasks like booking travel accommodations, making restaurant reservations, and shopping online. There are several task categories users can choose from within the Operator interface, including shopping, delivery, dining, and travel.

Operator captures screenshots of its built-in browser to help it understand how and when to take actions in apps, like when to use buttons and which forms to complete. To be clear, Operator doesn’t capture screenshots when it gets “stuck,” like when the tool needs a password. OpenAI calls this “take over” mode.

Still, some users may be wary of volunteering screenshots of their online activities to a company that may keep them for upwards of three months. OpenAI notes that, as with ChatGPT, Operator data may be accessed by “a limited number of authorized OpenAI personnel” and “trusted service providers” for purposes like investigating abuse and handling legal matters.

Kyle Wiggers is a senior reporter at TechCrunch with a special interest in artificial intelligence. His writing has appeared in VentureBeat and Digital Trends, as well as a range of gadget blogs including Android Police, Android Authority, Droid-Life, and XDA-Developers. He lives in Brooklyn with his partner, a piano educator, and dabbles in piano himself. occasionally — if mostly unsuccessfully.

View Bio

Newsletters

Subscribe for the industry’s biggest tech news

Related

Latest in AI