
How OpenAI’s ‘Sign in With ChatGPT’ Could Expose User Data Across Apps, Raising Compliance Issues
Senior Research Associate, Dona Mathew was quoted in a Medianama article looking at the compliance and privacy issues that arise with OpenAI exploring ways for users to sign in to third-party apps using their ChatGPT accounts.
Dona shared that beyond explicit data sharing, ChatGPT’s inferences raise separate concerns about transparency and data flow across platforms.
There are risks around how data will be shared with third parties once such an integration happens. Some users have reported the system picking up personal information like location, even when it wasn’t explicitly shared in conversations. What happens to these covert data collection practices once data starts flowing between ChatGPT, e-commerce platforms, or messaging apps, especially given how data-hungry generative AI models are? It’s important to consider the cumulative impact of ChatGPT’s ubiquity on user privacy.
She also raised concerns about the infrastructure needed to support these integrations.
Data centres, their energy and water usage, and carbon emissions are already central to discussions around AI’s societal impacts.
Why This Matters
Platform design, data handling practices, and supporting infrastructure shape how users and developers choose to engage with tools like “Sign in with ChatGPT.” These factors may directly influence privacy, consent, and accountability.
Framing this as a user decision ignores the broader power dynamics.
There’s definitely a lot of tension here when it comes to user agency in these matters. On the face of it, it just seems like it is ultimately up to the user to decide if they want to ‘sign in with ChatGPT’. But I think it’s more complex than that. It’s putting the weight of that decision on individuals, while offering incentives like ease and time-saving, even as large tech companies pursue environmentally destructive material infrastructures to keep these digital technologies running.
She added that consent mechanisms often fall short.
We’ve seen how cookies and consent checklists have evolved. Those mechanisms are not entirely effective. And given the pace of tech developments, legal frameworks end up having to play catch-up. Given the human and environmental costs of these systems taken cumulatively, we need more collective thinking, and to also lean on research that scholars across disciplines are already doing on frameworks for transcending the current power dynamics of our digital worlds.