ChatGPT Pulse: The Worst of Two Worlds
OpenAI's new feature points to a dystopian future where privacy has no value, and where the negative implications of AI chatbots and social media merge | Edition #236
Yesterday, OpenAI launched ChatGPT Pulse, which, according to Sam Altman, points to what he believes is the future of ChatGPT:
“a shift from being all reactive to being significantly proactive, and extremely personalized.”
The feature is part of the company’s broader strategy of diversifying its GPT-powered AI applications, and it is currently available only to Pro users. Fidji Simo, OpenAI’s CEO of applications, wrote that:
“AI should do more than just answer questions; it should anticipate your needs and help you reach your goals. That’s what we’re beginning to build, starting with ChatGPT Pulse.”
More specifically, Pulse is likely the beginning of Sam Altman’s plans to build a GPT-powered social network, as reported earlier this year. According to OpenAI's website, ChatGPT Pulse is:
“a new experience where ChatGPT proactively does research to deliver personalized updates based on your chats, feedback, and connected apps like your calendar.”
A more detailed description explains that:
“ChatGPT can now do asynchronous research on your behalf. Each night, it synthesizes information from your memory, chat history, and direct feedback to learn what’s most relevant to you, then delivers personalized, focused updates the next day.”
Below is a screenshot of what these topical visual cards look like, based on a user’s interests:
People have been sharing their experience with ChatGPT Pulse on X, such as this user, who wrote that it focuses more on professional topics (maybe not to spook users at this point), this user, who said that it makes him “want to dump even more information and context and app connections into ChatGPT so he can get an even better daily feed”, and this other user, who said that “this is clearly the future of media.”
Beyond press releases and curated case studies, there is no magic here. To provide ‘extremely personalized’ outputs, AI needs… vast quantities of personal data.
As described above, that data will come from ChatGPT's memory and chat history features, as well as external apps you can connect to ChatGPT, such as Gmail and Google Calendar.
As I wrote earlier this year, to protect privacy, lower the risk of sensitive leaks, and reduce the negative effects of excessive personalization, I recommend turning memory and chat history off.
To make features like Pulse work in a “significantly proactive and extremely personalized” way, as Sam Altman plans, users will be prompted to grant ChatGPT access to more personal data.
Additionally, allowing ChatGPT to access external accounts, especially those containing deeply personal information such as Gmail and Google Calendar, exponentially increases privacy and security risks, as it could lead to leaks and adversarial attacks.
However, again, in order to make Pulse a more ‘helpful personal assistant,’ as OpenAI is likely to frame it, users will be incentivized to share more personal data and grant ChatGPT access to more external sources containing personal data.
Keep reading with a 7-day free trial
Subscribe to Luiza's Newsletter to keep reading this post and get 7 days of free access to the full post archives.




