To me, it’s because that’s the day our relationship with technology changed forever. Before ChatGPT we had a one-way relationship with technology. We sat at a computer, asked Google a question and waited for it to return 100 million responses in .07 seconds. Then it was up to us to sort through the links to determine which one best met our needs. Yes, we could do follow-up queries, but for the most part the job of the technology was done.
Since then, every new wave of AI has reshaped that relationship further — and the latest shift, AI browsers, might be the most consequential yet.
But with ChatGPT and all the LLMs that have followed, the relationship is fundamentally different. The interaction starts with a query that’s often similar to a traditional search, but then it changes. Rather than an endless list of links, the LLM engages with the user in a conversation about their needs and how it can help.
Used properly (and a lot of people still use LLMs like search engines on steroids), ChatGPT and other tools like it become partners to and extensions of the user. It’s engaging in conversation, pushing on your thinking, taking you in directions you didn’t expect. Despite what you may have heard, LLMs are creative, thoughtful and, in many ways, human-seeming.
In short, ChatGPT created two-way relationships between humans and technology in a way that had never been done before.
I gave Atlas a spin last week and was pretty impressed by its capabilities. But I didn't know what was happening under the hood of these browsers until I read a fantastic deep-dive post by John Munsell, which was spurred by an equally deep dive by Chris Penn.
As Chris Penn noted, using these browsers is like doing the machine’s work for it — feeding them data we don’t even realize we’re giving away. “Like the humans in the Matrix,” he wrote, “we are transmitting enormous quantities of data that AI companies themselves might not be able to get access to because of things like paywalls or logins.”
Had I not read John’s and Chris’s posts I would have remained in the dark about how these browsers work. I would have assumed that they were just browsers with AI – making my web experience more interesting and efficient but ultimately not all that different from a typical browsing experience.
Much the same way we saw ChatGPT as “just another search engine” on Nov. 30, 2022.
But we’re finding that AI is requiring us to think differently.
Once again, our relationship with technology isn’t what we think it is. We believe we’re the users — but we’re also the data.
If I’m using Atlas and I have open tabs with private personal, financial and professional information, the browser may be able to see that data and send it to OpenAI’s servers without you knowing it’s happening – even if those tabs aren’t visible on your screen.
If, as happened with Chris Penn’s experiment with Gemini, I’m interacting with a LinkedIn post, I may be unwittingly sharing the contact information of everyone who interacted with that post with Google.
But like with so many other things, AI is forcing us to think and act differently. By using Atlas or Comet or Gemini you’re essentially consenting on behalf of people you may not know – giving these powerful AI browsers access to information people have no idea they’re sharing.
Companies’ data privacy practices and policies can’t reasonably keep up with the pace of change that AI brings, which means responsibility for our data and others’ belongs squarely with us now.
The relationship that began as command and response has become something closer to cohabitation. And like any relationship, trust depends on awareness and respect.
I’m not trying to be the boogeyman; Halloween is over. But I am suggesting that the onus is on us when using these new AI browsers to know what’s happening with the data – and to take steps to protect other people’s data the way we would protect our own.
As ChatGPT turns three, it’s worth remembering what that birthday really represents. We once asked technology to serve us. Now, it’s learning from us — absorbing our thinking, our language, our values. The question isn’t just what role technology will play in our lives, but what role we’ll play in teaching it to reflect the best of who we are.
If we stay awake to that responsibility, this next chapter of AI might be the one where both humans and machines evolve — together — toward something better.