Bubbles
1. The days of human intellectual contribution are numbered due to AI. 2. AI labs claiming to have solved software engineering are still hiring software engineers. Who is right here? Can both of these facts be true at the same time?

Pamela Samuelson put up this lovely graphic on slide three of her invited talk at ICML 2025. (a talk I didn't skip, despite my proclamation that I usually do). Like every human, she is many things, but for the sake of this post, let's reduce her to a mere internationally known copyright lawyer and influential legal scholar. She told ICML attendants that people out there don't actually all like generative AI. There are lawsuits, creatives are figuratively climbing the barricades, outrage. I liked the point she made, but I thought surely everyone knows this already, right? Turns out, no. Not everyone is aware of this! Bubbles are real and a lot harder to burst than their pathetic name suggests.
So what? Success is generally a recipe along the lines of agency to take actions that are effective at achieving your goals. Ignoring the first two components, how do you figure out what an effective action is? The answer lies in accurate world models. And if you are stuck in a bubble, your chances of accurately understanding the world are dramatically lower.
Don't shield yourself from information: I follow people I have tremendous disagreement with. What they say sometimes annoys me, but I value that they share their viewpoint, and after being exposed to their perspective, I am a more grounded and more informed person than before. That doesn't mean you should take what everyone on the internet says seriously, but I believe there is a benefit to being aware of voices across the spectrum.
Empathy: I don't disregard easily what other people say. Sure, I often come to the conclusion that they are misinformed, misguided, acting in bad faith, or whatever. But I try to take a moment to empathize with them: where are they coming from? What experiences might have shaped their feelings? What are they actually trying to say? The latter has some overlap with nonviolent communication, where you, as the recipient of communication, can try to translate in your head what the other person might actually mean (which is regularly not what they literally say).

TL;DR: Seek out contrasting information and keep thinking. I read a thread from Jascha Sohl-Dickstein (Anthropic) about how the days of human intellectual contribution are numbered due to exponentially increasing AI capabilities, and then I read from Jeffrey Funk that all AI labs claiming to have solved software engineering are still hiring tons of software engineers and that hiring statistics don't show an impact of AI. Who is right here? Can both of these facts be true at the same time? Working with conflicting information is mentally challenging in the sense that it requires me decide what I believe and for what reason. It keeps me thinking.