Dispatch No. 9 - When The Machine Teaches Us to Talk
Studio notes
This week I moved the Dispatch newsletter and blog onto my main Mosaic UX domain for a more cohesive experience, and to better connect the consulting and editorial sides of my work. It’s been a while since I’ve messed with the nuts and bolts of digital marketing, and... it's still tedious as ever! Every platform has its own weird logic and incentives. But the tinkering can be fun.
As far as my writing... I'm still figuring out what belongs where, and what resonates. Sharing online can feel like speaking through static at first. So while I stay the course, I’m leaning on other forms of feedback: pitching to editors, attending panels, bending a friendly ear—sources that invite perspective and human input.
Field note
In a recent essay, Andrew Hogan at Figma wrote that AI’s real value may be proven in the app layer—e.g. shopping in Etsy through ChatGPT, planning trips with an AI copilot, or automating workflows inside tools like Notion.
Figma’s data supports that view. Twice as many Figma users are building agentic products compared to last year. Flying into San Francisco a few weeks ago, every billboard I passed going into the city was selling a niche agent.

UX is necessarily moving closer to systems design, concerned with how intelligence flows through a product and where human judgment stays involved. Beyond styling interfaces, the focus is setting the rules of engagement between people, products, and the logic that sits between them.
Human feedback is the anchor point that will keep the system honest.
Elsewhere
How much of what I’m seeing and saying is mine?
In 2016, MIT researchers built FlipFeed, a Chrome extension that swapped your Twitter feed with someone whose politics were completely different. For a few minutes, your timeline filled with their sources, tone, and version of reality—a small experiment in perspective that showed how easily information can feed insulation. Around the same time, The Wall Street Journal’s “Blue Feed, Red Feed” made that polarization visible, contrasting how Facebook users across party lines saw entirely different worlds.

The patterns haven’t disappeared; they’ve just gone quiet. A recent analysis of YouTube lectures shows a rise in the use of words favored by ChatGPT in spoken speech—even in academic settings.
Ask not how the tools echo us, but how we echo the tools.
Our patterns of understanding are being shaped by systems that reward sameness. The messages that matter rarely come from one voice or one format—they have to be said in different ways, by different people, before they really land. When every tool pushes us toward the same phrasing, tone, or take, we lose the variation that helps ideas connect. Reason enough to keep creating.
Thanks for reading. See you next week.
- MG