3 min read

Dispatch No. 9 - When The Machine Teaches Us to Talk

Algorithms now shape not just what we see, but how we speak and make sense of things. When every tool pushes us toward the same phrasing or tone, we lose the variation that helps ideas connect.
đź“§
Note: Dispatch now sends from moira@dispatch.mosaic-ux.com. If the newsletter landed outside your main inbox, move it there or add me to your contacts to keep getting it each Thursday.

Studio notes

This week I moved the Dispatch newsletter and blog onto my main Mosaic UX domain for a more cohesive experience, and to better connect the consulting and editorial sides of my work. It’s been a while since I’ve messed with the nuts and bolts of digital marketing, and... it's still tedious as ever! Every platform has its own weird logic and incentives. But the tinkering can be fun.

As far as my writing... I'm still figuring out what belongs where, and what resonates. Sharing online can feel like speaking through static at first. So while I stay the course, I’m leaning on other forms of feedback: pitching to editors, attending panels, bending a friendly ear—sources that invite perspective and human input.


Field note

In a recent essay, Andrew Hogan at Figma wrote that AI’s real value may be proven in the app layer—e.g. shopping in Etsy through ChatGPT, planning trips with an AI copilot, or automating workflows inside tools like Notion.

Figma’s data supports that view. Twice as many Figma users are building agentic products compared to last year. Flying into San Francisco a few weeks ago, every billboard I passed going into the city was selling a niche agent.

Text generation is still the most common AI use case, but agentic AI—tools that can complete multi-step processes—is the fastest-growing category.

UX is necessarily moving closer to systems design, concerned with how intelligence flows through a product and where human judgment stays involved. Beyond styling interfaces, the focus is setting the rules of engagement between people, products, and the logic that sits between them.

Human feedback is the anchor point that will keep the system honest.


Elsewhere

How much of what I’m seeing and saying is mine?

In 2016, MIT researchers built FlipFeed, a Chrome extension that swapped your Twitter feed with someone whose politics were completely different. For a few minutes, your timeline filled with their sources, tone, and version of reality—a small experiment in perspective that showed how easily information can feed insulation. Around the same time, The Wall Street Journal’s “Blue Feed, Red Feed” made that polarization visible, contrasting how Facebook users across party lines saw entirely different worlds.

Screenshot of The Wall Street Journal’s Blue Feed, Red Feed project page showing an archived interactive from 2016 that compared liberal and conservative Facebook feeds side by side to illustrate how social media algorithms shape polarized information bubbles.
The Wall Street Journal’s “Blue Feed, Red Feed” (2016) visualized how Facebook users with opposing political leanings experienced entirely different versions of the same reality.

The patterns haven’t disappeared; they’ve just gone quiet. A recent analysis of YouTube lectures shows a rise in the use of words favored by ChatGPT in spoken speech—even in academic settings.

Ask not how the tools echo us, but how we echo the tools.

Our patterns of understanding are being shaped by systems that reward sameness. The messages that matter rarely come from one voice or one format—they have to be said in different ways, by different people, before they really land. When every tool pushes us toward the same phrasing, tone, or take, we lose the variation that helps ideas connect. Reason enough to keep creating.


Thanks for reading. See you next week.

- MG