After the Ecstasy, the Laundry

Design·April 2, 2026

I had a conversation the other night that stuck with me more than I expected. It reminded me of After the Ecstasy, the Laundry - a book I haven’t picked up in years, and am re-reading this week.


I’m not against automation. Getting rid of empty mental calories—spreadsheets, data cleanup, calendar Tetris—is some of the best use of AI I can think of and I’m all for it.

But we’re also very good at convincing ourselves that anything inconvenient is unnecessary. And the line between “low-value task” and “part of being human” is getting blurry fast.

Automating Away Pasta

I’m playing billiards with friends the other night, losing badly, when a guy sits down and starts pitching the agentic future to the table. He tells us he’s already set up agents to do most of his job. He just checks in while prompts run all day.

He was then forecasting that this is the wave of the future: “Think about the things you hate most—laundry, trash, groceries. Gone. You’ll never have to think about them again.” He’d definitely had a few too many, and then started talking about how great it’d be to automate away how long it takes to make fresh pasta, which is where he lost me.

Making fresh pasta is the point. The repetition, the attention, the fact that it takes time. It’s not friction to eliminate—it’s one of the places life actually happens.

I told him I like doing laundry. I genuinely like the reset, and it’s cute when my dog jumps on to a pile of warm socks. I don’t hate taking out the trash—it marks another week going by. And groceries are what nourish us—I think it’s a gift, not a chore, to chose what you’re taking care of yourself with.

The conversation reminded me of a book I haven’t read in a long time—Jack Kornfield writes about the moment after a profound spiritual awakening when you return to ordinary life and…you still have to do the laundry. It’s more of a metaphor and not specifically about “laundry” per se but his point: you don’t have to do all of this stuff. You get to. The routine isn’t the obstacle to a full life or something we need to sweep away from our consciousness. It’s a big part of what a full life is made of.

This Stops Being Philosophical Pretty Quickly

I think in systems and long-term structures and that habit makes me cautious—sometimes outright wary—of building AI-native products, because when you deploy something at scale you’re not just shipping a feature. You’re normalizing a behavior.

Let’s use this example to imagine designing to avoid harm: imagine a small business owner using an AI assistant to handle customer outreach.

They have a voice, a product, a real relationship with their customers. They probably earned the business because of a human relationship in the first place as word of mouth referrals drive something like 60% of business. But as their agent takes on more, they step back further. The replies sound like them but aren’t. The follow-ups happen without them.

At some point you have to ask—what is your business now? How much of it is differentiated when you’re one step removed, then two, then entirely…have you just built a slightly more personal version of a dropshipping operation, collecting passive income while the robots handle the humans?

There’s still a product to be sure. There’s still revenue coming in. But the relationship—the actual human connection that differentiated it in the first place—has been abstracted away. And in a landscape where agentic tooling becomes more democratized and utilized by your competitors and the CX flattens, what’s the thing that helps you stand out? How do you continue to highlight the things that differentiates you and earns new business?

The Risk is Drift

I’ve worked for myself and scaled a sole-prop LLC before—I know the admin is tremendous and the burnout is real when you’re doing everything yourself. But there’s something that gets lost when you put distance between yourself and your customers—the sales conversations, the back-and-forth, the small signals you only pick up when you’re actually in it.

If an agent can represent your business across surfaces—answering questions, managing products, completing transactions—then the question isn’t just “can it functionally do it?” It becomes how you design for nuance and set controls:

Designers Need to Define the Paradigms

I’m cautiously hopeful. I’ve started to see more job descriptions that actually feel thoughtful—they’re asking designers to define interaction paradigms, design for trust and transparency, and shape how people delegate to and collaborate with AI. That’s the right framing!

What we define doesn’t stay contained to one product, but it becomes the default behavior millions of people inherit. Responsible product design, at least in this space, means being intentional about that.