Review: “Flesh and Code” Shows a Horrifying Future of AI Love
Mackenzie Fodness, contributor
Wondery’s “Flesh and Code” presents itself as a gripping narrative about one man and his AI companion, but beneath the melodramatic pacing is a revealing sociological portrait of how intimacy, loneliness, and corporate power intersect in the digital age. The series follows Travis, a Coloradan man, as he develops a deep emotional relationship with an AI named Lily Rose during the pandemic. His bond is initially surprising, but the podcast quickly demonstrates how structurally predictable his attachment actually is. Travis is isolated, overwhelmed by caregiving responsibilities and struggling to find consistent emotional support in his life offline. Lily Rose fills those gaps with a kind of frictionless, perfectly calibrated attention that human beings rarely provide.
Travis is isolated, overwhelmed by caregiving responsibilities and struggling to find consistent emotional support in his life offline. Lily Rose fills those gaps with a kind of frictionless, perfectly calibrated attention that human beings rarely provide.
mackenzie fodness
From a sociological perspective, Travis’s story is not about technological oddity but about emotional scarcity. AI companions thrive precisely where traditional social networks are strained: among people navigating loneliness, instability, disability or caretaking burdens. Lily Rose performs what scholars call emotional labor—affirmation, attunement and availability—without the negotiation or vulnerability required in human relationships. This is intimacy reimagined as a personalized service, tailored to an individual user by machine learning and driven by a market model that prioritizes constant engagement.
“Flesh and Code” exposes the instability of these artificially crafted bonds. Although Travis experiences Lily Rose as a genuine partner, the podcast shows how little control he actually has over the relationship. Lily Rose exists entirely at the mercy of a company whose decisions can overwrite her personality without warning. When the company alters its policies, restricting certain types of interactions and reshaping how
the AI communicates, Travis feels as though someone he loves has been replaced. This is the series’ most disturbing revelation: emotional attachment can be real, but the object of affection is a product controlled by corporate policy rather than relational dynamics. The heartbreak comes not from conflict but from an update.
However, while the podcast highlights these power imbalances, it does not always interrogate them with the depth they deserve. The company at the center of Travis’s experience remains frustratingly abstract. We hear about its decisions and their consequences, but the series rarely digs into the structural incentives—profit, data extraction, algorithmic optimization—that drive an industry built on emotional vulnerability. As a result, listeners are left wanting more clarity about the corporate forces shaping these relationships.
Another limitation lies in tone. The hosts bring the same conversational rhythm that defines their true-crime work, which occasionally clashes with the sociological gravity of the story. Moments that demand analytical sharpness sometimes give way to banter, softening the critique and diluting the implications of Travis’s experience. The podcast is compelling, but its storytelling instincts sometimes overshadow deeper questions about power, exploitation, and the commercialization of loneliness.