The Unsettling Rise of AI Real-Estate Slop
At first, the idea of using AI to create real-estate-listing pictures seemed like a decent proposition to Kati Spaniak, an Illinois-based agent. Like anyone who works on commission, real-estate agents are under tremendous pressure to reduce overhead costs, and a tool that produces images of a furnished home—without an agent having to actually furnish it—could save thousands of dollars. More and more brokers seem to have the same idea: A recent survey of Realtors found that nearly 70 percent of the participants had used AI.
Spaniak thought she had the ideal candidate for trying out the tech: a house in a suburb north of Chicago that had tremendous appeal on paper but looked terrible in photos when it was empty. “The house really needed quite a bit of work,” she told me. So she ordered some “virtually staged” photos that used AI to add furniture, wall hangings, and stacks of coffee-table books. But when potential buyers began showing up, Spaniak noticed a problem. Visitors seemed disappointed, even disoriented. “They don’t even really recognize why they’re upset,” she said. “They just feel let down.”
For homeseekers, the rise of the AI-assisted listing is not necessarily catastrophic. Fake imagery in home sales are like heavily edited photos on a dating profile—people are going to realize they’ve been fooled as soon as they walk in the door. And a level of manipulation has long been baked into real estate: wide-angle lenses to make spaces look bigger, aerosol sprays that smell like freshly baked cookies to suggest the presence of cozy homemakers, half-filled closets to imply a surplus of storage space.
[Read: The problem with using AI in your personal life]
But every successful broker knows that a sale isn’t made just on facts such as square footage and the number of bedrooms—it’s made on feelings, both implicit and latent, like FOMO (“This one will go fast”) or security (“You wouldn’t even need to lock your doors”). Vacant homes are carefully staged to help the unimaginative project themselves into a role: homeowner with a bungalow full of kids on a cul-de-sac, urban sophisticate in a downtown loft. Like a cherished item of clothing, the right home can allow a buyer to feel like the person they want to be. That’s why many tend not to object to an agent’s subtle manipulations; they get that they’re being nudged toward something they already desire. “Your whole goal when you’re selling a house is to get people in the door feeling emotional,” Spaniak said, “like they’re going to raise their families there.”
From this perspective, the unnameable distress caused by home-listing images fabricated in a data center is likely less about the superficial concern over misleading photos and more about something else: the psychological function of a home, and the dreams that buyers carry with them when they go out to find one.
The most flagrant AI real-estate imagery—hallucinated trees, staircases that don’t seem to go anywhere—can potentially run afoul of laws against false advertising, which forbid concealing substantial defects, such as, say, a crack in the house’s foundation. But AI photos fall along a spectrum of realism: Some are cartoonishly fake, whereas others can be nearly lifelike; some are labeled as using virtual enhancements, and others aren’t. In the listing that Spaniak showed me, furniture seems to hover over the floor without quite touching it, and cloth drapes as if it’s immune to gravity; still, a casual browser looking on their phone may not spot the nuances. Homebuyers tend to be more sensitive than the law, which is still largely unsettled in regards to the topic. Even in cases where use of AI photos is currently legal, many people seem to detest the results.
Many of those who revile AI-generated images can’t quite put a finger on what it is they don’t like. When an AI-inflected mural went up in London last year, Brits from across the political spectrum protested, even as some admitted they didn’t know what, exactly, they were protesting. Psychologists have observed that AI images of humans fall into the “uncanny valley”—a term that describes how almost-but-not-quite-realistic images of humans are far more unsettling than, say, a drawing of Charlie Brown. A study by researchers at Indiana University and the University of Duisburg-Essen found that people are also creeped out by AI images of food. AI real-estate images seem to be just as unsettling. On Reddit, when a woman posted AI-created photos of a home for sale, one commenter said, “It looks like they built it in The Sims.”
[Read: How private equity is changing housing]
Uncanny valley was coined by the Japanese roboticist Masahiro Mori in 1970, though the concept of the “uncanny” was popularized a century ago by Freud. Uncanny is a loose translation of Freud’s phrase unheimlich, which, incidentally, literally translates to “un-homely”: the opposite of evoking a feeling of comfort, security, and safety. In his 1919 essay about the topic, Freud includes examples such as the mysterious recurrence of a number in your life, and getting lost and somehow ending up back in the same place repeatedly—experiences that make one wonder whether the everyday might be subject to forces that elude apprehension or control.
As Spaniak learned, issues arise when you try to sell a living space by using technology that seems to have a knack for rattling people. A home really is a luxury good, in the sense that its value is based mostly on intangibles; the pride and coziness one feels in a home might have little to do with the shelter provided by the roof and four walls, any more than people wear Jordans for ankle support. For many people, home is, in the words of the architectural historian Paul Oliver, “the theatre of our lives”—and the marketing and imagery that agents use set a foundation for how people live once inside. As in any staged drama, there’s an element of pretend here, though it’s less make-believe than aspiration.
But aspirations are fragile. Psychologists who study the nature of ambition have consistently found that aspirations are motivating in proportion to their attainability. AI listing photos risk setting people up for disappointment by selling them on a dream of home that, by definition, they can never attain. A similar logic is illustrated in discussions about the danger of unrealistic beauty standards; once human desire has been calibrated to what doesn’t exist, what does exist can only disappoint. After all, although real-estate listings have always been somewhat aspirational, even the most heavily manipulated photos still reflect a grain of truth; the apartment really does look like that at the golden hour, if only for 20 minutes a day, and your home could be as sophisticated as the professional stager made it look, if you found a spare $180,000 lying around to blow on Danish furniture and some framed Twombly prints. You won’t, but you could.
What might be taboo to admit in America, where individual accomplishment has long been exalted, is that many lofty ambitions—striking it rich, snagging the ideal partner, living in the picture-perfect home—are more useful as possibility than actuality. People tend to embrace the idealized versions of themselves, their homes, or their lives as the bounds of their potential, not as its median. We don’t necessarily need to achieve these things; we only need to feel that we could. Your kitchen counter might be covered with half-full LaCroix cans, and you might Google “horizontal cracks basement wall bulging” every time you do laundry, but the thing that makes it all bearable, even heimlich, is that you know the house could look like you dreamed it would when you signed that lease or mortgage. Take that hope away and a home becomes just a house: a big box for storing one’s socks and USB cords.
[Read: How wanting leads to less satisfaction]
Many real-estate agents, to their credit, seem to instinctively understand this challenge. Though AI-generated listing photos have gained some traction, the agents I spoke with think the practice is unlikely to be adopted on a wider scale. After her experience with the tech, Spaniak now recommends that sellers use real-life staging and professional photography. Only amateurs, another agent told me, would try to cut corners by using AI photos. If some people in the real-estate industry insist on shoehorning the tech into listings, it will likely make the process of buying or selling less efficient and less profitable, Ayelet Fishbach, a professor of behavioral science and marketing at the University of Chicago, told me: “Both buyers and sellers lose.”
If AI listing photos ultimately flop, it might partly be a matter of bad timing. The philosopher Ernst Bloch once observed that the uncanny had its pleasures, but entertainment such as creepy stories were best enjoyed in “too cozy” conditions, when one was “personally secure.” People have a higher tolerance, even a craving, for the disconcerting when they are comfortable. If AI photos had hit the market during the socially placid and economically flush ’90s, they might have exuded a more utopian aura, more Jetsons than Terminator.
Instead, they’re spreading at a time when many Americans are financially strained, alienated, and pessimistic about the future. Ironically, a portion of these contemporary anxieties stem from the spread of AI itself, including the threats it may pose to the next generation’s capacity for critical thinking and wide swaths of the white-collar economy. Under these conditions, maybe it’s inevitable that AI real-estate photos would seem more like ominous transmissions from a post-human future, and less like a harmless way for an agent to save a buck.