Facial Recognition Scans Are Becoming the New Normal for Rental Tours
#Privacy

Facial Recognition Scans Are Becoming the New Normal for Rental Tours

Startups Reporter
3 min read

Landlords are increasingly requiring 3D facial scans for rental property tours, raising privacy concerns and potential discrimination issues.

Moving to a new home has always been stressful, but renters now face an additional hurdle: submitting to facial recognition scans just to tour a property. What started as a pandemic workaround has become standard practice for many landlords, with companies like Rently and Invitation Homes requiring "live selfies" - wrap-around 3D facial recognition scans - before allowing self-guided tours.

The technology promises convenience for both sides. Landlords save time by eliminating phone calls, emails, and guided walkthroughs. Renters can tour properties on their own schedule without coordinating with property managers. But this shift comes with significant privacy implications that many renters may not fully understand.

How the Technology Works

When you apply to tour a rental property, you'll typically be asked to upload a photo of your government ID and then enable your phone's front camera. The software guides you through turning your head left and right to capture a 3D scan of your face. Some platforms require this verification upfront, while others only ask for it when you arrive at the property. Companies like Rently even require both - a pre-tour scan and another verification just before entry.

None of the major rental platforms allow renters to opt out of facial recognition if they want to take a self-guided tour. The process happens within seconds, or at least it's supposed to. But the technology isn't perfect.

The Discrimination Problem

Facial recognition systems have documented accuracy issues, particularly for people with darker skin tones. The ACLU's Jay Stanley notes that these systems are "notoriously poor" when scanning people of color, which could lead to false negatives - legitimate renters being flagged as suspicious and denied access to properties.

There's also the risk of mistaken identity. Stanley points to cases where someone has been incorrectly matched with a sex offender record due to facial recognition errors. These mistakes can have serious consequences when they occur during tenant screening processes.

Where Your Data Goes

The companies collecting these scans have varying data retention policies. Onfido says it keeps facial scans for up to 360 days before deletion, though it typically deletes them immediately after verification. Plaid retains data "no longer than necessary" and may share it with business partners, law enforcement, or if the company is sold. Stripe keeps biometrics for one year.

Most of these companies don't specialize in rental services - they're identity verification startups or financial technology firms whose services happen to be integrated into rental websites. This means your facial scan might be stored by a company you've never heard of, with privacy policies that allow sharing with third parties or law enforcement.

Who Gets Left Behind?

About 10 percent of Americans don't own smartphones, and roughly 20 percent of U.S. households lack internet connectivity. The elderly, those in rural areas with poor connectivity, and people without technical know-how may find themselves unable to tour properties that require facial recognition. This creates a new barrier to housing access that disproportionately affects already vulnerable populations.

The Bigger Picture

As rental markets tighten and application fraud becomes more common, landlords are turning to technology for solutions. But the convenience for property managers comes at the cost of renter privacy and potentially fair housing access. The question remains: at what point does the burden shifted onto renters fail to justify the marginal decrease in risk for landlords?

For now, renters who want to tour properties independently have little choice but to submit to these scans. The technology is becoming so normalized that many renters don't think twice about it - but privacy advocates warn that this normalization could have long-term consequences for housing access and personal privacy.

Featured image

The Markup, which conducted this investigation, uses data analysis and software engineering to examine how technology impacts society. Their reporting reveals how quickly facial recognition has moved from a pandemic workaround to a standard requirement, often without renters fully understanding what they're agreeing to or where their biometric data will end up.

Comments

Loading comments...