Covering Disruptive Technology Powering Business in The Digital Age

image
Big data and hidden cameras are emerging as dangerous weapons in the gentrification wars
image
August 24, 2016 News

 

The gentrification wars have a dangerous new weapon: invasive surveillance technology.

Earlier this summer, the Washington Post wrote about a disturbing tenant-screening software service called Tenant Assured. The service, provided by London startup Score Assured, scans the LinkedIn, Instagram, Twitter, and Facebook accounts of prospective tenants to create a “comprehensive” personality profile and risk score. The software tracks prospective tenants’ use of keywords like “poor” or “loan,” as well as activities such as frequent check-ins at bars. Using such information, the company boasts that it can highlight the top five personality traits of a potential tenant as well as any risks, offering features such as a “new to country alert.”

It’s easy to see why Tenant Assured is getting so much bad press. It’s invasive, misleading, and potentially discriminatory. The “new to country” alert, for example, seems tailor-made to facilitate discrimination against immigrants. But Tenant Assured is actually only one of an impressive array of surveillance options available to today’s landlords. These high-tech tools raise important questions about how such technology could wind up promoting gentrification. Here are a few developments worth watching:

1.) Using algorithms to evaluate tenants

If Tenant Assured raises eyebrows, so should the tenant-screening service Naborly. Like Tenant Assured, it creates reports [pdf] by scanning applicants’ social media, credit information, and more. The company claims that its artificial intelligence system can “accurately predict tenant risks, the chance of eviction, late rent payments, property damage, and length of tenancy.” Naborly also offers a global tenant database whereby landlords, property managers, and tenant screening companies can share information about potential tenants.

Like Tenant Assured, some of Naborly’s analytics could lead to discrimination. For instance, the system raises an alarm if “the applicant’s identification does not match their social media profiles”–a potential problem for many people, including members of the transgender community who go by different names on Facebook for safety or personal reasons.

Moreover, while algorithms sound like an objective way to measure a person’s financial stability, there is a growing body of research that suggests existing societal biases are built into algorithms. “We’re trying to design algorithms that mimic what humans can do,” computer scientist Suresh Venkatasubramanian writes in a post on Medium. “In the process, we’re designing algorithms that have the same blind spots, unique experiences, and inscrutable behaviors that we do.”

 Often, as is the case with Naborly, these algorithms are proprietary black boxes. This means that applicants may have no insight into, or means of challenging, calculations that impact their ability to find housing. As Julia Angwin writes in the Pacific Standard, currently the algorithm that creates a consumer’s credit score “is the lone algorithm in which consumers have a legal right to examine and challenge the underlying data used to generate it.”

2.) Using hidden cameras to spy on tenants

Landlords already have access to many traditional methods of surveillance. Itkowitz, a New York City law firm that helps landlords “de-tenant” rent-stabilized building, published a guide in 2013 explaining how to use motion-activated, hidden surveillance cameras to prove that a tenant isn’t using their apartment as their primary residence. Housing advocates in New York tell me this is a common practice.

One former Brooklyn tenant told me how their landlords installed hidden surveillance cameras and changed the electronic swipe system in their building. They forced residents to hand over photo identification (which they copied) in order to get new keys. Using this catalog of residents and cameras, they tracked who was coming in and out by matching tenants’ faces with the key fob swipes. Landlords eventually evicted someone whose family had been in the building for three generations, using this surveillance as evidence.

 Technology can also be used to make things uncomfortable for tenants who can’t be evicted. For example, the same landlords at the Brooklyn apartment building refused to give elderly tenants extra key fobs so that relatives or caretakers could check in on them. Nor would they agree to give extras for caretakers or babysitters. While companies like Naborly claim technology makes things more convenient for everyone, its clear it can just as easily be used as a barrier.
 

The use of surveillance for the eviction and harassment of tenants is all too familiar for residents of government housing projects, especially in the “mixed-income” developments that are rapidly replacing traditional government-owned housing projects. A 2013 University of Chicago report on these developments noted that “private property managers of the new developments are concerned with maintaining strict norms of behavior through comprehensive rules and vigilant monitoring.” As one resident put it, “They watch you come in and watch you got out.”

 Fighting back

It’s an unsettling picture. Many low-income people living in housing projects already know that surveillance can control their access to housing. Now, renters everywhere should be prepared to be subject to such treatment as gentrification continues to increase.

The good news is that tenants can fight back. Locking down social media and other forms of surveillance self-defense are a good start. And while the Internet has made it easy to spy on tenants, it has also made it easier for tenants’ groups to provide tenants rights information online. If you live in a city where gentrification is a problem, it’s a good time to educate and protect yourself.

This article was originally published on qz.com can be viewed in full

(0)(0)

Archive