Emotion, gender prediction tech should be banned from Brazil’s metro: NGO testifies

June 26, 2020


facebook icon facebook icon

Access Now, a non-profit that defends the digital rights of citizens around the world, presents an expert opinion in a lawsuit against São Paulo metro operator ViaQuatro and its use of facial categorization software to predict the age, gender and emotional state of passengers for advertisement purposes.

ViaQuatro is a public-private concessionary responsible for the operation, maintenance and investments into Line 4 of São Paulo’s metro, which runs through some of the city’s busiest stations.

In 2018, the company installed “smart billboards” into some metro stations, complete with AdMobilize’s Digital Interactive Doors system facial analysis technology, which monitors viewers’ reactions to adverts, in order to carry out targeted advertising campaigns.

AdMobilize is a private advertising platform that uses facial categorization software to scan users’ faces on public transport.

Besides being able to predict a person’s age range and gender, the technology also claims to be able to detect whether a person is feeling happy, unsatisfied, surprised or neutral.

Violation of rights

The first organization to object to the technology was the Brazilian Institute of Consumer Defence (IDEC), which sued ViaQuatro in 2018 (the same year it released the technology) for violating consumer and personal data protection rights. The civil case is still ongoing.

Access Now is the first international third-party organization to join the case, over concerns that the technology could set an unnecessary precedent for the use of biometric technology not just in Brazil, but across Latin America.

Access Now makes the following three arguments:

  1. The defendant misrepresents the personal data processing activities of the Digital Interactive Doors system, making misleading claims about anonymization, personal data, identification, and the storage of data, and that their true activity is actually facial analysis/classification. 
  2. The so-called emotion detection technology used by this system is based on flawed scientific premises. This technology cannot, in fact, “perceive” or “detect” emotion, but at best detect facial expressions with a mediocre accuracy.
  3. Outcomes caused by classification of gender as a male-female binary in the functioning of the digital categorization system are inherently discriminatory for trans and non-binary people.

Speaking to The Sociable, Access Now’s Latin America Policy Associate Veronica Arroyo expanded upon ViaQuatro’s allegedly flawed claims.

The company claims to save data in an anonymous way, Arroyo said. But “how can you have raw data and anonymized data at the same time?” she asked. “Those terms are contradictory.”

ViaQuatro also claims not to store the data obtained by the technology, but, as Arroyo points out, “they must have to store it at some point to add it to the database.”

Emotion and gender prediction technology

When it comes to predicting emotions, there’s no scientific evidence to prove that technology can actually do this accurately, Arroyo pointed out.

“They claim that they can predict people’s emotions…but during the case, they have confessed that they are aware the technology is not 100% accurate,” she claimed, highlighting a “lack of diligence” on ViaQuatro’s behalf.

Veronica Arroyo, Access Now

Veronica Arroyo, Access Now

“They claim that they can predict people’s emotions…but during the case, they have confessed that they are aware the technology is not 100% accurate” — Veronica Arroyo, Access Now.

Finally, for Arroyo, the technology’s perception of gender is problematic.

“Tech thinks gender is a binary concept,” she said. “If you keep with this idea that gender is just binary then you continue the narrative… that transgender people do not belong to society.”

Brazil is the most dangerous country in the world for the trans and transvestite population, according to a 2020 report by the National Association of Transexuals and Transvestites (ANTRA).

Data protection laws in Brazil

The Brazilian Constitution does not recognize the right to data protection, as a data protection law — which was passed in mid-2018 — remains suspended by government decree.

Although there are hopes for the law to be reinstated this August, this situation leaves Brazilians with no authority to complain to about facial categorization technology.

“It’s a very invasive technology” — Veronica Arroyo, Access Now

This also means that, legally, there are no specific rules for ViaQuatro to follow when it comes to using the technology, Arroyo pointed out.

“No one can see what they are doing and inspect them,” she said.

There is no safeguard for the user, who is not whether they would like to opt out of being analyzed, she added, which explains why this case against ViaQuatro is being heard in a civil rights court as opposed to a human rights one.

And although Line 4 of São Paulo’s metro is run privately, the mode of transport itself is public, Arroyo highlighted.

“It’s a very invasive technology,” she added.

Tech optimism in Latin America

According to Arroyo, biometric technology in its many forms is generally well received in Latin America.

This mindset is due to the issue of public security, which is chronic across the region.

“We have experienced so many years of dictatorships and terrorist organizations that any surveillance system is welcome because it is going to protect you,” she said, explaining that the inherent desire for human safety overrides concerns over data protection.

Although countries in Latin America tend to use local companies to develop their biometric technology, Arroyo believes that if US-based big tech companies continue to advocate for more public debate around the regulation of these technologies, then this could affect Latin America.

Before this can happen, however, she stresses the need for data protection laws to be enforced in countries such as Brazil, in order to create a legal framework to hold companies like ViaQuatro accountable for the way in which they use this technology.

Emotion analytics used in AI recruitment tools are not only unethical but incorrect

Will this be the last time Brazil uses facial recognition technology at Carnival?


facebook icon facebook icon

Sociable's Podcast