A major stadium in Los Angeles has confirmed that it uses facial recognition tech during sport and music events, to spot known troublemakers.
The Rose Bowl told news site Gizmodo that it used the tech “inside and outside” the stadium gates.
It made the statement after an article in Rolling Stone alleged that Taylor Swift fans were scanned at a pop concert in an effort to spot stalkers.
However, the companies involved have been reluctant to talk.
What happened at the Taylor Swift concert?
According to Rolling Stone, visitors to a Taylor Swift concert at the Rose Bowl were encouraged to visit a pop-up booth playing behind-the-scenes footage of the pop star.
The booth allegedly recorded fans and compared their pictures to a database of Swift’s known stalkers.
The claims were made by the chief technology officer of a company called Oak View Group. He told Rolling Stone he had been invited to see the technology in action.
However, the article did not name the company responsible.
The BBC’s Dave Lee has tried to contact Oak View Group and Taylor Swift’s representatives, but neither has responded to his questions.
But the Rose Bowl has confirmed it does use facial recognition at some events to enhance security.
It says the use of the tech is clearly signposted at events. However, it has not offered any clarity about the alleged Taylor Swift pop-up booth.
Wembley Stadium in the UK, where Taylor Swift performed earlier this year, told the BBC it does not use facial recognition at events.
Has this happened before?
Yes. China says it has apprehended several suspects using facial recognition at pop concerts.
In April, a man who was wanted for “economic crimes” was caught at a Jacky Cheung gig in Nanchang city.
Local media has reported several cases where people have been arrested this way.
Olympic organisers say the tech will be used at the Tokyo 2020 games to eliminate entry with forged identification.
However, there are privacy and accuracy concerns with large-scale facial recognition systems.
Campaign group Big Brother Watch argues that the use of automatic facial recognition systems contravenes the Human Rights Act.
It found systems tested by police in the UK had made a “staggering” number of false matches.
The UK’s Information Commissioner Elizabeth Denham says police forces must address concerns over the use of facial recognition systems or they may face legal action.
No comments:
Post a Comment