Wednesday, February 8, 2023
HomeTechnologyNew York’s new camera map shows more surveillance in black and brown...

New York’s new camera map shows more surveillance in black and brown quarters

In areas of New York with higher rates of police searches with a closed network of television cameras more. new Amnesty International reportNYC Decode Surveillance Project.

Starting in April 2021, more than 7,000 volunteers began surveying the streets of New York through Google Street View to document the location of the cameras; volunteers assessed 45,000 intersections three times and found more than 25,500 cameras. The report estimates that about 3,300 of these cameras are state-owned and used by government and law enforcement. The project used this data to create a map that shows the coordinates of all 25,500 cameras using BetaNYC, a community-based technology organization, and contracted data scientists.

Analysis of these data showed that in the Bronx, Brooklyn and Queens, there were more state-owned cameras in census tracts with a higher concentration of colored people.

To find out how the camera network relates to police searches, Amnesty researchers and partner researchers determined the incidence rate per 1,000 residents in 2019 in each census area (geographical area less than zip code), according to initial address data. with NYPD. The “Stop and Search” policy allows officers to conduct random checks of citizens on the basis of “reasonable suspicions”. New York City police data are reported showed that since 2002 in New York there have been more than 5 million incidents of searches, and most searches were conducted near colored people. Most of the people subjected to these searches were innocent, according to the New York ACLU.

Each census site was assigned a “surveillance level” according to the number of state-owned cameras per 1,000 residents within 200 meters of its borders. There was a higher level of surveillance in areas with a higher frequency of “stop and search” searches. For example, on one half-mile route in the Eastern Flatbush in Brooklyn there were six such searches in 2019 and 60% coverage by public cameras.

Experts fear that law enforcement will use face recognition technology on the tapes from these cameras, disproportionately targeting people of color. According to documents obtained as part of requests for public documents under the Surveillance Technology Oversight Project (STOP), the New York Police Department used face recognition, including the controversial Clearview AI system, at least 22,000 cases between 2016 and 2019

“Our analysis shows that the use of NYPD face recognition technology is helping to strengthen discriminatory police against minority communities in New York,” said Matt Mahmoudi, a researcher at Amnesty International who worked on the report.

The report also details the impact of face recognition technology on last year’s Black Lives Matter protests by imposing surveillance maps on march routes. According to Mahmoud, he found “almost complete coverage”. Although it is not known exactly how face recognition technology was used during the protests, New York City police have already used it in one of the protesters’ investigations.

On August 7, 2020, dozens of New York City police officers, some of them in special forces, knocked on the door of Derrick Ingram, a 28-year-old Black Lives Matter activist. Ingrem was suspected of assaulting a police officer, shouting in the officer’s ear with a beep during the march. Police were spotted at the scene reviewing a document called the “Information Report of the Personnel Identification Department,” which included what appeared to be a photo of Ingram on social media. NYPD confirmed that used facial recognition look for it.

The new mayor of the city is Eric Adams considers expanding the use of face recognition technologydespite the fact that many cities in the US have banned it due to concerns about accuracy and bias.

Jameson Singer, an employee of the Georgetown Law Center for Privacy and Technology, says the Amnesty project “gives us an idea of ​​how widespread surveillance – especially in neighborhoods where most are not white – and how many public places are recorded on footage that police can use face recognition included.

Reported by Source link

RELATED ARTICLES
- Advertisment -

Most Popular

New York’s new camera map shows more surveillance in black and brown quarters

In areas of New York with higher rates of police searches with a closed network of television cameras more. new Amnesty International reportNYC Decode Surveillance Project.

Starting in April 2021, more than 7,000 volunteers began surveying the streets of New York through Google Street View to document the location of the cameras; volunteers assessed 45,000 intersections three times and found more than 25,500 cameras. The report estimates that about 3,300 of these cameras are state-owned and used by government and law enforcement. The project used this data to create a map that shows the coordinates of all 25,500 cameras using BetaNYC, a community-based technology organization, and contracted data scientists.

Analysis of these data showed that in the Bronx, Brooklyn and Queens, there were more state-owned cameras in census tracts with a higher concentration of colored people.

To find out how the camera network relates to police searches, Amnesty researchers and partner researchers determined the incidence rate per 1,000 residents in 2019 in each census area (geographical area less than zip code), according to initial address data. with NYPD. The “Stop and Search” policy allows officers to conduct random checks of citizens on the basis of “reasonable suspicions”. New York City police data are reported showed that since 2002 in New York there have been more than 5 million incidents of searches, and most searches were conducted near colored people. Most of the people subjected to these searches were innocent, according to the New York ACLU.

Each census site was assigned a “surveillance level” according to the number of state-owned cameras per 1,000 residents within 200 meters of its borders. There was a higher level of surveillance in areas with a higher frequency of “stop and search” searches. For example, on one half-mile route in the Eastern Flatbush in Brooklyn there were six such searches in 2019 and 60% coverage by public cameras.

Experts fear that law enforcement will use face recognition technology on the tapes from these cameras, disproportionately targeting people of color. According to documents obtained as part of requests for public documents under the Surveillance Technology Oversight Project (STOP), the New York Police Department used face recognition, including the controversial Clearview AI system, at least 22,000 cases between 2016 and 2019

“Our analysis shows that the use of NYPD face recognition technology is helping to strengthen discriminatory police against minority communities in New York,” said Matt Mahmoudi, a researcher at Amnesty International who worked on the report.

The report also details the impact of face recognition technology on last year’s Black Lives Matter protests by imposing surveillance maps on march routes. According to Mahmoud, he found “almost complete coverage”. Although it is not known exactly how face recognition technology was used during the protests, New York City police have already used it in one of the protesters’ investigations.

On August 7, 2020, dozens of New York City police officers, some of them in special forces, knocked on the door of Derrick Ingram, a 28-year-old Black Lives Matter activist. Ingrem was suspected of assaulting a police officer, shouting in the officer’s ear with a beep during the march. Police were spotted at the scene reviewing a document called the “Information Report of the Personnel Identification Department,” which included what appeared to be a photo of Ingram on social media. NYPD confirmed that used facial recognition look for it.

The new mayor of the city is Eric Adams considers expanding the use of face recognition technologydespite the fact that many cities in the US have banned it due to concerns about accuracy and bias.

Jameson Singer, an employee of the Georgetown Law Center for Privacy and Technology, says the Amnesty project “gives us an idea of ​​how widespread surveillance – especially in neighborhoods where most are not white – and how many public places are recorded on footage that police can use face recognition included.

Reported by Source link

RELATED ARTICLES
- Advertisment -

Most Popular