top of page

When Jim Crow Goes Digital

Inside the Private Tech Pipeline Powering U.S. Surveillance

A Black woman with dark skin looks directly at the camera as an overlay maps the left half of her face.
The same facial recognition technology that can identify dogs with over 90% accuracy still misidentifies Black women up to one-third of the time. image credit: Shutterstock 

When a federal agency wants to venture deeper into a person’s life, private contractors often open the door. In 2025, that pipeline is vast. Immigration authorities have 24-hour social media monitoring teams staffed by vendors. Police departments are pooling billions of license plate scans through private clouds. Prisons are swapping in corporate tablets that record communication. Defense projects are absorbing artificial intelligence born in Silicon Valley. Each is sold as efficiency or safety, yet each expands who can see, sell, or weaponize the daily lives of everyone – including Black women and girls. The result is an infrastructure of data extraction where public power and private profit converge. Wired first reported ICE’s plan to create round-the-clock surveillance teams, and investigations by the ACLU, EPIC and Reuters have continued to trace how those systems now extend from immigration enforcement to policing and the military. 


These technologies form what legal scholars now call a “digital Jim Crow” system: a network that classifies and constrains people along racial and economic lines while presenting itself as neutral data science.

Social media giveth to Black women ...  


Technology, however, has also been a lifeline. Digital platforms helped build the modern movements for Black liberation and gender justice, giving Black women and girls faster routes to mobilize, learn and care for one another. Hashtags like #BlackLivesMatter and #SayHerName transformed from rallying cries into organizing infrastructure, while research from the ACM Digital Library documents how mutual aid networks used simple digital tools to coordinate food, housing and safety resources during crisis periods. For Black women entrepreneurs, technology has opened real economic possibility. Federal Reserve small business surveys show Black women-owned firms growing rapidly, despite persistent barriers to bank credit; and a Brookings analysis found digital tools are key to expanding opportunity for micro-entrepreneurs across the African diaspora. These gains are genuine. But they exist inside systems increasingly shaped by surveillance policing, and predictive control. 


As activists built power online, ICE took notice  


The same year activists celebrated a decade of online protest power, ICE began hiring contractors to build two continuous monitoring centers to scrape Facebook, Instagram, TikTok, YouTube and Reddit for “actionable intelligence.” That new watch floor rests on massive databases of purchased information, including utility customer records supplied through Thomson Reuters and billions of digital traces sold through LexisNexis. License plate scans come from nationwide networks run by Vigilant Solutions and Flock Safety, giving federal agencies warrant-free access to billions of location points. 


Immigration surveillance technology “mission creeps” toward the mainstream


Palantir, one of Silicon Valley’s most powerful data firms, now sits at the center of both immigration and military surveillance. The company’s $30 million contract to build ImmigrationOS promises “near real-time visibility” into visa overstays and deportation targets, while the same engineers are also building the Army’s Maven Smart System and TITAN intelligence platform. These contracts link domestic policing and foreign targeting in one continuous pipeline of surveillance code. 


Faulty technology increases police presence in Black neighborhoods 


“Racism is productive, not in the sense of being good, but in the literal capacity of racism to produce things of value to some, even as it wreaks havoc on others.” Her warning is that bias is not a glitch but a design feature.

Local police departments, too, are buying their way into the same ecosystem. Cloud-based tools like Flock Safety’s camera network allow small departments to access billions of vehicle scans shared by thousands of jurisdictions. Gunshot detection systems such as ShotSpotter have been shown to send officers into predominantly Black neighborhoods on false alerts, a pattern so pronounced The City of Chicago cancelled its contract in 2024.  


Facial recognition technology still mixes up Black women 


Private surveillance extends to homes and retail spaces: Amazon’s Ring has re-opened channels for police video requests Grocery chains like Sainsbury’s are testing in-store facial recognition that automatically flags shoppers. Facial recognition algorithms still misidentify darker-skinned women at the highest rates – up to one-third of the time – according to the National Institute of Standards and Technology


Faulty algorithms could keep Black women in prison

 

Inside prisons, companies like ViaPath and Securus have digitized visitation and communication, charging fees and storing every message. These systems create searchable archives of family correspondence that persist long after incarceration ends. At the same time, parole and sentencing algorithms are determining who walks free. The Justice Department’s own review of its Prisoner Assessment Tool Targeting Estimated Risk and Needs PATTERN risk tool found consistent over-prediction of risk for Black women. Tools like COMPAS, first exposed by ProPublica, continue to shape decisions across states, despite documented racial bias. 


Jim Crow goes digital: legal scholars  


Together these technologies form what legal scholars now call a “digital Jim Crow” system: a network that classifies and constrains people along racial and economic lines while presenting itself as neutral data science. Predictive policing recycles the very prejudiced patterns it claims to correct. Social media vetting and scoring increasingly determine who crosses borders or gets flagged at airports, State Department now urging visa applicants to make their accounts public. Meanwhile, retail and financial systems are testing behavior-based rating models that function as soft social-credit scores, deciding who is “trusted” enough for access. 


White, Christian nationalists see surveillance as tools of moral order  


These trends don’t exist in a political vacuum. The growth of surveillance capitalism is happening alongside the emboldening of white nationalist and Christian nationalist movements, whose policy allies in Congress and state legislatures are reshaping immigration, reproductive and education laws to reinforce control. Research from the Public Religion Research Institute and others show Christian nationalism strongly correlates with support for racial hierarchy, anti-LGBTQ legislation and authoritarian governance. These ideologies have already influenced policy: the Supreme Court’s September 2025 order, for example, expanding immigration officers’ discretion was celebrated by figures in these circles who see surveillance and control as tools of moral order. 


When racism isn’t a glitch, but a design feature  


For Ruha Benjamin, professor of African American Studies at Princeton and author of “Race after Technology,” this moment is what she calls the “New Jim Code”: “Racism is productive, not in the sense of being good, but in the literal capacity of racism to produce things of value to some, even as it wreaks havoc on others.” Her warning is that bias is not a glitch but a design feature. Each technological fix that claims to optimize fairness, she argues, risks embedding old hierarchies in new forms. 


First, they came for the borders, then the schools 


That insight is visible everywhere. Surveillance tools once justified by border control or counterterrorism are now embedded in city contracts, school safety apps and welfare eligibility systems. Every policy promising protection arrives with a price: more data collected, more profiles built, more invisible hands deciding who belongs. Technology still offers freedom. It also writes the code of control. For Black women and girls, the challenge ahead is not just to survive within the digital system but to rewrite its logic – demanding transparency from governments and corporations alike, designing tools that honor consent and privacy, and insisting liberation (not profit or punishment) becomes the standard by which innovation is measured. 

Comments


bottom of page