Water Machine Control
- 1. Algorithms are biased. Machine Bias at Propublica
- 2. Bearded hipster group says police mistook them for Islamic State terrorists
- 3. Minority Report-style AI learns to predict if people are criminals from their facial features
For a workshop about technology in public space at Etopia Art Center (Zaragoza) I proposed to build a machine capable of identifiying terrorists and shot them with a water gun.
The IA algorithm is secret, and it shot many people when the prototype was installed at Asalto Festival of urban art
Tech description: It is an IP camera connected to a RaspberryPi running an openFrameworks program which detect the faces. Then it sends the face image to a cloud API to identify "terrorist face features".