Sure. I mean realistically we need something for real-world application, like sunglasses form factor with the category “innocent citizen doing harmless stuff” or “criminal” taped to it, to fool the systems which are currently being rolled out. But I have no clue what amount of computing power they use for license plate readers or to spy on the pedestrians in the city center / mall / train station / bad neighborhood or wherever these AI cameras are being used.
Lol, I was just adding the youtube video I watched yesterday to my previous comment: Can I Confuse Police AI Cameras? about road safety cameras in Australia(?), and seems they use some amount of processing power. Or human oversight. At least he doesn’t get a ticket in the end. But that was just a mostly humorous take on a similar thing. Thx for the video on Flock cameras.
Flock would be using a local model, so this might actually be pretty effective.
This isn’t actually using a vision LLM, it’s using a CLIP model. This image comes from an OpenAI blog from 2019 I think
Yeah, you’d need to do a FOIA request to check if it worked.
Sure. I mean realistically we need something for real-world application, like sunglasses form factor with the category “innocent citizen doing harmless stuff” or “criminal” taped to it, to fool the systems which are currently being rolled out. But I have no clue what amount of computing power they use for license plate readers or to spy on the pedestrians in the city center / mall / train station / bad neighborhood or wherever these AI cameras are being used.
This video may help shed some light (as well as an already proven method to fool them): https://youtu.be/Pp9MwZkHiMQ
Lol, I was just adding the youtube video I watched yesterday to my previous comment: Can I Confuse Police AI Cameras? about road safety cameras in Australia(?), and seems they use some amount of processing power. Or human oversight. At least he doesn’t get a ticket in the end. But that was just a mostly humorous take on a similar thing. Thx for the video on Flock cameras.