George got the telescreens and cameras and the stench of omnipresent surveillance right, but he was writing in the age of microfilm and 3x5 index cards.Data storage was prodigiously expensive and mass communication networks were centralized and costly to run — it wasn't practical for amateurs to set up a decentralized, end-to-end encrypted shadow network tunnelling over the public phone system, or to run private anonymous blogs in the classified columns of newspapers.
Instead, we see soldiers and machine-guns and refugees and the presentation of inevitable border wars and genocides between the three giant power blocs. What we have today is a vision of disrupted by a torrent of data storage.
Circa 1972-73, total US manufacturing volume of online computer storage — hard drives and RAM and core memory, but not tape — amounted to some 100Gb/year.
He was also writing in the age of mass-mobilization of labour and intercontinental warfare.
Limned in the backdrop to is a world where atom bombs have been used in warfare and are no longer used by the great powers, by tacit agreement.
It turns out that facial recognition neural networks can be trained to accurately recognize pain!
The researchers were doubtless thinking of clinical medical applications — doctors are bad at objectively evaluating patients' expressions of pain and patients often don't self-evaluate effectively — but just think how much use this technology might be to a regime bent of using torture as a tool of social repression (like, oh, Egypt or Syria today).A paper to be presented at the IEEE International Conference on Computer Vision Workshops (ICCVW) introduces a deep-learning algorithm that can identify an individual even when part of their face is obscured.The system was able to correctly identify a person concealed by a scarf 67 percent of the time against a "complex" background.And some of the potential applications of neural network driven deep learning and machine vision are really hair-raising.We've all seen video of mass demonstrations over the past year.Here's a racist hand dryer — it's proximity sensor simply doesn't work on dark skin!