Digital Age Violence
The shooter who took her own life after wounding four YouTube employees at the video hosting website’s San Bruno, California, offices on Tuesday has been identified as Nasim Aghdam… long history of animal rights activism… claims to be from Iran…
terror groups were using popular social networking Web sites like Facebook to recruit, and possibly kidnap, Israeli citizens.
You can run, but you cannot hide ... and if you try, one push of a button will cause a lethal poison to immediately begin flowing through your body. That's the Orwellian future a Saudi inventor was seeking to bring to Germany until that nation's patent office announced last week it was rejecting his request to patent what has been dubbed the "Killer Chip."
Nimzay Aponte met Raymond Dennis via AOL, but ultimately refused to meet him face to face. He responded by stabbing her in broad daylight as she sat with a friend, William Sherief, on a park bench in the Bronx... died a short time after the Tuesday afternoon attack - but not before she gave police a dying declaration: "Mike did it." ...
"In startling revelations, convicted terrorist Ali Saleh Kahlah al-Marri admitted that Al Qaeda used public telephones, pre-paid calling cards, search engines and Hotmail. al-Marri 'used a "10-code" to protect the [phone] numbers — subtracting the actual digits in the phone numbers from 10 to arrive at a coded number.' The real story behind all this is that the terrorists weren't using sophisticated methods to avoid detection or monitoring — which tells us just how crappy SIGINT really is right now.
"Now programs with millions of lines of code are written by teams of programmers, none of whom knows the entire program; hence, no individual can predict the effect of a given command with absolute certainty since portions of programs may interact in unexpected, untested ways." That's what might have happened during an exercise in South Africa in 2007, when a robot anti-aircraft gun sprayed hundreds of rounds of cannon shell around its position, killing nine soldiers and injuring 14.
The government of South Korea is drawing up a code of ethics to prevent human abuse of robots—and vice versa.
Autonomous military robots that will fight future wars must be programmed to live by a strict warrior code, or the world risks untold atrocities at their steely hands. The stark warning — which includes discussion of a "Terminator"-style scenario in which robots turn on their human masters — is part of a hefty report funded by and prepared for the U.S. Navy's high-tech and secretive Office of Naval Research. The report, the first serious work of its kind on military robot ethics, envisages a fast-approaching era where robots are smart enough to make battlefield decisions that are at present the preserve of humans. Eventually, it notes, robots could come to display significant cognitive advantages over Homo sapiens soldiers.
The country's broadcasters were summoned Friday by the Ministry of Information and Broadcasting to deal with charges that the live saturation coverage had helped the terrorists. At the same time, however, traditional media were criticized as too slow and inaccurate by legions of "citizen journalists" using Internet services such as Twitter and photo site Flickr. The deputy commissioner of police argued that the terrorists, who were holed up in two major hotels and became involved in floor-by-floor firefights with police, were gaining tactical information from TV. Using powers under Section 19 of the country's Cable Television Networks Act, he ordered a blackout of TV news channels.
By 2010 the US will have invested $4 billion in a research programme into "autonomous systems", the military jargon for robots, on the basis that they would not succumb to fear or the desire for vengeance that afflicts frontline soldiers. A British robotics expert has been recruited by the US Navy to advise them on building robots that do not violate the Geneva Conventions. [Anything that can be programmed can be hacked and re-programmed. And no program can encompass every moral possibility.]