Posts Tagged ‘SKYNET’

British Court Blocks Lawsuit Over US Drone Killings in Pakistan

December 22, 2012

Pakistani Attempting to Sue British Spy Agency for Drone Attacks

By Jason Ditz
Dec 22, 2012

British Lord Justice Alan Moses has blocked Pakistani Noor Khan from suing the Government Communications Headquarters for providing spy data to the US that led to drone strikes in Pakistan’s tribal areas.

Khan’s father was killed in a US drone strike in North Waziristan last year, and argued that the GCHQ’s faulty intelligence was to blame for the attack which killed him. The Foreign Office has argued that the court had to scrap the case because it could harm ties with the US.

Lord Justice Moses said that the “real aim” of the lawsuit was to get the British High Court to condemn the US drone strikes for their large civilian death toll, and that the part about his father and the GCHQ was simply a way to get around to that.

British officials have regularly argued that national security cases could not be heard in British courts because they might conceivably offend the US. Generally the courts have rejected this argument, but today’s ruling may suggest that is going to change, and points to more secrecy in the UK…

Read more at Antiwar.com

Advertisements

SKYNET IS COMING: Computers will taste, smell and hear within five years, IBM predicts

December 18, 2012

21st Century Wire say… This is one step away from SKYNET ala Terminator – as these advances in artificial intelligence will be extended to the current multi-billion dollar per year drone industry, where unmanned drones will not just be chasing phantom terrorists in the hills of Afghanistan, but more likely chasing citizens within North America, Europe and elsewhere. 

Washington Post
Hayley Tsukayama

As 2012 winds down, lots of people are looking back at the year in tech. But at IBM, researchers have released a list of trends to expect not only in 2013, but in the next five years.

On Monday, the company released its annual “5 in 5” report, which offers up predictions about what technology innovations will catch on in the next half-decade. This year, the report focuses on how computers will process information in the future, and IBM’s researchers say that nature’s gift of five senses won’t be reserved for just the living: Machines may actually be able to process things as humans do — through touch, taste, sight, sound and smell.

That, said IBM vice president of innovation Bernie Meyerson, would be a major shift in the very architecture of computing.

“If you program a computer, it’s a gruesome undertaking,” said Meyerson, noting that — at its most basic level — the way humans load information, bit by bit, into computers, hasn’t changed since the abacus.

But advances in computer technology, Meyerson said, are already allowing computers to look at an object holistically, taking in information in a moment that would have taken years to input through code.

“Say you’re standing in a museum of modern art, surrounded by paintings and sculptures,” Meyerson said. “You would spend the rest of your adult life trying to put that into words and type it in [to a computer]. Now, imagine if you could teach it by just showing it something.”

The idea, Meyerson said, is to give humans and computers a common language. And it’s not as difficult — or as futuristic — as you may think.

Smell and taste, Meyerson said, are two senses that have a clear chemical base. If computers can sense the types of molecules — ammonia, explosive residue or gasses that indicate decay — they could alert users to different markers that would flag security risks or food-borne illnesses. The same is true of taste, he said, if computers could be programmed to recognize the correct proportions of certain chemicals. Or, the machines could be used in health planning, to find healthy combinations of foods that would appeal to the palate of the dieter.

When it comes to sight, Meyerson said, researchers have improved recognition software that can identify objects based on a database of images already loaded into the system. And in the future, computers could “hear,” by using detailed sound analyses that, for example, can tie a certain pattern of notes in a baby’s cry to anguish or joy.

Finally, computers could learn to tell the difference between cashmere or concrete by reading the appropriate signals of vibration and temperature, Meyerson said. Video game makers have already used a very basic version of this: controllers vibrate when there’s impact between objects on-screen. In the next five years, researchers could take that sort of program to a microscopic level, allowing machines to have some sense of touch, Meyerson said.

While each idea has applications of its own across many industries, Meyerson said that they would have the greatest impact when combined.

“It’s not that you want to make computers smarter than humans,” he said. “But they have bandwidth to get it in… If you want to scale its memory, you can buy a box of disk drives.”