Finland submarine avoids US internet spying

Finland

Finland

Finland is aiming to capitalise on Germany’s privacy worries by setting itself up as a haven where online data will be safe from the prying eyes of foreign governments.

Cinia Group, a new state-owned telecoms company, is building a new submarine cable to link the Scandinavian country to German business and digital consumers.

It comes amid growing concern that the EU’s ‘Safe Harbour’ agreement, which allows personal and financial data to be exported to the US, does not protect privacy.

The route of the 685-mile cable has also been carefully planned to avoid waters where it would be more likely to be secretly wire tapped, in an effort to help build German confidence to use Finnish data centres.

The country is bidding to become the “Switzerland of the North”, where sensitive personal and financial data can be safely stored. Cinia has signed up its first customer for the new cable, the data centre provider Hetzner Online, which will pay €10m to use the link from early next year.

German privacy laws are particularly stringent due to its history of secret policing, and public concern about the internet was especially heightened there following the exposure of US spying by the former US intelligence contractor Edward Snowden.

Chris Watson, head of TMT at the City law firm CMS, which advised on the deal, said: “Russia and China keep control of data in government hands – and we saw in the US how personal data was not secure from the spooks. This cable is the fundamental aorta of an entirely secure commercial alternative for keeping data safe.”

Amazon’s data centre business has sought to address German privacy concerns by building a data centre near Frankfurt, guaranteeing that sensitive information will not leave the country. However other internet companies including Facebook have sought to save money by locating data centres in Scandinavia, where cooling for the thousands of server computers they contain is readily available from icy water.

Source:  telegraph.co.uk

Analytics engine will read your mind

Computation knowledge engine will soon be able to read your mind:

COMPUTATION KNOWLEDGE ENGINE

COMPUTATION KNOWLEDGE ENGINE

Wolfram Alpha will soon be able to read your mind, its creator Stephen Wolfram said at the South By Southwest (SXSW) conference in Austin today.

Speaking at the US technology conference on Monday, Wolfram predicted that his analytics engine will soon work pre-emptively, meaning it will be able to predict what its users are looking for.

“Wolfram Alpha will be able to predict what users are looking for,” Wolfram said. “Imagine that combined with augmented reality.”

Speaking during a talk on the future of computation, Stephen Wolfram – the creator of Wolfram Alpha and the mastermind behind Apple’s Siri personal assistant – also showed off the engine’s new ability to analyse images.

Wolfram said, “We’re now able to bring in uploaded material, and use our algorithm to analyse it. For example, we can take a picture, ask Wolfram Alpha and it will try and tell us things about it.

“We can compute all sorts of things about this picture – and ask Wolfram Alpha to do a specific computation if need be.”

That’s not the only new feature of Wolfram Alpha, as it can also now analyse data from uploaded spreadsheet documents.

“We can also do things like uploading a spreadsheet and asking Wolfram [Alpha] to analyse specific data from it,” Wolfram said.

He added, “This is an exciting time for me, because a whole lot of things I’ve been working on for 30 years have begun converging in a nice way.”

This upload feature will be available as part of Wolfram Alpha Pro, a paid-for feature where Wolfram hopes the analytical engine will make most of its money. Wolfram Alpha Pro costs $4.99 per month, or $2.99 if you’re a student.

Wolfram also showed off Wolfram Alpha’s ability to analyse data from Facebook, a feature that was announced last August.

Computer that will never crash

Scientists invent a self-repairing computer that will never crash:

Scientists invent a self-repairing computer that will never crash Read more at http://venturebeat.com/2013/02/14/scientists-invent-a-self-repairing-computer-that-will-never-crash

Scientists invent a self-repairing computer that will never crash

Scientists at University College London (UCL) have created a self-healing computer. The “systemic” machine, according to a report in the New Scientist, can instantly recover corrupted data. The invention is expected to have far-reaching consequences for physicians and the military. It could allow drones to recover from combat damage in a matter of seconds, or create a more realistic model of the human brain. The team behind the systemic computer built it to be able to respond to random and unpredictable events. Computers were originally designed to follow a linear set of instructions, and can only consider one thing at a time. “Even when it feels like your computer is running all your software at the same time, it is just pretending to do that, flicking its attention very quickly between each program,” Peter Bentley, a computer scientist at UCL, said in an interview with the New Scientist. Together with his colleague Christos Sakellariou, Bentley re-engineered a new computer that thinks more like the human brain. Anant Jhingran, who has been considered the brains behind IBM’s super computer “Watson,” said that new computing systems are designed to “mimic the real world better.” The vice president of products at Apigee, a “big data” analytics company, said IBM has spent years building computers that “observe and then react” like humans do. The trick is a safety in numbers approach: The new computer contains multiple copies of its instructions across its individual systems, so if one fails, it can access a clean copy and repair itself. In the future, Bentley’s team will incorporate machine learning, so if you’re sitting outside working and the temperature gets too high, the computer will respond to preemptively prevent a crash. The next generation of school kids may need to come up with a more creative excuse for failing to turn in work on time!

Quartz data is stored for Millions of years

Quartz Glass Stores Data For Millions of Years:

Quartz Glass Stores Data For Millions of Years

Quartz Glass Stores Data For Millions of Years

 

“The volume of data being created every day is exploding, but in terms of keeping it for later generations, we haven’t necessarily improved since the days we inscribed things on stones. The possibility of losing information may actually have increased.” -Hitachi researcher Kazuyoshi Torii To combat this problem, Hitachi is looking towards an unlikely source: slivers of quartz glass. Quartz glass is perfect for data storage due to it’s incredible strength and longevity. Quartz glass is waterproof, chemical resistant, unhindered by radio waves and can withstand temperatures up to 1000° Celsius. “We believe data will survive unless this hard glass is broken,” said Hitachi senior researcher Takao Watanabe. Within quartz glass, digital data is stored in binary form. A laser beam is used to record dots within layers of incredibly thin sheets of quartz glass. An optical microscope, or any computer that understands binary code, can read the stored data in the future. A standard piece of Hitachi’s quartz glass measures less than 1 square inch and is just .08 inches thick! Within each tiny sliver, 40 MB of data can currently be stored. The storage capacity is obviously the main hindrance, but quartz glass has a couple things working in it’s favor. First, a number of these quartz slivers can easily be stacked together without taking up much physical space. Second, the technology has a lot of room for growth. Quartz glass data storage compares nicely to solid state drives or SSDs, which, in their infancy could only store megabytes worth of data, but can now hold terabytes of data. As the news was just announced, it’s uncertain when quartz glass will be commercially available. Dependable long term hard data back-up as a complement to the cloud infrastructure beginning to dominate our lives will certainly be a welcome addition.

Google servers in complete darkness

 

Just how far will Google go to hide its custom-built data center hardware from the rest of the world?

Google Servers

Google Servers

In one Silicon Valley data center, the company is apparently so paranoid about competitors catching a glimpse of its gear, it’s been known to keep its server cages in complete darkness, outfitting its technical staff like miners and sending them spelunking into the cages with lights on their heads.  “Many [companies] try to keep things covered up. There’s a lot of valuable intellectual property in here,” says Chris Sharp, general manager of content and cloud at Equinix, as he walks through the company’s data center. “But we were always amazed by Google and the helmets.”  Google is one of many big-name Web outfits that lease data center space from Equinix—a company whose massive computing facilities serve as hubs for the world’s biggest Internet providers. All the big Web names set up shop in these data centers, so that they too can plug into the hub. The irony is that they must also share space with their biggest rivals, and this may cause some unease with companies that see their hardware as a competitive advantage best hidden from others.  About two years ago, Chris Sharp says, Google unscrewed all the light bulbs inside the hardware cages it occupied at that Equinix data center. “They had us turn off all overhead lights too, and their guys put on those helmets with lights you see miners wear,” he tells Wired. “Presumably, they were bringing up custom-built gear they didn’t want anyone else to see.”  Google declined to comment on Sharp’s little anecdote. But the tale is not surprising. Google designs its own servers and its own networking gear, and though it still leases space in third-party data centers such as the Equinix facility, it’s now designing and building its own data centers as well. These designs are meant to improve the performance of the company’s Web services but also save power and money. More so than any other outfit, Google views its data center work as an important advantage over competitors.  That said, Google has actually loosened up in recent years. In 2009, the company opened a window into the first custom-built data center it had built five years before, and it has discussed parts of its new facilities. But many of its operations remain a mystery.  Some believe this should change. Facebook now designs its own data centers and servers, and as a direct response to Google’s approach, the social-networking outfit has “open sourced” its designs, hoping to encourage collaboration on designs across the industry. This, Facebook says, will allow the rest of the world to save power in much the same way Google has done and ultimately, well, save the planet.  Several companies have already embraced this effort, including Netflix, the Texas-based cloud provider Rackspace and Japanese tech giant NTT Data. But others still prefer to keep their secret hardware secret.  Amazon, for instance, takes a Google-like approach. The company says very little about the facilities it runs or the hardware in those facilities. Apparently, the company is working with server sellers such as ZT Technologies to customize its servers, and it has followed Google’s lead in constructing its data centers with modular shipping containers. But it’s unclear just how far the company has gone towards designing and building its own hardware.  This week, the Internet is rife with speculation about just how many machines back the company’s Elastic Compute Cloud service.  At Google, employees sign strict non-disclosure agreements that bar them from discussing what goes on inside the company’s data centers—and apparently, this agreement is open-ended. That alone puts a lid on Google’s deepest secrets. We’ve seen the NDA in action—many times. But for Google, and others, there’s an added problem when they set up shop in a “co-location” facility like the data centers run by Equinix.  The nature of the beast is that you’re sharing space with competitors. Equinix trumpets its data centers as places where the giants of the Web can improve performance by plugging their gear straight into the world’s biggest Internet carriers—and into each other. The company began life offering a service—the Internet Core Exchange—that connected all the major Internet service providers, and now it lets other outfits plug into this carrier hub.  According to Sharp, over 70 carriers used the company’s main data center in San Jose, California. “We were a place for network operators to efficiently hand off traffic, and that’s the legacy that created Equinix,” Sharp says. “Not only are networks leveraging that to talk to each other, but [websites] are too.”  Security is high in the company’s facilities. Hand geometry readers—i. e. Fingerprint readers that extend beyond fingerprints—guard access to the data center floor. There’s a security camera looking at you every time you turn around. And each company can contain their gear in their own cages, protected by still more hand readers. But if you’re on the floor, you can peer into the cages. For cooling purposes, they’re not walled off. While some companies proudly display their logo on the side of their machines, the Googles of the world do their best to hide themselves. To keep competitors from eying their gear, Sharp says, many companies keep the lights off inside their cages when no one’s working on them. But others go even further.