Wednesday, March 21, 2012

‎Polygamist land: The NSA Is Building the Country's Biggest Spy Center

But new pioneers have quietly begun moving into the area, secretive outsiders who say little and keep to themselves. Like the pious polygamists, they are focused on deciphering cryptic messages that only they have the power to understand. Just off Beef Hollow Road, less than a mile from brethren headquarters, thousands of hard-hatted construction workers in sweat-soaked T-shirts are laying the groundwork for the newcomers’ own temple and archive, a massive complex so large that it necessitated expanding the town’s boundaries. Once built, it will be more than five times the size of the US Capitol.

It needs that capacity because, according to a recent report by Cisco, global Internet traffic will quadruple from 2010 to 2015, reaching 966 exabytes per year. (A million exabytes equal a yottabyte.) In terms of scale, Eric Schmidt, Google’s former CEO, once estimated that the total of all human knowledge created from the dawn of man to 2003 totaled 5 exabytes. And the data flow shows no sign of slowing. In 2011 more than 2 billion of the world’s 6.9 billion people were connected to the Internet. By 2015, market research firm IDC estimates, there will be 2.7 billion users. Thus, the NSA’s need for a 1-million-square-foot data storehouse. Should the agency ever fill the Utah center with a yottabyte of information ?

So the agency had one major ingredient—a massive data storage facility—under way. Meanwhile, across the country in Tennessee, the government was working in utmost secrecy on the other vital element: the most powerful computer the world has ever known.Some 300 scientists and computer engineers with top security clearance toil away here, building the world’s fastest supercomputers and working on cryptanalytic applications and other secret projects.

But the real competition will take place in the classified realm. To secretly develop the new exaflop (or higher) machine by 2018, the NSA has proposed constructing two connecting buildings, totaling 260,000 square feet, near its current facility on the East Campus of Oak Ridge. Called the Multiprogram Computational Data Center, the buildings will be low and wide like giant warehouses, a design necessary for the dozens of computer cabinets that will compose an exaflop-scale machine, possibly arranged in a cluster to minimize the distance between circuits. According to a presentation delivered to DOE employees in 2009, it will be an “unassuming facility with limited view from roads,” in keeping with the NSA’s desire for secrecy. And it will have an extraordinary appetite for electricity, eventually using about 200 megawatts, enough to power 200,000 homes. The computer will also produce a gargantuan amount of heat, requiring 60,000 tons of cooling equipment, the same amount that was needed to serve both of the World Trade Center towers.

Feature size of 22 to 11 nanometers, CMOS in 2018
* Total average of 25 picojoules per floating point operation
* Approximately 10 billion-way concurrency for simultaneous operation and latency hiding
* 100 million to 1 billion cores
* Clock rates of 1 to 2 GHz
* Multithreaded, fine-grained concurrency of 10- to 100-way concurrency per core
* Hundreds of cores per die (varies dramatically depending on core type and other factors)
* Global address space without cache coherence; extensions to PGAS (e.g., AGAS)
* 128-petabyte capacity mix of DRAM and nonvolatile memory (most expensive subsystem)
* Explicitly managed high-speed buffer caches; part of deep memory hierarchy
* Optical communications for distances > 10 centimeters, possibly intersocket
* Optical bandwidth of 1 terabit per second
* Systemwide latencies on the order of tens of thousands of cycles
* Active power management to eliminate wasted energy by momentarily unused cores
* Fault tolerance by means of graceful degradation and dynamically reconfigurable structures
* Hardware-supported rapid thread context switching
* Hardware-supported efficient message-to-thread conversion for message-driven computation
* Hardware-supported, lightweight synchronization mechanisms
* 3-D packaging of dies for stacks of 4 to 10 dies each including DRAM, cores, and networking 

___________________________
99% BAD HARDWARE WEEK:

Comments: Post a Comment



<< Home

This page is powered by Blogger. Isn't yours?