HPC geeks ponder 100 petafloppers and quantum supercomputers
ISC 2013 The next big barrier for supercomputing is punching through 100 petaflops peak performance, which frankly could be done in a heartbeat if someone had a few hundred millions dollars lying around. And now that Google and NASA are monkeying around with a quantum computer, thoughts are turning to how a QC might be deployed to replace some of the work done by traditional supercomputer clusters.
Euro students cluster fest: Configurations LAID BARE
The configurations of the systems to be used by the young HPC warriors in the 2013 International Supercomputing Conference’s Student Cluster Challenge were released last week
Edinburgh students’ heaving racks: UK’s only hope for cluster-wrestling glory
Eight universities have traveled to the 2013 International Supercomputing Conference in Leipzig, Germany, to participate in the 2013 ISC Student Cluster Challenge. They’ve deployed their clusters and are busily working to turn in the best results on a series of HPC benchmarks and scientific apps.
SDSC GeoComputing Lab named winner of HPC Innovation Excellence award by IDC
The High Performance GeoComputing Laboratory (HPGeoC) at the San Diego Supercomputer Center (SDSC), an organized research unit at the University of California, San Diego, was named a winner of the HPC Innovation Excellence Award by the International Data Corporation (IDC) for developing a highly-scalable computer code that promises to dramatically cut both research times and energy costs in simulating seismic hazards throughout California and elsewhere
How Virtualization is Key to Unlocking Cloud HPC
Josh Simons from VMware discusses why the high performance computing community is starting to leverage virtualization technologies for Cloud HPC.