1 2 3 4 5 6 7 8 9 10 next
Archived IT - HPC Articles
13 Sep 2017
open to premium members only
Lustre Enhances Flexibility for Big Data Era [66442]
HPCWire, September 13th, 2017

"Researchers at Oak Ridge National Laboratory (ORNL) and Intel Corporation have wrapped up a three year project aimed at giving users of Lustre, the Department of Energy's preferred parallel file system, more flexibility. And the results are impressive.

Lustre is the preferred file system for the leadership scientific computing community for a simple reason: it has an unprecedented ability to store and retrieve the large-scale data inherent in complex scientific simulations such as those run at ORNL's Leadership Computing Facility, home to Titan, the nation's most powerful system for open science..."
(Get More Information . .)

07 Sep 2017
open to premium members only
NVIDIA Volta GPUs to Help CDDS Advance Medicine with AI [66149]
insideHPC, September 7th, 2017

"Over at the NVIDIA Blog, Abdul Hamid Halabi writes that the Center for Clinical Data Science (CCDS) today received the world's first purpose-built AI supercomputer from the all-new portfolio of NVIDIA DGX systems with Volta...

'With a mission to advance medicine with artificial intelligence, CCDS includes clinicians, researchers, data scientists and product development, and translational experts. Using the first-generation DGX-1 based on Pascal GPUs, CCDS data scientists have successfully trained machines to 'see' abnormalities and patterns in medical images...'

CCDS data scientists have created dozens of medical training algorithms to date. In addition to radiology, they include other medical specialties such as cardiology, ophthalmology, dermatology and psychiatry..."
(Get More Information . .)

02 Sep 2017
open to premium members only
What Lies Ahead for HPC Cooling? [65974]
insideHPC, September 2nd, 2017

"As the HPC industry collectively approaches exascale, the importance of energy efficiency - and maximizing the efficiency and performance of cooling technology - becomes paramount to ensuring that the cost of HPC resources does not become prohibitively expensive.

To meet the ambitious targets for exascale computing, many cooling companies are exploring optimizations and innovative methods that will redefine cooling architectures for the next generation of HPC systems. Here, some of the prominent cooling technology providers give their views on the current state and future prospects of cooling technology in HPC..."
(Get More Information . .)

28 Aug 2017
open to premium members only
NERSC Scales Scientific Deep Learning to 15 Petaflops [65972]
HPCWire, August 28th, 2017

Rob Farber writes, "A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according to the authors of the paper (and to the best of their knowledge), currently the most scalable deep-learning implementation in the world. The work described in the paper, 'Deep Learning at 15PF: Supervised and Semi-Supervised Classification for Scientific Data', reported that a Cray XC40 system with a configuration of 9,600 self-hosted 1.4GHz Intel Xeon Phi Processor 7250 based nodes achieved a peak rate between 11.73 and 15.07 petaflops (single-precision) and an average sustained performance of 11.41 to 13.47 petaflops when training on physics and climate based data sets using Lawrence Berkeley National Laboratory's (Berkeley Lab) NERSC (National Energy Research Scientific Computing Center) Cori Phase-II supercomputer..."
(Get More Information . .)

27 Aug 2017
open to premium members only
Oak Ridge Turns to Deep Learning for Big Data Problems [65973]
insideHPC, August 27th, 2017

"A team of researchers from Oak Ridge National Laboratory has been awarded nearly $2 million over three years from the Department of Energy to explore the potential of machine learning in revolutionizing scientific data analysis...

While deep learning has long been used to classify relatively simple data such as photographs, today's scientific data presents a much greater challenge because of its size and complexity. Deep learning offers the potential to truly change the way in which researchers use massive datasets to solve challenges spanning the scientific spectrum..."
(Get More Information . .)

23 Aug 2017
open to premium members only
Top 10 supercomputers of 2017 [65806]
Networkworld, August 23rd, 2017

"These 10 supercomputers are the world's fastest....

Yes, your new gaming PC that supports VR headsets is impressively fast. But can it simulate the entire universe over millions of years? Shed light on the forces that cause destructive summer storms in Europe? Ensure the safety and reliability of nuclear weapons? We didn't think so; those are jobs for supercomputers..."
(Get More Information . .)

 
 1 2 3 4 5 6 7 8 9 10 next