The National Institute for Computational Sciences

What is HPC?

What is HPC

"High-Performance Computing," or HPC, is the application of "supercomputers" to computational problems that are either too large for standard computers or would take too long. A desktop computer generally has a single processing chip, commonly called a CPU. A HPC system, on the other hand, is essentially a network of nodes, each of which contains one or more processing chips, as well as its own memory.

Parallel Computing

Programs for HPC systems must be split up into many smaller "programs" called threads, corresponding to each core. To piece the larger program together, the cores must be able communicate with each other efficiently, and the system as a whole must be organized well.

Programs on HPC systems create a vast amount of data, which can be very difficult for standard file systems and storage hardware to deal with. Standard file systems, or those defined for personal use, might have an upper limit on file size, number of files, or total storage. HPC file systems must be able to grow to contain and quickly transfer large amounts of data. In addition to data in use, researchers often keep previous data for comparison or as a starting point for future projects. Older data is kept in archival storage systems. Kraken, for example, uses a magnetic tape storage system, which can store several petabytes (millions of gigabytes) of data.


XSEDE is a partnership between HPC centers around the United States. XSEDE funding comes from the National Science Foundation. As a member of XSEDE, NICS devotes most of its resources to scientific research. Many other big HPC centers focus on classified government computations, while many smaller centers focus on industrial research. Regardless of setup, a center is likely to deal with a large range of problems typified by the following:

  • Biochemistry
    • Protein-folding attempts to determine the overall shape of a protein based on the sequence of amino acids; the effort may lead to better understanding of biological processes, and new medicines.
    • Protein-ligand interactions play a huge role in biological processes and are common targets for medicine. Refining computational models allows new medicines to be developed more efficiently.
  • Chemistry
    • Materials research helps to identify new materials, such as high-temperature superconductors, affordable electrodes for fuel cells, efficient catalysts, new methods of energy storage, etc.
  • Physics
    • Simulations of stars help astrophysicists interpret observations.
  • Environment Modeling
    • The simulation of earthquakes can be startlingly complex. These simulations can be used to predict which areas are likely to experience a large earthquake, and what the conditions will be. This work helps inform building codes, development planning, and emergency action plans. For instance, see the SCEC Web site.
    • Weather patterns are notoriously difficult to predict. Supercomputers model weather around the country, and weather models are particularly important during hurricane season. Additionally, new computational models may be used with old data to gauge the models' usefulness in predicting weather patterns.
    • Global climate change is a contentious topic these days, in large part due to the extreme complexity of the problem. As computational models improve, we may be better informed in our policy decisions, and able to act in a more unified manner.
  • Industry
    • Products: Prototyping is often an expensive and time-consuming process. Computer modeling can reduce the time and cost of physical prototyping and make optimization much easier. This process is essential not only for obvious industries such as the automotive, aerospace, and pharmaceutical industries, but a wide spectrum of consumer products, from dishwashers to potato chips.
    • Services: Many corporations handle large amounts of data, and deal with complex routings or decisions. Insurers, for instance, use models to calculate risk, while the oil/gas industry uses seismographic data to determine where oil and gas fields are located underground. Wal-Mart optimizes supply chains, and Fed-Ex monitors and routes packages. The financial sector uses models to spot trends and calculate risks.