Overview

February 23, 2014

Layered Co-Design

Since the 1960s, the general-purpose processor (also known as the central processing unit or CPU) has served as the brains in computing instruments. For example, each of the 1,100 compute nodes in Virginia Tech’s System X supercomputer has a pair of homogeneous brains, i.e., two 2.3-GHz PowerPC 970FX CPUs. Recent trends, however, have exposed the CPU as a “jack of all (computing) trades, master of none,” thus giving rise to heterogeneous computing instruments with multiple types of brains, e.g., CPUs and graphics processing units (GPUs). Building on our team’s expertise in this area, we acquired a versatile heterogeneous supercomputing instrument, in which each compute node consists of CPUs and GPUs. This transformative instrument empowered faculty, students, and staff across disciplines to tackle problems previously viewed as intractable or that required heroic efforts and significant domain-specific expertise to solve. For example, in 2007, though conventional wisdom believed that finding missing genes in 699 microbial genomes was computationally infeasible, PI Feng led a team of 20+ interdisciplinary researchers from 8 institutions around the world and developed software cybertool instruments to integrate a set of distributed supercomputers, totaling 12,000+ CPUs, to complete the task in ten weeks. The HokieSpeed instrument, coupled with our existing cybertool instruments and those available from NVIDIA, can complete this task in a day while involving only two researchers rather than 20+.

Furthermore, this instrument catalyzed new approaches for conducting research via the synergistic amalgamation of heterogeneous supercomputing and our cyber-enabled tools that enhance ease of use. In particular, it allowed end users to (1) make commonplace, the ability to perform in-situ visualization for rapid visual information synthesis and analysis and (2) to control their level of immersion in the discovery process — from being completely immersed, a la a “human in the loop” making real-time decisions (intuitively) via a large-scale gigapixel display, to observing the instrument automatically collect, organize, and analyze data in support of visual analytics.

More »