Supercomputing
My experience in the 1970s and early 1980s using supercomputers at Department of Energy labs and European centers convinced me that the National Science Foundation (NSF) needed to make this capability widely available to university researchers. My unsolicited NSF proposal in 1983 helped define the need for what became the NSF Supercomputer Center program. In 1985 my proposal was funded and I became the founding director of the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. In 1997 the NSF expanded the program and I became the founding director of the National Computational Science Alliance, comprised of more than 50 universities, government labs, and corporations linked with NCSA to prototype the information infrastructure of the 21st century. I oversaw bringing successive generations of high-performance computers to the national research community, first with vectors [Cray X-MP, Cray Y-MP, Convex], then massively parallel [Alliant, CM-2, CM-5], shared memory [SGI Challenge, Power Challenge, Origin], and finally large-scale superclusters. Based on these experiences, I co-authored a book on the scientific underpinnings of the many disciplines of science and engineering that were being transformed by this computational methodology. In the late 1990s NIH director Varmus asked me to develop a report on the future impact of IT and telecom on biomedical research. This report is still being used today to launch new NIH centers.