Skip to main content
Article | Mathematics and Computer Science Division

The keys to success: good projects, good choices, good collaborators - and openness to change

In 1996, Dr. Ewing Rusty” Lusk coauthored a paper that presented the first full implementation of the Message Passing Interface (MPI) standard – and with it initiated a revolution in parallel computing. Lusk, among other computer scientists, realized that a method of communication between parallel computers needed to be established; every computer had its own language, a situation that made computer communication extremely inefficient. At the time of publication, the MPI implementation, known as MPICH, was the first of its kind; it was received so well that now every parallel computer comes with an MPI implementation. 

Lusk is currently an Argonne Distinguished Fellow Emeritus and a computer scientist in the MCS division at Argonne National Laboratory. Lusk joined MCS in 1982 as a computer scientist and has been a part of the division since. He is the author of five books and has published over 100 research articles with a focus on parallel computing, mathematics and automated deduction. Lusk also served as the director of MCS from 2005 to 2011.

Interestingly, Lusk didn’t begin his career in computer science until his late twenties. After graduating from the University of Notre Dame in 1965 with his B.A. in mathematics, he then received his Ph.D. in the same from the University of Maryland in 1970. Lusk was immediately offered a professorship at Northern Illinois University (NIU) in the mathematics division. 

While at NIU, Lusk realized he had an affinity for computer science and would frequently sit in on the undergraduate computer science lectures. After observing many lectures, he was approached by a computer science professor and later offered a position in NIU’s Computer Science department.

In the following Q&A, Lusk discusses his research both before and after retirement and speaks about his professional accomplishments. He also sheds light on the one skill he developed throughout his career that he considers the most impactful.

What led you to join the MCS division at Argonne? How did the position fit into your overall career goals?

I wanted to be a computer scientist, and I actually spent my sabbatical (while teaching at NIU) at Argonne. I met Larry Wos, an expert in the field of automated reasoning, with whom I ended up working extensively during my time in the MCS division. The field served as an in-between” for mathematics and computer science, and it was definitely a match with what I wanted to pursue. I enjoyed the work being done in Argonne, and I liked the research-focused environment of the division as well.

Can you tell us about your research experiences pre-retirement?

My biggest interest was definitely parallel computing. The MPI standard served as a good source of research topics. MPI is a message-passing library interface specification; it’s not a language or an implementation. The standard defines the syntax of the core library routines. The standard didn’t have any specifics about implementation, and features were being added or revised to it every few weeks. I worked on MPICH, which is a high-performance and widely portable implementation of the MPI standard. The implementation evolved with the standard; the advantage was that when the standard came out, MPICH provided an open source implementation —  one that was freely available and that code developers and vendors could easily adopt or adapt it to their own needs. 

From 1997 to 2007, I also worked with the FLASH project, a joint effort with the University of Chicago to create code to simulate matter accretion onto the surfaces of compact stars, nuclear ignition of the accumulated material, and subsequent evolution of the star’s interior, surface, and exterior. I worked on performance visualization, which is the presentation of parallel program behavior in graphical form in order to understand how the program behaves.

I also coded in Prolog. It’s a language in which the statements are logical; every computation is the proof of a theorem, and it was used in parallel computing. Logic is intrinsically parallel, and the combination of programming and theorem proving was a really nice fit. I wrote a lot about the implementation of a Prolog system at the beginning of parallel computing, before MPI

Did your research focus or interests change  after gaining Emeritus status?

Yes, my primary work since retirement has been working with the U.S. Department of Energy SciDAC (Scientific Discovery through Advanced Computing) project through the Nuclear Computational Low-Energy Initiative. I’m currently working on a small library that focuses on load balancing of memory across computer nodes; one of the applications is that it provides a simple interface that’s even simpler than MPI and still scales up to the largest machines.

What is your proudest professional accomplishment?

Definitely the success of the MPI standard! It was certainly a group effort, and the most important thing that happened that I played a role in. Before the MPI standard, every parallel computer had its own language, which changed between different computers and models. This standard dramatically improved the efficiency of communication between computers.

Tell us how you would use technology in your day-to-day job, and how that changed over the years due to technological advancements?

We used to submit a code job and wait for the output to print out. The advanced technology Argonne had back then was that the listings would come up in a dumbwaiter and the staff would put the prints in the mailbox. A little sensor at the bottom on the mailbox would turn a light on in your office so you knew your job was done!  We usually got the jobs within a day of submitting them.  

I’d say that the technological change that affected my work was that we can run codes on our laptop now. The fastest computer back then was five megaflops, so everything is more efficient now. Also, there is also more scientific collaboration on an international scale with the introduction of email, and traveling for science has become more common.

If you had to choose one skill that you developed throughout your career as the most impactful, which one would it be?

Certainly decision-making. I believe I’ve made good decisions about what to work on; while not all things pan out, some panned out really well. I ended up choosing projects that I really liked and had the capability to contribute to. Most projects seemed to have a publishable nugget coming along, and there was always something to work with.

And lastly, if you had one piece of advice to early career researchers, what would it be?

Be open to possibilities of change. If you find something more interesting than what you’re working on right now, go work on that! You’re not stuck right now; and even 30 years from now, you still won’t be stuck. Take advantage of the fact that you’re welcome at any seminar at any division. Go and see what else is going on at Argonne.