The High-Performance Computing1 Laboratory (HPCL) is part of the Department of Computer and Information Science at the University of Oregon.

In HPCL, we believe that achieving high performance must not come at the cost of software quality, maintainability, and extensibility. Moreover, we believe that the productivity and happiness of HPC software developers is important to the overall success of HPC in enabling more and better science across many scientific domains.

HPCL is directed by Prof. Boyana Norris and conducts research in several areas, including optimizing compilers, performance modeling and optimization, parallel algorithms, and software engineering. Example projects include static and dynamic analysis of software for building application performance models, ensuring software quality, or detecting security vulnerabilities; using machine learning and other approaches to model run-time characteristics of software; developing data mining techniques to study and improve HPC software engineering processes; applying natural language processing methods to study and improve HPC software developer productivity; designing new algorithms or improving existing ones in several application areas, including large-scale dynamic graphs, computational physics, and computational biology.

1What is HPC?

Short-term research projects are available for advanced undergrads or MS students.

News

June 2022

Aliza Lisan has been awarded a General University Scholarship!

Dewi Yokelson successfully passed her area exam and has advanced to candidacy!

May 2022

Brian Gravelle successfully defended his dissertation!

April 2022

HPCL students have accepted summer internships at national labs, including Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Pacific Northwest National Laboratory.

March 2022

Paper accepted: D. Yokelson, N.V. Tkachenko, B. Robey, Y.W. Li, and P. Dub. Performance Analysis of CP2K Code for Ab Initio Molecular Dynamics on CPUs and GPUs. Journal of Chemical Information and Modeling, to appear April 22, 2022.

January 2022

Sudharshan Srinivasan begins an internship at Advanced Micro Devices (AMD), and Dewi Yokelson begins graduate research work at Los Alamos National Laboratory.

November 2021

Paper accepted: B. Gravelle, W.D. Nystrom, D. Yokelson, and B. Norris. Enabling Cache Aware Roofline Analysis with Portable Hardware Counter Metrics. 2021 International Workshop on Performance Modeling, Benchmarking and Simulation of High Performance Computer Systems (PMBS).

Poster accepted: S. Srinivasan, A. Pandey, A. Khanda, S. Srinivasan, S. Das. Parallel Framework for Updating Large Scale Dynamic Networks. Supercomputing, 2021.

October 2021

New NSF grant! NSF Collaborative Research: PPoSS: Planning: Extreme-scale Sparse Data Analytics.

September 2021

HPCL welcomes new graduate student Aliza Lisan to our lab.

More news can be found at our news archive.

Open projects

Undergraduate / short-term graduate: these are term-long projects achieavable with up to 10-15 hours per week effort. Experience or background that may be helpful is listed in square brackets. Interested students should contact Prof. Norris.

  • - Software development practices analysis through revision control data mining and natural language processing
  • - Performance analysis and optimization of scientific codes (usually these are parallel applications using MPI, OpenMP, or TBB). We typically have a number of scientific applications that we analyze and optimize. Some experience with performance analysis is helpful, but not required.
  • - Extract the class relationships (inheritence and containment) from C++ software [330, using/writing parsers]
  • - Using binary analysis to identify computational patterns and anti-patterns (for performance or power efficiency) [314, 429]
  • - Text analysis of selected portions of the scientific literature to discover and categorize use cases for scientific software [data mining]

The HPC Lab is generously supported by donations and grants from the Department of Energy (DOE) and the National Science Foundation (NSF).

Past sponsors include RNET Technologies, Inc (Dayton, OH) and Paratools, Inc..

Relevant conferences and workshops.