Skip to main content

Recent Advances in Parallel Algorithm Design

Dr. Yan Gu

Assistant Professor in the Computer Science and Engineering (CSE) Department at UC Riverside

Yan Gu

Abstract: This talk will focus on two main categories of our recent progress in parallel algorithms. The first part will delve into our decade-long research in parallelizing sequential iterative algorithms, such as greedy algorithms and dynamic programming. The core concept here is to identify the inherent computational structures and develop general frameworks for their parallel execution. I will overview the key concepts and techniques proposed throughout this research process, including the dependence graphs, asynchronous execution, phase parallelism, and the cordon algorithm. Illustrative examples will include random permutation, maximal independent set (MIS), and dynamic programming applications.

The second part of this talk will discuss the design of efficient parallel data structures. Parallel data structures differ from their sequential and concurrent counterparts in various ways, including their interfaces and design objectives. This talk will highlight the design challenges in parallel data structures, showcase some examples of the data structures we have recently designed (such as search trees, spatial trees, and data structures for priority queue), and discuss interesting open problems.

Bio: Yan Gu is an Assistant Professor in the Computer Science and Engineering (CSE) Department at UC Riverside. He completed his PhD at Carnegie Mellon University in 2018 and his Bachelor’s degree at Tsinghua University in 2012. Before joining UCR, he spent one year as a postdoc at MIT. His research interest lies in designing simple and efficient algorithms with strong theoretical guarantees and good practical performance. He is a recipient of the NSF CAREER Award and the Google Research Scholar Program in 2024. He has won the Best Paper Awards at PPoPP’23 and ESA’23, the Best Paper Runner-up at VLDB’23, and the Outstanding Paper Awards at SPAA’24 and SPAA’20.

Skip to content