Embedded Systems

Hardware-aware Neural Architecture Search

Bachelor’s Thesis, Master’s Thesis, Student Job, Research Project

Abstract

Designing neural networks (NNs) manually is a complex and time-consuming process that demands expert knowledge, significant computational resources, and extensive experimentation. Neural Architecture Search (NAS) seeks to automate this process by defining a search space of possible network variants and applying optimization algorithms to discover high-performing architectures.

Hardware-aware NAS extends this goal by incorporating hardware considerations—such as latency, memory usage, and supported operations—into the optimization objective, enabling the design of models that are not only accurate but also efficient.

NAS research is typically organized around three key components: the search space, the search algorithm, and the performance estimation strategy (1). Our group investigates all three aspects, with a particular focus on hardware-aware NAS. Possible thesis topics include but are not limited to:

  • Designing search spaces tailored to specific hardware platforms or tasks
  • Combining NAS with other model compression techniques such as pruning, quantization, or distillation
  • Developing constraint-solving approaches to automatically generate valid search spaces
  • Investigating surrogate models for fast and reliable architecture performance prediction

Requirements

  • Basics: Python, Git, Linux
  • PyTorch (recommended)
  • Successfully attended the lecture “Efficient Machine Learning in Hardware” (recommended)

Contact

Reiber, Moritz

Bringmann, Oliver