Unlocking the Power of Content Addressable Memory for Memory-Intensive Applications
dataset
posted on 2024-05-13, 22:02authored byMengyuan Li
Due to the high cost of data movement in the traditional von Neumann architecture, particularly in many data-intensive workloads, in-memory computing (IMC), by integrating computation into memory, has emerged as a highly attractive alternative. Within the realm of IMC, Content-Addressable Memory (CAM) stands out as a specialized unit capable of executing in-memory search operations in a massively parallel fashion. CAM offers distinct advantages, particularly for applications that heavily rely on large-scale match or search operations. However, realizing the full potential of CAM-based accelerators necessitates cross-layer efforts.
This thesis aims to advance the development of IMC hardware, with focus on enhancing the capabilities of CAM-based hardware and applicability of such to memory-intensive applications. To enhance CAM's capabilities, this thesis delves into optimizing CAM-based hardware from both application-driven and circuit-driven perspectives. Specifically, to achieve end-to-end performance improvement for emerging ML workloads, such as recommendation systems and reinforcement learning, we explore the utilization of associative search and devise innovative CAM-based architecture which moves computation into memory. At the circuit-level, novel programming methods are developed to expand CAM's hardware capability by enabling more distance functions on CAM cells.
To broaden the applicability of CAM technology and facilitate collaboration between software and hardware experts, this thesis introduces an extensive CAM toolchain, comprising the CAMASim function and performance simulator, supporting accuracy prediction and performance evaluation. These contributions effectively tackle critical challenges in CAM-based hardware design for memory-intensive applications. Furthermore, they lay the foundation for extensive exploration in the broader realm of CAM-based accelerators.