MaxLinear’s augmented neuron technology mathematically augments DNNs reducing the amount of computation required to achieve a target accuracy. This leads to reduced memory usage, lower latency, and increased throughput. We provide optimized integration with TensorFlow and Intel’s OpenVino and OneDNN libraries, enabling enhanced AI using industry-standard workflows.

MaxLinear strives to improve the world’s communication networks for everyone by connecting people through our highly integrated radio-frequency (RF), analog, digital, and mixed-signal semiconductor solutions and licensable machine learning IP for access and connectivity, wired and wireless infrastructure, and industrial and multi-market applications.


MaxLinear’s Augmented Neuron technology enables small DNN models to achieve high accuracy, leading to AI solutions with lower computational and memory cost. Augmented Neurons are provided via a new set of convolutional and dense layers that are fully-compatible, drop-in replacements for standard DNN layers.

Augmented Neurons are delivered via TensorFlow and OpenVino plugins, easing integration to customer’s existing workflows. These plugins include implementations optimized for Intel’s CPUs supporting SSE, AVX-2, and AVX-512 instruction sets.