What if your AI model has to run on a device with less RAM than a single smartphone photo? Edge AI on Embedded Devices answers that question with engineering discipline, not theory. Why this matters now: Billions of microcontrollers power our world-pacemakers, industrial sensors, smart infrastructure. Cloud AI can't reach them. This book shows how to build machine learning systems that thrive under constraints where standard ML practices break down. What makes this different: Concrete trade-offs between accuracy, latency, memory, and power consumption on real hardwareModel optimization techniques that preserve performance when kilobytes matterDeployment pipelines designed for resource-limited targets, not GPU clustersSecurity and maintenance strategies for devices in the field for decadesHardware selection frameworks that match model complexity to silicon capabilitiesSystems-level thinking: Connects model architecture to power management, real-time OS behavior, and long-term reliability. No abstraction comes without cost analysis. For practitioners: Written for engineers building production systems, not running benchmarks. Embedded developers learn ML constraints. ML engineers learn embedded realities. Both learn to design AI that survives deployment. Build AI that runs where cloud computing ends. Start designing systems engineered for silicon, not slides.
ThriftBooks sells millions of used books at the lowest
everyday prices. We personally assess every book's quality and offer rare, out-of-print treasures. We
deliver the joy of reading in recyclable packaging with free standard shipping on US orders over $15.
ThriftBooks.com. Read more. Spend less.