Skip to content
Scan a barcode
Scan
Paperback Transformers and Large Language Models: A Hands-On Guide to Rag and Agentic AI Book

ISBN: B0GTPRZ6MY

ISBN13: 9798868827846

Transformers and Large Language Models: A Hands-On Guide to Rag and Agentic AI

This book is a hands-on guide to understanding the foundations, architectures, and real-world applications of transformers and large language models in modern AI. The book begins by laying the foundations of generative AI architectures, tokenization, encoding, and classical modeling techniques. Initial chapters address the evolution from feed-forward networks and recurrent neural networks to long short-term memory (LSTM), setting the stage for the revolutionary transformer architecture. The core of the book focuses on transformers, introducing the encoder-decoder framework, attention mechanisms, positional encodings, and the internal workings of multi-head attention, normalization, and multi-layer perceptrons. Readers gain insight into advanced techniques such as rotary positional embeddings (RoPE), mixture of experts (MoE), and knowledge distillation, alongside practical training strategies like self-supervised learning, fine-tuning, and reinforcement learning with human feedback. Popular models from OpenAI, DeepSeek, and other vendors are examined to highlight the evolution of the LLM landscape. Building on these foundations, the text explores methods for model customization, including parameter-efficient fine-tuning (LoRA, adapters), text generation strategies, prompt engineering, and quantization. Retrieval-Augmented Generation (RAG) is introduced as a critical innovation for grounding LLMs in external knowledge, with detailed evaluation techniques for retrieval and generation. Finally, the book ventures into Agentic AI, demonstrating protocols like Model Context Protocol (MCP) and Agent-to-Agent (A2A) interactions with practical coding examples. In conclusion, this book serves as both a practical guide, equipping readers with the technical depth and applied strategies needed to design, fine-tune, and deploy cutting-edge transformers and large language models for real-world applications. What we will learn: Understand the foundations of AI, ML pipelines, tokenization, encoding, and early neural architectures. Explore transformers in depth--encoder-decoder design, attention mechanisms, and advanced embedding methods. Learn modern LLM advancements like RoPE, MoE, SLMs, fine-tuning strategies, and evaluation techniques. Master practical customization through prompt engineering, PEFT methods, quantization, and text generation. nWho this book is for: Data scientists, ML engineers, AI researchers, and developers exploring Transformers and large language models.

Recommended

Format: Paperback

$41.85
Save $18.14!
List Price $59.99
Releases 8/1/2026

Customer Reviews

0 rating
Copyright © 2026 Thriftbooks.com Terms of Use | Privacy Policy | Do Not Sell/Share My Personal Information | Cookie Policy | Cookie Preferences | Accessibility Statement
ThriftBooks ® and the ThriftBooks ® logo are registered trademarks of Thrift Books Global, LLC
GoDaddy Verified and Secured