Can efficient AI be powerful without requiring massive compute resources or costly cloud subscriptions?
Engineering with Small Language Models answers this question by showing how Small Language Models (SLMs) deliver high-performance natural language processing in resource-constrained environments. While large language models dominate headlines, SLMs offer a compelling alternative: fast inference, low memory usage, and flexible deployment on CPUs, mobile devices, edge hardware, and affordable GPUs. With tools like Hugging Face, PyTorch, and advanced techniques such as quantization and federated learning, you can build production-ready AI systems that are lightweight, secure, and scalable.
This comprehensive guide takes you through the entire SLM lifecycle, from design and training to optimization and deployment. Written for developers, AI engineers, and data scientists, it provides clear, practical workflows backed by real-world code and case studies. You’ll learn how to fine-tune models with parameter-efficient methods like LoRA, compress them using 4-bit quantization and pruning, and deploy them on devices like Raspberry Pi or smartphones. The book also addresses critical topics like privacy, bias mitigation, and compliance, ensuring your AI systems are ethical and production-ready.
What’s Inside:
About the Reader: This book is for developers, AI engineers, data scientists, and advanced learners who want to build efficient, scalable NLP systems without relying on massive infrastructure. A working knowledge of Python and basic familiarity with machine learning concepts are all you need to get started. Whether you’re a startup founder integrating AI into a mobile app, a researcher optimizing models for edge devices, or an engineer deploying secure APIs, this book equips you with practical tools and insights.
SLMs are transforming AI by making it faster, lighter, and more accessible. From fine-tuning on a laptop to deploying on constrained IoT devices, Engineering with Small Language Models is your definitive resource for creating impactful AI solutions. Get your copy today and start building smarter, more efficient systems—one small model at a time.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Vendeur : California Books, Miami, FL, Etats-Unis
Etat : New. Print on Demand. N° de réf. du vendeur I-9798298559843
Quantité disponible : Plus de 20 disponibles
Vendeur : PBShop.store US, Wood Dale, IL, Etats-Unis
PAP. Etat : New. New Book. Shipped from UK. Established seller since 2000. N° de réf. du vendeur L2-9798298559843
Quantité disponible : Plus de 20 disponibles
Vendeur : PBShop.store UK, Fairford, GLOS, Royaume-Uni
PAP. Etat : New. New Book. Shipped from UK. Established seller since 2000. N° de réf. du vendeur L2-9798298559843
Quantité disponible : Plus de 20 disponibles
Vendeur : CitiRetail, Stevenage, Royaume-Uni
Paperback. Etat : new. Paperback. Can efficient AI be powerful without requiring massive compute resources or costly cloud subscriptions?Engineering with Small Language Models answers this question by showing how Small Language Models (SLMs) deliver high-performance natural language processing in resource-constrained environments. While large language models dominate headlines, SLMs offer a compelling alternative: fast inference, low memory usage, and flexible deployment on CPUs, mobile devices, edge hardware, and affordable GPUs. With tools like Hugging Face, PyTorch, and advanced techniques such as quantization and federated learning, you can build production-ready AI systems that are lightweight, secure, and scalable.This comprehensive guide takes you through the entire SLM lifecycle, from design and training to optimization and deployment. Written for developers, AI engineers, and data scientists, it provides clear, practical workflows backed by real-world code and case studies. You'll learn how to fine-tune models with parameter-efficient methods like LoRA, compress them using 4-bit quantization and pruning, and deploy them on devices like Raspberry Pi or smartphones. The book also addresses critical topics like privacy, bias mitigation, and compliance, ensuring your AI systems are ethical and production-ready.What's Inside: Setting up and running SLMs with Hugging Face and PyTorchFine-tuning with LoRA, QLoRA, and adapters for domain-specific tasksCompression techniques: 4-bit/8-bit quantization, GPTQ, AWQ, and pruningExporting models to ONNX, TensorFlow Lite, and Core ML for edge deploymentOn-device inference for Raspberry Pi, Android, iOS, and IoT devicesFederated learning and differential privacy for secure, privacy-preserving AIBuilding scalable inference APIs with FastAPI and TorchServeKubernetes, serverless, and autoscaling strategies for cloud deploymentEthical AI: bias mitigation, interpretability, and accessibility best practicesCase studies in chatbots, healthcare, finance, and IoTCI/CD pipelines, monitoring, and performance optimization workflowsAppendices with scripts, datasets, and troubleshooting guidesAbout the Reader: This book is for developers, AI engineers, data scientists, and advanced learners who want to build efficient, scalable NLP systems without relying on massive infrastructure. A working knowledge of Python and basic familiarity with machine learning concepts are all you need to get started. Whether you're a startup founder integrating AI into a mobile app, a researcher optimizing models for edge devices, or an engineer deploying secure APIs, this book equips you with practical tools and insights.SLMs are transforming AI by making it faster, lighter, and more accessible. From fine-tuning on a laptop to deploying on constrained IoT devices, Engineering with Small Language Models is your definitive resource for creating impactful AI solutions. Get your copy today and start building smarter, more efficient systems-one small model at a time. This item is printed on demand. Shipping may be from our UK warehouse or from our Australian or US warehouses, depending on stock availability. N° de réf. du vendeur 9798298559843
Quantité disponible : 1 disponible(s)