In recent years, transformer-based AI language models have gained prominence due to their powerful capabilities in a variety of tasks including generation of images, video, text and code. Large Language Models (LLMs) exist with parameters counts of a trillion parameters and greater. Such models are proprietary and unavailable for organizations to deploy privately. Even if such deployments were possible, the tremendous resource requirements of LLMs preclude their deployment on infrastructure smaller than enterprise and hyper-scale data centers. Small Language Models (SLMs), with far lower parameter counts of billions or fewer are a viable alternative for use on small servers and edge devices including PCs. While SLMs possess similar generative capabilities as LLMs, the reduction in model size is correlated with a decrease in accuracy when evaluated across a broad range of generative applications, including code generation in multiple languages. To mitigate this shortcoming, an SLM may be fine-tuned with a curated code dataset consisting of code examples in a target programming language. This praxis presents results illustrating how two fine-tuned SLMs variants have been created that improve average accuracy in C++ code generation by more than 9%, and Rust code generation by more than 14%.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Vendeur : PBShop.store UK, Fairford, GLOS, Royaume-Uni
PAP. Etat : New. New Book. Delivered from our UK warehouse in 4 to 14 business days. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. N° de réf. du vendeur L0-9785161063675
Quantité disponible : Plus de 20 disponibles
Vendeur : Majestic Books, Hounslow, Royaume-Uni
Etat : New. N° de réf. du vendeur 407694736
Quantité disponible : 4 disponible(s)
Vendeur : Books Puddle, New York, NY, Etats-Unis
Etat : New. N° de réf. du vendeur 26406540879
Quantité disponible : 4 disponible(s)
Vendeur : Biblios, Frankfurt am main, HESSE, Allemagne
Etat : New. N° de réf. du vendeur 18406540869
Quantité disponible : 4 disponible(s)
Vendeur : AHA-BUCH GmbH, Einbeck, Allemagne
Taschenbuch. Etat : Neu. nach der Bestellung gedruckt Neuware - Printed after ordering - In recent years, transformer-based AI language models have gained prominence due to their powerful capabilities in a variety of tasks including generation of images, video, text and code. Large Language Models (LLMs) exist with parameters counts of a trillion parameters and greater. Such models are proprietary and unavailable for organizations to deploy privately. Even if such deployments were possible, the tremendous resource requirements of LLMs preclude their deployment on infrastructure smaller than enterprise and hyper-scale data centers. Small Language Models (SLMs), with far lower parameter counts of billions or fewer are a viable alternative for use on small servers and edge devices including PCs. While SLMs possess similar generative capabilities as LLMs, the reduction in model size is correlated with a decrease in accuracy when evaluated across a broad range of generative applications, including code generation in multiple languages.To mitigate this shortcoming, an SLM may be fine-tuned with a curated code dataset consisting of code examples in a target programming language. This praxis presents results illustrating how two fine-tuned SLMs variants have been created that improve average accuracy in C++ code generation by more than 9%, and Rust code generation by more than 14%. N° de réf. du vendeur 9785161063675
Quantité disponible : 2 disponible(s)
Vendeur : preigu, Osnabrück, Allemagne
Taschenbuch. Etat : Neu. Methods to Improve AI Code Generation | Mohd Rashid | Taschenbuch | Englisch | 2026 | RASHID PUBLICATIONS | EAN 9785161063675 | Verantwortliche Person für die EU: Libri GmbH, Europaallee 1, 36244 Bad Hersfeld, gpsr[at]libri[dot]de | Anbieter: preigu Print on Demand. N° de réf. du vendeur 135060925
Quantité disponible : 5 disponible(s)