Basics of predictive best-estimate model calibration.- Predictive best-estimate model-validation, model-calibration and model-verification concerning open and chaotic systems.- Differences to traditional statostic evaluation methods.- Examples.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Dan Gabriel Cacuci received his Master of Science, Master and Doctor of Philosophy degrees in applied physics and nuclear engineering from Columbia University in New York City. His scientific expertise encompasses the following areas: predictive best-estimate analysis of large-scale physical and engineering systems, large-scale scientific computations, and nuclear engineering (reactor multi-physics, dynamics, and safety). He currently holds the South Carolina SmartState Endowed Chair and Directorship of the Center of Economic Excellence in Nuclear Science and Energy at the University of South Carolina in Columbia, USA.
Les informations fournies dans la section « A propos du livre » peuvent faire référence à une autre édition de ce titre.
Vendeur : Brook Bookstore On Demand, Napoli, NA, Italie
Etat : new. Questo è un articolo print on demand. N° de réf. du vendeur fcf9f0b754cf6a608abce100833fc556
Quantité disponible : Plus de 20 disponibles
Vendeur : Ria Christie Collections, Uxbridge, Royaume-Uni
Etat : New. In. N° de réf. du vendeur ria9783662583937_new
Quantité disponible : Plus de 20 disponibles
Vendeur : moluna, Greven, Allemagne
Gebunden. Etat : New. N° de réf. du vendeur 255916551
Quantité disponible : Plus de 20 disponibles
Vendeur : BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Allemagne
Buch. Etat : Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author's view, the objective of predictive modeling is to extract 'best estimate' values for model parameters and predicted results, together with 'best estimate' uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information. The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, 'cost functional' (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the 'data adjustment' and/or the '4D-VAR' data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a 'model validation metric' which quantifies the consistency (agreement/disagreement) between measurements and computations. This 'model validation metric' (or 'consistency indicator') is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters. Traditional methods for computing response sensitivities are hampered by the 'curse of dimensionality,' which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, 'best estimate' predictive modeling tools for designing new technologies and facilities, while also improving on existing ones. 468 pp. Englisch. N° de réf. du vendeur 9783662583937
Quantité disponible : 2 disponible(s)
Vendeur : buchversandmimpf2000, Emtmannsberg, BAYE, Allemagne
Buch. Etat : Neu. This item is printed on demand - Print on Demand Titel. Neuware -This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author¿s view, the objective of predictive modeling is to extract ¿best estimate¿ values for model parameters and predicted results, together with ¿best estimate¿ uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information.The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, ¿cost functional¿ (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the ¿data adjustment¿ and/or the ¿4D-VAR¿ data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a ¿model validation metric¿ which quantifies the consistency (agreement/disagreement) between measurements and computations. This ¿model validation metric¿ (or ¿consistency indicator¿) is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters.Traditional methods for computing response sensitivities are hampered by the ¿curse of dimensionality,¿ which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, ¿best estimate¿ predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 468 pp. Englisch. N° de réf. du vendeur 9783662583937
Quantité disponible : 1 disponible(s)
Vendeur : Books Puddle, New York, NY, Etats-Unis
Etat : New. N° de réf. du vendeur 26376625625
Quantité disponible : 4 disponible(s)
Vendeur : AHA-BUCH GmbH, Einbeck, Allemagne
Buch. Etat : Neu. Druck auf Anfrage Neuware - Printed after ordering - This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author's view, the objective of predictive modeling is to extract 'best estimate' values for model parameters and predicted results, together with 'best estimate' uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information. The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, 'cost functional' (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the 'data adjustment' and/or the '4D-VAR' data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a 'model validation metric' which quantifies the consistency (agreement/disagreement) between measurements and computations. This 'model validation metric' (or 'consistency indicator') is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters. Traditional methods for computing response sensitivities are hampered by the 'curse of dimensionality,' which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, 'best estimate' predictive modeling tools for designing new technologies and facilities, while also improving on existing ones. N° de réf. du vendeur 9783662583937
Quantité disponible : 1 disponible(s)
Vendeur : Majestic Books, Hounslow, Royaume-Uni
Etat : New. Print on Demand. N° de réf. du vendeur 369452550
Quantité disponible : 4 disponible(s)
Vendeur : Biblios, Frankfurt am main, HESSE, Allemagne
Etat : New. PRINT ON DEMAND. N° de réf. du vendeur 18376625619
Quantité disponible : 4 disponible(s)
Vendeur : Mispah books, Redhill, SURRE, Royaume-Uni
Hardcover. Etat : New. NEW. SHIPS FROM MULTIPLE LOCATIONS. book. N° de réf. du vendeur ERICA77336625839336
Quantité disponible : 1 disponible(s)