Optimal Control: Calculus of Variations, Optimization (mathematics), Control Theory, Continuous Signal, Discrete Time, Dynamic Programming, Bellman Equation, Trajectory Optimization - Couverture souple

 
9786130306885: Optimal Control: Calculus of Variations, Optimization (mathematics), Control Theory, Continuous Signal, Discrete Time, Dynamic Programming, Bellman Equation, Trajectory Optimization

Synopsis

Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. A control problem includes a cost functional that is a function of state and control variables. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. The optimal control can be derived using Pontryagin''s maximum principle (a necessary condition), or by solving the Hamilton-Jacobi-Bellman equation (a sufficient condition).

Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.

Présentation de l'éditeur

Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. A control problem includes a cost functional that is a function of state and control variables. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. The optimal control can be derived using Pontryagin''s maximum principle (a necessary condition), or by solving the Hamilton-Jacobi-Bellman equation (a sufficient condition).

Les informations fournies dans la section « A propos du livre » peuvent faire référence à une autre édition de ce titre.