Adaptive Control: Stability, Convergence, and Robustness
by Shankar Sastry, Marc Bodson
Publisher: Prentice Hall 1994
Number of pages: 378
The objective of this book is to give the major results, techniques of analysis and new directions of research in adaptive systems. The authors give a clear, conceptual presentation of adaptive methods, to enable a critical evaluation of these techniques and suggest avenues of further development. The book presents deterministic theory of identification and adaptive control. The focus is on linear, continuous time, single-input single output systems.
Home page url
Download or read it online for free here:
(multiple PDF files)
by Ivan Ganchev Ivanov (ed.) - InTech
The book provides a self-contained treatment on practical aspects of stochastic modeling and calculus including applications in engineering, statistics and computer science. Readers should be familiar with probability theory and stochastic calculus.
by S. Boyd, L. El Ghaoui, E. Feron, V. Balakrishnan
The authors reduce a wide variety of problems arising in system and control theory to a handful of optimization problems that involve linear matrix inequalities. These problems can be solved using recently developed numerical algorithms.
by Hugh Jack
Dynamic System Modeling and Control introduces the basic concepts of system modeling with differential equations. Supplemental materials at the end of this book include a writing guide, summary of math topics, and a table of useful engineering units.
by Francesco Bullo, Jorge Cortes, Sonia Martinez - Princeton University Press
This introductory book offers a distinctive blend of computer science and control theory. The book presents a broad set of tools for understanding coordination algorithms, determining their correctness, and assessing their complexity.