This unified survey of the theory of adaptive filtering, prediction, and control focuses on linear discrete-time systems and explores the natural extensions to nonlinear systems. In keeping with the importance of computers to practical applications, the authors emphasize discrete-time systems. Their approach summarizes the theoretical and practical aspects of a large class of adaptive algorithms. Ideal for advanced undergraduate and graduate classes, this treatment consists of two parts. The first section concerns deterministic systems, covering models, parameter estimation, and adaptive prediction and control. The second part examines stochastic systems, exploring optimal filtering and prediction, parameter estimation, adaptive filtering and prediction, and adaptive control. Extensive appendices offer a summary of relevant background material, making this volume largely self-contained. Readers will find that these theories, formulas, and applications are related to a variety of fields, including biotechnology, aerospace engineering, computer sciences, and electrical engineering.