This textbook offers a concise yet rigorous introduction to calculus of variations and optimal control theory and is a self-contained resource for graduate students in engineering applied mathematics and related subjects. Designed specifically for a one-semester course the book begins with calculus of variations preparing the ground for optimal control. It then gives a complete proof of the maximum principle and covers key topics such as the Hamilton-Jacobi-Bellman theory of dynamic programming