0%

Book Description

An Application-Oriented Introduction to Essential Optimization Concepts and Best Practices

Optimization is an inherent human tendency that gained new life after the advent of calculus; now, as the world grows increasingly reliant on complex systems, optimization has become both more important and more challenging than ever before. Engineering Optimization provides a practically-focused introduction to modern engineering optimization best practices, covering fundamental analytical and numerical techniques throughout each stage of the optimization process.

Although essential algorithms are explained in detail, the focus lies more in the human function: how to create an appropriate objective function, choose decision variables, identify and incorporate constraints, define convergence, and other critical issues that define the success or failure of an optimization project.

Examples, exercises, and homework throughout reinforce the author’s “do, not study” approach to learning, underscoring the application-oriented discussion that provides a deep, generic understanding of the optimization process that can be applied to any field.

Providing excellent reference for students or professionals, Engineering Optimization:

  • Describes and develops a variety of algorithms, including gradient based (such as Newton’s, and Levenberg-Marquardt), direct search (such as Hooke-Jeeves, Leapfrogging, and Particle Swarm), along with surrogate functions for surface characterization
  • Provides guidance on optimizer choice by application, and explains how to determine appropriate optimizer parameter values
  • Details current best practices for critical stages of specifying an optimization procedure, including decision variables, defining constraints, and relationship modeling
  • Provides access to software and Visual Basic macros for Excel on the companion website, along with solutions to examples presented in the book

Clear explanations, explicit equation derivations, and practical examples make this book ideal for use as part of a class or self-study, assuming a basic understanding of statistics, calculus, computer programming, and engineering models. Anyone seeking best practices for “making the best choices” will find value in this introductory resource.

Table of Contents

  1. Cover
  2. Title Page
  3. Preface
    1. Introduction
    2. Key Points
    3. Book Aspirations
    4. Organization
    5. Rationale for the Book
    6. Target Audience
    7. Presentation Style
  4. Acknowledgments
  5. Nomenclature
  6. About the Companion Website
  7. Section 1: Introductory Concepts
    1. 1 Optimization
      1. 1.1 Optimization and Terminology
      2. 1.2 Optimization Concepts and Definitions
      3. 1.3 Examples
      4. 1.4 Terminology Continued
      5. 1.5 Optimization Procedure
      6. 1.6 Issues That Shape Optimization Procedures
      7. 1.7 Opposing Trends
      8. 1.8 Uncertainty
      9. 1.9 Over‐ and Under‐specification in Linear Equations
      10. 1.10 Over‐ and Under‐specification in Optimization
      11. 1.11 Test Functions
      12. 1.12 Significant Dates in Optimization
      13. 1.13 Iterative Procedures
      14. 1.14 Takeaway
      15. 1.15 Exercises
    2. 2 Optimization Application Diversity and Complexity
      1. 2.1 Optimization
      2. 2.2 Nonlinearity
      3. 2.3 Min, Max, Min–Max, Max–Min, …
      4. 2.4 Integers and Other Discretization
      5. 2.5 Conditionals and Discontinuities: Cliffs Ridges/Valleys
      6. 2.6 Procedures, Not Equations
      7. 2.7 Static and Dynamic Models
      8. 2.8 Path Integrals
      9. 2.9 Economic Optimization and Other Nonadditive Cost Functions
      10. 2.10 Reliability
      11. 2.11 Regression
      12. 2.12 Deterministic and Stochastic
      13. 2.13 Experimental w.r.t. Modeled OF
      14. 2.14 Single and Multiple Optima
      15. 2.15 Saddle Points
      16. 2.16 Inflections
      17. 2.17 Continuum and Discontinuous DVs
      18. 2.18 Continuum and Discontinuous Models
      19. 2.19 Constraints and Penalty Functions
      20. 2.20 Ranks and Categorization: Discontinuous OFs
      21. 2.21 Underspecified OFs
      22. 2.22 Takeaway
      23. 2.23 Exercises
    3. 3 Validation
      1. 3.1 Introduction
      2. 3.2 Validation
      3. 3.3 Advice on Becoming Proficient
      4. 3.4 Takeaway
      5. 3.5 Exercises
  8. Section 2: Univariate Search Techniques
    1. 4 Univariate (Single DV) Search Techniques
      1. 4.1 Univariate (Single DV)
      2. 4.2 Analytical Method of Optimization
      3. 4.3 Numerical Iterative Procedures
      4. 4.4 Direct Search Approaches
      5. 4.5 Perspectives on Univariate Search Methods
      6. 4.6 Evaluating Optimizers
      7. 4.7 Summary of Techniques
      8. 4.8 Takeaway
      9. 4.9 Exercises
    2. 5 Path Analysis
      1. 5.1 Introduction
      2. 5.2 Path Examples
      3. 5.3 Perspective About Variables
      4. 5.4 Path Distance Integral
      5. 5.5 Accumulation along a Path
      6. 5.6 Slope along a Path
      7. 5.7 Parametric Path Notation
      8. 5.8 Takeaway
      9. 5.9 Exercises
    3. 6 Stopping and Convergence Criteria: 1‐D Applications
      1. 6.1 Stopping versus Convergence Criteria
      2. 6.2 Determining Convergence
      3. 6.3 Combinations of Convergence Criteria
      4. 6.4 Choosing Convergence Threshold Values
      5. 6.5 Precision
      6. 6.6 Other Convergence Criteria
      7. 6.7 Stopping Criteria to End a Futile Search
      8. 6.8 Choices!
      9. 6.9 Takeaway
      10. 6.10 Exercises
  9. Section 3: Multivariate Search Techniques
    1. 7 Multidimension Application Introduction and the Gradient
      1. 7.1 Introduction
      2. 7.2 Illustration of Surface and Terms
      3. 7.3 Some Surface Analysis
      4. 7.4 Parametric Notation
      5. 7.5 Extension to Higher Dimension
      6. 7.6 Takeaway
      7. 7.7 Exercises
    2. 8 Elementary Gradient‐Based Optimizers
      1. 8.1 Introduction
      2. 8.2 Cauchy’s Sequential Line Search
      3. 8.3 Incremental Steepest Descent
      4. 8.4 Takeaway
      5. 8.5 Exercises
    3. 9 Second‐Order Model‐Based Optimizers
      1. 9.1 Introduction
      2. 9.2 Successive Quadratic
      3. 9.3 Newton–Raphson
      4. 9.4 Perspective on CSLS, ISD, SQ, and NR
      5. 9.5 Choosing Step Size for Numerical Estimate of Derivatives
      6. 9.6 Takeaway
      7. 9.7 Exercises
    4. 10 Gradient‐Based Optimizer Solutions
      1. 10.1 Introduction
      2. 10.2 Levenberg–Marquardt (LM)
      3. 10.3 Scaled Variables
      4. 10.4 Conjugate Gradient (CG)
      5. 10.5 Broyden–Fletcher–Goldfarb–Shanno (BFGS)
      6. 10.6 Generalized Reduced Gradient (GRG)
      7. 10.7 Takeaway
      8. 10.8 Exercises
    5. 11 Direct Search Techniques
      1. 11.1 Introduction
      2. 11.2 Cyclic Heuristic Direct (CHD) Search
      3. 11.3 Hooke–Jeeves (HJ)
      4. 11.4 Compare and Contrast CHD and HJ Features: A Summary
      5. 11.5 Nelder–Mead (NM) Simplex: Spendley, Hext, and Himsworth
      6. 11.6 Multiplayer Direct Search Algorithms
      7. 11.7 Leapfrogging
      8. 11.8 Particle Swarm Optimization
      9. 11.9 Complex Method (CM)
      10. 11.10 A Brief Comparison
      11. 11.11 Takeaway
      12. 11.12 Exercises
    6. 12 Linear Programming
      1. 12.1 Introduction
      2. 12.2 Visual Representation and Concepts
      3. 12.3 Basic LP Procedure
      4. 12.4 Canonical LP Statement
      5. 12.5 LP Algorithm
      6. 12.6 Simplex Tableau
      7. 12.7 Takeaway
      8. 12.8 Exercises
    7. 13 Dynamic Programming
      1. 13.1 Introduction
      2. 13.2 Conditions
      3. 13.3 DP Concept
      4. 13.4 Some Calculation Tips
      5. 13.5 Takeaway
      6. 13.6 Exercises
    8. 14 Genetic Algorithms and Evolutionary Computation
      1. 14.1 Introduction
      2. 14.2 GA Procedures
      3. 14.3 Fitness of Selection
      4. 14.4 Takeaway
      5. 14.5 Exercises
    9. 15 Intuitive Optimization
      1. 15.1 Introduction
      2. 15.2 Levels
      3. 15.3 Takeaway
      4. 15.4 Exercises
    10. 16 Surface Analysis II
      1. 16.1 Introduction
      2. 16.2 Maximize Is Equivalent to Minimize the Negative
      3. 16.3 Scaling by a Positive Number Does Not Change DV
      4. 16.4 Scaled and Translated OFs Do Not Change DV
      5. 16.5 Monotonic Function Transformation Does Not Change DV
      6. 16.6 Impact on Search Path or NOFE
      7. 16.7 Inequality Constraints
      8. 16.8 Transforming DVs
      9. 16.9 Takeaway
      10. 16.10 Exercises
    11. 17 Convergence Criteria 2
      1. 17.1 Introduction
      2. 17.2 Defining an Iteration
      3. 17.3 Criteria for Single TS Deterministic Procedures
      4. 17.4 Criteria for Multiplayer Deterministic Procedures
      5. 17.5 Stochastic Applications
      6. 17.6 Miscellaneous Observations
      7. 17.7 Takeaway
      8. 17.8 Exercises
    12. 18 Enhancements to Optimizers
      1. 18.1 Introduction
      2. 18.2 Criteria for Replicate Trials
      3. 18.3 Quasi‐Newton
      4. 18.4 Coarse–Fine Sequence
      5. 18.5 Number of Players
      6. 18.6 Search Range Adjustment
      7. 18.7 Adjustment of Optimizer Coefficient Values or Options in Process
      8. 18.8 Initialization Range
      9. 18.9 OF and DV Transformations
      10. 18.10 Takeaway
      11. 18.11 Exercises
  10. Section 4: Developing Your Application Statements
    1. 19 Scaled Variables and Dimensional Consistency
      1. 19.1 Introduction
      2. 19.2 A Scaled Variable Approach
      3. 19.3 Sampling of Issues with Primitive Variables
      4. 19.4 Linear Scaling Options
      5. 19.5 Nonlinear Scaling
      6. 19.6 Takeaway
      7. 19.7 Exercises
    2. 20 Economic Optimization
      1. 20.1 Introduction
      2. 20.2 Annual Cash Flow
      3. 20.3 Including Risk as an Annual Expense
      4. 20.4 Capital
      5. 20.5 Combining Capital and Nominal Annual Cash Flow
      6. 20.6 Combining Time Value and Schedule of Capital and Annual Cash Flow
      7. 20.7 Present Value
      8. 20.8 Including Uncertainty
      9. 20.9 Takeaway
      10. 20.10 Exercises
    3. 21 Multiple OF and Constraint Applications
      1. 21.1 Introduction
      2. 21.2 Solution 1: Additive Combinations of the Functions
      3. 21.3 Solution 2: Nonadditive OF Combinations
      4. 21.4 Solution 3: Pareto Optimal
      5. 21.5 Takeaway
      6. 21.6 Exercises
    4. 22 Constraints
      1. 22.1 Introduction
      2. 22.2 Equality Constraints
      3. 22.3 Inequality Constraints
      4. 22.4 Constraints: Pass/Fail Categories
      5. 22.5 Hard Constraints Can Block Progress
      6. 22.6 Advice
      7. 22.7 Constraint‐Equivalent Features
      8. 22.8 Takeaway
      9. 22.9 Exercises
    5. 23 Multiple Optima
      1. 23.1 Introduction
      2. 23.2 Solution: Multiple Starts
      3. 23.3 Other Options
      4. 23.4 Takeaway
      5. 23.5 Exercises
    6. 24 Stochastic Objective Functions
      1. 24.1 Introduction
      2. 24.2 Method Summary for Optimizing Stochastic Functions
      3. 24.3 What Value to Report?
      4. 24.4 Application Examples
      5. 24.5 Takeaway
      6. 24.6 Exercises
    7. 25 Effects of Uncertainty
      1. 25.1 Introduction
      2. 25.2 Sources of Error and Uncertainty
      3. 25.3 Significant Digits
      4. 25.4 Estimating Uncertainty on Values
      5. 25.5 Propagating Uncertainty on DV Values
      6. 25.6 Implicit Relations
      7. 25.7 Estimating Uncertainty in DV and OF
      8. 25.8 Takeaway
      9. 25.9 Exercises
    8. 26 Optimization of Probable Outcomes and Distribution Characteristics
      1. 26.1 Introduction
      2. 26.2 The Concept of Modeling Uncertainty
      3. 26.3 Stochastic Approach
      4. 26.4 Takeaway
      5. 26.5 Exercises
    9. 27 Discrete and Integer Variables
      1. 27.1 Introduction
      2. 27.2 Optimization Solutions
      3. 27.3 Convergence
      4. 27.4 Takeaway
      5. 27.5 Exercises
    10. 28 Class Variables
      1. 28.1 Introduction
      2. 28.2 The Random Keys Method: Sequence
      3. 28.3 The Random Keys Method: Dichotomous Variables
      4. 28.4 Comments
      5. 28.5 Takeaway
      6. 28.6 Exercises
    11. 29 Regression
      1. 29.1 Introduction
      2. 29.2 Perspective
      3. 29.3 Least Squares Regression: Traditional View on Linear Model Parameters
      4. 29.4 Models Nonlinear in DV
      5. 29.5 Maximum Likelihood
      6. 29.6 Convergence Criterion
      7. 29.7 Model Order or Complexity
      8. 29.8 Bootstrapping to Reveal Model Uncertainty
      9. 29.9 Perspective
      10. 29.10 Takeaway
      11. 29.11 Exercises
  11. Section 5: Perspective on Many Topics
    1. 30 Perspective
      1. 30.1 Introduction
      2. 30.2 Classifications
      3. 30.3 Elements Associated with Optimization
      4. 30.4 Root Finding Is Not Optimization
      5. 30.5 Desired Engineering Attributes
      6. 30.6 Overview of Optimizers and Attributes
      7. 30.7 Choices
      8. 30.8 Variable Classifications
      9. 30.9 Constraints
      10. 30.10 Takeaway
      11. 30.11 Exercises
    2. 31 Response Surface Aberrations
      1. 31.1 Introduction
      2. 31.2 Cliffs (Vertical Walls)
      3. 31.3 Sharp Valleys (or Ridges)
      4. 31.4 Striations
      5. 31.5 Level Spots (Functions 1, 27, 73, 84)
      6. 31.6 Hard‐to‐Find Optimum
      7. 31.7 Infeasible Calculations
      8. 31.8 Uniform Minimum
      9. 31.9 Noise: Stochastic Response
      10. 31.10 Multiple Optima
      11. 31.11 Takeaway
      12. 31.12 Exercises
    3. 32 Identifying the Models, OF, DV, Convergence Criteria, and Constraints
      1. 32.1 Introduction
      2. 32.2 Evaluate the Results
      3. 32.3 Takeaway
      4. 32.4 Exercises
    4. 33 Evaluating Optimizers
      1. 33.1 Introduction
      2. 33.2 Challenges to Optimizers
      3. 33.3 Stakeholders
      4. 33.4 Metrics of Optimizer Performance
      5. 33.5 Designing an Experimental Test
      6. 33.6 Takeaway
      7. 33.7 Exercises
    5. 34 Troubleshooting Optimizers
      1. 34.1 Introduction
      2. 34.2 DV Values Do Not Change
      3. 34.3 Multiple DV Values for the Same OF Value
      4. 34.4 EXE Error
      5. 34.5 Extreme Values
      6. 34.6 DV Is Dependent on Convergence Threshold
      7. 34.7 OF Is Irreproducible
      8. 34.8 Concern over Results
      9. 34.9 CDF Features
      10. 34.10 Parameter Correlation
      11. 34.11 Multiple Equivalent Solutions
      12. 34.12 Takeaway
      13. 34.13 Exercises
  12. Section 6: Analysis of Leapfrogging Optimization
    1. 35 Analysis of Leapfrogging
      1. 35.1 Introduction
      2. 35.2 Balance in an Optimizer
      3. 35.3 Number of Initializations to be Confident That the Best Will Draw All Others to the Global Optimum
      4. 35.4 Leap‐To Window Amplification Analysis
      5. 35.5 Analysis of α and M to Prevent Convergence on the Side of a Hill
      6. 35.6 Analysis of α and M to Minimize NOFE
      7. 35.7 Probability Distribution of Leap‐Overs
      8. 35.8 Takeaway
      9. 35.9 Exercises
  13. Section 7: Case Studies
    1. 36 Case Study 1
      1. 36.1 Process and Analysis
      2. 36.2 Exercises
    2. 37 Case Study 2
      1. 37.1 The Process and Analysis
      2. 37.2 Exercises
    3. 38 Case Study 3
      1. 38.1 The Process and Analysis
      2. 38.2 Exercises
    4. 39 Case Study 4
      1. 39.1 The Process and Analysis
      2. 39.2 Pre‐Assignment Note
      3. 39.3 Exercises
    5. 40 Case Study 5
      1. 40.1 The Process and Analysis
      2. 40.2 Exercises
    6. 41 Case Study 6
      1. 41.1 Description and Analysis
      2. 41.2 Exercises
    7. 42 Case Study 7
      1. 42.1 Concepts and Analysis
      2. 42.2 Exercises
    8. 43 Case Study 8
      1. 43.1 Description and Analysis
      2. 43.2 Exercises
    9. 44 Case Study 9
      1. 44.1 Description and Analysis
      2. 44.2 Exercises
    10. 45 Case Study 10
      1. 45.1 Description and Analysis
      2. 45.2 Exercises
  14. Section 8: Appendices
    1. Appendix A: Mathematical Concepts and Procedures
      1. A.1 Representation of Relations
      2. A.2 Taylor Series Expansion (Single Variable)
      3. A.3 Taylor Series Expansion (Multiple Variable)
      4. A.4 Evaluating First Derivatives at x0
      5. A.5 Partial Derivatives: First Order
      6. A.6 Partial Derivatives: Second Order
      7. A.7 Linear Algebra Notations
      8. A.8 Algebra and Assignment Statements
    2. Appendix B: Root Finding
      1. B.1 Introduction
      2. B.2 Interval Halving: Bisection Method
      3. B.3 Newton’s: Secant Version
      4. B.4 Which to Choose?
    3. Appendix C: Gaussian Elimination
      1. C.1 Linear Equation Sets
      2. C.2 Gaussian Elimination
      3. C.3 Pivoting
      4. C.4 Code in VBA
    4. Appendix D: Steady‐State Identification in Noisy Signals
      1. D.1 Introduction
      2. D.2 Conceptual Model
      3. D.3 Method Equations
      4. D.4 Coefficient, Threshold, and Sample Frequency Values
      5. D.5 Type‐I Error
      6. D.6 Type‐II Error
      7. D.7 Alternate Type‐I Error
      8. D.8 Alternate Array Method
      9. D.9 SSID and TSID VBA Code
    5. Appendix E: Optimization Challenge Problems (2‐D and Single OF)
      1. E.1 Introduction
      2. E.2 Challenges for Optimizers
      3. E.3 Test Functions
      4. E.4 Other Test Function Sources
    6. Appendix F: Brief on VBA Programming
      1. F.1 Introduction
      2. F.2 To Start
      3. F.3 General
      4. F.4 I/O to Excel Cells
      5. F.5 Variable Types and Declarations
      6. F.6 Operations
      7. F.7 Loops
      8. F.8 Conditionals
      9. F.9 Debugging
      10. F.10 Run Buttons (Commands)
      11. F.11 Objects and Properties
      12. F.12 Keystroke Macros
      13. F.13 External File I/O
      14. F.14 Solver Add‐In
      15. F.15 Calling Solver from VBA
  15. Section 9: References and Index
    1. References and Additional Resources
      1. Books on Optimization
      2. Books on Probability and Statistics
      3. Books on Simulation
      4. Specific Techniques
      5. Selected Landmark Papers
      6. Selected Websites Resources
  16. Index
  17. End User License Agreement