Details

Nonlinear Regression Modeling for Engineering Applications


Nonlinear Regression Modeling for Engineering Applications

Modeling, Model Validation, and Enabling Design of Experiments
Wiley-ASME Press Series 1. Aufl.

von: R. Russell Rhinehart

109,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 01.08.2016
ISBN/EAN: 9781118597934
Sprache: englisch
Anzahl Seiten: 400

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p>Since mathematical models express our understanding of how nature behaves, we use them to validate our understanding of the fundamentals about systems (which could be processes, equipment, procedures, devices, or products). Also, when validated, the model is useful for engineering applications related to diagnosis, design, and optimization.</p> <p>First, we postulate a mechanism, then derive a model grounded in that mechanistic understanding. If the model does not fit the data, our understanding of the mechanism was wrong or incomplete. Patterns in the residuals can guide model improvement. Alternately, when the model fits the data, our understanding is sufficient and confidently functional for engineering applications.</p> <p>This book details methods of nonlinear regression, computational algorithms,model validation, interpretation of residuals, and useful experimental design. The focus is on practical applications, with relevant methods supported by fundamental analysis.</p> <p>This book will assist either the academic or industrial practitioner to properly classify the system, choose between the various available modeling options and regression objectives, design experiments to obtain data capturing critical system behaviors, fit the model parameters based on that data, and statistically characterize the resulting model. The author has used the material in the undergraduate unit operations lab course and in advanced control applications.</p>
<p>Series Preface xiii</p> <p>Preface xv</p> <p>Acknowledgments xxiii</p> <p>Nomenclature xxv</p> <p>Symbols xxxvii</p> <p><b>Part I INTRODUCTION</b></p> <p><b>1 Introductory Concepts 3</b></p> <p>1.1 Illustrative Example – Traditional Linear Least-Squares Regression 3</p> <p>1.2 How Models Are Used 7</p> <p>1.3 Nonlinear Regression 7</p> <p>1.4 Variable Types 8</p> <p>1.5 Simulation 12</p> <p>1.6 Issues 13</p> <p>1.7 Takeaway 15</p> <p>Exercises 15</p> <p><b>2 Model Types 16</b></p> <p>2.1 Model Terminology 16</p> <p>2.2 A Classification of Mathematical Model Types 17</p> <p>2.3 Steady-State and Dynamic Models 21</p> <p>2.3.1 Steady-State Models 22</p> <p>2.3.2 Dynamic Models (Time-Dependent, Transient) 24</p> <p>2.4 Pseudo-First Principles – Appropriated First Principles 26</p> <p>2.5 Pseudo-First Principles – Pseudo-Components 28</p> <p>2.6 Empirical Models with Theoretical Grounding 28</p> <p>2.6.1 Empirical Steady State 28</p> <p>2.6.2 Empirical Time-Dependent 30</p> <p>2.7 Empirical Models with No Theoretical Grounding 31</p> <p>2.8 Partitioned Models 31</p> <p>2.9 Empirical or Phenomenological? 32</p> <p>2.10 Ensemble Models 32</p> <p>2.11 Simulators 33</p> <p>2.12 Stochastic and Probabilistic Models 33</p> <p>2.13 Linearity 34</p> <p>2.14 Discrete or Continuous 36</p> <p>2.15 Constraints 36</p> <p>2.16 Model Design (Architecture, Functionality, Structure) 37</p> <p>2.17 Takeaway 37</p> <p>Exercises 37</p> <p><b>Part II PREPARATION FOR UNDERLYING SKILLS</b></p> <p><b>3 Propagation of Uncertainty 43</b></p> <p>3.1 Introduction 43</p> <p>3.2 Sources of Error and Uncertainty 44</p> <p>3.2.1 Estimation 45</p> <p>3.2.2 Discrimination 45</p> <p>3.2.3 Calibration Drift 45</p> <p>3.2.4 Accuracy 45</p> <p>3.2.5 Technique 46</p> <p>3.2.6 Constants and Data 46</p> <p>3.2.7 Noise 46</p> <p>3.2.8 Model and Equations 46</p> <p>3.2.9 Humans 47</p> <p>3.3 Significant Digits 47</p> <p>3.4 Rounding Off 48</p> <p>3.5 Estimating Uncertainty on Values 49</p> <p>3.5.1 Caution 50</p> <p>3.6 Propagation of Uncertainty – Overview – Two Types, Two Ways Each 51</p> <p>3.6.1 Maximum Uncertainty 51</p> <p>3.6.2 Probable Uncertainty 56</p> <p>3.6.3 Generality 58</p> <p>3.7 Which to Report? Maximum or Probable Uncertainty 59</p> <p>3.8 Bootstrapping 59</p> <p>3.9 Bias and Precision 61</p> <p>3.10 Takeaway 65</p> <p>Exercises 66</p> <p><b>4 Essential Probability and Statistics 67</b></p> <p>4.1 Variation and Its Role in Topics 67</p> <p>4.2 Histogram and Its PDF and CDF Views 67</p> <p>4.3 Constructing a Data-Based View of PDF and CDF 70</p> <p>4.4 Parameters that Characterize the Distribution 71</p> <p>4.5 Some Representative Distributions 72</p> <p>4.5.1 Gaussian Distribution 72</p> <p>4.5.2 Log-Normal Distribution 72</p> <p>4.5.3 Logistic Distribution 74</p> <p>4.5.4 Exponential Distribution 74</p> <p>4.5.5 Binomial Distribution 75</p> <p>4.6 Confidence Interval 76</p> <p>4.7 Central Limit Theorem 77</p> <p>4.8 Hypothesis and Testing 78</p> <p>4.9 Type I and Type II Errors, Alpha and Beta 80</p> <p>4.10 Essential Statistics for This Text 82</p> <p>4.10.1 t-Test for Bias 83</p> <p>4.10.2 Wilcoxon Signed Rank Test for Bias 83</p> <p>4.10.3 r-lag-1 Autocorrelation Test 84</p> <p>4.10.4 Runs Test 87</p> <p>4.10.5 Test for Steady State in a Noisy Signal 87</p> <p>4.10.6 Chi-Square Contingency Test 89</p> <p>4.10.7 Kolmogorov–Smirnov Distribution Test 89</p> <p>4.10.8 Test for Proportion 90</p> <p>4.10.9 F-Test for Equal Variance 90</p> <p>4.11 Takeaway 91</p> <p>Exercises 91</p> <p><b>5 Simulation 93</b></p> <p>5.1 Introduction 93</p> <p>5.2 Three Sources of Deviation: Measurement, Inputs, Coefficients 93</p> <p>5.3 Two Types of Perturbations: Noise (Independent) and Drifts (Persistence) 95</p> <p>5.4 Two Types of Influence: Additive and Scaled with Level 98</p> <p>5.5 Using the Inverse CDF to Generate n and u from UID(0, 1) 99</p> <p>5.6 Takeaway 100</p> <p>Exercises 100</p> <p><b>6 Steady and Transient State Detection 101</b></p> <p>6.1 Introduction 101</p> <p>6.1.1 General Applications 101</p> <p>6.1.2 Concepts and Issues in Detecting Steady State 104</p> <p>6.1.3 Approaches and Issues to SSID and TSID 104</p> <p>6.2 Method 106</p> <p>6.2.1 Conceptual Model 106</p> <p>6.2.2 Equations 107</p> <p>6.2.3 Coefficient, Threshold, and Sample Frequency Values 108</p> <p>6.2.4 Noiseless Data 111</p> <p>6.3 Applications 112</p> <p>6.3.1 Applications of the R-Statistic Approach for Process Monitoring 112</p> <p>6.3.2 Applications of the R-Statistic Approach for Determining Regression Convergence 112</p> <p>6.4 Takeaway 114</p> <p>Exercises 114</p> <p><b>Part III REGRESSION, VALIDATION, DESIGN</b></p> <p><b>7 Regression Target – Objective Function 119</b></p> <p>7.1 Introduction 119</p> <p>7.2 Experimental and Measurement Uncertainty – Static and Continuous Valued 119</p> <p>7.3 Likelihood 122</p> <p>7.4 Maximum Likelihood 124</p> <p>7.5 Estimating σx and σy Values 127</p> <p>7.6 Vertical SSD – A Limiting Consideration of Variability Only in the Response Measurement 127</p> <p>7.7 r-Square as a Measure of Fit 128</p> <p>7.8 Normal, Total, or Perpendicular SSD 130</p> <p>7.9 Akaho’s Method 132</p> <p>7.10 Using a Model Inverse for Regression 134</p> <p>7.11 Choosing the Dependent Variable 135</p> <p>7.12 Model Prediction with Dynamic Models 136</p> <p>7.13 Model Prediction with Classification Models 137</p> <p>7.14 Model Prediction with Rank Models 138</p> <p>7.15 Probabilistic Models 139</p> <p>7.16 Stochastic Models 139</p> <p>7.17 Takeaway 139</p> <p>Exercises 140</p> <p><b>8 Constraints 141</b></p> <p>8.1 Introduction 141</p> <p>8.2 Constraint Types 141</p> <p>8.3 Expressing Hard Constraints in the Optimization Statement 142</p> <p>8.4 Expressing Soft Constraints in the Optimization Statement 143</p> <p>8.5 Equality Constraints 147</p> <p>8.6 Takeaway 148</p> <p>Exercises 148</p> <p><b>9 The Distortion of Linearizing Transforms 149</b></p> <p>9.1 Linearizing Coefficient Expression in Nonlinear Functions 149</p> <p>9.2 The Associated Distortion 151</p> <p>9.3 Sequential Coefficient Evaluation 154</p> <p>9.4 Takeaway 155</p> <p>Exercises 155</p> <p><b>10 Optimization Algorithms 157</b></p> <p>10.1 Introduction 157</p> <p>10.2 Optimization Concepts 157</p> <p>10.3 Gradient-Based Optimization 159</p> <p>10.3.1 Numerical Derivative Evaluation 159</p> <p>10.3.2 Steepest Descent – The Gradient 161</p> <p>10.3.3 Cauchy’s Method 162</p> <p>10.3.4 Incremental Steepest Descent (ISD) 163</p> <p>10.3.5 Newton–Raphson (NR) 163</p> <p>10.3.6 Levenberg–Marquardt (LM) 165</p> <p>10.3.7 Modified LM 166</p> <p>10.3.8 Generalized Reduced Gradient (GRG) 167</p> <p>10.3.9 Work Assessment 167</p> <p>10.3.10 Successive Quadratic (SQ) 167</p> <p>10.3.11 Perspective 168</p> <p>10.4 Direct Search Optimizers 168</p> <p>10.4.1 Cyclic Heuristic Direct Search 169</p> <p>10.4.2 Multiplayer Direct Search Algorithms 170</p> <p>10.4.3 Leapfrogging 171</p> <p>10.5 Takeaway 173</p> <p><b>11 Multiple Optima 176</b></p> <p>11.1 Introduction 176</p> <p>11.2 Quantifying the Probability of Finding the Global Best 178</p> <p>11.3 Approaches to Find the Global Optimum 179</p> <p>11.4 Best-of-N Rule for Regression Starts 180</p> <p>11.5 Interpreting the CDF 182</p> <p>11.6 Takeaway 184</p> <p><b>12 Regression Convergence Criteria 185</b></p> <p>12.1 Introduction 185</p> <p>12.2 Convergence versus Stopping 185</p> <p>12.3 Traditional Criteria for Claiming Convergence 186</p> <p>12.4 Combining DV Influence on OF 188</p> <p>12.5 Use Relative Impact as Convergence Criterion 189</p> <p>12.6 Steady-State Convergence Criterion 190</p> <p>12.7 Neural Network Validation 197</p> <p>12.8 Takeaway 198</p> <p>Exercises 198</p> <p><b>13 Model Design – Desired and Undesired Model Characteristics and Effects 199</b></p> <p>13.1 Introduction 199</p> <p>13.2 Redundant Coefficients 199</p> <p>13.3 Coefficient Correlation 201</p> <p>13.4 Asymptotic and Uncertainty Effects When Model is Inverted 203</p> <p>13.5 Irrelevant Coefficients 205</p> <p>13.6 Poles and Sign Flips w.r.t. the DV 206</p> <p>13.7 Too Many Adjustable Coefficients or Too Many Regressors 206</p> <p>13.8 Irrelevant Model Coefficients 215</p> <p>13.8.1 Standard Error of the Estimate 216</p> <p>13.8.2 Backward Elimination 216</p> <p>13.8.3 Logical Tests 216</p> <p>13.8.4 Propagation of Uncertainty 216</p> <p>13.8.5 Bootstrapping 217</p> <p>13.9 Scale-Up or Scale-Down Transition to New Phenomena 217</p> <p>13.10 Takeaway 218</p> <p>Exercises 218</p> <p><b>14 Data Pre- and Post-processing 220</b></p> <p>14.1 Introduction 220</p> <p>14.2 Pre-processing Techniques 221</p> <p>14.2.1 Steady- and Transient-State Selection 221</p> <p>14.2.2 Internal Consistency 221</p> <p>14.2.3 Truncation 222</p> <p>14.2.4 Averaging and Voting 222</p> <p>14.2.5 Data Reconciliation 223</p> <p>14.2.6 Real-Time Noise Filtering for Noise Reduction (MA, FoF, STF) 224</p> <p>14.2.7 Real-Time Noise filtering for Outlier Removal (Median Filter) 227</p> <p>14.2.8 Real-Time Noise Filtering, Statistical Process Control 228</p> <p>14.2.9 Imputation of Input Data 230</p> <p>14.3 Post-processing 231</p> <p>14.3.1 Outliers and Rejection Criterion 231</p> <p>14.3.2 Bimodal Residual Distributions 233</p> <p>14.3.3 Imputation of Response Data 235</p> <p>14.4 Takeaway 235</p> <p>Exercises 235</p> <p><b>15 Incremental Model Adjustment 237</b></p> <p>15.1 Introduction 237</p> <p>15.2 Choosing the Adjustable Coefficient in Phenomenological Models 238</p> <p>15.3 Simple Approach 238</p> <p>15.4 An Alternate Approach 240</p> <p>15.5 Other Approaches 241</p> <p>15.6 Takeaway 241</p> <p>Exercises 241</p> <p><b>16 Model and Experimental Validation 242</b></p> <p>16.1 Introduction 242</p> <p>16.1.1 Concepts 242</p> <p>16.1.2 Deterministic Models 244</p> <p>16.1.3 Stochastic Models 246</p> <p>16.1.4 Reality! 249</p> <p>16.2 Logic-Based Validation Criteria 250</p> <p>16.3 Data-Based Validation Criteria and Statistical Tests 251</p> <p>16.3.1 Continuous-Valued, Deterministic, Steady State, or End-of-Batch 251</p> <p>16.3.2 Continuous-Valued, Deterministic, Transient 263</p> <p>16.3.3 Class/Discrete/Rank-Valued, Deterministic, Batch, or Steady State 264</p> <p>16.3.4 Continuous-Valued, Stochastic, Batch, or Steady State 265</p> <p>16.3.5 Test for Normally Distributed Residuals 266</p> <p>16.3.6 Experimental Procedure Validation 266</p> <p>16.4 Model Discrimination 267</p> <p>16.4.1 Mechanistic Models 267</p> <p>16.4.2 Purely Empirical Models 268</p> <p>16.5 Procedure Summary 268</p> <p>16.6 Alternate Validation Approaches 269</p> <p>16.7 Takeaway 270</p> <p>Exercises 270</p> <p><b>17 Model Prediction Uncertainty 272</b></p> <p>17.1 Introduction 272</p> <p>17.2 Bootstrapping 273</p> <p>17.3 Takeaway 276</p> <p><b>18 Design of Experiments for Model Development and Validation 277</b></p> <p>18.1 Concept – Plan and Data 277</p> <p>18.2 Sufficiently Small Experimental Uncertainty – Methodology 277</p> <p>18.3 Screening Designs – A Good Plan for an Alternate Purpose 281</p> <p>18.4 Experimental Design – A Plan for Validation and Discrimination 282</p> <p>18.4.1 Continually Redesign 282</p> <p>18.4.2 Experimental Plan 283</p> <p>18.5 EHS&LP 286</p> <p>18.6 Visual Examples of Undesired Designs 287</p> <p>18.7 Example for an Experimental Plan 289</p> <p>18.8 Takeaway 291</p> <p>Exercises 292</p> <p><b>19 Utility versus Perfection 293</b></p> <p>19.1 Competing and Conflicting Measures of Excellence 293</p> <p>19.2 Attributes for Model Utility Evaluation 294</p> <p>19.3 Takeaway 295</p> <p>Exercises 296</p> <p><b>20 Troubleshooting 297</b></p> <p>20.1 Introduction 297</p> <p>20.2 Bimodal and Multimodal Residuals 297</p> <p>20.3 Trends in the Residuals 298</p> <p>20.4 Parameter Correlation 298</p> <p>20.5 Convergence Criterion – Too Tight, Too Loose 299</p> <p>20.6 Overfitting (Memorization) 300</p> <p>20.7 Solution Procedure Encounters Execution Errors 300</p> <p>20.8 Not a Sharp CDF (OF) 300</p> <p>20.9 Outliers 301</p> <p>20.10 Average Residual Not Zero 302</p> <p>20.11 Irrelevant Model Coefficients 302</p> <p>20.12 Data Work-Up after the Trials 302</p> <p>20.13 Too Many rs! 303</p> <p>20.14 Propagation of Uncertainty Does Not Match Residuals 303</p> <p>20.15 Multiple Optima 304</p> <p>20.16 Very Slow Progress 304</p> <p>20.17 All Residuals are Zero 304</p> <p>20.18 Takeaway 305</p> <p>Exercises 305</p> <p><b>Part IV CASE STUDIES AND DATA</b></p> <p><b>21 Case Studies 309</b></p> <p>21.1 Valve Characterization 309</p> <p>21.2 CO2 Orifice Calibration 311</p> <p>21.3 Enrollment Trend 312</p> <p>21.4 Algae Response to Sunlight Intensity 314</p> <p>21.5 Batch Reaction Kinetics 316</p> <p>Appendix A: VBA Primer: Brief on VBA Programming – Excel in Office 2013 319</p> <p>Appendix B: Leapfrogging Optimizer Code for Steady-State Models 328</p> <p>Appendix C: Bootstrapping with Static Model 341</p> <p>References and Further Reading 350</p> <p>Index 355</p>
<p><b>R. Russell Rhinehart</b>, Oklahoma State University, USA.<br />Professor Rhinehart obtained his Ph.D. in Chemical Engineering in 1985 from North Carolina State University, USA. His research interests include process improvement (modeling, optimization, and control), and product improvement (modeling and design). In 2004 he was named as one of InTECHs 50 most influential industry innovators of the past 50 years, and was inducted into the Automation Hall of Fame for the Process Industries in 2005.  He has written extensively for numerous journals and refereed articles.</p>

Diese Produkte könnten Sie auch interessieren:

Neutron Applications in Earth, Energy and Environmental Sciences
Neutron Applications in Earth, Energy and Environmental Sciences
von: Liyuan Liang, Romano Rinaldi, Helmut Schober
PDF ebook
149,79 €
Nanobioelectronics - for Electronics, Biology, and Medicine
Nanobioelectronics - for Electronics, Biology, and Medicine
von: Andreas Offenhäusser, Ross Rinaldi
PDF ebook
96,29 €
Autonomous Robots
Autonomous Robots
von: Farbod Fahimi
PDF ebook
117,69 €