Chapter1 Probability 1
1.1 Introduction 1
1.1.1 Outcomes 2
1.1.2 Events 3
1.1.3 Disjunctions and Conjunctions of Events 3
1.1.4 Axioms of Probability 4
1.1.5 Subjective and Objective Probability 5
1.2 Joint and Conditional Probability 7
1.2.1 Joint Probability 7
1.2.2 Conditional Probability 7
1.2.3 Bayes's Rule 11
1.3 Random Variables 14
1.3.1 Distribution 15
1.3.2 Cumulative Density Function 17
1.3.3 Generating Values from an Arbitrary Distribution 17
1.3.4 Expectation of a Random Variable 18
1.3.5 Variance of a Random Variable
1.4 Moments and Moment Generating Functions 21
1.4.1 Moments 21
1.4.2 Moment Generating Functions 22
1.4.3 Properties of Moment Generating Functions 24
1.5 Standard Discrete Distributions 25
1.5.1 Bernoulli Distribution 25
1.5.2 Binomial Distribution 25
1.5.3 Geometric Distribution 27
1.5.4 Poisson Distribution 27
1.6 Standard Continuous Distributions 29
1.6.1 Uniform Distribution 29
1.6.2 Gaussian, or Normal, Distribution 29
1.6.3 Exponential Distribution 32
1.6.4 Power-Law Distribution 34
1.7 Useful Theorems 35
1.7.1 Markov's Inequality 36
1.7.2 Chebyshev's Inequality 36
1.7.3 Chernoff Bound 37
1.7.4 Strong Law of Large Numbers 39
1.7.5 Central Limit Theorem 40
1.8 Jointly Distributed Random Variables 42
1.8.1 Bayesian Networks 44
1.10 Exercises 47

Chapter2 Statistics 53
2.1 Sampling a Population 53
2.1.1 Types of Sampling 55
2.1.2 Scales 56
2.1.3 Outliers 57
2.2 Describing a Sample Parsimoniously 57
2.2.1 Tables 58
2.2.2 Bar Graphs, Histograms, and Cumulative Histograms 58
2.2.3 The Sample Mean 60
2.2.4 The Sample Median 63
2.2.5 Measures of Variability 64
2.3 Inferring Population Parameters from Sample Parameters 66
2.4 Testing Hypotheses about Outcomes of Experiments 70
2.4.1 Hypothesis Testing 71
2.4.2 Errors in Hypothesis Testing 72
2.4.3 Formulating a Hypothesis 73
2.4.4 Comparing an Outcome with a Fixed Quantity 74
2.4.5 Comparing Outcomes from Two Experiments 76
2.4.6 Testing Hypotheses about Quantities Measured on Ordinal Scales 79
2.4.7 Fitting a Distribution 82
2.4.8 Power 85
2.5 Independence and Dependence: Regression and Correlation 86
2.5.1 Independence 86
2.5.2 Regression 88
2.5.3 Correlation 90
2.6 Comparing Multiple Outcomes Simultaneously: Analysis of Variance 95
2.6.1 One-Way Layout 95
2.6.2 Multiway Layouts 98
2.7 Design of Experiments 99
2.8 Dealing with Large Data Sets 100
2.9 Common Mistakes in Statistical Analysis 103
2.9.1 Defining Population 103
2.9.2 Lack of Confidence Intervals in Comparing Results 103
2.9.3 Not Stating the Null Hypothesis 103
2.9.4 Too Small a Sample 104
2.9.5 Too Large a Sample 104
2.9.6 Not Controlling All Variables When Collecting Observations 104
2.9.7 Converting Ordinal to Interval Scales 104
2.9.8 Ignoring Outliers 105
2.11 Exercises

Chapter3 Linear Algebra 109
3.1 Vectors and Matrices 109
3.2 Vector and Matrix Algebra 111
3.2.2 Transpose 111
3.2.3 Multiplication 112
3.2.4 Square Matrices 113
3.2.5 Exponentiation 113
3.2.6 Matrix Exponential 114
3.3 Linear Combinations, Independence, Basis, and Dimension 114
3.3.1 Linear Combinations 114
3.3.2 Linear Independence 115
3.3.3 Vector Spaces, Basis, and Dimension 116
3.4 Using Matrix Algebra to Solve Linear Equations 117
3.4.1 Representation 117
3.4.2 Elementary Row Operations and Gaussian Elimination 118
3.4.3 Rank 120
3.4.4 Determinants 121
3.4.5 Cramer's Theorem 123
3.4.6 The Inverse of a Matrix 124
3.5 Linear Transformations, Eigenvalues, and Eigenvectors 125
3.5.1 A Matrix as a Linear Transformation 125
3.5.2 The Eigenvalue of a Matrix 126
3.5.3 Computing the Eigenvalues of a Matrix 129
3.5.4 The Importance of Eigenvalues 132
3.5.5 The Role of the Principal Eigenvalue 133
3.5.6 Finding Eigenvalues and Eigenvectors 134
3.5.7 Similarity and Diagonalization 137
3.6 Stochastic Matrices 138
3.6.1 Computing State Transitions by Using a Stochastic Matrix 139
3.6.2 Eigenvalues of a Stochastic Matrix 140
3.7 Exercises 143

Chapter4 Optimization 147
4.1 System Modeling and Optimization 147
4.2 Introduction to Optimization 149
4.3 Optimizing Linear Systems 152
4.3.1 Network Flow 156
4.4 Integer Linear Programming 157
4.4.1 Total Unimodularity 160
4.4.2 Weighted Bipartite Matching 161
4.5 Dynamic Programming 162
4.6 Nonlinear Constrained Optimization 164
4.6.1 Lagrangian Techniques 164
4.6.2 Karush-Kuhn-Tucker Conditions for Nonlinear Optimization 166
4.7 Heuristic Nonlinear Optimization 167
4.7.1 Hill Climbing 167
4.7.2 Genetic Algorithms 169
4.8 Exercises 170

Chapter5 Signals, Systems, and Transforms 173
5.1 Background 173
5.1.1 Sinusoids 173
5.1.2 Complex Numbers 174
5.1.3 Euler's Formula 176
5.1.4 Discrete-Time Convolution and the Impulse Function 179
5.1.5 Continuous-Time Convolution and the Dirac Delta Function 182
5.2 Signals 185
5.2.1 The Complex Exponential Signal 186
5.3 Systems 188
5.4 Analysis of a Linear Time-Invariant System 189
5.4.1 The Effect of an LTI System on a Complex Exponential Input 189
5.4.2 The Output of an LTI System with a Zero Input 191
5.4.3 The Output of an LTI System for an Arbitrary Input 193
5.4.4 Stability of an LTI System
5.5 Transforms 195
5.6 The Fourier Series 196
5.7 The Fourier Transform and Its Properties 200
5.8 The Laplace Transform 209
5.8.1 Poles, Zeroes, and the Region of Convergence 210
5.8.2 Properties of the Laplace Transform 212
5.9 The Discrete Fourier Transform and Fast Fourier Transform 216
5.9.1 The Impulse Train 216
5.9.2 The Discrete-Time Fourier Transform 217
5.9.3 Aliasing 219
5.9.4 The Discrete-Time-and-Frequency Fourier Transform 222
5.9.5 The Fast Fourier Transform 224
5.10 The Z Transform 226
5.10.1 Relationship between Z and Laplace Transforms 229
5.10.2 Properties of the Z Transform 231
5.12 Exercises 234

Chapter6 Stochastic Processes and Queueing Theory 237
6.1 Overview 237
6.1.1 A General Queueing System 238
6.1.2 Little's Theorem 238
6.2 Stochastic Processes 240
6.2.1 Discrete and Continuous Stochastic Processes 242
6.2.2 Markov Processes 243
6.2.3 Homogeneity, State-Transition Diagrams, and the Chapman-Kolmogorov Equations 244
6.2.4 Irreducibility 246
6.2.5 Recurrence 247
6.2.6 Periodicity 247
6.2.7 Ergodicity 248
6.2.8 A Fundamental Theorem 249
6.2.9 Stationary (Equilibrium) Probability of a Markov Chain 250
6.2.10 A Second Fundamental Theorem 251
6.2.11 Mean Residence Time in a State 252
6.3 Continuous-Time Markov Chains 252
6.3.1 Markov Property for Continuous-Time Stochastic Processes 253
6.3.2 Residence Time in a Continuous-Time Markov Chain 253
6.3.3 Stationary Probability Distribution for a Continuous-Time Markov Chain 253
6.4 Birth-Death Processes 255
6.4.1 Time-Evolution of a Birth-Death Process 255
6.4.2 Stationary Probability Distribution of a Birth-Death Process 256
6.4.3 Finding the Transition-Rate Matrix 257
6.4.4 A Pure-Birth (Poisson) Process 259
6.4.5 Stationary Probability Distribution for a Birth-Death Process 260
6.5 The M/M/1 Queue 262
6.6 Two Variations on the M/M/1 Queue 266
6.6.1 The M/M/?Queue: A Responsive Server 266
6.6.2 M/M/1/K: Bounded Buffers 268
6.7 Other Queueing Systems 270
6.7.1 M/D/1: Deterministic Service Times 270
6.7.2 G/G/1 270
6.7.3 Networks of Queues 271
6.9 Exercises 272

Chapter7 Game Theory 277
7.1 Concepts and Terminology 278
7.1.1 Preferences and Preference Ordering 278
7.1.2 Terminology 281
7.1.3 Strategies 282
7.1.4 Game Representation 283
7.1.5 Response and Best Response 287
7.1.6 Dominant and Dominated Strategy
7.1.7 Bayesian Games 288
7.1.8 Repeated Games 289
7.2 Solving a Game 291
7.2.1 Solution Concept and Equilibrium 291
7.2.2 Dominant-Strategy Equilibria 291
7.2.3 Iterated Removal of Dominated Strategies 293
7.2.4 Maximin Equilibrium 294
7.2.5 Nash Equilibria 296
7.2.6 Correlated Equilibria 299
7.2.7 Other Solution Concepts 301
7.3 Mechanism Design 301
7.3.1 Practical Mechanisms 302
7.3.2 Three Negative Results 302
7.3.3 Examples of Mechanism Design 304
7.3.4 Formalization 307
7.3.5 Desirable Properties of a Mechanism 308
7.3.6 Revelation Principle 309
7.3.7 Vickrey-Clarke-Groves Mechanism 310
7.3.8 Problems with VCG Mechanisms 313
7.4 Limitations of Game Theory 314
7.6 Exercises 316

Chapter8 Elements of Control Theory 319
8.1 Overview of a Controlled System 320
8.2 Modeling a System 323
8.2.1 Modeling Approach 323
8.2.2 Mathematical Representation 324
8.3 A First-Order System 329
8.4 A Second-Order System 331
8.4.1 Case 1 (Undamped System): a= 0
8.4.2 Case 2 (Underdamped System): 0 < a < 1
8.4.3 Case 3 (Critically Damped System): a = 1
8.4.4 Case 4 (Overdamped System): a > 1
8.5 Basics of Feedback Control 336
8.5.1 System Goal 338
8.5.2 Constraints 339
8.6 PID Control 341
8.6.1 Proportional-Mode Control 341
8.6.2 Integral-Mode Control 343
8.6.3 Derivative-Mode Control 344
8.6.4 Combining Modes 345
8.7.2 Control Delay 347
8.8 Stability 350
8.8.1 BIBO Stability Analysis of a Linear Time-Invariant System 353
8.8.2 Zero-Input Stability Analysis of a SISO Linear Time-Invariant System 356
8.8.3 Placing System Roots 357
8.8.4 Lyapunov Stability 358
8.9 State Space-based Modeling and Control 360
8.9.1 State Space-based Analysis 360
8.9.2 Observability and Controllability 361
8.9.3 Controller Design 362
8.10 Digital Control 364
8.11 Partial Fraction Expansion 367
8.11.1 Distinct Roots 367
8.11.2 Complex Conjugate Roots 368
8.11.3 Repeated Roots 369
8.13 Exercises 370

Chapter9 Information Theory 373
9.1 Introduction 373
9.2 A Mathematical Model for Communication 374
9.3 From Messages to Symbols 378
9.4 Source Coding
9.5 The Capacity of a Communication Channel 386
9.5.1 Modeling a Message Source 387
9.5.2 The Capacity of a Noiseless Channel 389
9.5.3 A Noisy Channel 390
9.6 The Gaussian Channel 399
9.6.1 Modeling a Continuous Message Source 400
9.6.2 A Gaussian Channel 401
9.6.3 The Capacity of a Gaussian Channel 403