Contents
Preface v
Notation xii
I What This Book Is About 1
1 An Illustrative Example: The Single-Server Queue . . . 1
2 TheMonte CarloMethod . . . . . . . . . . . . . . . . 5
3 Second Example: Option Pricing . . . . . . . . . . . . . 6
4 Issues Arising in the Monte Carlo Context . . . . . . . 9
5 Further Examples . . . . . . . . . . . . . . . . . . . . . 13
6 Introductory Exercises . . . . . . . . . . . . . . . . . . 25
Part A: General Methods and Algorithms 29
II Generating Random Objects 30
1 Uniform RandomVariables . . . . . . . . . . . . . . . . 30
2 NonuniformRandomVariables . . . . . . . . . . . . . . 36
3 Multivariate Random Variables . . . . . . . . . . . . . 49
4 Simple Stochastic Processes . . . . . . . . . . . . . . . 59
5 Further Selected Random Objects . . . . . . . . . . . . 62
6 Discrete-Event Systems and GSMPs . . . . . . . . . . 65
III Output Analysis 68
1 Normal Confidence Intervals . . . . . . . . . . . . . . . 68
Contents ix
2 Two-Stage and Sequential Procedures . . . . . . . . . . 71
3 Computing Smooth Functions of Expectations . . . . . 73
4 Computing Roots of Equations Defined by Expectations 77
5 Sectioning, Jackknifing, and Bootstrapping . . . . . . . 80
6 Variance/Bias Trade-Off Issues . . . . . . . . . . . . . 86
7 Multivariate Output Analysis . . . . . . . . . . . . . . 88
8 Small-Sample Theory . . . . . . . . . . . . . . . . . . . 90
9 Simulations Driven by Empirical Distributions . . . . . 91
10 The Simulation Budget . . . . . . . . . . . . . . . . . . 93
IV Steady-State Simulation 96
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . 96
2 Formulas for the Bias and Variance . . . . . . . . . . . 102
3 Variance Estimation for Stationary Processes . . . . . 104
4 The RegenerativeMethod . . . . . . . . . . . . . . . . 105
5 TheMethod of BatchMeans . . . . . . . . . . . . . . . 109
6 Further Refinements . . . . . . . . . . . . . . . . . . . 110
7 Duality Representations . . . . . . . . . . . . . . . . . 118
8 Perfect Sampling . . . . . . . . . . . . . . . . . . . . . 120
V Variance-Reduction Methods 126
1 Importance Sampling . . . . . . . . . . . . . . . . . . . 127
2 ControlVariates . . . . . . . . . . . . . . . . . . . . . . 138
3 Antithetic Sampling . . . . . . . . . . . . . . . . . . . . 144
4 ConditionalMonte Carlo . . . . . . . . . . . . . . . . . 145
5 Splitting . . . . . . . . . . . . . . . . . . . . . . . . . . 147
6 Common RandomNumbers . . . . . . . . . . . . . . . 149
7 Stratification . . . . . . . . . . . . . . . . . . . . . . . . 150
8 Indirect Estimation . . . . . . . . . . . . . . . . . . . . 155
VI Rare-Event Simulation 158
1 Efficiency Issues . . . . . . . . . . . . . . . . . . . . . . 158
2 Examples of Efficient Algorithms: Light Tails . . . . . 163
3 Examples of Efficient Algorithms: Heavy Tails . . . . . 173
4 Tail Estimation . . . . . . . . . . . . . . . . . . . . . . 178
5 Conditioned Limit Theorems . . . . . . . . . . . . . . . 183
6 Large-Deviations or Optimal-Path Approach . . . . . . 187
7 Markov Chains and the h-Transform . . . . . . . . . . 190
8 Adaptive Importance Sampling via the Cross-Entropy
Method . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
9 Multilevel Splitting . . . . . . . . . . . . . . . . . . . . 201
VII Derivative Estimation 206
1 Finite Differences . . . . . . . . . . . . . . . . . . . . . 209
2 Infinitesimal Perturbation Analysis . . . . . . . . . . . 214
x Contents
3 The Likelihood Ratio Method: Basic Theory . . . . . . 220
4 The Likelihood Ratio Method: Stochastic Processes . . 224
5 Examples and SpecialMethods . . . . . . . . . . . . . 231
VIII Stochastic Optimization 242
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . 242
2 Stochastic Approximation Algorithms . . . . . . . . . . 243
3 ConvergenceAnalysis . . . . . . . . . . . . . . . . . . . 245
4 Polyak–RuppertAveraging . . . . . . . . . . . . . . . . 250
5 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Part B: Algorithms for Special Models 259
IX Numerical Integration 260
1 Numerical Integration in One Dimension . . . . . . . . 260
2 Numerical Integration in Higher Dimensions . . . . . . 263
3 Quasi-Monte Carlo Integration . . . . . . . . . . . . . . 265
X Stochastic Differential Equations 274
1 Generalities about Stochastic Process Simulation . . . 274
2 BrownianMotion . . . . . . . . . . . . . . . . . . . . . 276
3 The Euler Scheme for SDEs . . . . . . . . . . . . . . . 280
4 The Milstein and Other Higher-Order Schemes . . . . . 287
5 ConvergenceOrders for SDEs: Proofs . . . . . . . . . . 292
6 Approximate Error Distributions for SDEs . . . . . . . 298
7 Multidimensional SDEs . . . . . . . . . . . . . . . . . . 300
8 Reflected Diffusions . . . . . . . . . . . . . . . . . . . . 301
XI Gaussian Processes 306
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . 306
2 Cholesky Factorization. Prediction . . . . . . . . . . . 311
3 Circulant-Embeddings . . . . . . . . . . . . . . . . . . 314
4 Spectral Simulation. FFT . . . . . . . . . . . . . . . . 316
5 Further Algorithms . . . . . . . . . . . . . . . . . . . . 320
6 Fractional BrownianMotion . . . . . . . . . . . . . . . 321
XII Lévy Processes 325
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . 325
2 First Remarks on Simulation . . . . . . . . . . . . . . . 331
3 Dealing with the Small Jumps . . . . . . . . . . . . . . 334
4 Series Representations . . . . . . . . . . . . . . . . . . 338
5 Subordination . . . . . . . . . . . . . . . . . . . . . . . 343
6 Variance Reduction . . . . . . . . . . . . . . . . . . . . 344
7 TheMultidimensional Case . . . . . . . . . . . . . . . 346
8 Lévy-Driven SDEs . . . . . . . . . . . . . . . . . . . . . 348
XIII Markov Chain Monte Carlo Methods 350
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . 350
2 Application Areas . . . . . . . . . . . . . . . . . . . . . 352
3 The Metropolis–Hastings Algorithm . . . . . . . . . . . 361
4 Special Samplers . . . . . . . . . . . . . . . . . . . . . 367
5 The Gibbs Sampler . . . . . . . . . . . . . . . . . . . . 375
XIV Selected Topics and Extended Examples 381
1 Randomized Algorithms for Deterministic Optimization 381
2 Resampling and Particle Filtering . . . . . . . . . . . . 385
3 Counting andMeasuring . . . . . . . . . . . . . . . . . 391
4 MCMC for the Ising Model and Square Ice . . . . . . . 395
5 Exponential Change of Measure in Markov-Modulated
Models . . . . . . . . . . . . . . . . . . . . . . . . . . . 403
6 Further Examples of Change of Measure . . . . . . . . 407
7 Black-BoxAlgorithms . . . . . . . . . . . . . . . . . . . 416
8 Perfect Sampling of Regenerative Processes . . . . . . . 420
9 Parallel Simulation . . . . . . . . . . . . . . . . . . . . 424
10 Branching Processes . . . . . . . . . . . . . . . . . . . 426
11 Importance Sampling for Portfolio VaR . . . . . . . . . 432
12 Importance Sampling for Dependability Models . . . . 435
13 Special Algorithms for the GI/G/1 Queue . . . . . . . 437
Appendix 442
A1 Standard Distributions . . . . . . . . . . . . . . . . . . 442
A2 Some Central Limit Theory . . . . . . . . . . . . . . . 444
A3 FFT . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444
A4 The EMAlgorithm . . . . . . . . . . . . . . . . . . . . 445
A5 Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . 447
A6 Itô’s Formula . . . . . . . . . . . . . . . . . . . . . . . 448
A7 Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . 450
A8 Integral Formulas . . . . . . . . . . . . . . . . . . . . . 450
Bibliography 452
Web Links 469
Index 471 |