Numerical experiments, Tips, Tricks and Gotchas

## Parametrized MonteCarlo method

### 1. Introduction

The Monte Carlo method [ 1 ], in its original formulation, converges very slowly, but the convergence rate is almost independent of the dimension of the problem. The latter makes Monte Carlo a valuable tool for modeling complex systems. The drawback is that due to the stochastic nature of the method, the dependence on parameters is not smooth.

### 2. Integration example

In particular, the Monte Carlo method is applied for numerical calculation of definite integrals [ 2 ]. In a one dimensional case it reduces to the following formula $$I(p)=\intop_{a}^{b}f(x;\, p)dx\approx I_{MC}(p)=\frac{b-a}{N}\sum_{i=1}^{N}f(x_{i};\, p)\label{eq:MC}$$ Here $x_{i}$ are the random numbers and $p$ is a parameter. As an example let's consider $$I(p)=\intop_{0}^{\pi}cos(px)\, sin(x)\, dx=\frac{1+cos(\pi p)}{1-p^{2}}\label{eq:exact}$$ The integral (\ref{eq:MC}) with $N=500$ uniformly distributed random numbers and the exact dependence (\ref{eq:exact}) are presented in Fig 1.

Fig. 1. Monte Carlo integration.

### 3. Smoothing trick

The errors are relatively small and alternate around the exact function. The noise can be effectively eliminated using conventional smoothing techniques [ 3 ]. However, there is an old trick used to make a Parametrized Monte-Carlo method smooth. In the above experiment for each new value of the parameter $p$ a new series of random numbers was generated. If the same random series was used for each $p$ then the resulting dependence is smooth (see Fig 2)

Fig. 2. Monte Carlo integration with seed reset.

This can be done by resetting the pseudo random number generation to the same seed. Note that now the error is systematic: it is biased to negative values for smaller $p$ and to positive values for larger $p$.