NumericalPhysics2ndyrJupyter



README.md

Here are the Jupyter notebooks for the Numerical Physics part of University of Surrey module Energy, Entropy and Numerical Physics (PHY2063). Some of these notebooks I will go through in lectures, most you will go through in the computing lab sessions. There are about ten notebooks in total, so you should aim to do about one per week. See your first year modules if you need to refresh your Python 3.

Week 1 lecture is the introduction to the course, and introduces numerically solving ordinary differential equations (ODES). Three notebooks for weeks 1 to 3: an intro to Python 3 Jupyer, intro to ODEs, and one on second order ODES.

Week 2 lecture is on data analysis with Python 3. Three notebooks: an intro notebook plus two (you can do just do either one) on the bootstrap method of estimating uncertainties.

Week 4 lecture is on PDEs. Two notebooks: an intro notebook where we solve Laplace's PDE, and the Assignment 1 notebook where you solve the diffusion PDE in one dimension.

Week 7 lecture is on data analysis, including Bayes' theorem, and randomness. Four notebooks: A notebook on random numbers, the Central Limit Theorem of Statistics and an introduction to Monte Carlo methods (via simple example). Notebook on testing a hypothesis (for a linear fit). Notebook introducing Bayes' Theorem. Assignment 2 notebook where you analyse data on radioactive decays.

Additional note on Week 2 material: The Uncertainty estimation notebook takes you through a standard error analysis in a simple example: that of estimating the true value from a set of noisy measurements. It gives the requried standard error formulas, and shows how these formulas rely on the noise being at least roughly Gaussian, if it is not they can fail.

The two bootstrap notebooks introduce the boostrap method. Bootstrap_UncertaintyEstimation.ipynb compares the standard error and bootstrap method for the same problem as in the Uncertainty estimation notebook. This has the advantage of comparing the two but on talking to some of you I realise that it also does not show why so many people use the bootstrap method. Becuase for that example, the standard error method is easy to use and works OK.

To address that problem I have done another bootstrap notebook, that analyses a data set considered by the father of the bootstrap method, Bradley Efron, which shows how the bootstrap is pretty easy to code to calculate the uncertainty in a measure of correlation (Pearson's r) for which we don't have convenient standard error formulas. With that code if you want to calculate the uncertainty in another property, other than Pearson's measure of correlation, you just change a function.

You don't need to do both bootstrap notebooks, either one should be enough to show how it works.

Clones Terminal Edit
Showing 6 notebooks