All Surrey students can sign into Azure with their university account. Once logged in, you can run notebooks using the "Run on Free C..." button but only for notebooks you own (i.e., not on the ones I own). So to edit and run in the cloud you need to make copies. As far as I can tell you can clone (=make a copy and put it in a folder of yours on Azure) the entire BiologicalPhysicsCourse folder, but you can't clone a single one. Once you have cloned, and have done the press yje "Run on Free C" button thing, press the double arrow button to run the whole notebook. If you have Python 3 on your machine you can download and run the notebook there. For an introduction to Jupyter notebooks see here, or simply Google "introduction jupyter python" or similar.
Once in the cloud, the advantage is that you can edit and run the notebook from anywhere on campus or at home, which may be more convenient than copying it from a uni computer to one at home.
It is a core skill in science and engineering (and in other fields such as data science) to be able to fit functions (straight lines, Gaussians, etc) to noisy data, including being able to reliably estimate the uncertainty in the values of the fit parameters. You can download, edit and run (or run online) these notebooks to fit to data. You can also edit them to fit other functions, or to data in other forms. But please NOTE that all ways to estimate uncertainties/errors make assumptions, so please don't fit without bearing that in mind.
The INTRO notebook is a self-contained Python notebook to fit a straight line (slope and intercept) to data, which includes standard error uncertainty estimates. Fitting is via least squares. It has links to webpages etc to explain where the math formulas needed to fit and to estimate uncertainties come from. Note that all fitting and uncertainty estimation methods make assumptions.
The INTRO notebook reads in and fits to the data in the text file "lin_fit_input_data.txt". You need to save both the notebook (.ipynb) and the text file to the same directory, then run the notebook.
The FULL version also shows how to use both the bootstrap and the jackknife uncertainty estimates.
You probably want to start with the INTRO version.
These notebooks fit power laws and Gaussian functions to data
Here are the Jupyter notebooks for the Numerical Physics part of University of Surrey module Energy, Entropy and Numerical Physics (PHY2063). Some of these notebooks I will go through in lectures, most you will go through in the computing lab sessions. There are about ten notebooks in total, so you should aim to do about one per week. See your first year modules if you need to refresh your Python 3.
Week 1 lecture is the introduction to the course, and introduces numerically solving ordinary differential equations (ODES). Three notebooks for weeks 1 to 3: an intro to Python 3 Jupyer, intro to ODEs, and one on second order ODES.
Week 2 lecture is on data analysis with Python 3. Three notebooks: an intro notebook plus two (you can do just do either one) on the bootstrap method of estimating uncertainties.
Week 4 lecture is on PDEs. Two notebooks: an intro notebook where we solve Laplace's PDE, and the Assignment 1 notebook where you solve the diffusion PDE in one dimension.
Week 7 lecture is on data analysis, including Bayes' theorem, and randomness. Four notebooks: A notebook on random numbers, the Central Limit Theorem of Statistics and an introduction to Monte Carlo methods (via simple example). Notebook on testing a hypothesis (for a linear fit). Notebook introducing Bayes' Theorem. Assignment 2 notebook where you analyse data on radioactive decays.
Additional note on Week 2 material: The Uncertainty estimation notebook takes you through a standard error analysis in a simple example: that of estimating the true value from a set of noisy measurements. It gives the requried standard error formulas, and shows how these formulas rely on the noise being at least roughly Gaussian, if it is not they can fail.
The two bootstrap notebooks introduce the boostrap method. Bootstrap_UncertaintyEstimation.ipynb compares the standard error and bootstrap method for the same problem as in the Uncertainty estimation notebook. This has the advantage of comparing the two but on talking to some of you I realise that it also does not show why so many people use the bootstrap method. Becuase for that example, the standard error method is easy to use and works OK.
To address that problem I have done another bootstrap notebook, that analyses a data set considered by the father of the bootstrap method, Bradley Efron, which shows how the bootstrap is pretty easy to code to calculate the uncertainty in a measure of correlation (Pearson's r) for which we don't have convenient standard error formulas. With that code if you want to calculate the uncertainty in another property, other than Pearson's measure of correlation, you just change a function.
You don't need to do both bootstrap notebooks, either one should be enough to show how it works.