Skip to content

Commit

Permalink
edited docs
Browse files Browse the repository at this point in the history
  • Loading branch information
Potatoasad committed Oct 28, 2024
1 parent 0a09ecc commit ce0f63d
Show file tree
Hide file tree
Showing 3 changed files with 38 additions and 35 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# gravpop

This library allows one to perform a gravitational wave population analysis, inspired by methods from [Thrane et al.](https://arxiv.org/abs/1809.02293), with an extension based on a technique from [Hussain et al.](...) that allows exploration of population features even in narrow regions near the edges of a bounded domain. It is similar to [`gwpopulation`](https://github.com/ColmTalbot/gwpopulation) (with model implementations as close as possible) but with explicitly a numpyro backend and with the ability to implement the TGMM population analysis method.
This library allows one to perform a gravitational wave population analysis, ([Hussain et al.](...), [Thrane et al.](https://arxiv.org/abs/1809.02293)) that allows exploration of population features even in narrow regions near the edges of a bounded domain.

> *Feel free to jump to the tutorial [here](https://potatoasad.github.io/gravpop/Examples/gravpop_tutorial.html)*
Expand Down
69 changes: 36 additions & 33 deletions docs/Examples/gravpop_tutorial.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "markdown",
"id": "77c6e57d",
"id": "b76f782c",
"metadata": {},
"source": [
"# Gravpop tutorial\n",
Expand Down Expand Up @@ -71,7 +71,7 @@
{
"cell_type": "code",
"execution_count": 3,
"id": "adde8068",
"id": "b1a23da0",
"metadata": {},
"outputs": [
{
Expand Down Expand Up @@ -275,7 +275,7 @@
},
{
"cell_type": "markdown",
"id": "9348e6b9",
"id": "c3b69488",
"metadata": {},
"source": [
"We can then fit the events we want to TGMMs. Note that the `.dataproduct()` method of the TGMM class provides the fitted data in the format that is required by `gravpop`."
Expand All @@ -284,7 +284,7 @@
{
"cell_type": "code",
"execution_count": 4,
"id": "cd7ae845",
"id": "8926b86d",
"metadata": {},
"outputs": [
{
Expand Down Expand Up @@ -337,7 +337,7 @@
},
{
"cell_type": "markdown",
"id": "6e1e8ef5",
"id": "15496b80",
"metadata": {},
"source": [
"We can now construct the data product we need:\n",
Expand All @@ -347,7 +347,7 @@
{
"cell_type": "code",
"execution_count": 5,
"id": "050c9825",
"id": "af61ce61",
"metadata": {},
"outputs": [
{
Expand Down Expand Up @@ -388,7 +388,7 @@
},
{
"cell_type": "markdown",
"id": "41b5f576",
"id": "71db1530",
"metadata": {},
"source": [
"we now have our data in the correct format. \n",
Expand Down Expand Up @@ -431,15 +431,15 @@
},
{
"cell_type": "markdown",
"id": "f2e75bea",
"id": "5c02d0b1",
"metadata": {},
"source": [
"# Models"
]
},
{
"cell_type": "markdown",
"id": "08441f24",
"id": "fcc567e4",
"metadata": {},
"source": [
"One can specify population models using a set of building block models. Each population model is defined as a distributions over some parameters $\\theta$, defined below by `var_names`, and some hyper-parameters $\\Lambda$, defined below by `hyper_var_names`. \n",
Expand Down Expand Up @@ -475,6 +475,7 @@
"\n",
"## Analytic Models\n",
"For analytic models, the building blocks are essentially\n",
"\n",
"- 1D truncated normals $N_{[0,1]}(x | \\mu, \\sigma)$\n",
"- 2D truncated normals $N_{[0,1]}(x, y | \\mu_x, \\sigma_x, \\mu_y, \\sigma_y, \\rho)$\n",
"- Uniform distributions $U(x | a, b)$\n",
Expand All @@ -485,7 +486,7 @@
{
"cell_type": "code",
"execution_count": 1,
"id": "f1b45abb",
"id": "fa13cbe8",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -511,7 +512,7 @@
},
{
"cell_type": "markdown",
"id": "6aeb8d05",
"id": "c10fb445",
"metadata": {},
"source": [
"We can combine these building blocks however we like. Using the following operations:\n",
Expand All @@ -531,7 +532,7 @@
{
"cell_type": "code",
"execution_count": 2,
"id": "2756340e",
"id": "3846c608",
"metadata": {},
"outputs": [],
"source": [
Expand Down Expand Up @@ -561,7 +562,7 @@
},
{
"cell_type": "markdown",
"id": "712c946e",
"id": "8706088c",
"metadata": {},
"source": [
"One can then evaluate this spin model on some set parameters"
Expand All @@ -570,7 +571,7 @@
{
"cell_type": "code",
"execution_count": 9,
"id": "4d61557c",
"id": "e880c5f9",
"metadata": {},
"outputs": [
{
Expand Down Expand Up @@ -599,7 +600,7 @@
},
{
"cell_type": "markdown",
"id": "f4709313",
"id": "941211a5",
"metadata": {},
"source": [
"## Sampled Models\n",
Expand All @@ -615,16 +616,18 @@
"Custom models can be made. We use duck-typing, so any class you create that has the following methods/properties should just work:\n",
"\n",
"__Required__:\n",
"\n",
"- `.__call__(data={var1 : E x K x N array, ...}, params) -> E x K x N array,` or an `E x K` array in the case of an analytic model\n",
"- `.limits` gives a dictionary of limits for each variable (e.g. `{'chi_1' : [0,1], chi_2' : [0,1]}`) \n",
"\n",
"__Recommended__:\n",
"\n",
"- `.sample(df_hyper_parameters, oversample=1)` will sample variables from the population model, given a dataframe of hyperparameters `df_hyper_parameters`. `oversample` will simply perform this operation `oversample` number of times and concatenate the result."
]
},
{
"cell_type": "markdown",
"id": "2e617575",
"id": "b473e5ad",
"metadata": {},
"source": [
"# Population Likelihood\n",
Expand All @@ -645,7 +648,7 @@
{
"cell_type": "code",
"execution_count": 15,
"id": "a1a149c4",
"id": "425accdc",
"metadata": {},
"outputs": [
{
Expand All @@ -672,7 +675,7 @@
},
{
"cell_type": "markdown",
"id": "4503e852",
"id": "b51cdc38",
"metadata": {},
"source": [
"We can compute the loglikelihood for some hyper-parameters, and also confirm by computing the derivative that there are no nan derivatives.\n",
Expand All @@ -683,7 +686,7 @@
{
"cell_type": "code",
"execution_count": 16,
"id": "7bc3d8d6",
"id": "b00b1586",
"metadata": {},
"outputs": [
{
Expand All @@ -707,7 +710,7 @@
},
{
"cell_type": "markdown",
"id": "8a4f6fd6",
"id": "1d61327c",
"metadata": {},
"source": [
"All our models are auto-diff-able, so we can compute the gradient of the logpdf as below:"
Expand All @@ -716,7 +719,7 @@
{
"cell_type": "code",
"execution_count": 18,
"id": "e9d231e8",
"id": "ba65cdd4",
"metadata": {},
"outputs": [
{
Expand All @@ -742,7 +745,7 @@
},
{
"cell_type": "markdown",
"id": "5bcf4565",
"id": "7e11c500",
"metadata": {},
"source": [
"One can also load up the event and selection function data from a file:"
Expand All @@ -751,7 +754,7 @@
{
"cell_type": "code",
"execution_count": 19,
"id": "d89b38ff",
"id": "96c86fd5",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -767,7 +770,7 @@
},
{
"cell_type": "markdown",
"id": "212b8029",
"id": "a1be31ef",
"metadata": {},
"source": [
"# Sampling\n",
Expand All @@ -780,7 +783,7 @@
{
"cell_type": "code",
"execution_count": 22,
"id": "6953774a",
"id": "83ce1f8d",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -798,7 +801,7 @@
},
{
"cell_type": "markdown",
"id": "2d6fa0ff",
"id": "3bd0c662",
"metadata": {},
"source": [
"Then, we can construct a `Sampler` object and put in our settings."
Expand All @@ -807,7 +810,7 @@
{
"cell_type": "code",
"execution_count": 24,
"id": "ca66b22f",
"id": "23f28bb0",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -821,7 +824,7 @@
},
{
"cell_type": "markdown",
"id": "11136505",
"id": "5e388f7b",
"metadata": {},
"source": [
"and we can begin sampling"
Expand All @@ -830,7 +833,7 @@
{
"cell_type": "code",
"execution_count": 25,
"id": "8cfff7c4",
"id": "b7de385f",
"metadata": {},
"outputs": [
{
Expand Down Expand Up @@ -862,7 +865,7 @@
},
{
"cell_type": "markdown",
"id": "e029089d",
"id": "3c5a1549",
"metadata": {},
"source": [
"we can see the dataframe holding the hyper-posterior samples in:"
Expand All @@ -871,7 +874,7 @@
{
"cell_type": "code",
"execution_count": 28,
"id": "89f45a0d",
"id": "b26d5de3",
"metadata": {},
"outputs": [
{
Expand Down Expand Up @@ -1024,7 +1027,7 @@
},
{
"cell_type": "markdown",
"id": "afb359ca",
"id": "1f40581d",
"metadata": {},
"source": [
"and here is a corner plot of our result"
Expand All @@ -1033,7 +1036,7 @@
{
"cell_type": "code",
"execution_count": 31,
"id": "986611ca",
"id": "f11e5421",
"metadata": {},
"outputs": [
{
Expand Down
2 changes: 1 addition & 1 deletion docs/intro.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
This library allows one to perform a gravitational wave population analysis, (Hussain et al.](...), [Thrane et al.](https://arxiv.org/abs/1809.02293)) that allows exploration of population features even in narrow regions near the edges of a bounded domain.
This library allows one to perform a gravitational wave population analysis, ([Hussain et al.](...), [Thrane et al.](https://arxiv.org/abs/1809.02293)) that allows exploration of population features even in narrow regions near the edges of a bounded domain.

It is similar to [`gwpopulation`](https://github.com/ColmTalbot/gwpopulation) (with model implementations as close as possible) explicitly has a numpyro backend and can implement the TGMM population analysis method to probe narrow features.

Expand Down

0 comments on commit ce0f63d

Please sign in to comment.