Skip to content

Commit 267c303

Browse files
committed
merged !77, adapted PV_MUMC_triexp to new format
2 parents 206d920 + 563205f commit 267c303

39 files changed

+5224
-2264
lines changed

.github/workflows/documentation.yml renamed to .github/workflows/website.yml

Lines changed: 21 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
name: Build & Deploy Documentation
1+
name: Build & Deploy Website
22

33
on:
44
workflow_run:
@@ -33,25 +33,42 @@ jobs:
3333
run: |
3434
pip install -r requirements.txt
3535
36-
# Action to download artifacts from a different workflow (analysis.yml)
36+
# Action Figures artifact
3737
- name: 'Download artifact'
3838
if: ${{ github.event.workflow_run.conclusion == 'success' }}
3939
uses: ./.github/actions/download-artifact
4040
with:
4141
name: 'Figures'
42+
# Action analysis data artifact
43+
- name: 'Download analysis data'
44+
if: ${{ github.event.workflow_run.conclusion == 'success' }}
45+
uses: ./.github/actions/download-artifact
46+
with:
47+
name: 'Data'
48+
49+
- name: 'Filter and compress results file.'
50+
run: python utilities/reduce_output_size.py test_output.csv test_output.csv.gz
4251

43-
- name: Build html
52+
- name: move data to the dashboard folder
53+
run: |
54+
mv test_output.csv.gz website/dashboard
55+
56+
- name: Build documentation
4457
run: |
4558
mkdir docs/_static
4659
mv *.pdf docs/_static/
4760
sphinx-apidoc -o docs src
4861
cd docs/
4962
make html
5063
64+
- name: move data to the website folder
65+
run: |
66+
mv "docs/_build/html" "website/documentation"
67+
5168
- name: Upload docs artifact
5269
uses: actions/upload-pages-artifact@v3
5370
with:
54-
path: 'docs/_build/html'
71+
path: 'website'
5572

5673
deploy:
5774
needs: build

.gitignore

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ __pycache__/
1010
*.raw
1111
bvals.txt
1212
download
13+
.Introduction_to_TF24_IVIM-MRI_CodeCollection_github_and_IVIM_Analysis_using_Python.ipynb
14+
.ipynb_checkpoints
1315
md5sums.txt
1416
*.gz
1517
*.zip
@@ -21,4 +23,6 @@ md5sums.txt
2123
.cache
2224
nosetests.xml
2325
coverage.xml
24-
*.pyc
26+
*.pyc
27+
phantoms/MR_XCAT_qMRI/*.json
28+
phantoms/MR_XCAT_qMRI/*.txt

README.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,8 @@ The ISMRM Open Science Initiative for Perfusion Imaging (OSIPI) is an initiative
66
77
This **IVIM code collection** code library is maintained by OSIPI [Taskforce 2.4](https://www.osipi.org/task-force-2-4/) (*currently not available*) and aims to collect, test and share open-source code related to intravoxel incoherent motion (IVIM) analysis of diffusion encoded MRI data to be used in research and software development. Code contributions can include any code related IVIM analysis (denoising, motion correction, model fitting, etc.), but at an initial phase, development of tests and other features of the repository will predominantly focus on fitting algorithms. A goal of the IVIM OSIPI task force is to develop a fully tested and harmonized code library, building upon the contributions obtained through this initiative. Documentation and analysis are available on the [OSIPI TF2.4](https://osipi.github.io/TF2.4_IVIM-MRI_CodeCollection/).
88

9+
We have some useful tools and further documentation on https://osipi.github.io/TF2.4_IVIM-MRI_CodeCollection/.
10+
911
## How to contribute
1012

1113
If you would like to get involve in OSIPI and work within the task force, please email the contacts listed on our website.
@@ -17,6 +19,9 @@ If you would like to contribute with code, please follow the instructions below:
1719
* [Guidelines for IVIM code contribution](doc/guidelines_for_contributions.md)
1820
* [Guidelines to creating a test file](doc/creating_test.md)
1921

22+
If you would like to use code from the repository and/or are new to Github or IVIM, please see the jupyter notebook below:
23+
* [Introduction to TF2.4_IVIM-MRI_CodeCollection github and IVIM Analysis using Python](doc/Introduction_to_TF24_IVIM-MRI_CodeCollection_github_and_IVIM_Analysis_using_Python.ipynb)
24+
2025
## Repository Organization
2126

2227
The repository is organized in four main folders along with configuration files for automated testing.
@@ -32,4 +37,4 @@ The **utils** folder contains various helpful tools.
3237
## View Testing Reports
3338
[![Unit tests](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/unit_test.yml/badge.svg?branch=main)](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/unit_test.yml)
3439
[![Algorithm Analysis](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/analysis.yml/badge.svg?branch=main)](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/analysis.yml)
35-
[![Build & Deploy Documentation](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/documentation.yml/badge.svg?branch=main)](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/documentation.yml)
40+
[![Build & Deploy Website](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/website.yml/badge.svg?branch=main)](https://github.com/OSIPI/TF2.4_IVIM-MRI_CodeCollection/actions/workflows/website.yml)

doc/Introduction_to_TF24_IVIM-MRI_CodeCollection_github_and_IVIM_Analysis_using_Python.ipynb

Lines changed: 1099 additions & 0 deletions
Large diffs are not rendered by default.

doc/wrapper_usage_example.ipynb

Lines changed: 165 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,165 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"\n",
8+
"# Organisation of code submissions and standardisation to a common interface\n",
9+
"\n",
10+
"## General structure\n",
11+
"Code submissions are located in the src/original folder, where submissions are named as `<initials>_<institution>`. Due to code submissions having different authors, it is expected that they all vary in their usage, inputs, and outputs. In order to facilitate testing in a larger scale, a common interface has been created in the form of the `OsipiBase` class (src/wrappers). This class acts as a parent class for standardised versions of the different code submissions. Together, they create the common interface of function calls and function outputs that allows us to perform mass testing, but also creates easy usage.\n",
12+
"\n",
13+
"The src/standardized folder contains the standardised version of each code submission. Here, a class is created following a naming convention (`<initials>_<institution>_<algorithm name>`), with `__init__()` and `ivim_fit()` methods that integrate well with the OsipiBase class. The idea is that every submitted fitting algorithm should be initialised in the same way, and executed in the same way.\n",
14+
"\n"
15+
]
16+
},
17+
{
18+
"cell_type": "markdown",
19+
"metadata": {},
20+
"source": [
21+
"\n",
22+
"## The standardized versions\n",
23+
"The standardised versions of each submission is a class that contains two methods. These classes inherit the functionalities of `OsipiBase`.\n",
24+
"\n",
25+
"### `__init__()`\n",
26+
"The `__init__()` method ensures that the algorithm is initiated correctly in accordance with OsipiBase. Custom code is to be inserted below the `super()` call. This method should contain any of the neccessary steps for the following `ivim_fit()` method to only require signals and b-values as input.\n",
27+
"\n",
28+
"Below is an example from src/standardized/IAR_LU_biexp.py"
29+
]
30+
},
31+
{
32+
"cell_type": "code",
33+
"execution_count": null,
34+
"metadata": {
35+
"vscode": {
36+
"languageId": "plaintext"
37+
}
38+
},
39+
"outputs": [],
40+
"source": [
41+
"def __init__(self, bvalues=None, thresholds=None, bounds=None, initial_guess=None, weighting=None, stats=False):\n",
42+
" \"\"\"\n",
43+
" Everything this algorithm requires should be implemented here.\n",
44+
" Number of segmentation thresholds, bounds, etc.\n",
45+
" \n",
46+
" Our OsipiBase object could contain functions that compare the inputs with\n",
47+
" the requirements.\n",
48+
" \"\"\"\n",
49+
" super(IAR_LU_biexp, self).__init__(bvalues, thresholds, bounds, initial_guess) ######## On this line, change \"IAR_LU_biexp\" to the name of the class\n",
50+
"\n",
51+
" ######## Your code below #########\n",
52+
" \n",
53+
" # Check the inputs\n",
54+
" \n",
55+
" # Initialize the algorithm\n",
56+
" if self.bvalues is not None:\n",
57+
" bvec = np.zeros((self.bvalues.size, 3))\n",
58+
" bvec[:,2] = 1\n",
59+
" gtab = gradient_table(self.bvalues, bvec, b0_threshold=0)\n",
60+
" \n",
61+
" self.IAR_algorithm = IvimModelBiExp(gtab)\n",
62+
" else:\n",
63+
" self.IAR_algorithm = None"
64+
]
65+
},
66+
{
67+
"cell_type": "markdown",
68+
"metadata": {},
69+
"source": [
70+
"\n",
71+
"### `ivim_fit()`\n",
72+
"The purpose of this method is to take a singe voxel signal and b-values as input, and return IVIM parameters as output. This is where most of the custom code will go that is related to each individual code submission. The idea here is to have calls to submitted functions in the src/originals folder. This ensures that the original code is not tampered with. However if required, the original code could be just pasted in here as well.\n",
73+
"\n",
74+
"Below is an example from src/standardized/IAR_LU_biexp.py"
75+
]
76+
},
77+
{
78+
"cell_type": "code",
79+
"execution_count": null,
80+
"metadata": {
81+
"vscode": {
82+
"languageId": "plaintext"
83+
}
84+
},
85+
"outputs": [],
86+
"source": [
87+
"\n",
88+
"def ivim_fit(self, signals, bvalues, **kwargs):\n",
89+
" \"\"\"Perform the IVIM fit\n",
90+
"\n",
91+
" Args:\n",
92+
" signals (array-like)\n",
93+
" bvalues (array-like, optional): b-values for the signals. If None, self.bvalues will be used. Default is None.\n",
94+
"\n",
95+
" Returns:\n",
96+
" _type_: _description_\n",
97+
" \"\"\"\n",
98+
" \n",
99+
" if self.IAR_algorithm is None:\n",
100+
" if bvalues is None:\n",
101+
" bvalues = self.bvalues\n",
102+
" else:\n",
103+
" bvalues = np.asarray(bvalues)\n",
104+
" \n",
105+
" bvec = np.zeros((bvalues.size, 3))\n",
106+
" bvec[:,2] = 1\n",
107+
" gtab = gradient_table(bvalues, bvec, b0_threshold=0)\n",
108+
" \n",
109+
" self.IAR_algorithm = IvimModelBiExp(gtab)\n",
110+
" \n",
111+
" fit_results = self.IAR_algorithm.fit(signals)\n",
112+
" \n",
113+
" results = {}\n",
114+
" results[\"f\"] = fit_results.model_params[1]\n",
115+
" results[\"D*\"] = fit_results.model_params[2]\n",
116+
" results[\"D\"] = fit_results.model_params[3]\n",
117+
" \n",
118+
" return results"
119+
]
120+
},
121+
{
122+
"cell_type": "markdown",
123+
"metadata": {},
124+
"source": [
125+
"\n",
126+
"## The `OsipiBase` class\n",
127+
"The usage of the OsipiBase class mainly consists of running the osipi_fit() method. In this method, the inputs from `__init__()` of the standardised version of a code submission, and the signals and b-values input to `osipi_fit()` is processed and fed into the `ivim_fit()` function.\n",
128+
"\n",
129+
"It is the `osipi_fit()` method that provides the common interface for model fitting. As one may note, `ivim_fit()` takes a single voxel as input. `OsipiBase.osipi_fit()` supports multidimensional inputs, which is then iteratively fed into `ivim_fit()`, and returns a corresponding output. Support for future types of input will be implemented here. This ensures that the `ivim_fit()` method can be written as simply as possible, which simplifies the inclusion of new code submissions into the standard interface.\n"
130+
]
131+
},
132+
{
133+
"cell_type": "markdown",
134+
"metadata": {},
135+
"source": [
136+
"\n",
137+
"## Example usage of standardized version of an algorithm\n",
138+
"### Using the standardized version directly\n",
139+
"The standardised versions can be used directly by\n",
140+
"1. Importing the class\n",
141+
"2. Initialising the object with the required parameters, e.g. `IAR_LU_biexp(bounds=[(0, 1), (0.005, 0.1), (0, 0.004)])`\n",
142+
"3. Call `osipi_fit(signals, bvalues)` for model fitting\n",
143+
"\n",
144+
"### Using the `OsipiBase` class with algorithm names\n",
145+
"Standardised versions can also be initiated using the OsipiBase.osipi_initiate_algorithm() method.\n",
146+
"\n",
147+
"1. Import `OsipiBase`\n",
148+
"2. Initiate `OsipiBase` with the algorithm keyword set to the standardised name of the desired algorithm e.g., `OsipiBase(algorithm=IAR_LU_biexp)`\n",
149+
"3. Call `osipi_fit()` for model fitting"
150+
]
151+
},
152+
{
153+
"cell_type": "markdown",
154+
"metadata": {},
155+
"source": []
156+
}
157+
],
158+
"metadata": {
159+
"language_info": {
160+
"name": "python"
161+
}
162+
},
163+
"nbformat": 4,
164+
"nbformat_minor": 2
165+
}
File renamed without changes.

0 commit comments

Comments
 (0)