@@ -121,81 +121,98 @@ We introduce some useful tools for work with image annotation and segmentation.
121121* ** Quantization:** in case you have some smooth color labeling in your images you can remove them with following quantization script.
122122 ``` bash
123123 python handling_annotations/run_image_color_quantization.py \
124- -imgs " data_images/drosophila_ovary_slice/segm_rgb/*.png" \
124+ -imgs " ./ data_images/drosophila_ovary_slice/segm_rgb/*.png" \
125125 -m position -thr 0.01 --nb_jobs 2
126126 ```
127127* ** Paint labels:** concerting image labels into colour space and other way around.
128128 ` ` ` bash
129129 python handling_annotations/run_image_convert_label_color.py \
130- -imgs " data_images/drosophila_ovary_slice/segm/*.png" \
131- -out data_images/drosophila_ovary_slice/segm_rgb
130+ -imgs " ./ data_images/drosophila_ovary_slice/segm/*.png" \
131+ -out ./ data_images/drosophila_ovary_slice/segm_rgb
132132 ` ` `
133133* ** Visualisation:** having input image and its segmentation we can use simple visualisation which overlap the segmentation over input image.
134134 ` ` ` bash
135135 python handling_annotations/run_overlap_images_segms.py \
136- -imgs " data_images/drosophila_ovary_slice/image/*.jpg" \
137- -segs data_images/drosophila_ovary_slice/segm \
138- -out results/overlap_ovary_segment
136+ -imgs " ./ data_images/drosophila_ovary_slice/image/*.jpg" \
137+ -segs ./ data_images/drosophila_ovary_slice/segm \
138+ -out ./ results/overlap_ovary_segment
139139 ` ` `
140140* ** Inpainting** selected labels in segmentation.
141141 ` ` ` bash
142142 python handling_annotations/run_segm_annot_inpaint.py \
143- -imgs " data_images/drosophila_ovary_slice/segm/*.png" \
143+ -imgs " ./ data_images/drosophila_ovary_slice/segm/*.png" \
144144 --label 4
145145 ` ` `
146146* ** Replace labels:** change labels in input segmentation into another set of lables in 1:1 schema.
147147 ` ` ` bash
148148 python handling_annotations/run_segm_annot_relabel.py \
149- -out results/relabel_center_levels \
149+ -out ./ results/relabel_center_levels \
150150 --label_old 2 3 --label_new 1 1
151151 ` ` `
152152
153153
154154# ## Structure segmentation
155155
156156We utilize (un)supervised segmentation according to given training examples or some expectations.
157+ ! [vusial debug](figures/visual_img_43_debug.jpg)
157158
158159* Evaluate superpixels (with given SLIC parameters) quality against given segmentation. It helps find out best SLIC configuration.
159160 ` ` ` bash
160161 python experiments_segmentation/run_eval_superpixels.py \
161- -imgs " data_images/drosophila_ovary_slice/image/*.jpg" \
162- -segm " data_images/drosophila_ovary_slice/annot_eggs/*.png" \
162+ -imgs " ./ data_images/drosophila_ovary_slice/image/*.jpg" \
163+ -segm " ./ data_images/drosophila_ovary_slice/annot_eggs/*.png" \
163164 --img_type 2d_split \
164- --slic_size 20 --slic_regul 0.25 --slico 0
165+ --slic_size 20 --slic_regul 0.25 --slico
165166 ` ` `
166167* Perform ** Unsupervised** segmentation in images given in CSV
167168 ` ` ` bash
168169 python experiments_segmentation/run_segm_slic_model_graphcut.py \
169- -l data_images/langerhans_islets/list_lang-isl_imgs-annot.csv -i " " \
170- --path_config experiments_segmentation/sample_config.json \
171- -o results -n langIsl --nb_classes 3 --visual --nb_jobs 2
170+ -l ./ data_images/langerhans_islets/list_lang-isl_imgs-annot.csv -i " " \
171+ --cdf experiments_segmentation/sample_config.json \
172+ -o ./ results -n langIsl --nb_classes 3 --visual --nb_jobs 2
172173 ` ` `
173- OR specified on particuler path:
174+ OR specified on particular path:
174175 ` ` ` bash
175176 python experiments_segmentation/run_segm_slic_model_graphcut.py \
176- -l " " -i " data_images/langerhans_islets/image/*.jpg" \
177- --path_config experiments_segmentation/sample_config.json \
178- -o results -n langIsl --nb_classes 3 --visual --nb_jobs 2
177+ -l " " -i " ./ data_images/langerhans_islets/image/*.jpg" \
178+ -cfg ./ experiments_segmentation/sample_config.json \
179+ -o ./ results -n langIsl --nb_classes 3 --visual --nb_jobs 2
179180 ` ` `
181+ ! [unsupervised](figures/imag-disk-20_gmm.jpg)
180182* Perform ** Supervised** segmentation with afterwards evaluation.
181183 ` ` ` bash
182184 python experiments_segmentation/run_segm_slic_classif_graphcut.py \
183- -l data_images/drosophila_ovary_slice/list_imgs-annot-struct.csv \
184- -i " data_images/drosophila_ovary_slice/image/*.jpg" \
185- --path_config experiments_segmentation/sample_config.json \
186- -o results -n Ovary --img_type 2d_split --visual --nb_jobs 2
185+ -l ./ data_images/drosophila_ovary_slice/list_imgs-annot-struct.csv \
186+ -i " ./ data_images/drosophila_ovary_slice/image/*.jpg" \
187+ --path_config ./ experiments_segmentation/sample_config.json \
188+ -o ./ results -n Ovary --img_type 2d_split --visual --nb_jobs 2
187189 ` ` `
190+ ! [supervised](figures/imag-disk-20_train.jpg)
188191* For both experiment you can evaluate segmentation results.
189192 ` ` ` bash
190193 python experiments_segmentation/run_compute-stat_annot-segm.py \
191- -annot " data_images/drosophila_ovary_slice/annot_struct/*.png" \
192- -segm " results/experiment_segm-supervise_ovary/*.png" \
193- -img " data_images/drosophila_ovary_slice/image/*.jpg" \
194- -out results/evaluation
194+ -annot " ./ data_images/drosophila_ovary_slice/annot_struct/*.png" \
195+ -segm " ./ results/experiment_segm-supervise_ovary/*.png" \
196+ -img " ./ data_images/drosophila_ovary_slice/image/*.jpg" \
197+ -out ./ results/evaluation
195198 ` ` `
196-
197- ! [unsupervised](figures/imag-disk-20_gmm.jpg)
198- ! [supervised](figures/imag-disk-20_train.jpg)
199+ ! [vusial](figures/segm-visual_D03_sy04_100x.jpg)
200+
201+ The previous two (un)segmentation accept [configuration file](experiments_segmentation/sample_config.json) (JSON) by parameter ` -cfg` with some extra parameters which was not passed in arguments, for instance:
202+ ` ` ` json
203+ {
204+ " slic_size" : 35,
205+ " slic_regul" : 0.2,
206+ " features" : {" color_hsv" : [" mean" , " std" , " eng" ]},
207+ " classif" : " SVM" ,
208+ " nb_classif_search" : 150,
209+ " gc_edge_type" : " model" ,
210+ " gc_regul" : 3.0,
211+ " run_LOO" : false,
212+ " run_LPO" : true,
213+ " cross_val" : 0.1
214+ }
215+ ` ` `
199216
200217# ## Center detection and ellipse fitting
201218
@@ -210,18 +227,18 @@ In general, the input is a formatted list (CSV file) of input images and annotat
2102271. With zone annotation, we train a classifier for center candidate prediction. The annotation can be a CSV file with annotated centers as points, and the zone of positive examples is set uniformly as the circular neighborhood around these points. Another way (preferable) is to use annotated image with marked zones for positive, negative and neutral examples.
211228 ` ` ` bash
212229 python experiments_ovary_centres/run_center_candidate_training.py -list none \
213- -segs " data_images/drosophila_ovary_slice/segm/*.png" \
214- -imgs " data_images/drosophila_ovary_slice/image/*.jpg" \
215- -centers " data_images/drosophila_ovary_slice/center_levels/*.png" \
216- -out results -n ovary
230+ -segs " ./ data_images/drosophila_ovary_slice/segm/*.png" \
231+ -imgs " ./ data_images/drosophila_ovary_slice/image/*.jpg" \
232+ -centers " ./ data_images/drosophila_ovary_slice/center_levels/*.png" \
233+ -out ./ results -n ovary
217234 ` ` `
2182351. Having trained classifier we perfom center prediction composed from two steps: i. center candidate clustering and candidate clustering.
219236 ` ` ` bash
220237 python experiments_ovary_centres/run_center_prediction.py -list none \
221- -segs " data_images/drosophila_ovary_slice/segm/*.png" \
222- -imgs " data_images/drosophila_ovary_slice/image/*.jpg" \
223- -centers results/detect-centers-train_ovary/classifier_RandForest.pkl \
224- -out results -n ovary
238+ -segs " ./ data_images/drosophila_ovary_slice/segm/*.png" \
239+ -imgs " ./ data_images/drosophila_ovary_slice/image/*.jpg" \
240+ -centers ./ results/detect-centers-train_ovary/classifier_RandForest.pkl \
241+ -out ./ results -n ovary
225242 ` ` `
2262431. Assuming you have an expert annotation you can compute static such as missed eggs.
227244 ` ` ` bash
@@ -271,13 +288,13 @@ python setup.py install
271288 ` ` ` bash
272289 python experiments_ovary_detect/run_RG2Sp_estim_shape-models.py \
273290 -annot " ~/Medical-drosophila/egg_segmentation/mask_2d_slice_complete_ind_egg/*.png" \
274- -out data -nb 15
291+ -out ./data_images -nb 15
275292 ` ` `
2762931. Run several segmentation techniques on each image.
277294 ` ` ` bash
278295 python experiments_ovary_detect/run_ovary_egg-segmentation.py \
279- -list data_images/drosophila_ovary_slice/list_imgs-segm-center-points.csv \
280- -out output -n ovary_image --nb_jobs 1 \
296+ -list ./ data_images/drosophila_ovary_slice/list_imgs-segm-center-points.csv \
297+ -out ./results -n ovary_image --nb_jobs 1 \
281298 -m ellipse_moments \
282299 ellipse_ransac_mmt \
283300 ellipse_ransac_crit \
@@ -289,16 +306,16 @@ python setup.py install
289306 rg2sp_GC-mixture \
290307 watershed_morph
291308 ` ` `
292- 1. Evaluate your segmentation results to expert annotation.
309+ 1. Evaluate your segmentation ./ results to expert annotation.
293310 ` ` ` bash
294311 python experiments_ovary_detect/run_ovary_segm_evaluation.py --visual
295312 ` ` `
2963131. In the end, cut individual segmented objects comes as minimal bounding box.
297314 ` ` ` bash
298315 python experiments_ovary_detect/run_cut_segmented_objects.py \
299- -annot " data_images/drosophila_ovary_slice/annot_eggs/*.png" \
300- -img " data_images/drosophila_ovary_slice/segm/*.png" \
301- -out results/cut_images --padding 50
316+ -annot " ./ data_images/drosophila_ovary_slice/annot_eggs/*.png" \
317+ -img " ./ data_images/drosophila_ovary_slice/segm/*.png" \
318+ -out ./ results/cut_images --padding 50
302319 ` ` `
3033201. Finally, performing visualisation of segmentation results toghter with expert annotation.
304321 ` ` ` bash
0 commit comments