Deploy AI apps for free on Ploomber Cloud!

Notebooks as experiments

Notebooks as experiments#

The .ipynb format is capable of storing tables and charts in a standalone file. This makes it a great choice for model evaluation reports. NotebookCollection allows you to retrieve results from previously executed notebooks to compare them.

import urllib.request

from ploomber_engine import execute_notebook
import jupytext

from sklearn_evaluation import NotebookCollection

Let’s first generate a few notebooks, we have a train.py script that trains a single model, let’s convert it to a jupyter notebook:

# download script
url = "https://raw.githubusercontent.com/ploomber/sklearn-evaluation/master/doc-assets/nb-collection/train.py"  # noqa
urllib.request.urlretrieve(url, filename="train.py")

# convert
nb = jupytext.read("train.py")
jupytext.write(nb, "train.ipynb")

We use ploomber-engine to execute the notebook with different parameters, we’ll train 4 models: 2 random forest, a linear regression and a support vector regression:

# models with their corresponding parameters
params = [
    {"model": "sklearn.ensemble.RandomForestRegressor", "params": {"n_estimators": 50}},
    {
        "model": "sklearn.ensemble.RandomForestRegressor",
        "params": {"n_estimators": 100},
    },
    {"model": "sklearn.linear_model.LinearRegression", "params": {}},
    {"model": "sklearn.svm.LinearSVR", "params": {}},
]

# ids to identify each experiment
ids = [
    "random_forest_1",
    "random_forest_2",
    "linear_regression",
    "support_vector_regression",
]

# output files
files = [f"{i}.ipynb" for i in ids]

# execute notebooks using ploomber-engine
for f, p in zip(files, params):
    execute_notebook("train.ipynb", output_path=f, parameters=p, progress_bar=False)
/home/docs/checkouts/readthedocs.org/user_builds/sklearn-evaluation/conda/latest/lib/python3.10/site-packages/sklearn/metrics/_regression.py:918: UndefinedMetricWarning: R^2 score is not well-defined with less than two samples.
  warnings.warn(msg, UndefinedMetricWarning)
/home/docs/checkouts/readthedocs.org/user_builds/sklearn-evaluation/conda/latest/lib/python3.10/site-packages/sklearn/metrics/_regression.py:918: UndefinedMetricWarning: R^2 score is not well-defined with less than two samples.
  warnings.warn(msg, UndefinedMetricWarning)
/home/docs/checkouts/readthedocs.org/user_builds/sklearn-evaluation/conda/latest/lib/python3.10/site-packages/sklearn/metrics/_regression.py:918: UndefinedMetricWarning: R^2 score is not well-defined with less than two samples.
  warnings.warn(msg, UndefinedMetricWarning)
/home/docs/checkouts/readthedocs.org/user_builds/sklearn-evaluation/conda/latest/lib/python3.10/site-packages/sklearn/svm/_base.py:1244: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  warnings.warn(
/home/docs/checkouts/readthedocs.org/user_builds/sklearn-evaluation/conda/latest/lib/python3.10/site-packages/sklearn/metrics/_regression.py:918: UndefinedMetricWarning: R^2 score is not well-defined with less than two samples.
  warnings.warn(msg, UndefinedMetricWarning)

To use NotebookCollection, we pass a a list of paths, and optionally, ids for each notebook (uses paths by default).

The only requirement is that cells whose output we want to extract must have tags, each tag then becomes a key in the notebook collection. For instructions on adding tags, see this.

Extracted tables add colors to certain cells to identify the best and worst metrics. By default, it assumes that metrics are errors (smaller is better). If you are using scores (larger is better), pass scores=True, if you have both, pass a list of scores:

nbs = NotebookCollection(paths=files, ids=ids, scores=["r2"])

To get a list of tags available:

list(nbs)
['model_name', 'feature_names', 'model_params', 'plot', 'metrics', 'houseage']

model_params contains a dictionary with model parameters, let’s get them (click on the tabs to switch):

# pro-tip: then typing the tag, press the "Tab" key for autocompletion!
nbs["model_params"]
{
    'bootstrap': True,
    'ccp_alpha': 0.0,
    'criterion': 'squared_error',
    'max_depth': None,
    'max_features': 1.0,
    'max_leaf_nodes': None,
    'max_samples': None,
    'min_impurity_decrease': 0.0,
    'min_samples_leaf': 1,
    'min_samples_split': 2,
    'min_weight_fraction_leaf': 0.0,
    'n_estimators': 50,
    'n_jobs': None,
    'oob_score': False,
    'random_state': None,
    'verbose': 0,
    'warm_start': False,
}
{
    'bootstrap': True,
    'ccp_alpha': 0.0,
    'criterion': 'squared_error',
    'max_depth': None,
    'max_features': 1.0,
    'max_leaf_nodes': None,
    'max_samples': None,
    'min_impurity_decrease': 0.0,
    'min_samples_leaf': 1,
    'min_samples_split': 2,
    'min_weight_fraction_leaf': 0.0,
    'n_estimators': 100,
    'n_jobs': None,
    'oob_score': False,
    'random_state': None,
    'verbose': 0,
    'warm_start': False,
}
{
    'copy_X': True,
    'fit_intercept': True,
    'n_jobs': None,
    'positive': False,
}
{
    'C': 1.0,
    'dual': True,
    'epsilon': 0.0,
    'fit_intercept': True,
    'intercept_scaling': 1.0,
    'loss': 'epsilon_insensitive',
    'max_iter': 1000,
    'random_state': None,
    'tol': 0.0001,
    'verbose': 0,
}

plot has a y_true vs y_pred chart:

nbs["plot"]

On each notebook, metrics outputs a data frame with a single row with mean absolute error (mae) and mean squared error (mse) as columns.

For single-row tables, a “Compare” tab shows all results at once:

nbs["metrics"]
  random_forest_1 random_forest_2 linear_regression support_vector_regression
mae 0.335007 0.335184 0.529571 1.362412
mse 0.261070 0.260232 0.536969 3.397946
r2 0.804088 0.804717 0.597049 -1.549878
mae mse r2
0 0.335007 0.26107 0.804088
mae mse r2
0 0.335184 0.260232 0.804717
mae mse r2
0 0.529571 0.536969 0.597049
mae mse r2
0 1.362412 3.397946 -1.549878

We can see that the second random forest is performing the best in both metrics.

houseage contains a multi-row table where with error metrics broken down by the HouseAge indicator feature. Multi-row tables do not display the “Compare” tab:

nbs["houseage"]
Hide code cell output
mae mse r2
HouseAge
1.0 0.574760 0.330349 NaN
2.0 0.665388 0.781025 0.555539
3.0 0.458829 0.317093 0.513458
4.0 0.429836 0.383136 0.633840
5.0 0.401436 0.332435 0.570201
6.0 0.450137 0.709720 0.357673
7.0 0.383246 0.286670 0.300753
8.0 0.356277 0.237684 0.814934
9.0 0.300000 0.212454 0.733803
10.0 0.348138 0.271888 0.722334
11.0 0.391522 0.334161 0.644776
12.0 0.344515 0.260549 0.705334
13.0 0.348778 0.301961 0.704295
14.0 0.302298 0.180999 0.760840
15.0 0.342469 0.264852 0.748330
16.0 0.322292 0.203786 0.746227
17.0 0.353680 0.289712 0.691650
18.0 0.336097 0.249636 0.771382
19.0 0.339921 0.229790 0.791619
20.0 0.345266 0.261117 0.777799
21.0 0.369868 0.303957 0.774637
22.0 0.373782 0.343803 0.781382
23.0 0.296011 0.186726 0.871232
24.0 0.377797 0.300668 0.714304
25.0 0.374820 0.286860 0.791415
26.0 0.327040 0.218806 0.837962
27.0 0.353562 0.259573 0.806859
28.0 0.328507 0.284007 0.841307
29.0 0.275536 0.173436 0.860094
30.0 0.313152 0.265062 0.799595
31.0 0.314925 0.249528 0.816883
32.0 0.289693 0.216250 0.854901
33.0 0.335981 0.273314 0.776041
34.0 0.301922 0.268332 0.826428
35.0 0.284222 0.199455 0.855104
36.0 0.247743 0.135245 0.872770
37.0 0.314333 0.198425 0.858650
38.0 0.292649 0.191381 0.878730
39.0 0.352954 0.396605 0.762316
40.0 0.308454 0.243355 0.862801
41.0 0.288866 0.157905 0.896466
42.0 0.282521 0.193035 0.828239
43.0 0.306338 0.200691 0.836096
44.0 0.307356 0.192903 0.855049
45.0 0.279698 0.145215 0.874129
46.0 0.313526 0.253001 0.757776
47.0 0.277926 0.148440 0.873223
48.0 0.392431 0.379400 0.794519
49.0 0.378720 0.301617 0.840513
50.0 0.359933 0.287888 0.858836
51.0 0.266996 0.116058 0.937795
52.0 0.447488 0.452603 0.756615
mae mse r2
HouseAge
1.0 0.977671 0.955840 NaN
2.0 0.607704 0.667599 0.620087
3.0 0.473654 0.332057 0.490498
4.0 0.412526 0.366423 0.649813
5.0 0.401614 0.330528 0.572666
6.0 0.446460 0.732015 0.337495
7.0 0.350437 0.257390 0.372172
8.0 0.342534 0.222479 0.826773
9.0 0.297673 0.213990 0.731878
10.0 0.364331 0.282028 0.711978
11.0 0.378554 0.308337 0.672228
12.0 0.349257 0.247674 0.719895
13.0 0.354008 0.290776 0.715249
14.0 0.302113 0.179818 0.762400
15.0 0.349705 0.265702 0.747523
16.0 0.318324 0.191937 0.760983
17.0 0.346227 0.278943 0.703112
18.0 0.317155 0.219868 0.798643
19.0 0.347406 0.245833 0.777072
20.0 0.352341 0.286767 0.755972
21.0 0.364713 0.298815 0.778450
22.0 0.370348 0.347561 0.778992
23.0 0.301322 0.197592 0.863739
24.0 0.370716 0.284735 0.729444
25.0 0.364894 0.267047 0.805822
26.0 0.327428 0.224256 0.833926
27.0 0.342705 0.256715 0.808986
28.0 0.341033 0.303708 0.830299
29.0 0.285883 0.183186 0.852229
30.0 0.324519 0.280646 0.787812
31.0 0.326092 0.256975 0.811418
32.0 0.304627 0.240231 0.838810
33.0 0.329236 0.265524 0.782424
34.0 0.305927 0.261304 0.830974
35.0 0.282573 0.194367 0.858800
36.0 0.248872 0.140067 0.868234
37.0 0.298770 0.185479 0.867873
38.0 0.288661 0.180110 0.885872
39.0 0.353428 0.367124 0.779984
40.0 0.326604 0.274066 0.845488
41.0 0.291717 0.151948 0.900372
42.0 0.270835 0.187359 0.833290
43.0 0.319459 0.212804 0.826204
44.0 0.318565 0.209112 0.842869
45.0 0.284201 0.146638 0.872896
46.0 0.332094 0.288449 0.723838
47.0 0.279540 0.147927 0.873661
48.0 0.379237 0.316407 0.828636
49.0 0.379844 0.292460 0.845355
50.0 0.367058 0.294315 0.855685
51.0 0.271762 0.116379 0.937623
52.0 0.451249 0.461691 0.751727
mae mse r2
HouseAge
1.0 0.077045 0.005936 NaN
2.0 0.621786 0.704731 0.598956
3.0 0.394869 0.309883 0.524520
4.0 0.502120 0.526965 0.496384
5.0 0.402867 0.349914 0.547602
6.0 0.535902 0.955226 0.135479
7.0 0.471769 0.422926 -0.031604
8.0 0.435266 0.300705 0.765865
9.0 0.395658 0.334463 0.580930
10.0 0.548596 0.445740 0.544787
11.0 0.497246 0.449512 0.522155
12.0 0.436461 0.347855 0.606596
13.0 0.432717 0.332560 0.674330
14.0 0.412923 0.298274 0.605881
15.0 0.454044 0.359019 0.658850
16.0 0.451385 0.339579 0.577125
17.0 0.450863 0.388885 0.586097
18.0 0.422341 0.361656 0.668794
19.0 0.440418 0.385330 0.650572
20.0 0.480448 0.449165 0.617777
21.0 0.481173 0.437860 0.675358
22.0 0.557506 0.622022 0.604467
23.0 0.437108 0.361490 0.750713
24.0 0.480626 0.422856 0.598201
25.0 0.551525 0.541321 0.606389
26.0 0.507414 0.484308 0.641343
27.0 0.522484 0.532184 0.604017
28.0 0.574381 0.744957 0.583745
29.0 0.484700 0.541898 0.562865
30.0 0.538348 0.534287 0.596041
31.0 0.538239 0.521302 0.617441
32.0 0.515134 0.528955 0.645083
33.0 0.536822 0.510495 0.581690
34.0 0.608867 1.018926 0.340902
35.0 0.571631 0.534728 0.611543
36.0 0.503451 0.421068 0.603886
37.0 0.537139 0.475265 0.661442
38.0 0.567287 0.519219 0.670993
39.0 0.601552 0.741179 0.555814
40.0 0.610118 0.653960 0.631311
41.0 0.581788 0.574146 0.623550
42.0 0.503874 0.438613 0.609726
43.0 0.565345 0.495386 0.595420
44.0 0.553312 0.508499 0.617904
45.0 0.547449 0.493563 0.572186
46.0 0.613946 0.646647 0.380899
47.0 0.580874 0.482213 0.588160
48.0 0.765080 0.945197 0.488086
49.0 0.753427 0.834546 0.558714
50.0 0.593333 0.654756 0.678945
51.0 0.641995 0.661623 0.645383
52.0 0.736178 1.016155 0.453567
mae mse r2
HouseAge
1.0 0.568364 0.323037 NaN
2.0 2.412446 10.665314 -5.069348
3.0 2.702656 17.347265 -25.617363
4.0 2.998564 18.732208 -16.902196
5.0 2.250587 11.293336 -13.600950
6.0 1.896657 5.042349 -3.563544
7.0 2.034707 6.820226 -15.635951
8.0 1.890973 5.505713 -3.286856
9.0 1.569817 4.015307 -4.031034
10.0 1.482127 4.135893 -3.223785
11.0 1.562862 3.783216 -3.021680
12.0 1.846223 7.139825 -7.074743
13.0 1.417733 3.007668 -1.945355
14.0 1.454416 3.820384 -4.047997
15.0 1.394318 3.488356 -2.314732
16.0 1.268467 2.841861 -2.538944
17.0 1.561164 4.105370 -3.369470
18.0 1.629816 4.083255 -2.739468
19.0 1.575136 4.022268 -2.647509
20.0 1.424797 4.010665 -2.412932
21.0 1.755116 5.107228 -2.786642
22.0 1.747823 5.111687 -2.250432
23.0 1.448371 3.459014 -1.385369
24.0 1.460224 3.586813 -2.408202
25.0 1.572595 4.167664 -2.030432
26.0 1.437043 3.474898 -1.573351
27.0 1.362573 3.117923 -1.319960
28.0 1.491566 3.601816 -1.012566
29.0 1.353388 3.317415 -1.676070
30.0 1.301937 2.668858 -1.017845
31.0 1.351238 2.753186 -1.020435
32.0 1.244660 2.497870 -0.676016
33.0 1.247614 2.647299 -1.169254
34.0 1.186621 2.350269 -0.520284
35.0 1.094637 1.974643 -0.434496
36.0 0.932867 1.658994 -0.560679
37.0 0.961683 1.516355 -0.080187
38.0 0.982114 1.690189 -0.070999
39.0 1.176525 2.286193 -0.370109
40.0 1.176664 2.259430 -0.273818
41.0 0.962483 1.567789 -0.027950
42.0 0.975698 1.592138 -0.416672
43.0 0.937572 1.573067 -0.284719
44.0 1.013753 1.646602 -0.237287
45.0 0.862528 1.270118 -0.100923
46.0 0.992554 1.743111 -0.668859
47.0 0.947160 1.715396 -0.465053
48.0 1.010832 1.832054 0.007769
49.0 1.028062 1.885457 0.003021
50.0 0.908217 1.628493 0.201479
51.0 1.206018 2.185077 -0.171158
52.0 1.442912 3.419033 -0.838573

If we only compare two notebooks, the output is a bit different:

# only compare two notebooks
nbs_two = NotebookCollection(paths=files[:2], ids=ids[:2], scores=["r2"])

Comparing single-row tables includes a diff column with the error difference between experiments. Error reductions are showed in green, increments in red:

nbs_two["metrics"]
  random_forest_1 random_forest_2 diff diff_relative ratio
mae 0.335007 0.335184 0.000177 0.05% 1.000528
mse 0.261070 0.260232 -0.000838 -0.32% 0.996790
r2 0.804088 0.804717 0.000629 0.08% 1.000782
mae mse r2
0 0.335007 0.26107 0.804088
mae mse r2
0 0.335184 0.260232 0.804717

When comparing multi-row tables, the “Compare” tab appears, showing the difference between the tables:

nbs_two["houseage"]
Hide code cell output
  mae mse r2
HouseAge      
1.000000 0.402911 0.625491 nan
2.000000 -0.057684 -0.113426 0.064548
3.000000 0.014825 0.014964 -0.022960
4.000000 -0.017310 -0.016713 0.015973
5.000000 0.000178 -0.001907 0.002465
6.000000 -0.003677 0.022295 -0.020178
7.000000 -0.032809 -0.029280 0.071419
8.000000 -0.013743 -0.015205 0.011839
9.000000 -0.002327 0.001536 -0.001925
10.000000 0.016193 0.010140 -0.010356
11.000000 -0.012968 -0.025824 0.027452
12.000000 0.004742 -0.012875 0.014561
13.000000 0.005230 -0.011185 0.010954
14.000000 -0.000185 -0.001181 0.001560
15.000000 0.007236 0.000850 -0.000807
16.000000 -0.003968 -0.011849 0.014756
17.000000 -0.007453 -0.010769 0.011462
18.000000 -0.018942 -0.029768 0.027261
19.000000 0.007485 0.016043 -0.014547
20.000000 0.007075 0.025650 -0.021827
21.000000 -0.005155 -0.005142 0.003813
22.000000 -0.003434 0.003758 -0.002390
23.000000 0.005311 0.010866 -0.007493
24.000000 -0.007081 -0.015933 0.015140
25.000000 -0.009926 -0.019813 0.014407
26.000000 0.000388 0.005450 -0.004036
27.000000 -0.010857 -0.002858 0.002127
28.000000 0.012526 0.019701 -0.011008
29.000000 0.010347 0.009750 -0.007865
30.000000 0.011367 0.015584 -0.011783
31.000000 0.011167 0.007447 -0.005465
32.000000 0.014934 0.023981 -0.016091
33.000000 -0.006745 -0.007790 0.006383
34.000000 0.004005 -0.007028 0.004546
35.000000 -0.001649 -0.005088 0.003696
36.000000 0.001129 0.004822 -0.004536
37.000000 -0.015563 -0.012946 0.009223
38.000000 -0.003988 -0.011271 0.007142
39.000000 0.000474 -0.029481 0.017668
40.000000 0.018150 0.030711 -0.017313
41.000000 0.002851 -0.005957 0.003906
42.000000 -0.011686 -0.005676 0.005051
43.000000 0.013121 0.012113 -0.009892
44.000000 0.011209 0.016209 -0.012180
45.000000 0.004503 0.001423 -0.001233
46.000000 0.018568 0.035448 -0.033938
47.000000 0.001614 -0.000513 0.000438
48.000000 -0.013194 -0.062993 0.034117
49.000000 0.001124 -0.009157 0.004842
50.000000 0.007125 0.006427 -0.003151
51.000000 0.004766 0.000321 -0.000172
52.000000 0.003761 0.009088 -0.004888
mae mse r2
HouseAge
1.0 0.574760 0.330349 NaN
2.0 0.665388 0.781025 0.555539
3.0 0.458829 0.317093 0.513458
4.0 0.429836 0.383136 0.633840
5.0 0.401436 0.332435 0.570201
6.0 0.450137 0.709720 0.357673
7.0 0.383246 0.286670 0.300753
8.0 0.356277 0.237684 0.814934
9.0 0.300000 0.212454 0.733803
10.0 0.348138 0.271888 0.722334
11.0 0.391522 0.334161 0.644776
12.0 0.344515 0.260549 0.705334
13.0 0.348778 0.301961 0.704295
14.0 0.302298 0.180999 0.760840
15.0 0.342469 0.264852 0.748330
16.0 0.322292 0.203786 0.746227
17.0 0.353680 0.289712 0.691650
18.0 0.336097 0.249636 0.771382
19.0 0.339921 0.229790 0.791619
20.0 0.345266 0.261117 0.777799
21.0 0.369868 0.303957 0.774637
22.0 0.373782 0.343803 0.781382
23.0 0.296011 0.186726 0.871232
24.0 0.377797 0.300668 0.714304
25.0 0.374820 0.286860 0.791415
26.0 0.327040 0.218806 0.837962
27.0 0.353562 0.259573 0.806859
28.0 0.328507 0.284007 0.841307
29.0 0.275536 0.173436 0.860094
30.0 0.313152 0.265062 0.799595
31.0 0.314925 0.249528 0.816883
32.0 0.289693 0.216250 0.854901
33.0 0.335981 0.273314 0.776041
34.0 0.301922 0.268332 0.826428
35.0 0.284222 0.199455 0.855104
36.0 0.247743 0.135245 0.872770
37.0 0.314333 0.198425 0.858650
38.0 0.292649 0.191381 0.878730
39.0 0.352954 0.396605 0.762316
40.0 0.308454 0.243355 0.862801
41.0 0.288866 0.157905 0.896466
42.0 0.282521 0.193035 0.828239
43.0 0.306338 0.200691 0.836096
44.0 0.307356 0.192903 0.855049
45.0 0.279698 0.145215 0.874129
46.0 0.313526 0.253001 0.757776
47.0 0.277926 0.148440 0.873223
48.0 0.392431 0.379400 0.794519
49.0 0.378720 0.301617 0.840513
50.0 0.359933 0.287888 0.858836
51.0 0.266996 0.116058 0.937795
52.0 0.447488 0.452603 0.756615
mae mse r2
HouseAge
1.0 0.977671 0.955840 NaN
2.0 0.607704 0.667599 0.620087
3.0 0.473654 0.332057 0.490498
4.0 0.412526 0.366423 0.649813
5.0 0.401614 0.330528 0.572666
6.0 0.446460 0.732015 0.337495
7.0 0.350437 0.257390 0.372172
8.0 0.342534 0.222479 0.826773
9.0 0.297673 0.213990 0.731878
10.0 0.364331 0.282028 0.711978
11.0 0.378554 0.308337 0.672228
12.0 0.349257 0.247674 0.719895
13.0 0.354008 0.290776 0.715249
14.0 0.302113 0.179818 0.762400
15.0 0.349705 0.265702 0.747523
16.0 0.318324 0.191937 0.760983
17.0 0.346227 0.278943 0.703112
18.0 0.317155 0.219868 0.798643
19.0 0.347406 0.245833 0.777072
20.0 0.352341 0.286767 0.755972
21.0 0.364713 0.298815 0.778450
22.0 0.370348 0.347561 0.778992
23.0 0.301322 0.197592 0.863739
24.0 0.370716 0.284735 0.729444
25.0 0.364894 0.267047 0.805822
26.0 0.327428 0.224256 0.833926
27.0 0.342705 0.256715 0.808986
28.0 0.341033 0.303708 0.830299
29.0 0.285883 0.183186 0.852229
30.0 0.324519 0.280646 0.787812
31.0 0.326092 0.256975 0.811418
32.0 0.304627 0.240231 0.838810
33.0 0.329236 0.265524 0.782424
34.0 0.305927 0.261304 0.830974
35.0 0.282573 0.194367 0.858800
36.0 0.248872 0.140067 0.868234
37.0 0.298770 0.185479 0.867873
38.0 0.288661 0.180110 0.885872
39.0 0.353428 0.367124 0.779984
40.0 0.326604 0.274066 0.845488
41.0 0.291717 0.151948 0.900372
42.0 0.270835 0.187359 0.833290
43.0 0.319459 0.212804 0.826204
44.0 0.318565 0.209112 0.842869
45.0 0.284201 0.146638 0.872896
46.0 0.332094 0.288449 0.723838
47.0 0.279540 0.147927 0.873661
48.0 0.379237 0.316407 0.828636
49.0 0.379844 0.292460 0.845355
50.0 0.367058 0.294315 0.855685
51.0 0.271762 0.116379 0.937623
52.0 0.451249 0.461691 0.751727

When displaying dictionaries, a “Compare” tab shows with a diff view:

nbs_two["model_params"]
f1{f1{
2    'bootstrap': True,2    'bootstrap': True,
3    'ccp_alpha': 0.0,3    'ccp_alpha': 0.0,
4    'criterion': 'squared_error',4    'criterion': 'squared_error',
5    'max_depth': None,5    'max_depth': None,
6    'max_features': 1.0,6    'max_features': 1.0,
7    'max_leaf_nodes': None,7    'max_leaf_nodes': None,
8    'max_samples': None,8    'max_samples': None,
9    'min_impurity_decrease': 0.0,9    'min_impurity_decrease': 0.0,
10    'min_samples_leaf': 1,10    'min_samples_leaf': 1,
11    'min_samples_split': 2,11    'min_samples_split': 2,
12    'min_weight_fraction_leaf': 0.0,12    'min_weight_fraction_leaf': 0.0,
t13    'n_estimators': 50,t13    'n_estimators': 100,
14    'n_jobs': None,14    'n_jobs': None,
15    'oob_score': False,15    'oob_score': False,
16    'random_state': None,16    'random_state': None,
17    'verbose': 0,17    'verbose': 0,
18    'warm_start': False,18    'warm_start': False,
19}19}
Legends
Colors
 Added 
Changed
Deleted
Links
(f)irst change
(n)ext change
(t)op
{
    'bootstrap': True,
    'ccp_alpha': 0.0,
    'criterion': 'squared_error',
    'max_depth': None,
    'max_features': 1.0,
    'max_leaf_nodes': None,
    'max_samples': None,
    'min_impurity_decrease': 0.0,
    'min_samples_leaf': 1,
    'min_samples_split': 2,
    'min_weight_fraction_leaf': 0.0,
    'n_estimators': 50,
    'n_jobs': None,
    'oob_score': False,
    'random_state': None,
    'verbose': 0,
    'warm_start': False,
}
{
    'bootstrap': True,
    'ccp_alpha': 0.0,
    'criterion': 'squared_error',
    'max_depth': None,
    'max_features': 1.0,
    'max_leaf_nodes': None,
    'max_samples': None,
    'min_impurity_decrease': 0.0,
    'min_samples_leaf': 1,
    'min_samples_split': 2,
    'min_weight_fraction_leaf': 0.0,
    'n_estimators': 100,
    'n_jobs': None,
    'oob_score': False,
    'random_state': None,
    'verbose': 0,
    'warm_start': False,
}

Lists (and sets) are compared based on elements existence:

nbs_two["feature_names"]
Both Only in random_forest_1 Only in random_forest_2
AveBedrms
AveOccup
AveRooms
HouseAge
Latitude
Longitude
MedInc
Population
['MedInc', 'HouseAge', 'AveRooms', 'AveBedrms', 'Population', 'AveOccup', 'Latitude', 'Longitude']
['MedInc', 'HouseAge', 'AveRooms', 'AveBedrms', 'Population', 'AveOccup', 'Latitude', 'Longitude']

Using the mapping interface#

NotebookCollection has a dict-like interface, you can retrieve data from individual notebooks:

nbs["model_params"]["random_forest_1"]
{'bootstrap': True,
 'ccp_alpha': 0.0,
 'criterion': 'squared_error',
 'max_depth': None,
 'max_features': 1.0,
 'max_leaf_nodes': None,
 'max_samples': None,
 'min_impurity_decrease': 0.0,
 'min_samples_leaf': 1,
 'min_samples_split': 2,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 50,
 'n_jobs': None,
 'oob_score': False,
 'random_state': None,
 'verbose': 0,
 'warm_start': False}
nbs["plot"]["random_forest_2"]
../_images/147768eaf1c8379bb5a8acb7695bb28db6d019dbab1c2db3f24f74a959529bf1.png