Skip to content

Commit 7cceed1

Browse files
FIX Add missing "be" in cross-validation-curve notebook (#868)
Co-authored-by: Arturo Amor <[email protected]>
1 parent 454f95b commit 7cceed1

File tree

2 files changed

+20
-20
lines changed

2 files changed

+20
-20
lines changed

notebooks/cross_validation_validation_curve.ipynb

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -255,18 +255,18 @@
255255
"errors made during the data collection process (besides not measuring the\n",
256256
"unobserved input feature).\n",
257257
"\n",
258-
"One extreme case could happen if there where samples in the dataset with\n",
259-
"exactly the same input feature values but different values for the target\n",
260-
"variable. That is very unlikely in real life settings, but could the case if\n",
261-
"all features are categorical or if the numerical features were discretized\n",
262-
"or rounded up naively. In our example, we can imagine two houses having\n",
263-
"the exact same features in our dataset, but having different prices because\n",
264-
"of the (unmeasured) seller's rush.\n",
265-
"\n",
266-
"Apart from these extreme case, it's hard to know for sure what should qualify\n",
267-
"or not as noise and which kind of \"noise\" as introduced above is dominating.\n",
268-
"But in practice, the best ways to make our predictive models robust to noise\n",
269-
"are to avoid overfitting models by:\n",
258+
"One extreme case could happen if there where samples in the dataset with exactly\n",
259+
"the same input feature values but different values for the target variable. That\n",
260+
"is very unlikely in real life settings, but could be the case if all features\n",
261+
"are categorical or if the numerical features were discretized or rounded up\n",
262+
"naively. In our example, we can imagine two houses having the exact same\n",
263+
"features in our dataset, but having different prices because of the (unmeasured)\n",
264+
"seller's rush.\n",
265+
"\n",
266+
"Apart from this extreme case, it's hard to know for sure what should qualify or\n",
267+
"not as noise and which kind of \"noise\" as introduced above is dominating. But in\n",
268+
"practice, the best way to make our predictive models robust to noise is to\n",
269+
"avoid overfitting models by:\n",
270270
"\n",
271271
"- selecting models that are simple enough or with tuned hyper-parameters as\n",
272272
" explained in this module;\n",

python_scripts/cross_validation_validation_curve.py

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -198,16 +198,16 @@
198198
#
199199
# One extreme case could happen if there where samples in the dataset with
200200
# exactly the same input feature values but different values for the target
201-
# variable. That is very unlikely in real life settings, but could the case if
202-
# all features are categorical or if the numerical features were discretized
203-
# or rounded up naively. In our example, we can imagine two houses having
204-
# the exact same features in our dataset, but having different prices because
205-
# of the (unmeasured) seller's rush.
201+
# variable. That is very unlikely in real life settings, but could be the case
202+
# if all features are categorical or if the numerical features were discretized
203+
# or rounded up naively. In our example, we can imagine two houses having the
204+
# exact same features in our dataset, but having different prices because of the
205+
# (unmeasured) seller's rush.
206206
#
207-
# Apart from these extreme case, it's hard to know for sure what should qualify
207+
# Apart from this extreme case, it's hard to know for sure what should qualify
208208
# or not as noise and which kind of "noise" as introduced above is dominating.
209-
# But in practice, the best ways to make our predictive models robust to noise
210-
# are to avoid overfitting models by:
209+
# But in practice, the best way to make our predictive models robust to noise
210+
# is to avoid overfitting models by:
211211
#
212212
# - selecting models that are simple enough or with tuned hyper-parameters as
213213
# explained in this module;

0 commit comments

Comments
 (0)