-
Notifications
You must be signed in to change notification settings - Fork 331
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix #298 Handle removal of sklearn randomized l1 models #302
Fix #298 Handle removal of sklearn randomized l1 models #302
Conversation
Codecov Report
@@ Coverage Diff @@
## master #302 +/- ##
==========================================
- Coverage 97.26% 97.19% -0.07%
==========================================
Files 44 44
Lines 2851 2854 +3
Branches 541 541
==========================================
+ Hits 2773 2774 +1
- Misses 41 43 +2
Partials 37 37
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great @teabolt , thank you! Left some comment, but I think PR is good to be merged as-is.
scikit-learn 0.21 is not released yet, but it would be picked up on CI once it's on PyPI, so good that we'll have less breakage then.
eli5/sklearn/transform.py
Outdated
_select_names = transform_feature_names.register(RandomizedLogisticRegression)(_select_names) | ||
except ImportError: # Removed in scikit-learn 0.21 | ||
pass | ||
_select_names = transform_feature_names.register(SelectorMixin)(_select_names) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SelectorMixin
could be left as a decorator, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are correct. I expanded out and placed SelectorMixin
last to keep the original order that the decorators are applied in (RandomizedLasso -> RandomizedLogisticRegression -> SelectorMixin). But in this case order does not seem to matter.
@@ -21,15 +18,23 @@ | |||
|
|||
# Feature selection: | |||
|
|||
@transform_feature_names.register(SelectorMixin) | |||
@transform_feature_names.register(RandomizedLogisticRegression) | |||
@transform_feature_names.register(RandomizedLasso) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had an idea for another approach here, but after some thought it looks worse than what is implemented here, still let me put it here:
- put imports at the top, set
RandomizedLasso
etc. toNone
if they fail to import - make it possible to write something equivalent to
@transform_feature_names.register(RandomizedLasso)
which would do nothing ifRandomizedLasso
isNone
. But it seems that we'd need a wrapper for that, something like@register(transform_feature_names, RandomizedLasso)
Advantage is that the flow would be more natural (imports at the top, decorators above the function). Disadvantage is that it might be a bit confusing that RandomizedLasso
can be None
, and also we would need to introduce the wrapper. So to me it seems that it would make sense only if we get a lot of similar conditional registers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the suggestions @lopuhin. Looking at the recently deprecated in sklearn there doesn't seem to be too many removals of existing models. But I would certainly prefer your approach if we face more issues like this down the line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great, thanks @teabolt
Fixes #298
Wrap import and decorators in a try-except block.