Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix #298 Handle removal of sklearn randomized l1 models #302

Merged
merged 2 commits into from
Mar 29, 2019

Conversation

teabolt
Copy link
Contributor

@teabolt teabolt commented Mar 27, 2019

Fixes #298
Wrap import and decorators in a try-except block.

@codecov-io
Copy link

Codecov Report

Merging #302 into master will decrease coverage by 0.06%.
The diff coverage is 71.42%.

@@            Coverage Diff             @@
##           master     #302      +/-   ##
==========================================
- Coverage   97.26%   97.19%   -0.07%     
==========================================
  Files          44       44              
  Lines        2851     2854       +3     
  Branches      541      541              
==========================================
+ Hits         2773     2774       +1     
- Misses         41       43       +2     
  Partials       37       37
Impacted Files Coverage Δ
eli5/sklearn/transform.py 94.28% <71.42%> (-5.72%) ⬇️

Copy link
Contributor

@lopuhin lopuhin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great @teabolt , thank you! Left some comment, but I think PR is good to be merged as-is.

scikit-learn 0.21 is not released yet, but it would be picked up on CI once it's on PyPI, so good that we'll have less breakage then.

_select_names = transform_feature_names.register(RandomizedLogisticRegression)(_select_names)
except ImportError: # Removed in scikit-learn 0.21
pass
_select_names = transform_feature_names.register(SelectorMixin)(_select_names)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SelectorMixin could be left as a decorator, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are correct. I expanded out and placed SelectorMixin last to keep the original order that the decorators are applied in (RandomizedLasso -> RandomizedLogisticRegression -> SelectorMixin). But in this case order does not seem to matter.

@@ -21,15 +18,23 @@

# Feature selection:

@transform_feature_names.register(SelectorMixin)
@transform_feature_names.register(RandomizedLogisticRegression)
@transform_feature_names.register(RandomizedLasso)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had an idea for another approach here, but after some thought it looks worse than what is implemented here, still let me put it here:

  • put imports at the top, set RandomizedLasso etc. to None if they fail to import
  • make it possible to write something equivalent to @transform_feature_names.register(RandomizedLasso) which would do nothing if RandomizedLasso is None. But it seems that we'd need a wrapper for that, something like @register(transform_feature_names, RandomizedLasso)

Advantage is that the flow would be more natural (imports at the top, decorators above the function). Disadvantage is that it might be a bit confusing that RandomizedLasso can be None, and also we would need to introduce the wrapper. So to me it seems that it would make sense only if we get a lot of similar conditional registers.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the suggestions @lopuhin. Looking at the recently deprecated in sklearn there doesn't seem to be too many removals of existing models. But I would certainly prefer your approach if we face more issues like this down the line.

Copy link
Contributor

@lopuhin lopuhin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, thanks @teabolt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants