Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converting clip with a min value of 0 as a ReLU #69

Merged
merged 1 commit into from
Mar 27, 2019

Conversation

ElteHupkes
Copy link
Contributor

This is a minor optimization: in my TensorFlow graphs I can see that the clip_by_value lambda layer is converted as two layers (a clip with a maximum and a clip with a minimum). When the minimum value is of the clip is 0, the clip is equivalent to a ReLU layer with the max argument set. The resulting output has just a single node in the final graph, if the max value is 6 it actually turns in to a ReLU6 layer automatically. In my experience the clip nodes often come from ReLU6 in the original graph, so this makes for the best conversion.

@gmalivenko gmalivenko merged commit af4b9da into gmalivenko:master Mar 27, 2019
@gmalivenko
Copy link
Owner

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants