Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make sure sub-graph-specific code is encapsulated in Python submodules #2026

Merged
merged 1 commit into from
Jan 7, 2022

Conversation

timmoon10
Copy link
Collaborator

@timmoon10 timmoon10 commented Jan 6, 2022

By importing sub-graph code directly into our base Python submodules (lbann.modules, lbann.models), we were overwriting the non-sub-graph implementations of multi-head attention and Transformer. Forcing the user to load lbann.modules.subgraph or lbann.models.subgraph seems appropriate since this is an advanced, experimental feature.

No new Bamboo failures. Pinging @aj-prime and @tnat410.

We were overriding the original, non-sub-graph implementations.
Copy link
Collaborator

@benson31 benson31 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@timmoon10 timmoon10 merged commit 086f443 into LLNL:develop Jan 7, 2022
@timmoon10 timmoon10 deleted the python-subgraph-modules branch January 7, 2022 02:03
graham63 pushed a commit to graham63/lbann that referenced this pull request Jan 12, 2022
LLNL#2026)

We were overriding the original, non-sub-graph implementations.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants