Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Throw on data dependent outputs #1163

Merged
merged 21 commits into from
Sep 13, 2022
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
address comments + revert accidental changes
  • Loading branch information
eellison committed Sep 12, 2022
commit 31892929f3205b98906543c7798e4a2d1abc3f49
25 changes: 2 additions & 23 deletions torchdynamo/variables/tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -157,38 +157,17 @@ def context():
size=(), dtype=args[0].dtype
voznesenskym marked this conversation as resolved.
Show resolved Hide resolved
).item()
else:
unimplemented("data dependent operator")
unimplemented(f"data dependent operator: {e.func}")
elif use_fake_tensors and isinstance(
e, DynamicOutputShapeException
):
unimplemented("dynamic shape operator")
unimplemented(f"dynamic shape operator: {e.func}")
else:
raise TorchRuntimeError() from e
else:
if use_fake_tensors:
example_value = fake_wrapper(example_value)

# Avoids a .item() call in the tensor slice that would attempt to get a value out
# fake tensors, and which would determine the output shape of the slice.
# It is a workaround until https://github.com/pytorch/pytorch/pull/83567
# is landed and there is more complete support for breaking on data dependent operators.

if (
proxy.node.target == operator.getitem
and use_fake_tensors
and args is not None
and not config.dynamic_shapes
):
if (
isinstance(args[0], FakeTensor)
and isinstance(args[1], slice)
and any(
isinstance(e, FakeTensor)
for e in (args[1].start, args[1].stop, args[1].step)
)
):
unimplemented("dynamic shape slicing")

if isinstance(example_value, torch.Tensor):
is_parameter = isinstance(example_value, torch.nn.Parameter)
parameter_value = initial_example_value if is_parameter else None
Expand Down