Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modify reshapes #2099

Merged
merged 61 commits into from
Sep 27, 2023
Merged

Modify reshapes #2099

merged 61 commits into from
Sep 27, 2023

Conversation

TedThemistokleous
Copy link
Collaborator

Modify reshapes to use reshape_lazy for aliasing and then reshape for a reshape copy operation to eliminate contiguous

Used to get pass reshape errors seen when running TIMMs models for TorchMIGraphX

TedThemistokleous and others added 20 commits August 17, 2023 13:50
Just make the old reshape into reshape_lazy since this is going to be performing
aliasing after we perform a proper pass of reshapes.
Case here is to let reshape fall into a contiguous to do the copy required
for nonstandard shape->standard shape convert.
this will get filtered out once proper order and aliasing is determined
use this to find a reshape->contiguous and then determine if aliasing can be done
thus use the proper reshape or reshape lazy operator
These will get cleaned up later but result in us adding contiguous after
every reshape prior to us performing a find_reshape_alias matcher
Unable to get dims() to populate correclty. fails at replace_instruction in
lowering
Don't need this anymore
Not needed so we don;'t need the extra contiguous as we're trying to
eliminate the need to use contiguous
Make reshape work like contiguous to perform the copy and then add
proper aliasing in lowering if we're unable to perform a replace instruction
Not needed anymore as reshape performs the copy by default instead thus contiguous
is not needed
Reuse the previous test for reshape lazy since we've made reshape -> reshape_lazy
Move reshape to reshape_lazy which performs the aliasing for op_shape_test
@TedThemistokleous TedThemistokleous added enhancement New feature or request roadmap Tasks to finish for a release TorchMIGraphX labels Aug 18, 2023
@TedThemistokleous TedThemistokleous self-assigned this Aug 18, 2023
This was linked to issues Aug 18, 2023
@migraphx-bot
Copy link
Collaborator

migraphx-bot commented Aug 19, 2023

Test Batch Rate new
0625bb
Rate old
52eb36
Diff Compare
torchvision-resnet50 64 2,274.74 2,320.56 -1.97%
torchvision-resnet50_fp16 64 5,346.39 5,338.54 0.15%
torchvision-densenet121 32 1,819.25 1,846.01 -1.45%
torchvision-densenet121_fp16 32 3,393.04 3,413.98 -0.61%
torchvision-inceptionv3 32 1,344.90 1,293.50 3.97% 🔆
torchvision-inceptionv3_fp16 32 2,581.21 2,529.64 2.04%
cadene-inceptionv4 16 680.15 620.22 9.66% 🔆
cadene-resnext64x4 16 589.97 588.32 0.28%
slim-mobilenet 64 7,228.10 7,204.25 0.33%
slim-nasnetalarge 64 236.97 236.62 0.15%
slim-resnet50v2 64 2,527.65 2,558.85 -1.22%
bert-mrpc-onnx 8 720.94 839.95 -14.17% 🔴
bert-mrpc-tf 1 389.34 389.14 0.05%
pytorch-examples-wlang-gru 1 306.06 nan nan%
pytorch-examples-wlang-lstm 1 311.80 nan nan%
torchvision-resnet50_1 1 558.65 548.12 1.92%
torchvision-inceptionv3_1 1 306.66 306.42 0.08%
cadene-dpn92_1 1 349.78 351.80 -0.57%
cadene-resnext101_1 1 220.55 219.58 0.44%
slim-vgg16_1 1 224.40 224.38 0.01%
slim-mobilenet_1 1 1,463.84 1,501.43 -2.50%
slim-inceptionv4_1 1 220.41 217.85 1.17%
onnx-taau-downsample 1 322.63 306.13 5.39% 🔆
dlrm-criteoterabyte 1 21.70 nan nan%
dlrm-criteoterabyte_fp16 1 40.62 nan nan%
agentmodel 1 5,863.21 nan nan%
unet_fp16 2 55.03 55.17 -0.26%

This build is not recommended to merge 🔴

@migraphx-bot
Copy link
Collaborator

migraphx-bot commented Aug 19, 2023


    :white_check_mark:bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

    :white_check_mark:bert-mrpc-tf: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

    :white_check_mark:torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

🔴torchvision-inceptionv3_1: FAILED: MIGraphX is not within tolerance - check verbose output


🔴cadene-dpn92_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:cadene-resnext101_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-vgg16_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-mobilenet_1: PASSED: MIGraphX meets tolerance

🔴slim-inceptionv4_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

    :white_check_mark:agentmodel: PASSED: MIGraphX meets tolerance

    :white_check_mark:unet: PASSED: MIGraphX meets tolerance

    :white_check_mark:resnet50v1: PASSED: MIGraphX meets tolerance

🔴bert_base_cased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


🔴bert_large_uncased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:bert_large: PASSED: MIGraphX meets tolerance

🔴distilgpt2_fp16: FAILED: MIGraphX is not within tolerance - check verbose output

-Split reshape_shape into rehshape_shape_invalid, reshape_shape_minus1_reshapes
-Added back assertions to handle each possible case
Similar to the reshape_lazy_dims version of this call but we don't need to
care for strides merging/aliasing as the copy operations around the reshape
will afford us the ability to change memory layout to perform the correct
reshape operation
Since we aren't worried about aliasing anymore for squeeze/unsqueeze cases
we can rename the test to handle mem layout changes. Test is renamed to
reflect that.

In the reshape_lazy case which aliases, this case is actually an error.
In the case of reshape we can modify the output memory layout via copy
thus the output size can be larger than that of the input.
@TedThemistokleous
Copy link
Collaborator Author

@causten @pfultz2 still failing on that test in ORT we're seeing on other PRs: [2023-09-21T03:55:13.018Z] [ RUN ] AttentionTest.AttentionBatch1_Float16

Otherwise everything else seems to be fine

test/op_shape_test.cpp Outdated Show resolved Hide resolved
…e_test

Change to a different shape in this test
@causten causten merged commit 7e5ccd4 into develop Sep 27, 2023
13 of 15 checks passed
@causten causten deleted the modify_reshapes_develop branch September 27, 2023 20:33
@pfultz2 pfultz2 mentioned this pull request Nov 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request roadmap Tasks to finish for a release TorchMIGraphX
Projects
None yet
Development

Successfully merging this pull request may close these issues.

investigate Benchmarks failures TIMM Add reshape_copy operator
6 participants