Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove contiguous from passes for reshapes #2319

Merged
merged 10 commits into from
Nov 15, 2023

Conversation

TedThemistokleous
Copy link
Collaborator

@TedThemistokleous TedThemistokleous commented Oct 11, 2023

@TedThemistokleous TedThemistokleous added the Cleanup Cleans up code from stale bits/warnings/previous changes for a previous feature PR label Oct 11, 2023
@TedThemistokleous TedThemistokleous self-assigned this Oct 11, 2023
@codecov
Copy link

codecov bot commented Oct 11, 2023

Codecov Report

❗ No coverage uploaded for pull request base (develop@d8011ad). Click here to learn what that means.
The diff coverage is n/a.

❗ Current head ac8f353 differs from pull request most recent head af3e36c. Consider uploading reports for the commit af3e36c to get more accurate results

@@            Coverage Diff             @@
##             develop    #2319   +/-   ##
==========================================
  Coverage           ?   91.45%           
==========================================
  Files              ?      433           
  Lines              ?    16161           
  Branches           ?        0           
==========================================
  Hits               ?    14779           
  Misses             ?     1382           
  Partials           ?        0           
Files Coverage Δ
src/simplify_algebra.cpp 96.80% <ø> (ø)

@umangyadav
Copy link
Member

Need more changes inside
find_reshaper(), find_resize()
find_pointwise_reshape_pointwise
find_transpose_contiguous_reshaper_unary

In general look for make_op("contiguous") and see reshape or "reshaper" op follows it and check if it can be removed.

@migraphx-bot
Copy link
Collaborator

migraphx-bot commented Oct 12, 2023

Test Batch Rate new
fef5bd
Rate old
fa0dc3
Diff Compare
torchvision-resnet50 64 2,827.68 2,830.58 -0.10%
torchvision-resnet50_fp16 64 6,489.84 6,496.69 -0.11%
torchvision-densenet121 32 2,087.56 2,088.66 -0.05%
torchvision-densenet121_fp16 32 3,660.78 3,666.42 -0.15%
torchvision-inceptionv3 32 1,578.91 1,585.70 -0.43%
torchvision-inceptionv3_fp16 32 2,571.29 2,570.38 0.04%
cadene-inceptionv4 16 702.81 702.77 0.01%
cadene-resnext64x4 16 691.75 691.92 -0.03%
slim-mobilenet 64 8,313.98 8,329.13 -0.18%
slim-nasnetalarge 64 225.28 225.46 -0.08%
slim-resnet50v2 64 2,663.59 2,665.48 -0.07%
bert-mrpc-onnx 8 821.25 820.68 0.07%
bert-mrpc-tf 1 385.28 389.19 -1.01%
pytorch-examples-wlang-gru 1 291.14 299.36 -2.74%
pytorch-examples-wlang-lstm 1 303.20 307.35 -1.35%
torchvision-resnet50_1 1 601.57 607.66 -1.00%
torchvision-inceptionv3_1 1 337.16 343.76 -1.92%
cadene-dpn92_1 1 392.62 398.97 -1.59%
cadene-resnext101_1 1 327.61 329.07 -0.44%
slim-vgg16_1 1 458.13 459.55 -0.31%
slim-mobilenet_1 1 2,101.62 2,081.53 0.96%
slim-inceptionv4_1 1 217.60 219.47 -0.85%
onnx-taau-downsample 1 304.15 304.59 -0.14%
dlrm-criteoterabyte 1 21.60 21.61 -0.07%
dlrm-criteoterabyte_fp16 1 40.64 40.64 0.00%
agentmodel 1 nan nan nan%
unet_fp16 2 54.71 54.95 -0.45%
resnet50v1_fp16 1 948.78 982.15 -3.40% 🔴
bert_base_cased_fp16 64 902.96 967.75 -6.70% 🔴
bert_large_uncased_fp16 32 285.56 304.02 -6.07% 🔴
bert_large_fp16 1 167.16 166.60 0.34%
distilgpt2_fp16 16 1,282.12 1,272.94 0.72%

This build is not recommended to merge 🔴

@migraphx-bot
Copy link
Collaborator

migraphx-bot commented Oct 12, 2023


    :white_check_mark:bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

    :white_check_mark:bert-mrpc-tf: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

    :white_check_mark:torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:torchvision-inceptionv3_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:cadene-dpn92_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:cadene-resnext101_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-vgg16_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-mobilenet_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-inceptionv4_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

❌agentmodel: ERROR - check error outputTraceback (most recent call last):
File "/src/AMDMIGraphX/tools/accuracy/accuracy_checker.py", line 336, in
main()
File "/src/AMDMIGraphX/tools/accuracy/accuracy_checker.py", line 254, in main
pred_migx = np.array(model.run(params)[-1])
RuntimeError: /src/AMDMIGraphX/src/targets/gpu/device/include/migraphx/gpu/device/visit.hpp:140: hip_visit_views_impl: Ranks must be the same


    :white_check_mark:unet: PASSED: MIGraphX meets tolerance

    :white_check_mark:resnet50v1: PASSED: MIGraphX meets tolerance

🔴bert_base_cased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


🔴bert_large_uncased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:bert_large: PASSED: MIGraphX meets tolerance

🔴distilgpt2_fp16: FAILED: MIGraphX is not within tolerance - check verbose output

@TedThemistokleous
Copy link
Collaborator Author

@kahmed10 not sure if this undoes your branch with adding contiguous. What was that PR again?

@TedThemistokleous TedThemistokleous linked an issue Oct 20, 2023 that may be closed by this pull request
Copy link
Collaborator

@CharlieL7 CharlieL7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the status on this PR? Waiting on Khalique to respond or additional changes?

@TedThemistokleous
Copy link
Collaborator Author

What's the status on this PR? Waiting on Khalique to respond or additional changes?

Not sure, I haven't seen much movement on it. I dont intend to add anything else to this

TedThemistokleous and others added 7 commits November 10, 2023 19:37
remove the contiguous added when transpose input is used.
remove contiguous that we expect since we don't insert contiguous anymore as part of the simplification in our simplify algebra pass
Remove the extra contiguous put into reshapes
Need to adjust the output of this for comparision. Since we're missing the contiguous we can't rely on find_split_reshape due to the lack of contiguous to gate off of
…implify_reshapes.

Update tests to reflect change accordingly
@TedThemistokleous TedThemistokleous force-pushed the remove_contiguous_from_passes branch from dbc784e to af3e36c Compare November 11, 2023 17:41
@TedThemistokleous
Copy link
Collaborator Author

Rebased off develop and added changes based on @umangyadav pertaining to reshape since this is for reshape_lazy changes done previously

@CharlieL7 if you want to take a look/have comments this should be fine to review now

One question I do have for @umangyadav pruning other reshapes I suppose we'll try to prune based on this list and then handle each matcher case by case? I'll create another PR for these to segment the work up if need be.

const auto& reshaper_names()
{
    // clang-format off
    static const std::unordered_set<std::string> names = {
        "flatten",
        "reshape",
        "contiguous",
        "squeeze",
        "unsqueeze"
    };
    // clang-format on
    return names;
}

@TedThemistokleous TedThemistokleous changed the title Remove contiguous from passes Remove contiguous from passes for reshapes Nov 11, 2023
Copy link
Collaborator

@CharlieL7 CharlieL7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, CI fixes needed

@@ -1897,12 +1897,17 @@ TEST_CASE(simplify_split_add_relu_reshape)
auto concatb = m2.add_instruction(b, concat);
auto sum = m2.add_instruction(migraphx::make_op("add"), input, concatb);
auto relu = m2.add_instruction(migraphx::make_op("relu"), sum);
auto rsp = m2.add_instruction(migraphx::make_op("reshape", {{"dims", {3, 8}}}), relu);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering what change caused this reshape to split into two.

@causten
Copy link
Collaborator

causten commented Nov 14, 2023

@TedThemistokleous Format has a problem after that I can merge

Ted Themistokleous added 2 commits November 14, 2023 15:22
@causten causten merged commit 45ccd75 into develop Nov 15, 2023
16 checks passed
@causten causten deleted the remove_contiguous_from_passes branch November 15, 2023 03:34
@causten causten restored the remove_contiguous_from_passes branch December 1, 2023 20:21
@causten causten deleted the remove_contiguous_from_passes branch January 5, 2024 19:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Cleanup Cleans up code from stale bits/warnings/previous changes for a previous feature PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Cleanup contiguous from reshape passes
6 participants