Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ignore order of arguments while doing layernorm + pointwise fusion #2189

Merged
merged 3 commits into from
Sep 27, 2023

Conversation

umangyadav
Copy link
Member

See the conversation here: #2185 (comment)

@umangyadav umangyadav self-assigned this Sep 14, 2023
@TedThemistokleous TedThemistokleous added roadmap Tasks to finish for a release bugfix Fixes a bug found in the code. labels Sep 14, 2023
@TedThemistokleous TedThemistokleous linked an issue Sep 14, 2023 that may be closed by this pull request
@codecov
Copy link

codecov bot commented Sep 14, 2023

Codecov Report

Merging #2189 (c9d52d6) into develop (434a06c) will not change coverage.
Report is 2 commits behind head on develop.
The diff coverage is n/a.

❗ Current head c9d52d6 differs from pull request most recent head d6fd07c. Consider uploading reports for the commit d6fd07c to get more accurate results

@@           Coverage Diff            @@
##           develop    #2189   +/-   ##
========================================
  Coverage    91.53%   91.53%           
========================================
  Files          429      429           
  Lines        16011    16011           
========================================
  Hits         14655    14655           
  Misses        1356     1356           

@TedThemistokleous TedThemistokleous added the simple small or simple changes label Sep 14, 2023
@migraphx-bot
Copy link
Collaborator

Test Batch Rate new
e4c468
Rate old
7ed797
Diff Compare
torchvision-resnet50 64 2,283.20 2,280.48 0.12%
torchvision-resnet50_fp16 64 5,356.77 5,358.02 -0.02%
torchvision-densenet121 32 1,832.45 1,837.97 -0.30%
torchvision-densenet121_fp16 32 3,394.39 3,386.07 0.25%
torchvision-inceptionv3 32 1,344.15 1,341.03 0.23%
torchvision-inceptionv3_fp16 32 2,584.69 2,593.95 -0.36%
cadene-inceptionv4 16 680.67 679.08 0.23%
cadene-resnext64x4 16 590.88 590.64 0.04%
slim-mobilenet 64 7,219.55 7,223.57 -0.06%
slim-nasnetalarge 64 236.87 236.87 0.00%
slim-resnet50v2 64 2,525.03 2,528.03 -0.12%
bert-mrpc-onnx 8 710.28 720.86 -1.47%
bert-mrpc-tf 1 390.72 389.63 0.28%
pytorch-examples-wlang-gru 1 304.38 306.07 -0.55%
pytorch-examples-wlang-lstm 1 309.82 313.93 -1.31%
torchvision-resnet50_1 1 560.73 558.72 0.36%
torchvision-inceptionv3_1 1 308.33 309.43 -0.36%
cadene-dpn92_1 1 355.98 351.12 1.38%
cadene-resnext101_1 1 220.72 220.34 0.18%
slim-vgg16_1 1 224.23 224.35 -0.05%
slim-mobilenet_1 1 1,489.22 1,485.44 0.25%
slim-inceptionv4_1 1 221.17 223.95 -1.24%
onnx-taau-downsample 1 322.78 322.94 -0.05%
dlrm-criteoterabyte 1 21.70 21.69 0.05%
dlrm-criteoterabyte_fp16 1 40.62 40.63 -0.02%
agentmodel 1 5,839.89 5,743.33 1.68%
unet_fp16 2 55.09 55.13 -0.07%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator


    :white_check_mark:bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

    :white_check_mark:bert-mrpc-tf: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

    :white_check_mark:torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

🔴torchvision-inceptionv3_1: FAILED: MIGraphX is not within tolerance - check verbose output


🔴cadene-dpn92_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:cadene-resnext101_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-vgg16_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-mobilenet_1: PASSED: MIGraphX meets tolerance

🔴slim-inceptionv4_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

    :white_check_mark:agentmodel: PASSED: MIGraphX meets tolerance

    :white_check_mark:unet: PASSED: MIGraphX meets tolerance

@kahmed10 kahmed10 removed their request for review September 27, 2023 00:54
@causten causten merged commit 03d8a25 into develop Sep 27, 2023
14 of 15 checks passed
@causten causten deleted the layernorm_matcher_order branch September 27, 2023 17:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bugfix Fixes a bug found in the code. roadmap Tasks to finish for a release simple small or simple changes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Investigate DETR model failure with latest develop branch
6 participants