Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Preserve layout of fused kernel for layernorm+pointwise #2185

Merged
merged 6 commits into from
Sep 15, 2023

Conversation

umangyadav
Copy link
Member

Fixes #2137

@umangyadav umangyadav requested a review from pfultz2 September 14, 2023 00:58
@umangyadav umangyadav self-assigned this Sep 14, 2023
@codecov
Copy link

codecov bot commented Sep 14, 2023

Codecov Report

Merging #2185 (3b04815) into develop (74ba964) will not change coverage.
The diff coverage is n/a.

❗ Current head 3b04815 differs from pull request most recent head a4249e4. Consider uploading reports for the commit a4249e4 to get more accurate results

@@           Coverage Diff            @@
##           develop    #2185   +/-   ##
========================================
  Coverage    91.48%   91.48%           
========================================
  Files          426      426           
  Lines        15927    15927           
========================================
  Hits         14571    14571           
  Misses        1356     1356           

@migraphx-bot
Copy link
Collaborator

migraphx-bot commented Sep 14, 2023

Test Batch Rate new
d23fba
Rate old
7ed797
Diff Compare
torchvision-resnet50 64 2,280.44 2,280.48 -0.00%
torchvision-resnet50_fp16 64 5,364.49 5,358.02 0.12%
torchvision-densenet121 32 1,828.68 1,837.97 -0.51%
torchvision-densenet121_fp16 32 3,383.38 3,386.07 -0.08%
torchvision-inceptionv3 32 1,341.25 1,341.03 0.02%
torchvision-inceptionv3_fp16 32 2,587.26 2,593.95 -0.26%
cadene-inceptionv4 16 678.04 679.08 -0.15%
cadene-resnext64x4 16 589.22 590.64 -0.24%
slim-mobilenet 64 7,215.10 7,223.57 -0.12%
slim-nasnetalarge 64 237.12 236.87 0.11%
slim-resnet50v2 64 2,528.95 2,528.03 0.04%
bert-mrpc-onnx 8 720.92 720.86 0.01%
bert-mrpc-tf 1 391.31 389.63 0.43%
pytorch-examples-wlang-gru 1 300.72 306.07 -1.75%
pytorch-examples-wlang-lstm 1 313.75 313.93 -0.06%
torchvision-resnet50_1 1 555.87 558.72 -0.51%
torchvision-inceptionv3_1 1 302.61 309.43 -2.20%
cadene-dpn92_1 1 358.31 351.12 2.05%
cadene-resnext101_1 1 220.80 220.34 0.21%
slim-vgg16_1 1 224.29 224.35 -0.03%
slim-mobilenet_1 1 1,507.94 1,485.44 1.51%
slim-inceptionv4_1 1 222.30 223.95 -0.74%
onnx-taau-downsample 1 320.72 322.94 -0.69%
dlrm-criteoterabyte 1 21.68 21.69 -0.07%
dlrm-criteoterabyte_fp16 1 40.61 40.63 -0.04%
agentmodel 1 5,803.56 5,743.33 1.05%
unet_fp16 2 55.03 55.13 -0.17%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator


    :white_check_mark:bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

    :white_check_mark:bert-mrpc-tf: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

    :white_check_mark:torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

🔴torchvision-inceptionv3_1: FAILED: MIGraphX is not within tolerance - check verbose output


🔴cadene-dpn92_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:cadene-resnext101_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-vgg16_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-mobilenet_1: PASSED: MIGraphX meets tolerance

🔴slim-inceptionv4_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

    :white_check_mark:agentmodel: PASSED: MIGraphX meets tolerance

    :white_check_mark:unet: PASSED: MIGraphX meets tolerance

@TedThemistokleous TedThemistokleous added enhancement New feature or request bugfix Fixes a bug found in the code. labels Sep 14, 2023
@causten causten merged commit 15acaee into develop Sep 15, 2023
11 of 12 checks passed
@causten causten deleted the layernorm_nonstd branch September 15, 2023 17:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bugfix Fixes a bug found in the code. enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Investigate DETR model failure with latest develop branch
5 participants