You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the description of section 3.1.1, "Differently, we remove the last batch normalization [12] and ReLU lay�ers [18] in the encoder, as the ReLU cuts off negative values, restricting diverse feature representations. We instead add an L2 normalization layer to make the features have a common scale" But, the code just removes the BN and Relu Layres, no add L2 normalization layer? how to do this add operation?
The text was updated successfully, but these errors were encountered:
Not only that, the memory module does not align with the description in the paper. The ROC score is significantly influenced by the batch size; if the batch size is not 1, the results would deteriorate.
In the description of section 3.1.1, "Differently, we remove the last batch normalization [12] and ReLU lay�ers [18] in the encoder, as the ReLU cuts off negative values, restricting diverse feature representations. We instead add an L2 normalization layer to make the features have a common scale" But, the code just removes the BN and Relu Layres, no add L2 normalization layer? how to do this add operation?
The text was updated successfully, but these errors were encountered: