Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REQUEST] Softmax activation function for encoders #416

Open
Tengoles opened this issue Sep 8, 2024 · 1 comment
Open

[REQUEST] Softmax activation function for encoders #416

Tengoles opened this issue Sep 8, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@Tengoles
Copy link

Tengoles commented Sep 8, 2024

Is there a particular reason why softmax activation functions are not supported? I undertand they can be useful for a resource allocation scenario. As far as I can tell, adding it only requires a slight change in this file:

def create_activation(activation_type: str) -> nn.Module:

@Tengoles Tengoles added the enhancement New feature or request label Sep 8, 2024
@takuseno
Copy link
Owner

takuseno commented Sep 9, 2024

@Tengoles Hi, thanks for the issue. I feel that softmax is not a common choice for an activation function. Do you have any papers where softmax activation is actually used?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants