Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Multilayer Perceptron (MLP) primitive for binary classification #140

Closed
Hector-hedb12 opened this issue Mar 26, 2019 · 3 comments · Fixed by #144
Closed

Add Multilayer Perceptron (MLP) primitive for binary classification #140

Hector-hedb12 opened this issue Mar 26, 2019 · 3 comments · Fixed by #144
Assignees
Labels
approved The issue is approved and someone can start working on it new primitives A new primitive is being requested
Milestone

Comments

@Hector-hedb12
Copy link
Contributor

Related to #121

@Hector-hedb12
Copy link
Contributor Author

I would like to work on this

@csala
Copy link
Contributor

csala commented Mar 26, 2019

@Hector-hedb12 Would you mind adding here the link to the corresponding section in the keras docs and summarize what the architecture will be?

Just to keep track of what each issue is

@csala csala added new primitives A new primitive is being requested approved The issue is approved and someone can start working on it labels Mar 26, 2019
@Hector-hedb12
Copy link
Contributor Author

Hector-hedb12 commented Mar 26, 2019

The architecture would be:

Dense (ReLU) layer ---> Dropout layer --> Dense (ReLU) layer ---> Dropout layer --> Dense (sigmoid) layer

You can find an example of this here in the MLP for binary classification section:

import numpy as np
from keras.models import Sequential
from keras.layers import Dense, Dropout

# Generate dummy data
x_train = np.random.random((1000, 20))
y_train = np.random.randint(2, size=(1000, 1))
x_test = np.random.random((100, 20))
y_test = np.random.randint(2, size=(100, 1))

model = Sequential()
model.add(Dense(64, input_dim=20, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy',
              optimizer='rmsprop',
              metrics=['accuracy'])

model.fit(x_train, y_train,
          epochs=20,
          batch_size=128)
score = model.evaluate(x_test, y_test, batch_size=128)

@csala csala added this to the 0.1.8 milestone Apr 1, 2019
@csala csala modified the milestones: 0.1.8, 0.1.9 Apr 25, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved The issue is approved and someone can start working on it new primitives A new primitive is being requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants