You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Think I may have answered my own question, assuming I'm right in thinking that changing the num_epoch variable controls this.
# We will show a quick demo with only 1 epoch. In practice, we can set it to be 100
num_epoch = 75
# learning rate
learning_rate = 0.01
...
model = mx.model.FeedForward(
ctx=mx.cpu(0),
symbol=symbol,
num_epoch=num_epoch,
learning_rate=learning_rate,
momentum=0,
wd=0.0001,
initializer=mx.init.Xavier(factor_type="in", magnitude=2.34))
Could anyone explain the use of epoch in the FeedForward method?
I've had a look at the char_lstm tutorial and notice a file named obama-0075.params that is included in the zip file download.
Could you explain how this file is generated or point me in the right direction?
The text was updated successfully, but these errors were encountered: