Skip to content

Conversation

@iblislin
Copy link
Member

make its optimizer configured same as Python's

fix #369


@rickhg12hs Could you checkout it?

make its optimizer configured same as Python's

fix #369
@codecov-io
Copy link

codecov-io commented Dec 11, 2017

Codecov Report

Merging #371 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #371   +/-   ##
=======================================
  Coverage   69.74%   69.74%           
=======================================
  Files          25       25           
  Lines        1963     1963           
=======================================
  Hits         1369     1369           
  Misses        594      594

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 27c66ec...8e99fa9. Read the comment docs.

mx.fit(model, optimizer, train_provider, n_epoch=20, eval_data=eval_provider)
mx.fit(model, optimizer, train_provider,
n_epoch=20,
eval_data=eval_provider

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line needs a comma , at the end, yes?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ha, 👍

got this on my machine
INFO: == Epoch 020/020 ==========
INFO: ## Training summary
INFO:           accuracy = 0.9965
INFO:               time = 5.2912 seconds
INFO: ## Validation summary
INFO:           accuracy = 0.9917
INFO: Finish training on MXNet.mx.Context[GPU0]
@iblislin
Copy link
Member Author

Do you have any GPU to try it out?
Maybe we can add some note in the example first.

@rickhg12hs
Copy link

I have no GPU, sorry.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Segfault running MNIST example lenet-stn.jl

4 participants