Skip to content

A Tensorflow implementation of CapsNet(Capsules Net) in Hinton's paper Dynamic Routing Between Capsules

License

Notifications You must be signed in to change notification settings

dawin2015/CapsNet-Tensorflow

 
 

Repository files navigation

CapsNet-Tensorflow

Contributions welcome License completion Gitter

A Tensorflow implementation of CapsNet based on Geoffrey Hinton's paper Dynamic Routing Between Capsules

capsVSneuron

Status:

  1. The capsule of MNIST version is finished. Now we have two version: a) tag v0.1, not work well due to Issue #8; b) the current version with test accuracy 99.57, see details in the Results section

Daily task

  1. multi-GPU support
  2. Improving the reusability of capsLayer.py, what you need is import capsLayer.fully_connected or import capsLayer.conv2d in your code

Others

  1. Here(知乎) is an answer explaining my understanding of Section 4 of the paper (the core part of CapsNet). It may be helpful in understanding the code.
  2. If you find out any problems, please let me know. I will try my best to 'kill' it ASAP.

Requirements

  • Python
  • NumPy
  • Tensorflow (I'm using 1.3.0, not yet tested for older version)
  • tqdm (for displaying training progress info)
  • scipy (for saving images)

Usage

Step 1. Clone this repository with git.

$ git clone https://github.com/naturomics/CapsNet-Tensorflow.git
$ cd CapsNet-Tensorflow

Step 2. Download the MNIST dataset, mv and extract it into data/mnist directory.(Be careful the backslash appeared around the curly braces when you copy the wget command to your terminal, remove it)

$ mkdir -p data/mnist
$ wget -c -P data/mnist http://yann.lecun.com/exdb/mnist/{train-images-idx3-ubyte.gz,train-labels-idx1-ubyte.gz,t10k-images-idx3-ubyte.gz,t10k-labels-idx1-ubyte.gz}
$ gunzip data/mnist/*.gz

Step 3. Start the training:

$ pip install tqdm  # install it if you haven't installed yet
$ python main.py

The default parameters of batch size is 128, and epoch is 50. You may need to modify the config.py file or use command line parameters to suit your case. In my case, I run python main.py --test_sum_freq=200 --batch_size=48 for my 4G GPU(~10min/epoch)

Results

  • training loss

total_loss margin_loss reconstruction_loss

  • test accuracy(the best result is 99.57%)

test_acc

My simple comments for capsule

  1. A new version neural unit(vector in vector out, not scalar in scalar out)
  2. The routing algorithm is similar to attention mechanism
  3. Anyway, a great potential work, a lot to be built upon

TODO:

  • Finish the MNIST version of capsNet (progress:90%)

  • Do some different experiments for capsNet:

    • Try Using other datasets
    • Adjusting the model structure
  • There is another new paper about capsules(submitted to ICLR 2018), a follow-up of the CapsNet paper.

My weChat:

my_wechat

Reference

About

A Tensorflow implementation of CapsNet(Capsules Net) in Hinton's paper Dynamic Routing Between Capsules

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 98.9%
  • R 1.1%