Skip to content

Commit

Permalink
use https instead of http for web links
Browse files Browse the repository at this point in the history
  • Loading branch information
Hossein Pourbozorg committed Apr 25, 2019
1 parent 01ffa21 commit 7f06b15
Show file tree
Hide file tree
Showing 6 changed files with 10 additions and 10 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,15 @@
<img width="400px" src="https://raw.githubusercontent.com/FluxML/fluxml.github.io/master/logo.png"/>
</p>

[![Build Status](https://travis-ci.org/FluxML/Flux.jl.svg?branch=master)](https://travis-ci.org/FluxML/Flux.jl) [![](https://img.shields.io/badge/docs-stable-blue.svg)](https://fluxml.github.io/Flux.jl/stable/) [![](https://img.shields.io/badge/chat-on%20slack-yellow.svg)](https://slackinvite.julialang.org/) [![DOI](http://joss.theoj.org/papers/10.21105/joss.00602/status.svg)](https://doi.org/10.21105/joss.00602)
[![Build Status](https://travis-ci.org/FluxML/Flux.jl.svg?branch=master)](https://travis-ci.org/FluxML/Flux.jl) [![](https://img.shields.io/badge/docs-stable-blue.svg)](https://fluxml.github.io/Flux.jl/stable/) [![](https://img.shields.io/badge/chat-on%20slack-yellow.svg)](https://slackinvite.julialang.org/) [![DOI](https://joss.theoj.org/papers/10.21105/joss.00602/status.svg)](https://doi.org/10.21105/joss.00602)

Flux is an elegant approach to machine learning. It's a 100% pure-Julia stack, and provides lightweight abstractions on top of Julia's native GPU and AD support. Flux makes the easy things easy while remaining fully hackable.

```julia
julia> Pkg.add("Flux")
```

See the [documentation](http://fluxml.github.io/Flux.jl/) or the [model zoo](https://github.com/FluxML/model-zoo/) for examples.
See the [documentation](https://fluxml.github.io/Flux.jl/) or the [model zoo](https://github.com/FluxML/model-zoo/) for examples.

If you use Flux in research, please cite the following paper:

Expand Down
4 changes: 2 additions & 2 deletions paper/paper.bib
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ @article{besard:2017
journal = {arXiv},
volume = {abs/11712.03112},
year = {2017},
url = {http://arxiv.org/abs/1712.03112},
url = {https://arxiv.org/abs/1712.03112},
}

@online{MLPL,
Expand All @@ -29,7 +29,7 @@ @online{CuArrays
author = {Mike Innes and others},
title = {Generic GPU Kernels},
year = 2017,
url = {http://mikeinnes.github.io/2017/08/24/cudanative.html},
url = {https://mikeinnes.github.io/2017/08/24/cudanative.html},
urldate = {2018-02-16}
}

Expand Down
2 changes: 1 addition & 1 deletion src/data/cmudict.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ function load()
@info "Downloading CMUDict dataset"
mkpath(deps("cmudict"))
for (x, hash) in suffixes_and_hashes
download_and_verify("$cache_prefix/http://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict/cmudict-$version$x",
download_and_verify("$cache_prefix/https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict/cmudict-$version$x",
deps("cmudict", "cmudict$x"), hash)
end
end
Expand Down
2 changes: 1 addition & 1 deletion src/data/iris.jl
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ function load()
isfile(deps("iris.data")) && return

@info "Downloading iris dataset."
download_and_verify("$(cache_prefix)http://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data",
download_and_verify("$(cache_prefix)https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data",
deps("iris.data"),
"6f608b71a7317216319b4d27b4d9bc84e6abd734eda7872b71a458569e2656c0")
end
Expand Down
4 changes: 2 additions & 2 deletions src/layers/recurrent.jl
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ Base.show(io::IO, l::LSTMCell) =
Long Short Term Memory recurrent layer. Behaves like an RNN but generally
exhibits a longer memory span over sequences.
See [this article](http://colah.github.io/posts/2015-08-Understanding-LSTMs/)
See [this article](https://colah.github.io/posts/2015-08-Understanding-LSTMs/)
for a good overview of the internals.
"""
LSTM(a...; ka...) = Recur(LSTMCell(a...; ka...))
Expand Down Expand Up @@ -194,7 +194,7 @@ Base.show(io::IO, l::GRUCell) =
Gated Recurrent Unit layer. Behaves like an RNN but generally
exhibits a longer memory span over sequences.
See [this article](http://colah.github.io/posts/2015-08-Understanding-LSTMs/)
See [this article](https://colah.github.io/posts/2015-08-Understanding-LSTMs/)
for a good overview of the internals.
"""
GRU(a...; ka...) = Recur(GRUCell(a...; ka...))
4 changes: 2 additions & 2 deletions src/optimise/optimisers.jl
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ end
"""
RMSProp(η = 0.001, ρ = 0.9)
[RMSProp](http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf)
[RMSProp](https://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf)
optimiser. Parameters other than learning rate don't need tuning. Often a good
choice for recurrent networks.
"""
Expand Down Expand Up @@ -155,7 +155,7 @@ end
"""
ADADelta(ρ = 0.9, ϵ = 1e-8)
[ADADelta](http://arxiv.org/abs/1212.5701) optimiser. Parameters don't need
[ADADelta](https://arxiv.org/abs/1212.5701) optimiser. Parameters don't need
tuning.
"""
mutable struct ADADelta
Expand Down

0 comments on commit 7f06b15

Please sign in to comment.