Work developed to apply the gradient descent techniques studied during Neural Networks class in the master's degree in Informatics. The work consisted of minimizing the value of the functions using two possible approaches. In the end, some graphics and images in gif format were exported, helping to understand the method. The detailed description of the work is in this repository, in "Trabalho de Redes Neurais.pdf".
To clone and run this application, you will need Github installed, or you can download the zip file on your computer. From your command line:
# Clone this repository
$ git clone https://github.com/VitordsAmorim/Redes-Neurais.git
# Go into the repository
$ cd Redes-Neurais
# Install dependencies
$ pip install -r requirements.txt
# Run the code
$ python3 main.py
Project is created with:
- Python version: 3.10.6
The gradient descent method is applied to each pair of points using the analytically calculated derivative. The analyzed function and its gradient are respectively:
# Given the set of starting points:
initial_points = [[1, 1], [-0.5, -0.5], [0, 0], [0.3, -0.2], [0.7, 1], [1, -0.5]]
The solution of the function at each new iteration decreases and converges to minimum points. Thus the goal of finding the parameters which minimize the function was achieved, but there is no guarantee that it is a global minimum. As shown in the image below, the darker the region, the lower the value of the function.
Vitor Amorim | Wesley Pimentel |