A Machine Learning project to predict user interactions with social network ads using demographic data to optimize ad targeting
-
Updated
Sep 7, 2024 - Jupyter Notebook
A Machine Learning project to predict user interactions with social network ads using demographic data to optimize ad targeting
Machine learning algorithms in Dart programming language
This repository contains a project that demonstrates how to perform sentiment analysis on Twitter data using Apache Spark, including data preprocessing, feature engineering, model training, and evaluation.
Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.
Following and implementing (some of) the machine learning algorithms from scratch based on the Stanford CS229 course.
Recreated Poudlard's Sorting Hat by implementing logistic regression from scratch.
Implementation and in-depth comparative analysis of two foundational machine learning optimization algorithms, Stochastic Gradient Descent (SGD) and Batch Gradient Descent (BGD).
Linear Regression - Batch Gradient Descent
Gradient Descent(From Scratch & With TensorFlow)
Gradient Descent with multiple method: Univariate - Multivariate, Momentum, Batch Gradient Descent, ...
developed a model that can predict air temperature according to atmospheric pressure.
Python machine learning applications in image processing, recommender system, matrix completion, netflix problem and algorithm implementations including Co-clustering, Funk SVD, SVD++, Non-negative Matrix Factorization, Koren Neighborhood Model, Koren Integrated Model, Dawid-Skene, Platt-Burges, Expectation Maximization, Factor Analysis, ISTA, F…
This repository includes implementation of the basic optimization algorithms (Batch-Mini-stochatic)Gradient descents and NAG,Adagrad,RMSProp and Adam)
Numerical Optimization for Machine Learning & Data Science
Assignments from the AI course.
Softmax Regression from scratch. MNIST dataset
An easy implementation of the Stochastic / Batch gradient descent and comparison with the standard Gradient Descent Method
Analyzing and overcoming the curse of dimensionality and exploring various gradient descent techniques with implementations in R
Add a description, image, and links to the batch-gradient-descent topic page so that developers can more easily learn about it.
To associate your repository with the batch-gradient-descent topic, visit your repo's landing page and select "manage topics."