Artificial Intelligence with PHP
  • Getting Started
    • Introduction
    • Audience
    • How to Read This Book
    • Glossary
    • Contributors
    • Resources
    • Changelog
  • Artificial Intelligence
    • Introduction
    • Overview of AI
      • History of AI
      • How Does AI Work?
      • Structure of AI
      • Will AI Take Over the World?
      • Types of AI
        • Limited Memory AI
        • Reactive AI
        • Theory of Mind AI
        • Self-Aware AI
    • AI Capabilities in PHP
      • Introduction to LLM Agents PHP SDK
      • Overview of AI Libraries in PHP
    • AI Agents
      • Introduction to AI Agents
      • Structure of AI Agent
      • Components of AI Agents
      • Types of AI Agents
      • AI Agent Architecture
      • AI Agent Environment
      • Application of Agents in AI
      • Challenges in AI Agent Development
      • Future of AI Agents
      • Turing Test in AI
      • LLM AI Agents
        • Introduction to LLM AI Agents
        • Implementation in PHP
          • Sales Analyst Agent
          • Site Status Checker Agent
    • Theoretical Foundations of AI
      • Introduction to Theoretical Foundations of AI
      • Problem Solving in AI
        • Introduction
        • Types of Search Algorithms
          • Comparison of Search Algorithms
          • Informed (Heuristic) Search
            • Global Search
              • Beam Search
              • Greedy Search
              • Iterative Deepening A* Search
              • A* Search
                • A* Graph Search
                • A* Graph vs A* Tree Search
                • A* Tree Search
            • Local Search
              • Hill Climbing Algorithm
                • Introduction
                • Best Practices and Optimization
                • Practical Applications
                • Implementation in PHP
              • Simulated Annealing Search
              • Local Beam Search
              • Genetic Algorithms
              • Tabu Search
          • Uninformed (Blind) Search
            • Global Search
              • Bidirectional Search (BDS)
              • Breadth-First Search (BFS)
              • Depth-First Search (DFS)
              • Iterative Deepening Depth-First Search (IDDFS)
              • Uniform Cost Search (UCS)
            • Local Search
              • Depth-Limited Search (DLS)
              • Random Walk Search (RWS)
          • Adversarial Search
          • Means-Ends Analysis
      • Knowledge & Uncertainty in AI
        • Knowledge-Based Agents
        • Knowledge Representation
          • Introduction
          • Approaches to KR in AI
          • The KR Cycle in AI
          • Types of Knowledge in AI
          • KR Techniques
            • Logical Representation
            • Semantic Network Representation
            • Frame Representation
            • Production Rules
        • Reasoning in AI
        • Uncertain Knowledge Representation
        • The Wumpus World
        • Applications and Challenges
      • Cybernetics and AI
      • Philosophical and Ethical Foundations of AI
    • Mathematics for AI
      • Computational Theory in AI
      • Logic and Reasoning
        • Classification of Logics
        • Formal Logic
          • Propositional Logic
            • Basics of Propositional Logic
            • Implementation in PHP
          • Predicate Logic
            • Basics of Predicate Logic
            • Implementation in PHP
          • Second-order and Higher-order Logic
          • Modal Logic
          • Temporal Logic
        • Informal Logic
        • Semi-formal Logic
      • Set Theory and Discrete Mathematics
      • Decision Making in AI
    • Key Application of AI
      • AI in Astronomy
      • AI in Agriculture
      • AI in Automotive Industry
      • AI in Data Security
      • AI in Dating
      • AI in E-commerce
      • AI in Education
      • AI in Entertainment
      • AI in Finance
      • AI in Gaming
      • AI in Healthcare
      • AI in Robotics
      • AI in Social Media
      • AI in Software Development
      • AI in Adult Entertainment
      • AI in Criminal Justice
      • AI in Criminal World
      • AI in Military Domain
      • AI in Terrorist Activities
      • AI in Transforming Our World
      • AI in Travel and Transport
    • Practice
  • Machine Learning
    • Introduction
    • Overview of ML
      • History of ML
        • Origins and Early Concepts
        • 19th Century
        • 20th Century
        • 21st Century
        • Coming Years
      • Key Terms and Principles
      • Machine Learning Life Cycle
      • Problems and Challenges
    • ML Capabilities in PHP
      • Overview of ML Libraries in PHP
      • Configuring an Environment for PHP
        • Direct Installation
        • Using Docker
        • Additional Notes
      • Introduction to PHP-ML
      • Introduction to Rubix ML
    • Mathematics for ML
      • Linear Algebra
        • Scalars
          • Definition and Operations
          • Scalars with PHP
        • Vectors
          • Definition and Operations
          • Vectors in Machine Learning
          • Vectors with PHP
        • Matrices
          • Definition and Types
          • Matrix Operations
          • Determinant of a Matrix
          • Inverse Matrices
          • Cofactor Matrices
          • Adjugate Matrices
          • Matrices in Machine Learning
          • Matrices with PHP
        • Tensors
          • Definition of Tensors
          • Tensor Properties
            • Tensor Types
            • Tensor Dimension
            • Tensor Rank
            • Tensor Shape
          • Tensor Operations
          • Practical Applications
          • Tensors in Machine Learning
          • Tensors with PHP
        • Linear Transformations
          • Introduction
          • LT with PHP
          • LT Role in Neural Networks
        • Eigenvalues and Eigenvectors
        • Norms and Distances
        • Linear Algebra in Optimization
      • Calculus
      • Probability and Statistics
      • Information Theory
      • Optimization Techniques
      • Graph Theory and Networks
      • Discrete Mathematics and Combinatorics
      • Advanced Topics
    • Data Fundamentals
      • Data Types and Formats
        • Data Types
        • Structured Data Formats
        • Unstructured Data Formats
        • Implementation with PHP
      • General Data Processing
        • Introduction
        • Storage and Management
          • Data Security and Privacy
          • Data Serialization and Deserialization in PHP
          • Data Versioning and Management
          • Database Systems for AI
          • Efficient Data Storage Techniques
          • Optimizing Data Retrieval for AI Algorithms
          • Big Data Considerations
            • Introduction
            • Big Data Techniques in PHP
      • ML Data Processing
        • Introduction
        • Types of Data in ML
        • Stages of Data Processing
          • Data Acquisition
            • Data Collection
            • Ethical Considerations in Data Preparation
          • Data Cleaning
            • Data Cleaning Examples
            • Data Cleaning Types
            • Implementation with PHP
          • Data Transformation
            • Data Transformation Examples
            • Data Transformation Types
            • Implementation with PHP ?..
          • Data Integration
          • Data Reduction
          • Data Validation and Testing
            • Data Splitting and Sampling
          • Data Representation
            • Data Structures in PHP
            • Data Visualization Techniques
          • Typical Problems with Data
    • ML Algorithms
      • Classification of ML Algorithms
        • By Methods Used
        • By Learning Types
        • By Tasks Resolved
        • By Feature Types
        • By Model Depth
      • Supervised Learning
        • Regression
          • Linear Regression
            • Types of Linear Regression
            • Finding Best Fit Line
            • Gradient Descent
            • Assumptions of Linear Regression
            • Evaluation Metrics for Linear Regression
            • How It Works by Math
            • Implementation in PHP
              • Multiple Linear Regression
              • Simple Linear Regression
          • Polynomial Regression
            • Introduction
            • Implementation in PHP
          • Support Vector Regression
        • Classification
        • Recommendation Systems
          • Matrix Factorization
          • User-Based Collaborative Filtering
      • Unsupervised Learning
        • Clustering
        • Dimension Reduction
        • Search and Optimization
        • Recommendation Systems
          • Item-Based Collaborative Filtering
          • Popularity-Based Recommendations
      • Semi-Supervised Learning
        • Regression
        • Classification
        • Clustering
      • Reinforcement Learning
      • Distributed Learning
    • Integrating ML into Web
      • Open-Source Projects
      • Introduction to EasyAI-PHP
    • Key Applications of ML
    • Practice
  • Neural Networks
    • Introduction
    • Overview of NN
      • History of NN
      • Basic Components of NN
        • Activation Functions
        • Connections and Weights
        • Inputs
        • Layers
        • Neurons
      • Problems and Challenges
      • How NN Works
    • NN Capabilities in PHP
    • Mathematics for NN
    • Types of NN
      • Classification of NN Types
      • Linear vs Non-Linear Problems in NN
      • Basic NN
        • Simple Perceptron
        • Implementation in PHP
          • Simple Perceptron with Libraries
          • Simple Perceptron with Pure PHP
      • NN with Hidden Layers
      • Deep Learning
      • Bayesian Neural Networks
      • Convolutional Neural Networks (CNN)
      • Recurrent Neural Networks (RNN)
    • Integrating NN into Web
    • Key Applications of NN
    • Practice
  • Natural Language Processing
    • Introduction
    • Overview of NLP
      • History of NLP
        • Ancient Times
        • Medieval Period
        • 15th-16th Century
        • 17th-18th Century
        • 19th Century
        • 20th Century
        • 21st Century
        • Coming Years
      • NLP and Text
      • Key Concepts in NLP
      • Common Challenges in NLP
      • Machine Learning Role in NLP
    • NLP Capabilities in PHP
      • Overview of NLP Libraries in PHP
      • Challenges in NLP with PHP
    • Mathematics for NLP
    • NLP Techniques
      • Basic Text Processing with PHP
      • NLP Workflow
      • Popular Tools and Frameworks for NLP
      • Techniques and Algorithms in NLP
        • Basic NLP Techniques
        • Advanced NLP Techniques
      • Advanced NLP Topics
    • Integrating NLP into Web
    • Key Applications of NLP
    • Practice
  • Computer Vision
    • Introduction
  • Overview of CV
    • History of CV
    • Common Use Cases
  • CV Capabilities in PHP
  • Mathematics for CV
  • CV Techniques
  • Integrating CV into Web
  • Key Applications of CV
  • Practice
  • Robotics
    • Introduction
  • Overview of Robotics
    • History and Evolution of Robotics
    • Core Components
      • Sensors (Perception)
      • Actuators (Action)
      • Controllers (Processing and Logic)
    • The Role of AI in Robotics
      • Object Detection and Recognition
      • Path Planning and Navigation
      • Decision Making and Learning
  • Robotics Capabilities in PHP
  • Mathematics for Robotics
  • Building Robotics
  • Integration Robotics into Web
  • Key Applications of Robotics
  • Practice
  • Expert Systems
    • Introduction
    • Overview of ES
      • History of ES
        • Origins and Early ES
        • Milestones in the Evolution of ES
        • Expert Systems in Modern AI
      • Core Components and Architecture
      • Challenges and Limitations
      • Future Trends
    • ES Capabilities in PHP
    • Mathematics for ES
    • Building ES
      • Knowledge Representation Approaches
      • Inference Mechanisms
      • Best Practices for Knowledge Base Design and Inference
    • Integration ES into Web
    • Key Applications of ES
    • Practice
  • Cognitive Computing
    • Introduction
    • Overview of CC
      • History of CC
      • Differences Between CC and AI
    • CC Compatibilities in PHP
    • Mathematics for CC
    • Building CC
      • Practical Implementation
    • Integration CC into Web
    • Key Applications of CC
    • Practice
  • AI Ethics and Safety
    • Introduction
    • Overview of AI Ethics
      • Core Principles of AI Ethics
      • Responsible AI Development
      • Looking Ahead: Ethical AI Governance
    • Building Ethics & Safety AI
      • Fairness, Bias, and Transparency
        • Bias in AI Models
        • Model Transparency and Explainability
        • Auditing, Testing, and Continuous Monitoring
      • Privacy and Security in AI
        • Data Privacy and Consent
        • Safety Mechanisms in AI Integration
        • Preventing and Handling AI Misuse
      • Ensuring AI Accountability
        • Ethical AI in Decision Making
        • Regulations & Compliance
        • AI Risk Assessment
    • Key Applications of AI Ethics
    • Practice
  • Epilog
    • Summing-up
Powered by GitBook
On this page
  • Gradient Descent in Linear Regression
  • What is Gradient Descent
  • Mathematics Behind Gradient Descent
  • TODO: implement in PHP PHP Implementation of Gradient Descent
  1. Machine Learning
  2. ML Algorithms
  3. Supervised Learning
  4. Regression
  5. Linear Regression

Gradient Descent

Gradient Descent in Linear Regression

We know that in any machine learning project our main aim relies on how good our project accuracy is or how much our model prediction differs from the actual data point. Based on the difference between model prediction and actual data points we try to find the parameters of the model which give better accuracy on our dataset\, In order to find these parameters we apply gradient descent on the cost function of the machine learning model.

What is Gradient Descent

Gradient Descent is an iterative optimization algorithm that tries to find the optimum value (Minimum/Maximum) of an objective function. It is one of the most used optimization techniques in machine learning projects for updating the parameters of a model in order to minimize a cost function.

The main aim of gradient descent is to find the best parameters of a model which gives the highest accuracy on training as well as testing datasets. In gradient descent, The gradient is a vector that points in the direction of the steepest increase of the function at a specific point. Moving in the opposite direction of the gradient allows the algorithm to gradually descend towards lower values of the function, and eventually reaching to the minimum of the function.

Steps Required in Gradient Descent Algorithm

  • Step 1 we first initialize the parameters of the model randomly

  • Step 2 Compute the gradient of the cost function with respect to each parameter. It involves making partial differentiation of cost function with respect to the parameters.

  • Step 3 Update the parameters of the model by taking steps in the opposite direction of the model. Here we choose a hyperparameter learning rate which is denoted by alpha. It helps in deciding the step size of the gradient.

  • Step 4 Repeat steps 2 and 3 iteratively to get the best parameter for the defined model

Pseudocode for Gradient Descent

t ← 0
max_iterations ← 1000
w, b ← initialize randomly

while t < max_iterations do
    t ← t + 1
    w_t+1 ← w_t − η ∇w_t
    b_t+1 ← b_t − η ∇b_t
end

Here max_iterations is the number of iteration we want to do to update our parameter

W,b are the weights and bias parameter

η is the learning parameter also denoted by alpha

To apply this gradient descent on data using any programming language we have to make four new functions using which we can update our parameter and apply it to data to make a prediction. We will see each function one by one and understand it

  1. gradient_descent – In the gradient descent function we will make the prediction on a dataset and compute the difference between the predicted and actual target value and accordingly we will update the parameter and hence it will return the updated parameter.

  2. compute_predictions – In this function, we will compute the prediction using the parameters at each iteration.

  3. compute_gradient – In this function we will compute the error which is the difference between the actual and predicted target value and then compute the gradient using this error and training data.

  4. update_parameters – In this separate function we will update the parameter using learning rate and gradient that we got from the compute_gradient function.

function gradient_descent(X, y, learning_rate, num_iterations):
    Initialize parameters  = θ
    for iter in range(num_iterations):
        predictions = compute_predictions(X, θ)
        gradient = compute_gradient(X, y, predictions)
        update_parameters(θ, gradient, learning_rate)
    return θ

function compute_predictions(X, θ):
    return X*θ

function compute_gradient(X, y, predictions):
    error = predictions - y
    gradient = Xᵀ * error / m
    return gradient

function update_parameters(θ, gradient, learning_rate):
    θ = θ - learning_rate ⨉ gradient

Mathematics Behind Gradient Descent

In the Machine Learning Regression problem, our model targets to get the best-fit regression line to predict the value y based on the given input value (x). While training the model, the model calculates the cost function like Root Mean Squared error between the predicted value (pred) and true value (y). Our model targets to minimize this cost function. To minimize this cost function, the model needs to have the best value of θ1 and θ2(for Univariate linear regression problem). Initially model selects θ1 and θ2 values randomly and then iteratively update these value in order to minimize the cost function until it reaches the minimum. By the time model achieves the minimum cost function, it will have the best θ1 and θ2 values. Using these updated values of θ1 and θ2 in the hypothesis equation of linear equation, our model will predict the output value y.

How do θ1 and θ2 values get updated?

Linear Regression Cost Function: J(θ)=12m∑i=1m(hθ(x(i))–y(i))2J(θ)=2m1​∑i=1m​(hθ​(x(i))–y(i))2

so our model aim is to Minimize \frac{1}{2m} \sum_{i=1}{m} (h_\theta(x{(i)}) – y{(i)})2 and store the parameters which makes it minimum.

Gradient Descent Algorithm For Linear Regression

Cost Function

J(Θ0,Θ1)=12m∑i=1m[hΘ(xi)−yi]2J(\Theta_0, \Theta_1) = \frac{1}{2m} \sum_{i=1}^{m} \left[ h_\Theta(x_i) - y_i \right]^2J(Θ0​,Θ1​)=2m1​∑i=1m​[hΘ​(xi​)−yi​]2

• True Value: yiy_iyi​

• Predicted Value: hΘ(xi)h_\Theta(x_i)hΘ​(xi​)

Gradient Descent

Θj=Θj−α∂∂ΘjJ(Θ0,Θ1)\Theta_j = \Theta_j - \alpha \frac{\partial}{\partial \Theta_j} J(\Theta_0, \Theta_1)Θj​=Θj​−α∂Θj​∂​J(Θ0​,Θ1​)

• Learning Rate: α\alphaα

Now: ∂∂ΘjJΘ=∂∂Θj12m∑i=1m[hΘ(xi)−y]2\frac{\partial}{\partial \Theta_j} J_\Theta = \frac{\partial}{\partial \Theta_j} \frac{1}{2m} \sum_{i=1}^{m} \left[ h_\Theta(x_i) - y \right]^2∂Θj​∂​JΘ​=∂Θj​∂​2m1​∑i=1m​[hΘ​(xi​)−y]2

=1m∑i=1m(hΘ(xi)−y)∂∂Θj(Θxi−y)= \frac{1}{m} \sum_{i=1}^{m} \left( h_\Theta(x_i) - y \right) \frac{\partial}{\partial \Theta_j} (\Theta x_i - y)=m1​∑i=1m​(hΘ​(xi​)−y)∂Θj​∂​(Θxi​−y)

=1m∑i=1m(hΘ(xi)−y)xi= \frac{1}{m} \sum_{i=1}^{m} \left( h_\Theta(x_i) - y \right) x_i=m1​∑i=1m​(hΘ​(xi​)−y)xi​

Therefore: Θj:=Θj−αm∑i=1m(hΘ(xi)−y)xi\Theta_j := \Theta_j - \frac{\alpha}{m} \sum_{i=1}^{m} \left( h_\Theta(x_i) - y \right) x_iΘj​:=Θj​−mα​∑i=1m​(hΘ​(xi​)−y)xi​

Gradient descent algorithm for linear regression

-> θj     : Weights of the hypothesis.
-> hθ(xi) : predicted y value for ith input.
-> i     : Feature index number (can be 0, 1, 2, ......, n).
-> α     : Learning Rate of Gradient Descent.

How Does Gradient Descent Work

  • If slope is +ve : θj = θj – (+ve value). Hence the value of θj decreases.\

If slope is +ve in Gradient Descent

  • If slope is -ve : θj = θj – (-ve value). Hence the value of θj increases.

<div align="left"><figure><img src="../../../../../.gitbook/assets/ml-gradient-descent-slope-2-min.jpg" alt=""><figcaption></figcaption></figure></div>

If slope is -ve in Gradient Descent

How To Choose Learning Rate

The choice of correct learning rate is very important as it ensures that Gradient Descent converges in a reasonable time. :

  • If we choose α to be very large, Gradient Descent can overshoot the minimum. It may fail to converge or even diverge. \

Effect of large alpha value on Gradient Descent

  • If we choose α to be very small, Gradient Descent will take small steps to reach local minima and will take a longer time to reach minima. \

Effect of small alpha value on Gradient Descent

TODO: implement in PHP PHP Implementation of Gradient Descent

\

PreviousFinding Best Fit LineNextAssumptions of Linear Regression

Last updated 1 month ago

Gradient descent works by moving downward toward the pits or valleys in the graph to find the minimum value. This is achieved by taking the derivative of the cost function, as illustrated in the figure below. During each iteration, gradient descent step-downs the in the direction of the steepest descent. By adjusting the parameters in this direction, it seeks to reach the minimum of the cost function and find the best-fit values for the parameters. The size of each step is determined by parameter α known as Learning Rate. In the Gradient Descent algorithm, one can infer two points :

cost function