LT with PHP
In addition in PHP it can be written as a class LinearTransformation with implementation of a set of linear transformation operations.
Below there are some examples of linear transformation, implemented with PHP.
1. Application of Matrices in Transforming Data
Example: Scale Transformation
Here, the matrix scales x-coordinate by 2 and y-coordinate by 3.
// Example 1: 2x2 matrix (scale transformation)
$transformMatrix = [
[2, 0],
[0, 3]
];
$vector2D = [1, 2];
$transformation2D = new LinearTransformation($transformMatrix);
$result2D = $transformation2D->transform($vector2D);
echo "2D Transformation Result: [" . implode(", ", $result2D) . "]\n";
Output:
2D Transformation Result: [2, 6]
Result Visualization:

2. Linear Transformations in Neural Networks
In neural networks, linear transformations are represented as: . Here, is a weight matrix, is the input, and is the bias vector.
Example: Simple Linear Layer
// Example usage:
$weightMatrix = [[2, -1], [1, 3]]; // Weight matrix W
$inputVector = [1, 2]; // Input vector x
$bias = [1, 0]; // Bias vector b
$linearTransform = new LinearTransformation($weightMatrix);
$result = $linearTransform->linearLayer($inputVector, $bias);
echo "Output after Linear Layer: [" . implode(", ", $result) . "]";
Output:
Output after Linear Layer: [1, 7]
Result Visualization:

3. Linear Transformations in Neural Networks
In a neural network, each layer applies a linear transformation followed by an activation function: .
Here:
is the weight matrix, representing the linear transformation.
is the input vector.
is the bias vector.
Example: Transformation in a Fully Connected Layer
// Example usage:
$weightMatrix = [[3, 2], [-1, 4]]; // Weight matrix W
$inputVector = [1, 2]; // Input vector x
$bias = [1, -2]; // Bias vector b
$linearTransform = new LinearTransformation($weightMatrix);
$result = $linearTransform->linearTransform($weightMatrix, $bias, $inputVector);
echo "Output after Fully Connected Layer: [" . implode(", ", $result) . "]";
Mathematical Representation:
Performing the calculation:
Output:
Output after Fully Connected Layer: [8, 5]
Result Visualization:

4. Activation Functions and the Importance of Nonlinearities
Linear transformations alone cannot solve complex, nonlinear problems. Activation functions like ReLU or Sigmoid introduce nonlinearity to the network.
Example 4: ReLU Activation
The ReLU function is defined as: .
PHP Code:
// Example usage:
$weightMatrix = [[-1, 2], [1, -2]]; // Weight matrix W
$inputVector = [5, 3]; // Input vector x
$bias = [-10, 2]; // Bias vector b
// Example usage with values that will produce both positive and negative results
$transform = new LinearTransformation($weightMatrix);
// Apply linear transformation with bias
$linearResult = $transform->linearLayer($inputVector, $bias);
// Apply ReLU activation
$activated = $transform->relu($linearResult);
echo "Original values: [" . implode(", ", $linearResult) . "]\n";
echo "ReLU Output: [" . implode(", ", $activated) . "]";
Output:
Original values: [-9, 1]
ReLU Output: [0, 1]
Result Visualization:

Last updated