Artificial Intelligence with PHP
  • Getting Started
    • Introduction
    • Audience
    • How to Read This Book
    • Glossary
    • Contributors
    • Resources
    • Changelog
  • Artificial Intelligence
    • Introduction
    • Overview of AI
      • History of AI
      • How Does AI Work?
      • Structure of AI
      • Will AI Take Over the World?
      • Types of AI
        • Limited Memory AI
        • Reactive AI
        • Theory of Mind AI
        • Self-Aware AI
    • AI Capabilities in PHP
      • Introduction to LLM Agents PHP SDK
      • Overview of AI Libraries in PHP
    • AI Agents
      • Introduction to AI Agents
      • Structure of AI Agent
      • Components of AI Agents
      • Types of AI Agents
      • AI Agent Architecture
      • AI Agent Environment
      • Application of Agents in AI
      • Challenges in AI Agent Development
      • Future of AI Agents
      • Turing Test in AI
      • LLM AI Agents
        • Introduction to LLM AI Agents
        • Implementation in PHP
          • Sales Analyst Agent
          • Site Status Checker Agent
    • Theoretical Foundations of AI
      • Introduction to Theoretical Foundations of AI
      • Problem Solving in AI
        • Introduction
        • Types of Search Algorithms
          • Comparison of Search Algorithms
          • Informed (Heuristic) Search
            • Global Search
              • Beam Search
              • Greedy Search
              • Iterative Deepening A* Search
              • A* Search
                • A* Graph Search
                • A* Graph vs A* Tree Search
                • A* Tree Search
            • Local Search
              • Hill Climbing Algorithm
                • Introduction
                • Best Practices and Optimization
                • Practical Applications
                • Implementation in PHP
              • Simulated Annealing Search
              • Local Beam Search
              • Genetic Algorithms
              • Tabu Search
          • Uninformed (Blind) Search
            • Global Search
              • Bidirectional Search (BDS)
              • Breadth-First Search (BFS)
              • Depth-First Search (DFS)
              • Iterative Deepening Depth-First Search (IDDFS)
              • Uniform Cost Search (UCS)
            • Local Search
              • Depth-Limited Search (DLS)
              • Random Walk Search (RWS)
          • Adversarial Search
          • Means-Ends Analysis
      • Knowledge & Uncertainty in AI
        • Knowledge-Based Agents
        • Knowledge Representation
          • Introduction
          • Approaches to KR in AI
          • The KR Cycle in AI
          • Types of Knowledge in AI
          • KR Techniques
            • Logical Representation
            • Semantic Network Representation
            • Frame Representation
            • Production Rules
        • Reasoning in AI
        • Uncertain Knowledge Representation
        • The Wumpus World
        • Applications and Challenges
      • Cybernetics and AI
      • Philosophical and Ethical Foundations of AI
    • Mathematics for AI
      • Computational Theory in AI
      • Logic and Reasoning
        • Classification of Logics
        • Formal Logic
          • Propositional Logic
            • Basics of Propositional Logic
            • Implementation in PHP
          • Predicate Logic
            • Basics of Predicate Logic
            • Implementation in PHP
          • Second-order and Higher-order Logic
          • Modal Logic
          • Temporal Logic
        • Informal Logic
        • Semi-formal Logic
      • Set Theory and Discrete Mathematics
      • Decision Making in AI
    • Key Application of AI
      • AI in Astronomy
      • AI in Agriculture
      • AI in Automotive Industry
      • AI in Data Security
      • AI in Dating
      • AI in E-commerce
      • AI in Education
      • AI in Entertainment
      • AI in Finance
      • AI in Gaming
      • AI in Healthcare
      • AI in Robotics
      • AI in Social Media
      • AI in Software Development
      • AI in Adult Entertainment
      • AI in Criminal Justice
      • AI in Criminal World
      • AI in Military Domain
      • AI in Terrorist Activities
      • AI in Transforming Our World
      • AI in Travel and Transport
    • Practice
  • Machine Learning
    • Introduction
    • Overview of ML
      • History of ML
        • Origins and Early Concepts
        • 19th Century
        • 20th Century
        • 21st Century
        • Coming Years
      • Key Terms and Principles
      • Machine Learning Life Cycle
      • Problems and Challenges
    • ML Capabilities in PHP
      • Overview of ML Libraries in PHP
      • Configuring an Environment for PHP
        • Direct Installation
        • Using Docker
        • Additional Notes
      • Introduction to PHP-ML
      • Introduction to Rubix ML
    • Mathematics for ML
      • Linear Algebra
        • Scalars
          • Definition and Operations
          • Scalars with PHP
        • Vectors
          • Definition and Operations
          • Vectors in Machine Learning
          • Vectors with PHP
        • Matrices
          • Definition and Types
          • Matrix Operations
          • Determinant of a Matrix
          • Inverse Matrices
          • Cofactor Matrices
          • Adjugate Matrices
          • Matrices in Machine Learning
          • Matrices with PHP
        • Tensors
          • Definition of Tensors
          • Tensor Properties
            • Tensor Types
            • Tensor Dimension
            • Tensor Rank
            • Tensor Shape
          • Tensor Operations
          • Practical Applications
          • Tensors in Machine Learning
          • Tensors with PHP
        • Linear Transformations
          • Introduction
          • LT with PHP
          • LT Role in Neural Networks
        • Eigenvalues and Eigenvectors
        • Norms and Distances
        • Linear Algebra in Optimization
      • Calculus
      • Probability and Statistics
      • Information Theory
      • Optimization Techniques
      • Graph Theory and Networks
      • Discrete Mathematics and Combinatorics
      • Advanced Topics
    • Data Fundamentals
      • Data Types and Formats
        • Data Types
        • Structured Data Formats
        • Unstructured Data Formats
        • Implementation with PHP
      • General Data Processing
        • Introduction
        • Storage and Management
          • Data Security and Privacy
          • Data Serialization and Deserialization in PHP
          • Data Versioning and Management
          • Database Systems for AI
          • Efficient Data Storage Techniques
          • Optimizing Data Retrieval for AI Algorithms
          • Big Data Considerations
            • Introduction
            • Big Data Techniques in PHP
      • ML Data Processing
        • Introduction
        • Types of Data in ML
        • Stages of Data Processing
          • Data Acquisition
            • Data Collection
            • Ethical Considerations in Data Preparation
          • Data Cleaning
            • Data Cleaning Examples
            • Data Cleaning Types
            • Implementation with PHP
          • Data Transformation
            • Data Transformation Examples
            • Data Transformation Types
            • Implementation with PHP ?..
          • Data Integration
          • Data Reduction
          • Data Validation and Testing
            • Data Splitting and Sampling
          • Data Representation
            • Data Structures in PHP
            • Data Visualization Techniques
          • Typical Problems with Data
    • ML Algorithms
      • Classification of ML Algorithms
        • By Methods Used
        • By Learning Types
        • By Tasks Resolved
        • By Feature Types
        • By Model Depth
      • Supervised Learning
        • Regression
          • Linear Regression
            • Types of Linear Regression
            • Finding Best Fit Line
            • Gradient Descent
            • Assumptions of Linear Regression
            • Evaluation Metrics for Linear Regression
            • How It Works by Math
            • Implementation in PHP
              • Multiple Linear Regression
              • Simple Linear Regression
          • Polynomial Regression
            • Introduction
            • Implementation in PHP
          • Support Vector Regression
        • Classification
        • Recommendation Systems
          • Matrix Factorization
          • User-Based Collaborative Filtering
      • Unsupervised Learning
        • Clustering
        • Dimension Reduction
        • Search and Optimization
        • Recommendation Systems
          • Item-Based Collaborative Filtering
          • Popularity-Based Recommendations
      • Semi-Supervised Learning
        • Regression
        • Classification
        • Clustering
      • Reinforcement Learning
      • Distributed Learning
    • Integrating ML into Web
      • Open-Source Projects
      • Introduction to EasyAI-PHP
    • Key Applications of ML
    • Practice
  • Neural Networks
    • Introduction
    • Overview of NN
      • History of NN
      • Basic Components of NN
        • Activation Functions
        • Connections and Weights
        • Inputs
        • Layers
        • Neurons
      • Problems and Challenges
      • How NN Works
    • NN Capabilities in PHP
    • Mathematics for NN
    • Types of NN
      • Classification of NN Types
      • Linear vs Non-Linear Problems in NN
      • Basic NN
        • Simple Perceptron
        • Implementation in PHP
          • Simple Perceptron with Libraries
          • Simple Perceptron with Pure PHP
      • NN with Hidden Layers
      • Deep Learning
      • Bayesian Neural Networks
      • Convolutional Neural Networks (CNN)
      • Recurrent Neural Networks (RNN)
    • Integrating NN into Web
    • Key Applications of NN
    • Practice
  • Natural Language Processing
    • Introduction
    • Overview of NLP
      • History of NLP
        • Ancient Times
        • Medieval Period
        • 15th-16th Century
        • 17th-18th Century
        • 19th Century
        • 20th Century
        • 21st Century
        • Coming Years
      • NLP and Text
      • Key Concepts in NLP
      • Common Challenges in NLP
      • Machine Learning Role in NLP
    • NLP Capabilities in PHP
      • Overview of NLP Libraries in PHP
      • Challenges in NLP with PHP
    • Mathematics for NLP
    • NLP Techniques
      • Basic Text Processing with PHP
      • NLP Workflow
      • Popular Tools and Frameworks for NLP
      • Techniques and Algorithms in NLP
        • Basic NLP Techniques
        • Advanced NLP Techniques
      • Advanced NLP Topics
    • Integrating NLP into Web
    • Key Applications of NLP
    • Practice
  • Computer Vision
    • Introduction
  • Overview of CV
    • History of CV
    • Common Use Cases
  • CV Capabilities in PHP
  • Mathematics for CV
  • CV Techniques
  • Integrating CV into Web
  • Key Applications of CV
  • Practice
  • Robotics
    • Introduction
  • Overview of Robotics
    • History and Evolution of Robotics
    • Core Components
      • Sensors (Perception)
      • Actuators (Action)
      • Controllers (Processing and Logic)
    • The Role of AI in Robotics
      • Object Detection and Recognition
      • Path Planning and Navigation
      • Decision Making and Learning
  • Robotics Capabilities in PHP
  • Mathematics for Robotics
  • Building Robotics
  • Integration Robotics into Web
  • Key Applications of Robotics
  • Practice
  • Expert Systems
    • Introduction
    • Overview of ES
      • History of ES
        • Origins and Early ES
        • Milestones in the Evolution of ES
        • Expert Systems in Modern AI
      • Core Components and Architecture
      • Challenges and Limitations
      • Future Trends
    • ES Capabilities in PHP
    • Mathematics for ES
    • Building ES
      • Knowledge Representation Approaches
      • Inference Mechanisms
      • Best Practices for Knowledge Base Design and Inference
    • Integration ES into Web
    • Key Applications of ES
    • Practice
  • Cognitive Computing
    • Introduction
    • Overview of CC
      • History of CC
      • Differences Between CC and AI
    • CC Compatibilities in PHP
    • Mathematics for CC
    • Building CC
      • Practical Implementation
    • Integration CC into Web
    • Key Applications of CC
    • Practice
  • AI Ethics and Safety
    • Introduction
    • Overview of AI Ethics
      • Core Principles of AI Ethics
      • Responsible AI Development
      • Looking Ahead: Ethical AI Governance
    • Building Ethics & Safety AI
      • Fairness, Bias, and Transparency
        • Bias in AI Models
        • Model Transparency and Explainability
        • Auditing, Testing, and Continuous Monitoring
      • Privacy and Security in AI
        • Data Privacy and Consent
        • Safety Mechanisms in AI Integration
        • Preventing and Handling AI Misuse
      • Ensuring AI Accountability
        • Ethical AI in Decision Making
        • Regulations & Compliance
        • AI Risk Assessment
    • Key Applications of AI Ethics
    • Practice
  • Epilog
    • Summing-up
Powered by GitBook
On this page
  • 1. Tensor Rank (Order)
  • 2. Classification by Rank
  • 3. Mathematical Operations by Rank
  • 4. Rank Change Operations
  • 5. Properties Based on Rank
  1. Machine Learning
  2. Mathematics for ML
  3. Linear Algebra
  4. Tensors
  5. Tensor Properties

Tensor Rank

1. Tensor Rank (Order)

Formal Definition:

The rank (or order) of a tensor is the number of independent directional components required to specify the tensor completely. Alternatively, it's the number of indices needed in the component representation.

  • Mathematical Form: T∈RT ∈ ℝT∈R

  • Examples:

    • Temperature: T = 298K

    • Mass: m = 5kg

    • Pressure: P = 101.3 kPa

    • Energy: E = 10J

2. Classification by Rank

Rank 0 (Scalars)

  • Definition: Single numbers, no directional components

  • Mathematical Form: T∈RT ∈ ℝT∈R

  • Examples:

    • Temperature: T = 298K

    • Mass: m = 5kg

    • Pressure: P = 101.3 kPa

    • Energy: E = 10J

Rank 1 (Vectors)

  • Definition: One directional component

  • Mathematical Form: v∈Rnv ∈ ℝⁿv∈Rn

  • Examples:

    • Position vector: r=[x,y,z]r = [x, y, z]r=[x,y,z]

    • Velocity: v=[vx,vy,vz]v = [vx, vy, vz]v=[vx,vy,vz]

    • Force: F=[Fx,Fy,Fz]F = [Fx, Fy, Fz]F=[Fx,Fy,Fz]

    • Electric field: E=[Ex,Ey,Ez]E = [Ex, Ey, Ez]E=[Ex,Ey,Ez]

Rank 2 (Matrices)

  • Definition: Two directional components

  • Mathematical Form: M∈RnxmM ∈ ℝⁿˣᵐM∈Rnxm

  • Examples:

    • Stress Tensor: σ=[σxxσxyσxzσyxσyyσyzσzxσzyσzz]\sigma = \begin{bmatrix} \sigma_{xx} & \sigma_{xy} & \sigma_{xz} \\ \sigma_{yx} & \sigma_{yy} & \sigma_{yz} \\ \sigma_{zx} & \sigma_{zy} & \sigma_{zz} \end{bmatrix}σ=​σxx​σyx​σzx​​σxy​σyy​σzy​​σxz​σyz​σzz​​​

    • Inertia tensor: I=[IxxIxyIxzIyxIyyIyzIzxIzyIzz]I = \begin{bmatrix} I_{xx} & I_{xy} & I_{xz} \\ I_{yx} & I_{yy} & I_{yz} \\ I_{zx} & I_{zy} & I_{zz} \end{bmatrix}I=​Ixx​Iyx​Izx​​Ixy​Iyy​Izy​​Ixz​Iyz​Izz​​​

    • Metric tensor: g=[gttgtxgtygtzgxtgxxgxygxzgytgyxgyygyzgztgzxgzygzz]g = \begin{bmatrix} g_{tt} & g_{tx} & g_{ty} & g_{tz} \\ g_{xt} & g_{xx} & g_{xy} & g_{xz} \\ g_{yt} & g_{yx} & g_{yy} & g_{yz} \\ g_{zt} & g_{zx} & g_{zy} & g_{zz} \end{bmatrix}g=​gtt​gxt​gyt​gzt​​gtx​gxx​gyx​gzx​​gty​gxy​gyy​gzy​​gtz​gxz​gyz​gzz​​​

Rank 3

  • Definition: Three directional components

  • Mathematical Form: T∈RnxmxkT ∈ ℝⁿˣᵐˣᵏT∈Rnxmxk

  • Examples:

    • Piezoelectric tensor: dijk=[d111d112d113d121d122d123d131d132d133]d_{ijk} = \begin{bmatrix} d_{111} & d_{112} & d_{113} \\ d_{121} & d_{122} & d_{123} \\ d_{131} & d_{132} & d_{133} \end{bmatrix}dijk​=​d111​d121​d131​​d112​d122​d132​​d113​d123​d133​​​

    • Levi-Civita Symbol: ϵijk={+1,if (i,j,k) is (1,2,3),(2,3,1), or (3,1,2)−1,if (i,j,k) is (3,2,1),(1,3,2), or (2,1,3)0,if any index is repeated\epsilon_{ijk} = \begin{cases} +1, & \text{if } (i,j,k) \text{ is } (1,2,3), (2,3,1), \text{ or } (3,1,2) \\ -1, & \text{if } (i,j,k) \text{ is } (3,2,1), (1,3,2), \text{ or } (2,1,3) \\ 0, & \text{if any index is repeated} \end{cases}ϵijk​=⎩⎨⎧​+1,−1,0,​if (i,j,k) is (1,2,3),(2,3,1), or (3,1,2)if (i,j,k) is (3,2,1),(1,3,2), or (2,1,3)if any index is repeated​

3. Mathematical Operations by Rank

3.1 Rank Addition

Rank addition for tensors refers to the process of adding tensors with different ranks, resulting in a higher-rank tensor. In the context of tensors, rank (or order) refers to the number of dimensions (or axes) in the tensor.

When adding tensors of different ranks, one of two main techniques is typically used:

1. Broadcasting:

  • Broadcasting is a technique where lower-rank tensors are expanded to match the dimensions of higher-rank tensors during an operation like addition.

  • For example, if we have a rank-2 tensor (a matrix) and add it to a rank-1 tensor (a vector), the rank-1 tensor can be “broadcast” to match the rank-2 shape.

  • Suppose a rank-2 tensor has a shape of 3x3, and a rank-1 tensor has a shape of 3. Broadcasting replicates the rank-1 tensor across each row or column, allowing element-wise addition to occur.

2. Tensor Stacking:

  • Stacking is a method where tensors of the same shape are combined along a new axis, increasing the rank by one.

  • For instance, stacking two matrices (rank-2 tensors) of shape 3x3 results in a rank-3 tensor of shape 2x3x3. This is useful in cases where we want to preserve each tensor independently within a higher-rank structure, rather than perform element-wise addition.

Key Points

  • Broadcasting efficiently aligns shapes, enabling element-wise operations between tensors of different ranks without needing explicit expansion of the lower-rank tensor.

  • Stacking maintains individual tensor elements but increases rank, as it adds a new dimension.

Same rank tensors can be added:

  • Vectors (Rank 1): [a1a2a3]+[b1b2b3]=[a1+b1a2+b2a3+b3]\begin{bmatrix} a_1 \\ a_2 \\ a_3 \end{bmatrix} + \begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix} = \begin{bmatrix} a_1 + b_1 \\ a_2 + b_2 \\ a_3 + b_3 \end{bmatrix}​a1​a2​a3​​​+​b1​b2​b3​​​=​a1​+b1​a2​+b2​a3​+b3​​​

  • Matrices (Rank 2): [A11A12A21A22]+[B11B12B21B22]=[A11+B11A12+B12A21+B21A22+B22]\begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix} + \begin{bmatrix} B_{11} & B_{12} \\ B_{21} & B_{22} \end{bmatrix} = \begin{bmatrix} A_{11} + B_{11} & A_{12} + B_{12} \\ A_{21} + B_{21} & A_{22} + B_{22} \end{bmatrix}[A11​A21​​A12​A22​​]+[B11​B21​​B12​B22​​]=[A11​+B11​A21​+B21​​A12​+B12​A22​+B22​​]

Rank Addition with Broadcasting

Consider a rank-1 tensor (vector) v=[1,2,3]v = [1, 2, 3]v=[1,2,3] and a rank-2 tensor (matrix) M: M=[456789101112]M = \begin{bmatrix} 4 & 5 & 6 \\ 7 & 8 & 9 \\ 10 & 11 & 12 \end{bmatrix}M=​4710​5811​6912​​

Using broadcasting, vvv can be added to each row of MMM as follows:

M+v=[4+15+26+37+18+29+310+111+212+3]=[57981012111315]M + v = \begin{bmatrix} 4+1 & 5+2 & 6+3 \\ 7+1 & 8+2 & 9+3 \\ 10+1 & 11+2 & 12+3 \end{bmatrix} = \begin{bmatrix} 5 & 7 & 9 \\ 8 & 10 & 12 \\ 11 & 13 & 15 \end{bmatrix}M+v=​4+17+110+1​5+28+211+2​6+39+312+3​​=​5811​71013​91215​​

3.2 Rank Multiplication

  1. Vector × Vector → Scalar (Rank reduction): a⋅b=Σiaibia·b = Σᵢ aᵢbᵢa⋅b=Σi​ai​bi​

  2. Vector × Vector → Matrix (Rank increase): (a⊗b)ij=aibj(a⊗b)ᵢⱼ = aᵢbⱼ(a⊗b)ij​=ai​bj​

  3. Matrix × Vector → Vector: (Av)i=ΣjAijvj(Av)ᵢ = Σⱼ Aᵢⱼvⱼ(Av)i​=Σj​Aij​vj​

4. Rank Change Operations

Rank multiplication for tensors involves performing multiplication operations between tensors of varying ranks, often resulting in a higher-rank tensor. The type of multiplication can vary depending on the application and can include operations such as element-wise multiplication, outer products, or tensor contractions.

1. Element-wise Multiplication:

  • Also known as the Hadamard product, element-wise multiplication occurs when tensors of the same shape (and thus the same rank) are multiplied. Each element in one tensor is multiplied by the corresponding element in the other tensor.

  • Example: For two rank-2 tensors (matrices) A and B of shape 2x2 , A=[1234],B=[5678]A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}, \quad B = \begin{bmatrix} 5 & 6 \\ 7 & 8 \end{bmatrix}A=[13​24​],B=[57​68​] the element-wise multiplication results in A∘B=[1⋅52⋅63⋅74⋅8]=[5122132]A \circ B = \begin{bmatrix} 1 \cdot 5 & 2 \cdot 6 \\ 3 \cdot 7 & 4 \cdot 8 \end{bmatrix} = \begin{bmatrix} 5 & 12 \\ 21 & 32 \end{bmatrix}A∘B=[1⋅53⋅7​2⋅64⋅8​]=[521​1232​]

2. Outer Product:

  • The outer product of two tensors of ranks mmm and nnn produces a tensor of rank m+nm + nm+n. This is often used to create higher-dimensional tensors by combining lower-dimensional ones.

  • Example: For a rank-1 tensor a=[1,2,3]a = [1, 2, 3]a=[1,2,3] and another rank-1 tensor b=[4,5]b = [4, 5]b=[4,5],

  • Here, the result is a rank-2 tensor (matrix) with shape 3x2. a⊗b=[1⋅41⋅52⋅42⋅53⋅43⋅5]=[458101215]a \otimes b = \begin{bmatrix} 1 \cdot 4 & 1 \cdot 5 \\ 2 \cdot 4 & 2 \cdot 5 \\ 3 \cdot 4 & 3 \cdot 5 \end{bmatrix} = \begin{bmatrix} 4 & 5 \\ 8 & 10 \\ 12 & 15 \end{bmatrix}a⊗b=​1⋅42⋅43⋅4​1⋅52⋅53⋅5​​=​4812​51015​​

3. Tensor Contraction (Generalized Inner Product):

  • Tensor contraction is similar to matrix multiplication but extends to tensors of higher ranks. It involves summing over specific indices in a pair of tensors, effectively reducing the total rank.

  • For example, contracting a rank-3 tensor A of shape 2x3x4 with a rank-2 tensor B of shape 4x5 over the last index of A and the first index of B results in a rank-3 tensor of shape 2x3x5.

4. Matrix Multiplication:

  • Matrix multiplication is a specific type of rank multiplication between two rank-2 tensors (matrices). In this operation, the number of columns in the first matrix must equal the number of rows in the second. The result is a new rank-2 tensor (matrix).

  • Example: For matrices C and D where C is 2x3 and D is 3x2, C⋅D=resulting 2×2 matrixC \cdot D = \text{resulting } 2 \times 2 \text{ matrix}C⋅D=resulting 2×2 matrix

Key Points

  • Rank Addition and Rank Multiplication often lead to different dimensional results. Rank addition typically preserves the rank, while rank multiplication can increase or decrease it.

  • These operations are integral in fields like deep learning, where tensors of varying ranks are manipulated in complex models like neural networks.

5. Properties Based on Rank

5.1 Component Count

In 3D space:

  • Rank 0: 1 component

  • Rank 1: 3 components

  • Rank 2: 9 components

  • Rank 3: 27 components

  • Rank 4: 81 components

5.2 Symmetry Properties

  • Symmetric Rank 2: A tensor AijA_{ij}Aij​ is symmetric if it satisfies: Aij=AjiA_{ij} = A_{ji}Aij​=Aji​ This property means that swapping the indices does not change the value.

  • Antisymmetric Rank 2: A tensor AijA_{ij}Aij​ is antisymmetric if it satisfies: Aij=−AjiA_{ij} = -A_{ji}Aij​=−Aji​ In this case, swapping the indices negates the value.

  • Cyclic Rank 3: A tensor TijkT_{ijk}Tijk​ is cyclic if it satisfies: Tijk=Tjki=TkijT_{ijk} = T_{jki} = T_{kij}Tijk​=Tjki​=Tkij​ This means that rotating the indices in a cyclic manner yields the same value.

PreviousTensor DimensionNextTensor Shape

Last updated 1 month ago