Neurons
Last updated
Last updated
The Fundamental Processing Units. In artificial neural networks, neurons, also known as nodes or units, are computational units that mimic the biological neurons of the brain. They take input, process it, and produce an output that is passed to the next layer of the network.
Biological neurons and artificial neurons share some conceptual similarities but are fundamentally different in structure, function, and operation. Here's a comparison:
Major Components: Axons, dendrites, and synapses.
Functioning: Electrical impulses from other neurons enter dendrites at synapses, where they are processed in the cell body. The processed signal is transmitted via the axon to synapses connected to other neurons.
Information Storage: Synapses can strengthen or weaken connections, enabling dynamic information storage.
Scale: The human brain contains approximately neurons.
Structure: Modeled after biological neurons, with layers representing axons, dendrites, and synapses.
Layered Network:
Input Layer: Receives external signals.
Hidden Layer: Processes signals, extracting relevant patterns or features.
Output Layer: Generates the final output of the network.
Information Processing: Signal strength is modulated by weights, similar to synaptic adjustments in biological neurons.
Processing Style
Asynchronous
Synchronous
Speed of Computation
Slow (~milliseconds per computation)
Fast (~nanoseconds per computation)
Reliability
Distributed representation; tolerates neuron failure
Sensitive to errors; every bit must function
Adaptability
Connectivity changes over time based on new information and demands.
Fixed connectivity unless components are replaced.
Topology
Complex and highly interconnected
Typically organized in simpler, tree-like structures.
Learning Mechanism
Still under research
Relies on Gradient Descent.
While artificial neurons emulate some aspects of biological neurons, the human brain's complexity, adaptability, and distributed resilience far surpass current artificial networks.
Each neuron performs the following operations:
A neuron receives one or more inputs, each with an associated weight.
The inputs are multiplied by their respective weights and summed.
Here:
The result is passed through an activation function to introduce non-linearity, enabling the network to solve complex problems. Common activation functions include:
Neurons act as decision-makers. By adjusting the weights and biases, they learn patterns from data, enabling tasks like image recognition, language translation, and more. The combination of neurons in layers allows neural networks to model increasingly complex relationships.
Scale: Current artificial neural networks (ANNs) typically contain neurons.
is the input.
is the weight for the -th input.
is the bias term, which allows the neuron to shift its activation.
Sigmoid:
ReLU (Rectified Linear Unit):
Tanh: