As we don talk for di introduction, one way wey we fit take achieve intelligence na to train one computer model or one artificial brain. Since di middle of 20th century, researchers don dey try different mathematical models, until recent years wey dis direction don show say e dey very successful. Dis kind mathematical models of di brain dem dey call am neural networks.
Sometimes dem dey call neural networks Artificial Neural Networks, ANNs, to show say na models we dey talk about, no be real networks of neurons.
Neural Networks na part of one bigger area wey dem dey call Machine Learning, wey di goal na to use data train computer models wey fit solve problems. Machine Learning na big part of Artificial Intelligence, but for dis curriculum we no go cover di classical ML.
Check our separate Machine Learning for Beginners curriculum to sabi more about classic Machine Learning.
For Machine Learning, we dey assume say we get one dataset of examples X, and di output values wey follow am Y. Examples dey often be N-dimensional vectors wey get features, and di outputs dem dey call am labels.
We go look di two most common machine learning problems:
- Classification, wey mean say we go classify one input object into two or more classes.
- Regression, wey mean say we go predict one number for each of di input samples.
If we dey represent inputs and outputs as tensors, di input dataset na matrix wey get size M×N, where M na di number of samples and N na di number of features. Output labels Y na di vector wey get size M.
For dis curriculum, we go only focus on neural network models.
For biology, we sabi say our brain dey made up of neural cells (neurons), and each of dem get plenty "inputs" (dendrites) and one "output" (axon). Both dendrites and axons fit carry electrical signals, and di connections between dem — wey dem dey call synapses — fit get different levels of conductivity, wey neurotransmitters dey control.
![]() |
![]() |
|---|---|
| Real Neuron (Image from Wikipedia) | Artificial Neuron (Image by Author) |
So, di simplest mathematical model of a neuron get plenty inputs X1, ..., XN and one output Y, plus weights W1, ..., WN. Di output na:
where f na one non-linear activation function.
Di early models of neuron dem describe am for di classical paper A logical calculus of the ideas immanent in nervous activity by Warren McCullock and Walter Pitts for 1943. Donald Hebb for im book "The Organization of Behavior: A Neuropsychological Theory" propose how dem fit train dis networks.
For dis section we go learn about:
- Perceptron, one of di first neural network models for two-class classification
- Multi-layered networks with one notebook how to build our own framework
- Neural Network Frameworks, with dis notebooks: PyTorch and Keras/Tensorflow
- Overfitting
Disclaimer:
Dis dokyument don use AI translation service Co-op Translator take translate am. Even though we dey try make sure say e correct, abeg sabi say automatic translation fit get mistake or no dey accurate well. Di original dokyument for im native language na di main correct source. For important information, e go beta make professional human translator check am. We no go fit take blame for any misunderstanding or wrong interpretation wey fit happen because of dis translation.



