Neural Networks: What A Brain Looks like on AI

Issue 7

The Lightwave 

Practical Insights for Skeptics & Users Alike…in (Roughly) Two Minutes or Less

"Technology is just a tool. In terms of getting the kids working together and motivating them, the teacher is the most important" - Bill Gates

In yesterday’s post, I said that Deep Learning, Neural Networks, and Large Language Models are kind of like a chaotic kitchen staffed by millions of tiny, forgetful chefs.

Deep Learning is the head chef, barking orders and setting the menu.

The neural networks are the line cooks, frantically passing ingredients back and forth and adjusting recipes on the fly.

And the LLM is the final dish - a linguistic soufflé that somehow emerges from the chaos, rising impressively despite being made from digital word-scraps and statistical seasonings.

Or sometimes it doesn’t rise from the chaos and falls flat. Or thinks it’s a chandelier…

Anyway…today is Neural Networks.

SO…NEURAL NETWORKS

Neural networks are the backbone of modern artificial intelligence, inspired by the structure and function of the human brain.

They're composed of interconnected nodes (neurons) that process and transmit information, forming the basis for deep learning algorithms.

Here's a simple breakdown:

  • Structure: Neural networks consist of layers of interconnected nodes. There's typically an input layer (receiving initial data), one or more hidden layers (processing the data), and an output layer (producing the final result).

  • Neurons and Connections: Each node in the network is like a neuron in the brain. It receives input, processes it, and passes the result to connected nodes. The connections between nodes have weights that determine the strength of the signal passed along.

  • Learning Process: Neural networks learn by adjusting the weights of connections between nodes. As the network processes training data, it compares its output to the desired result and tweaks the weights to improve accuracy.

  • Activation Functions: Each neuron uses an activation function to determine whether and how strongly to fire, introducing non-linearity that allows the network to learn complex patterns.

Or, to get back to my unwieldy and strained metaphor:

Each Line Cook (neuron) receives ingredients (input), processes them using their own techniques, and passes the results to other cooks.

The kitchen is organized into stations (layers), from prep to cooking to plating.

As they work, the cooks learn and adjust their methods, strengthening or weakening their connections to create better dishes.

This setup allows the kitchen to handle complex recipes (tasks) and produce a wide variety of dishes (outputs), improving over time as they learn from successes and mistakes.

Wet Hot American Summer Cooking GIF by NETFLIX

Gif by netflix on Giphy

Tomorrow we’ll focus on Large Language Models (LLMs). ChatGPT-4o is an LLM. There are many, many (many) others, too. I like Perplexity and Claude.

For more, visit www.NorthLightAI.com