Revolutionary Algorithm for AI Learning: GPT-4 Toaster Possible?
This article is a summary of a YouTube video "This Algorithm Could Make a GPT-4 Toaster Possible" by Edan Meyer
TLDR Jeffrey Hinton proposes an alternative to backprop which could lead to AI learning without hardware requirements, while Clearml offers an end-to-end platform for ML Ops and Forward Forward algorithm learns a representation of data in an unsupervised way.
Jeffrey Hinton proposes a new alternative to backprop which could lead to AI learning without hardware requirements, while Clearml offers an end-to-end platform for ML Ops and Forward Forward algorithm learns a representation of data in an unsupervised way.
Gradient descent is used to optimize objectives in each layer of a neural network, with back propagation and normalization of the Hidden Vector to prevent data about the magnitude from being passed on.
Using a forward-forward algorithm with local receptive fields and incorrect labels for negative data, networks can learn to identify the difference between images with the right label and images with the wrong label, resulting in a test error of 0.64.
By using a multi-layer recurrent neural network, Hinton proposes a way to treat a static image as a video, with the activity vector at each layer determined by the normalized activity vectors at both the layer above and the layer below at the previous time.
Neurons predict their inputs and learn based off the difference between the prediction and the actual input, with Hover getting data from the sex and labels to create an upward and downward flow of information over time.
Matrix multiplication implemented as analog hardware could be an energy efficient way to run neural networks.
By using the forward forward algorithm and distillation, AI hardware chips could be created that are more efficient than current computer simulations, allowing for faster AI computations and endless possibilities.