By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Scorm.bizScorm.bizScorm.biz
Notification Show More
Font ResizerAa
  • eLearning Industry News
    • Articulate News
    • eLearning Industry
    • The eLearning Coach
    • eLearning Brothers
    • eLearning Guild
  • Learning Management Systems
    • Moodle News
    • Totara Learn
  • EdTech and Online Learning
    • EdTech Magazine
    • Class Central
    • Open Culture
  • Corporate Training and Development
    • Training Magazine
    • Chief Learning Officer
  • Technology and Tools
    • LinkedIn Learning Blog
    • eLearning Art
  • General Education
    • EdSurge
    • Inside Higher Ed
Reading: Discover 9 Strange & Magical Techniques to Master GPT-3 and AI – Dive into Class Central!
Share
Font ResizerAa
Scorm.bizScorm.biz
  • eLearning Industry News
  • Learning Management Systems
  • EdTech and Online Learning
  • Corporate Training and Development
  • Technology and Tools
  • General Education
Search
  • eLearning Industry News
    • Articulate News
    • eLearning Industry
    • The eLearning Coach
    • eLearning Brothers
    • eLearning Guild
  • Learning Management Systems
    • Moodle News
    • Totara Learn
  • EdTech and Online Learning
    • EdTech Magazine
    • Class Central
    • Open Culture
  • Corporate Training and Development
    • Training Magazine
    • Chief Learning Officer
  • Technology and Tools
    • LinkedIn Learning Blog
    • eLearning Art
  • General Education
    • EdSurge
    • Inside Higher Ed
Have an existing account? Sign In
Follow US
Scorm.biz > Blog > EdTech and Online Learning > Class Central > Discover 9 Strange & Magical Techniques to Master GPT-3 and AI – Dive into Class Central!
Discover 9 Strange & Magical Techniques to Master GPT-3 and AI – Dive into Class Central!
Class Central

Discover 9 Strange & Magical Techniques to Master GPT-3 and AI – Dive into Class Central!

Scorm.biz Team
Last updated: 2024/07/12 at 6:24 PM
Scorm.biz Team Published July 12, 2024
Share
SHARE
Discover 9 Strange & Magical Techniques to Master GPT-3 and AI – Dive into Class Central!
Source: Dall-E for “A humorous illustration of a person juggling multiple devices and books, symbolizing the overwhelming amount of resources available for learning about AI.”

Large Language Models, GenAI, Transformers, Embeddings, Vectors, Inferences, Fine-tuning, RAG, Neural Networks, Megaloboxing… Do I have your Attention?

Ever wanted to understand how Generative AI works, not to build it, but at least get the gist?

As someone with premium subscriptions to three different LLM services, so did I. For months, I’ve been safely keeping track of these resources by leaving them open in tabs across various devices and browsers. One day, I hope to actually finish at least one of them…

At Class Central, we generally recommend courses. But often, learning isn’t just about pre-recorded videos with an occasional quiz here and there, followed by a paid certificate for clicking the right things.

TOC - Jump to:

Toggle
  • Why Learn About LLMs or Large Language Models?
  • GenAI vs GPT vs LLM
  • What Is ChatGPT Doing … and Why Does It Work? By Stephen Wolfram
  • 3Blue1Brown’s Visual Intro to Transformers and Attention
  • LLM University And Serrano.Academy
  • Jay Alammar’s Visual Journey Through Language Models
  • Let’s build GPT: from scratch, in code, spelled out by Andrej Karpathy

Why Learn About LLMs or Large Language Models?

Honestly, IDK. I’m just adding this question to please the algorithm gods at Google. Feel free to skip the rest of this section. The next paragraph is entirely generated by an LLM, which will remain anonymous to protect its privacy.

Learning about Large Language Models is important because they’re changing how we communicate and access information. By understanding how LLMs work, you can better use them to improve your writing, communication, and even your job skills!

GenAI vs GPT vs LLM

You know the drill. Feel free to skip the rest of the section or just paste the title into your friendly neighborhood LLM.

  • GenAI: A general term for artificial intelligence that can learn and improve over time.
  • GPT (Generative Pre-trained Transformer): A specific type of AI model that can generate human-like text, like chatbots and language translation.
  • LLM (Large Language Model): A type of AI model that’s trained on a massive amount of text data to understand and generate human language, like writing and conversation.

What Is ChatGPT Doing … and Why Does It Work? By Stephen Wolfram

Source: What Is ChatGPT Doing … and Why Does It Work?

Yes, THE Stephen Wolfram. Don’t know who he is? Here is what Wikipedia ChatGPT has to say about him:

Stephen Wolfram is a British-American computer scientist, physicist, and entrepreneur, best known for his work in developing Mathematica, an advanced computational software, and for his development of the Wolfram Alpha computational knowledge engine.

This nearly 20,000-word article (book?) by Wolfram goes into detail about the Wolfram Language and GPT-2 system, with illustrations and code examples provided. It’s in-depth but accessible (given the complexity of the topic).

tldr: It’s just adding one word at a time.

This is art. It would be great if we had more educators like Wolfram who can break down complex topics into something that a majority of people can understand, building it up bit by bit.

Of course, we’re not going to do that because it doesn’t “scale.” Instead, we’re going to flood the internet with random garbage generated by LLMs.

If you have to read just one article, this would be it..

3Blue1Brown’s Visual Intro to Transformers and Attention

3Blue1Brown (3b1b) by Grant Sanderson is a popular YouTube channel with over 6 million subscribers. He creates stunning animated videos of complex mathematical concepts, making them accessible and visually engaging for viewers of all levels.

Sanderson created his own mathematical animation engine and open-sourced it on GitHub. Similar to Wolfram, I feel his videos are a work of art. I don’t think he’s worried about GenAI taking his job.

So far, he has published two videos on this topic: “But what is a GPT? Visual intro to transformers” and “Attention in transformers, visually explained.” I even spied a third video on his Patreon.

He explains visually what goes on in a transformer step-by-step. And by step-by-step, I mean he uses a real-world example and shows us the actual matrices in those steps as data flows through them. We’re talking tokens, vectors, attention blocks, and feed-forward layers – all brought to life through Sanderson’s magical animations.

It’s mind-boggling that this even exists. I haven’t been this impressed with a video since I watched Jurassic Park in theaters for the first time (I know I’m dating myself).

LLM University And Serrano.Academy

I am combining LLM University by Cohere and Serrano.Academy (YouTube Channel) because they have a common instructor: Luis Serrano.

If this name sounds familiar, you might be a Udacitian from its heyday. Luis was a popular instructor teaching the Machine Learning Nanodegree. Long ago, Udacity launched something called Udacity Connect Intensive. Basically, you’d meet in-person once a week in a physical classroom while taking the Nanodegree.

I was part of the first cohort/test in San Jose, and Luis Serrano once dropped in to give a lecture. His greatest strength is breaking down complicated concepts into simple analogies and examples.

For me, Luis provides the intuition behind the concepts. His passion for teaching is obvious and infectious.

Unlike the previous two examples that start with real-world examples and are quite information packed, I would say if you’re having trouble understanding those two resources, watch the following Serrano.Academy videos:

  1. The Attention Mechanism in Large Language Models
  2. The math behind Attention: Keys, Queries, and Values matrices
  3. What are Transformer Models and how do they work?

LLM University consists of 7 modules in total and contains a mix of text and videos. The first module, taught by Luis, covers Large Language Models and some of the theory behind them like Attention, Transformers, and Embeddings. I believe this would have significant overlap with the Serrano.Academy videos I mentioned above.

The next 6 modules are very practical in nature and deal with the real-world applications of LLMs: Text Generation, Semantic Search, and Retrieval-Augmented Generation (RAG). The code examples are in Python and use the Cohere SDK.

There is also a section on Prompt Engineering.

Jay Alammar’s Visual Journey Through Language Models

Source: The Illustrated GPT-2 (Visualizing Transformer Language Models)

Jay Alammar, another former Udacity instructor, now works at Cohere alongside Luis Serrano. He is also an instructor for certain modules at Cohere’s LLM University.

Alammar has created a series of tutorials where he explains the workings of large language models using illustrations, animations, and visualizations:

These tutorials offer a visual approach to understanding complex AI concepts, making them more accessible to a wider audience.

Let’s build GPT: from scratch, in code, spelled out by Andrej Karpathy

In this two-hour video, you build a simplified version of ChatGPT from scratch with one of the co-founders of OpenAI, Andrej Karpathy. Karpathy previously was also a Director of AI at Tesla, where he led the development of the company’s Autopilot system.

Cool cool cool cool.

Honestly, the title of the video is pretty self-explanatory. You will build a transformer model right from scratch in Python. It will focus on training a character-level language model on the Shakespeare dataset to generate text that resembles Shakespeare’s writing.

This is part of his “Neural Networks: Zero to Hero” series. Since this video, he has published a couple more: “Let’s build the GPT Tokenizer” and most recently a 4-hour “Let’s reproduce GPT-2 (124M)“. Apparently it takes 90 minutes and $

You Might Also Like

Boost Employee Engagement: Crafting a Meaningful Workplace for Your Team

Navigating Business Waves: Embrace the Unpredictable Journey

Become an AI Product Manager in 3 Months: Microsoft Certificate Course!

Tech Revolution: Shaping the Future of Skilled Trades Education

Building AI Success: Essential Skills for a Winning Strategy

Scorm.biz Team July 12, 2024 July 12, 2024
Share This Article
Facebook Twitter Email Print
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Weekly Newsletter

Subscribe to our newsletter to get our newest articles instantly!

Popular News
Mastering Successful Sales Partnerships: 8 Proven Tips
eLearning Industry

Mastering Successful Sales Partnerships: 8 Proven Tips

Scorm.biz Team Scorm.biz Team September 15, 2024
Enhancing eLearning with Stellar Voice-Over Audio
Boost Microlearning Creation with AI for Efficient Training by Preeti GT | Aug 14, 2024
Cultivating Resilient Employees Through Well-Being and Learning
AI Chatbots: From Reflecting Cultural Biases to Alleviating Them.

About US

SCORM.biz aggregates the most relevant news and updates in eLearning, Learning Management Systems, EdTech, Corporate Training, and more. Stay informed with our curated feed of insights, trends, and tools from the top sources in the industry.
  • Contact
  • About Us
  • Privacy Policy
  • Terms & Conditions

Subscribe US

Subscribe to our newsletter to get our newest articles instantly!

Copyright © 2024 Scorm.biz. All Rights Reserved.
Welcome Back!

Sign in to your account

Lost your password?