What You May Have Missed #25
LLMs (evolutionary tree, LLaMA) / Generative agents (BabyAGI, Auto-GPT) / AI art / Worthwhile essays / R&D and products (Elon Musk's X.AI) / Miscellanea and curiosities (Drake & The Weeknd viral song)
Large language models
The Practical Guide for Large Language Models: “We build an evolutionary tree of modern Large Language Models (LLMs) to trace the development of language models in recent years and highlights, of the most well-known models.” (based on this paper). Yann LeCun mentioned the nomenclature is confusing.
“There are over 50 one billion+ parameter LLMs to choose from (open-source or proprietary API). A list of all of them.” (Matt Rickard).
A brief history of LLaMA models: LLaMA, Alpaca, Vicuna, Koala, GPT4-x-Alpaca, and WizardLM.
LLM optimization progress: “Running 13B LLMs like LLaMA on edge devices (e.g. MacBook Pro with an M1 chip) is now almost a breeze.” (Aleksa Gordic).
Generative AI and education: “Based on what I have seen, I think we can assume three things about AI & education: 1) AI tutors are going to be very effective. 2) AI writing will not be caught by anti-cheating software. 3) Human instructors will be freed to focus on making learning better.” (Ethan Mollick).
Margaret Mitchell goes through the AGI sparks paper on GPT-4 demystifying all the wild claims that have been circulating. I find her approach refreshing from both the hype and the harsher attitude of some of her colleagues.
The APA has published a guide to cite ChatGPT: “Quoting ChatGPT’s text from a chat session is therefore more like sharing an algorithm’s output; thus, credit the author of the algorithm with a reference list entry and the corresponding in-text citation.” The basic way would be something like “OpenAI. (2023). ChatGPT (Mar 14 version) [Large language model].”
Generative AI agents
Keep reading with a 7-day free trial
Subscribe to The Algorithmic Bridge to keep reading this post and get 7 days of free access to the full post archives.