Product Launch 2023 Keynote

Modular is introducing two technological breakthroughs that could revolutionize the AI and computing industry. The first is an AI execution engine that provides performance and usability benefits, inc...
Modular Docs - Why Mojo - A backstory and rationale for why we created the Mojo language
Mojo is an innovative and scalable programming model designed to target accelerators and heterogeneous systems. Its mission includes innovations in compiler internals and support for current and emerg...
The Amazing AI Super Tutor for Students and Teachers | Sal Khan | TED

AI chatbots like ChatGPT have been feared for the possibility of students using them to cheat on assignments. However, Sal Khan, founder of Khan Academy, argues that AI has the potential to positively...
The Inside Story of ChatGPT’s Astonishing Potential | Greg Brockman | TED

OpenAI has showcased a live demo of its advanced AI tool, ChatGPT. The tool, which combines language processing with powerful AI-generated imagery, is designed to allow seamless communication between ...
Semantic reconstruction of continuous language from non-invasive brain recordings
A new non-invasive decoder has been introduced to reconstruct continuous language from cortical semantic representations, recorded through functional magnetic resonance imaging. The decoder is capable...
Tool Juggler - Alpha Version
Tool Juggler is a customizable AI assistant tool that allows users to create tools on-the-fly. In this alpha version, users can only work with OpenAI models and are required to have an OpenAI API Key....
Boosting Theory-of-Mind Performance in Large Language Models via Prompting
Large language models have shown remarkable success in many tasks, but they still face challenges in complex reasoning. One area of specific interest is theory-of-mind (ToM) reasoning, which involves ...
Getting Started with LangChain: A Beginners Guide to Building LLM-Powered Applications
LangChain is a framework that helps in building LLM-powered applications easily. It provides a generic interface to a variety of different foundation models, a framework to manage prompts, and a centr...
Low-code LLM: Visual Programming over LLMs
This paper introduces a novel human-LLM interaction framework, Low-code LLM, that incorporates six types of low-code visual programming interactions to achieve more controllable and stable responses. ...
Extending the Context Length of BERT using Recurrent Memory Transformer
This technical report introduces a new architecture, Recurrent Memory Transformer (RMT), that extends the context length of BERT, one of the most effective Transformer-based models in natural language...
Hyena Hierarchy: Towards Larger Convolutional Language Models
In this paper, we propose Hyena, a subquadratic drop-in replacement for attention, constructed by interleaving implicitly parametrized long convolutions and data-controlled gating. Our proposed method...
ChatGPTs Astonishing Potential Unveiled in Inside Story | TED Talk by Greg Brockman
In this TED talk, Greg Brockman, the CTO of OpenAI, explores the advanced capabilities of ChatGPT, an AI-powered chatbot that can generate human-like responses. He delves into how ChatGPT is revolutio...
Jupyter AI - A User-friendly and Powerful Way to Explore Generative AI Models in Notebooks
Welcome to Jupyter AI, which brings generative AI to Jupyter. Jupyter AI provides a user-friendly and powerful way to explore generative AI models in notebooks and improve your productivity in Jupyter...
MASS-EDITING MEMORY IN A TRANSFORMER
In this preprint, the authors propose MEMIT, a method for updating language models with many memories, scaling up to thousands of associations for large models like GPT-J (6B) and GPT-NeoX (20B), exce...
The A.I. Dilemma - March 9, 2023

The co-founders of the Center for Humane Technology, Tristan Harris and ASA Raskin, discuss the potential dangers and irresponsible deployment of artificial intelligence (AI) in their presentation tit...
Announcing Google DeepMind: Accelerating Progress in AI Safely and Responsibly
Earlier today, Google announced the formation of Google DeepMind, which combines the talents and efforts of the DeepMind and Brain teams from Google Research as a single, focused unit. The goal is to ...
griptape: A modular Python framework for LLM workflows, tools, memory, and data
# griptape [](https://pypi.python.org/pypi/griptape) [](https://griptape.readthedocs...
You, too, can LLM: A Quick Guide to Semantic Kernel and Language Models
This article provides insights on the Semantic Kernel, a Software Development Kit that simplifies integration of language models into your own applications. It discusses the capabilities of the Semant...
Elon Musk Launches New AI Company, X.AI, to Take on OpenAI
Elon Musk, the founder of Tesla and SpaceX, has quietly started a new AI company called X.AI to challenge OpenAI. The company will focus on developing advanced AI technologies across various domains, ...
Emergent Autonomous Scientific Research Capabilities of Large Language Models
Transformer-based large language models are gaining a foothold in the field of machine learning research, with a plethora of applications in natural language processing, biology, chemistry, and comput...