April 30, 2024NEWS

The Future of Large Language Models by Lukasz Kaiser and Jan Chorowski

In April 2024, Pathway hosted an incredible meetup in San Francisco, bringing together some of the brightest minds in AI and data science.

We welcomed Łukasz Kaiser, co-author of "Attention is All You Need" and Jan Chorowski, Pathway’s CTO, who shared their vision on the future of Large Language Models and the roadmap towards more intelligent foundational Large Language Models (LLMs). Joined by many senior developers, architects, and founders working on generative AI projects, they discussed the evolution of deep learning, the role of Reinforcement Learning with Human Feedback, the future of LLMs, and how achieving infinite LLM Context Windows can be made possible through innovative engineering and efficient retrieval mechanisms.

Key Topics Covered by Lukasz Kaiser and Jan Chorowski:

  1. Role of Retrievers in Reinforcement Learning for Intelligent LLMs Łukasz Kaiser, a renowned researcher at OpenAI who is a co-author of TensorFlow and Transformer Architecture as well as core contributor of Open AI’s GPT-4 and ChatGPT, explored the evolution and future of deep learning technologies and their future.

He emphasized that more data and compute lead to better results but highlighted the impending data scarcity. Łukasz discussed how in the future, training with fewer, high-quality retrieved data points will be the key to enhancing LLM performance. He also explained the importance of powerful retrieval mechanisms, integrating personal and organizational knowledge graphs, and efficient context provisioning for effective Reinforcement Learning with Human Feedback. Additionally, Łukasz mentioned a missed observation on parsing from his seminal paper "Attention is All You Need," and shared his vision for future Large Language Models (LLMs).

  1. How Retrievers and LLMs Help Each Other and Achieving Infinite LLM Context Windows

Jan Chorowski, CTO of Pathway and a prominent figure in AI and NLP, extended the discussion by focusing on the essential role of context and retrieval in AI systems.

Building on Łukasz Kaiser's insights, he highlighted the "yin and yang” relationship between Large Language Models (LLMs) and retrieval systems. Effective LLM performance and reinforcement learning require robust retrieval mechanisms, and efficient retrieval relies on the processing power of LLMs. Jan shared an example of Adaptive Retrieval Augmented Generation (RAG) where they achieved great accuracy at a quarter of the cost by leveraging LLM preprocessing. He emphasized the need for tighter integration to achieve infinite LLM Context Windows and cost-effective AI solutions, inviting the audience to explore these concepts further with resources and examples at Pathway's developer site.

Watch the recording:

  • ​Talk 1: Deep Learning Past and Future: What Comes After GPT? by Lukasz Kaiser (“Attention is All you Need” co-author, Senior Researcher at OpenAI)
  • Talk 2: Taming Unstructured Data: Which Indexing Strategy Wins? by Jan Chorowski (CTO, Pathway)

Pathway Team

Share this article
Share new articles with me each month

Comments