MLOps for LLMs - From Development to Production

November 1, 2024 · 136 words · One minute · events

MLOps for LLMs: From Prototyping to Production Ever wondered how to quickly prototype AI applications, scale them efficiently, and monitor their performance? This workshop covers the complete MLOps lifecycle, from rapid UI development to production deployment and monitoring. Prerequisites Basic understanding of LLMs and transformers Experience with Python and basic DevOps concepts Familiarity with REST APIs Previous workshops in the series recommended What You’ll Learn Efficient LLM Serving with vLLM:

AI Agents - Building Autonomous Systems

October 25, 2024 · 147 words · One minute · events

AI Agents: From Simple Tools to Autonomous Systems Ever wondered how AI can automatically execute complex tasks? Or how chatbots can interact with real-world applications? This workshop explores the world of AI agents - systems that can observe, decide, and act autonomously. Prerequisites Understanding of LLMs and RAG Basic Python programming knowledge Previous workshops in the series recommended What You’ll Learn Prompting Strategies Chain of Thought Tree of Thoughts Implicit Chain of Thought

Retrieval Augmented Generation - Beyond Basic Prompting

October 18, 2024 · 130 words · One minute · events

Retrieval Augmented Generation: Making LLMs Context-Aware Ever wondered how ChatGPT plugins work? Or how companies use LLMs with their private data? This workshop dives into Retrieval Augmented Generation (RAG), the technique powering context-aware AI applications. Prerequisites Basic understanding of LLMs and transformers Familiarity with embeddings and vector databases Basic Python programming knowledge Previous workshops in the series recommended but not required What You’ll Learn Core RAG Components: Document processing and chunking strategies

Advanced Transformer Architectures - From Text to Multimodal

September 13, 2024 · 185 words · One minute · events

Advanced Transformer Architectures: Masked Attention, Encoder-Decoder, and Beyond Ever wondered how DALL-E understands both images and text? Or how GPT models can predict the next word while Bert understands context in both directions? This workshop dives deep into the variants of transformer architectures that power today’s most advanced AI systems. Prerequisites Understanding of basic transformer architecture and self-attention Previous workshops on embeddings and basic transformers (recommended) Basic Python knowledge and familiarity with deep learning concepts What You’ll Learn Different Attention Patterns & Their Uses:

Transformer Part 1

September 6, 2024 · 163 words · One minute · events

Understanding Transformers: From Self-Attention to Complete Architecture Ever wondered how ChatGPT can understand context across paragraphs? Or how language models can maintain coherence in long conversations? The secret lies in the transformer architecture and its groundbreaking self-attention mechanism - and this workshop will demystify it all. Prerequisites Basic understanding of neural networks Familiarity with Python and basic matrix operations Previous workshop on embeddings and tokenization (recommended but not required) What You’ll Learn The core ideas behind self-attention and why it revolutionized NLP How transformers process sequences in parallel, unlike traditional RNNs The complete transformer architecture, from embeddings to output Practical intuition behind key components: Multi-head attention Positional encodings Feed-forward networks Layer normalization By Workshop’s End You’ll gain the ability to:

Embeddings and Tokenisation

August 30, 2024 · 155 words · One minute · events

Demystify AI’s Text Understanding: A Hands-on Journey into Embeddings & Tokenization Ever wondered how ChatGPT turns your words into meaningful responses? Or how Spotify knows which songs you’ll love? The secret lies in embeddings and tokenization - and this workshop will show you exactly how they work. Prerequisites Basic Python programming knowledge Familiarity with simple data structures (lists, dictionaries) No advanced math required - we’ll build intuition first! What You’ll Learn Turn text into numbers that AI models can understand Visualize word embeddings and understand semantic relationships Learn how tokenization is done, and a brief primer on positional embeddings Understand the math behind tokenisation and embeddings.

Introduction to Neural networks

August 16, 2024 · 116 words · One minute · events

Hello 💻 AI Coding Session: Hands-On AI Development 📆When?: 22 August, 6pm to 8pm 📍Where?: COM4 SR32 Time to get your hands dirty with some real AI coding! Join us for a self-paced coding session where you’ll: 🚀 Dive into Practical AI – Write code to train and optimise your own AI models. 🧠 Self-Exploration – Learn at your own pace, with guidance available when you need it. Whether you’re a coding novice or a seasoned pro, this session is designed to help you explore the practical side of AI in a supportive, hands-on environment.

Welcome Tea

August 10, 2024 · 55 words · One minute · events

Hello World We are officially starting NUS AI Society as a SoC-recognized club this semester. We are hosting our welcome tea on Wednesday, where we will announce our events and plans for this semester! You can sign up over here! We will post the venue and time on our telegram channel so stay tuned :)

Training Large Language Models - From Pretraining to Efficient Finetuning

January 1, 1 · 189 words · One minute · events

Training Large Language Models: Efficient Methods from Pretraining to Deployment Ever wondered how ChatGPT was trained? Or how companies can adapt massive language models on consumer hardware? This workshop dives into the cutting-edge techniques that make modern AI training efficient and accessible. Prerequisites Understanding of transformer architecture Basic knowledge of deep learning training concepts Familiarity with PyTorch is helpful but not required What You’ll Learn The Complete Training Pipeline: Pretraining objectives and strategies