Open Source Hong Kong is pleased to be a co-organizer of the vLLM Hong Kong Meetup!
The global vLLM Meetup is coming to Hong Kong! We’re bringing together vLLM core contributors and users locally from Hong Kong, Greater China, and around the world to share what’s next for LLM inference with vLLM—an open‑source LLM inference and serving engine with over 60,000 stars on GitHub!
Join us to dive into the fundamentals of vLLM, get hands-on experience, discover proven techniques to optimize LLM performance, deployment cost and reliability, and connect in person with a vibrant community of vLLM contributors, developers and users.
Event objectives:
- Discover vLLM and the current landscape of LLM inference
- Hear directly from vLLM core contributors to learn the latest vLLM developments and updates
- Learn how vLLM integrates with AI hardware accelerators and state-of-the-art AI models
Date: 07-Mar-2026 (Sat)
Time: 10:00am – 5:30pm *HKT
Venue: The Hong Kong Polytechnic University
Language: English (Primary) / Mandarin
Event Initiator: Red Hat
Event Co-organizer: Red Hat, vLLM, EmbeddedLLM, PolyU Department of Computing (COMP), AMD, MetaX, Python User Group/OSHK, MiniMax
How to Join?
Please register at https://www.vantagemind.com/events/vLLM/260307/vLLM-HK-Meetup_vLLM.html
Agenda
https://www.vantagemind.com/events/vLLM/260307/vLLM-HK-Meetup_Agenda.html