Loading Events


August 21

Title: Keeping Pace: Micro-Models, Large Language Models, and the Co-Evolution of Chip and Data Center Design in a Rapidly Shifting AI Landscape


The field of Artificial Intelligence is experiencing a period of exceptional dynamism, with significant advancements occurring at a rapid pace – often within a timeframe of four to six months. This keynote explores the interplay between two distinct, yet increasingly intertwined, aspects of this progress: micro-models and large language models (LLMs), and their corresponding influence on chip and data center design.

Micro-models, with their focus on efficiency and reduced computational footprint, offer exciting possibilities for on-device applications and wider accessibility of AI. Conversely, LLMs continue to push the boundaries of capability, demanding ever-increasing processing power and optimized data center infrastructure. This talk delves into the challenges and opportunities presented by this co-evolution, analyzing how chip design and data center architecture must adapt to accommodate the diverse needs of both micro-models and LLMs in the ever-changing AI landscape.

Speaker Biography:

Mark Baciak has held various strategic roles in many industries including but not limited to: Financial Services, Life Sciences, Insurance, Software, Travel, Defense, and Federal. He was able to pioneer many technologies still present in these industries today primarily around distributed computing, cloud computing and AI/ML. Mark has a proven track record of delivering complex projects on time and budget globally. He is also recognized for many technologies as the primary or co-inventor(s) of Service Oriented Architecture (SOA), Claims Based Authentication and Authorization, Service Oriented Infrastructure (SOI), Service Level Expectations (SLEs), Capability Dynamics, Internet Scale Computing, Cloud Computing, Cloud Data Center Design, Data Center In-A-Box, CloudCRM, Decentralized Manufacturing Domains (DEMAND) among other inventions.

Date: TBD