Mobile devices are driving a revolution in how computing is accessible by all. Users can now carry devices in their pockets that grant unfettered access to and facile manipulation of their data, redefining how people work and relate to each other. The blunt instrument of Moore’s law performance scaling has certainly contributed to this trend. However, the demands of graceful industrial design (thermals, PCB area), friction-free user interfaces (smooth performance), and long-term wire-free usage (battery life and unconstrained wireless connectivity) has forced architects of “System-on-Chip” (SoC) designs to take more nuanced approaches to utilizing increasing transistor density. This includes the integration of many purpose optimized logic blocks, programmable and non-programmable, which, when properly orchestrated with other (traditional) SoC components like CPUs and GPUs, deliver low-power implementations of common use-cases for these devices.
The challenge for Operating System vendors and developers is how to enable this orchestration efficiently for the burgeoning market of mobile and tablet-optimized applications (which is nearing millions of applications across all platforms). For optimal power at best quality, it is not unusual for the app developer to either implicitly or explicitly program upwards of a dozen pieces of chip IP, each with it’s own programming model and performance/power characteristics. How are the OS vendors enabling this vibrant and growing applications ecosystem in the presence of such daunting programming problems?
This tutorial will review the current landscape of mobile SoCs and their future direction (a hint: we’re integrating more functionality than ever on a single die). We will describe how the complexity of developing for these SoCs is tamed by today’s APIs and what challenges remain. We will also discuss how future use-cases will demand additional capabilities be enabled, impacting everything from chip design to OS to high-level APIs.
- 9:30am – 10:40am: Mobile SOCs: Connecting Hardware to Apps by Neil Trevett from Khronos
- 10:40am – 11:10am: Break
- 11:10am – 11:25am: Camera and Video by Sean Mao of ArcSoft
- 11:25am – 11:40am: Vision and Gesture Processing by Itay Katz of Eyesight
- 11:40am – 11:55am: Augmented Reality by Ben Blachnitzky of Metaio
- 11:55am – 12:10am: Sensor Fusion by Jim Steele of Sensor Platforms
- 12:10am – 12:25am: 3D Gaming by Daniel Wexler of the11ers
- 12:25am – 13:00am: Panel moderated by Neil featuring all the speakers