What actually happens when AI stops being an exclusively cloud experiment and starts running on desks, in labs, and in real teams trying to do real work?
In this episode, I sit down with Logan Lawler, Senior Director at Dell Technologies, to explain how AI workloads are actually being built and supported on-premises today. Logan leads Dell’s Precision and Pro Max AI Solutions business and hosts Dell’s own Reshaping Workflows podcast, giving him a rare insight into how engineers, developers, creatives and data teams actually work, and not how marketing slides suggest they should be.
We’ll start by cutting through the noise around AI PCs. At each conference stage, Logan explains what really matters when choosing hardware for AI work. CPUs, GPUs, NPUs, memory, and software stacks all play different roles, and a misunderstanding of these roles often leads to teams spending too much or too little. Logan explains why all AI workstations qualify as AI PCs, but not all AI PCs are suitable for serious AI work, and why GPUs remain central for anyone doing real model development, refinement, or inference at scale.
From there, the conversation shifts to a broader architectural reconsideration. As AI workloads become heavier and data sensitivity increases, many organizations are reconsidering where compute should reside. Logan shares how GPU-powered Dell workstations, storage-rich environments and hybrid cloud setups give teams more control over performance, costs and data. We explore why local computing is becoming attractive again, how modern GPUs now compete with small server setups, and why hybrid workflows, local for development and cloud for deployment, are becoming the standard rather than the exception.
One of the most compelling parts of the discussion comes when Logan connects hardware choices to business realities. Using real-world examples, he explains how teams are using local AI environments to work faster, reduce cloud costs, and avoid getting stuck in architectures that are difficult to dismantle later. This isn’t about abandoning the cloud, but about being intentional from the start, especially as the use of AI spreads beyond developers into marketing, operations, and day-to-day business roles.
We also take a step back to think about a deeper challenge. What happens to critical thinking, curiosity and learning as AI becomes easier to use? Logan shares a candid perspective shaped by his experiences as a parent, technologist, and podcast host, raising questions about how tools should support rather than replace thinking.
If you’re trying to understand AI PCs, on-premises versus cloud computing, or how teams are truly reshaping workflows with AI hardware today, this talk provides informed insight from someone at the center of it all. Are we designing systems that truly enable people to think better and build faster, or are we sleepwalking toward decisions we will later regret? How do you want your own AI workflow to evolve?
Useful links
Subscribe to the Tech Talks daily podcast
![]()

![]()

#PCs #explained #Dell #Technologies #Logan #Lawler


