What if the real breakthrough in AI is not the model itself, but the data that provides it with knowledge? In this episode of Tech Talks Daily, I sit down with Edo Liberty, founder and chief scientist of Pinecone, to explain how vector databases have quietly become the backbone of modern AI infrastructure.
We explore why Retrieval-Augmented Generation (RAG) works so effectively out-of-the-box, and why refining large models often adds complexity without real value. Edo explains how Pinecone’s research has shown that different models – from OpenAI to Anthropic – require differently structured contexts to perform well. This discovery changes the way companies think about AI implementation.
As a former director of research at Yahoo and AWS, Edo offers an informed perspective on where the real innovation is happening. He explains how the shift from traditional data structures to vector representations is redefining the way machines know and retrieve information, creating smarter, context-aware systems.
We also discuss his recent move to Chief Scientist, his enthusiasm to return to real-world research, and why he believes the convergence of AI and data represents the defining technological shift of our lives.
What does it mean for developers, business leaders, and anyone building with AI when knowledge becomes an accessible infrastructure layer? Can we build systems that truly ‘know’ the way humans do?
Join the conversation and after listening I’d love to hear your thoughts: do you think the future of AI lies in the models or in the data that feeds them?
Useful links
Subscribe to the Tech Talks daily podcast
![]()

![]()

#Pinecone #believes #future #depends #data #models


