He was joined by Jeanne Jennings, CEO of Email Optimization Shop; Neil Jennings, attorney and founder of GLF Strategic Compliance; and Chris Black, a leader in fractional marketing operations, who together unpack privacy-first personalization, dynamic consent management, and how to balance performance with responsibility as the rules continue to evolve.
The conversation started with a live poll that set the tone for the session. When attendees were asked which challenge kept them up at night – AI consent compliance, changing regulations, or performance versus privacy – the majority chose the latter. The tension between delivering hyper-relevant, AI-driven experiences and maintaining ethical, compliant marketing practices is clearly top of mind.
Keeping people informed
The panel’s first challenge was how to preserve human judgment when algorithms make more of the calls. Jeanne Jennings warned that AI is a tool and not a value system. “It’s not a substitute for good marketing acumen,” she said. Her example: an email program where executives received content that wasn’t relevant to them, even though the vendor insisted the algorithm had to be correct, was a common risk. “If someone says the AI is right and the humans are wrong, that breaks trust,” she explains. The solution, she said, lies in regular human supervision: quality control, spot checks and old-fashioned gut checks.
Neil Jennings approaches it from a governance lens and agreed that this is as much about culture as it is about code. “The algorithm is not always right,” he said. “We need QA processes and governance to identify biases and ensure accountability.” He noted that manipulation is not one neat legal concept; it includes user interfaces, algorithmic bias, and systemic transparency.
Trust: feeling or metric?
When asked whether trust should be treated as an emotion or a measurable outcome, panelists agreed that it should be both. Neil Jennings described it as “a consumer expectation and a legal outcome,” and pointed to the rising costs of things going wrong – from the fallout from Facebook’s Cambridge Analytica to the FTC’s fines against brands like Experian and Honda. Compliance is measurable in fines and lawsuits; trust, he argued, is measurable in brand value and customer retention. Jeanne Jennings described compliance as “the floor,” and trust as “the ceiling.” Her experience in email marketing has shown that exceeding legal minimum standards is worth it in terms of deliverability and loyalty.
“You know when you’ve lost confidence,” she said, “because the numbers tell you.”
Black expanded on this view, pointing out that trust now lives throughout the organization, not just in marketing.
“Prospects don’t just see our website and emails anymore,” he said. “They see Glassdoor reviews, G2 ratings, screenshots and social sentiment – all made visible in real time by AI.” That transparency, he argued, requires a company-wide commitment to consistency between what a brand says and how it behaves.
Making consent a living system
The discussion focused on operationalizing consent as a dynamic, real-time process.
“We’ve had the checkbox for years,” Black said. “Now consent is a live signal.”
He explained that marketing stacks often interpret consent differently: one system will stop all emails, another will stop only marketing messages, while ads continue to follow the user online. The outcome, he said, is a compliance paradox: technically legal, but trust-negative.
To solve this, Black pushed for a cross-functional alignment of what each consent state actually means and how it triggers or suppresses actions across systems. He described new dashboards that help visualize trust by unifying data from CRM, automation and customer success platforms. With current APIs and server integrations, he said, “we can finally see where consent fails.”
Neil Jennings added a governance perspective and identified four essentials for real-time compliance: technical reliability, transparency, autonomy and necessity. “If you can answer ‘yes’ to all four questions,” he said, “you’re building trust the right way.” Jeanne Jennings echoed this sentiment, emphasizing that companies should use AI to scale, not replace, governance.
Balance between speed and responsibility
A recurring theme was that governance does not slow down progress, but rather makes it possible. Neil Jennings argued that leaders need to understand their organization’s risk appetite and avoid ‘AI FOMO’. Many companies, he said, view fines as a cost of doing business, but this approach is unsustainable as fines escalate. The goal is to differentiate what AI is can do and what it is should Doing. Jeanne Jennings added that over-delegating authority to AI erodes responsibility: “You can’t outsource responsibility.”
The conversation revealed that compliance is no longer a check box, but a competitive advantage. “That’s how you build trust,” Hillison summarized. “If teams can act quickly without cutting corners, you build a bond of trust.”
Beyond compliance: trust as brand value
When asked how brands can go beyond compliance, Black offered an apt metaphor: “Trust is not the ceiling – it is the moat.”
He cited Apple as an example of how responsible data practices build loyalty even as innovation slows. “That trust buys time, forgiveness and brand value,” he said. Jeanne Jennings agreed, adding that marketers should ask themselves, “Is it smart to do this?” rather than simply, “Is it legal?” Her example: a B2B campaign that openly solicited industry information rather than distracted from it illustrated how transparency can strengthen relationships rather than jeopardize them.
Measuring trust in practice
In the audience Q&A, participants asked how they could quantify something as intangible as trust. Neil Jennings simplified it: “Compare what you told people you would do with what you actually did. Narrow that gap.”
Jeanne Jennings added a more personal guideline: “Treat your customers as you would want your family to be treated.”
Another question raised whether AI note-taking tools were safe for board meetings. Both Black and Neil Jennings warned of the discoverability risks and urged companies to establish clear governance policies before deploying them in sensitive contexts.
The way forward
Finally, the panel predicted that trust will soon be both more measurable and more human.
Black foresaw it being tracked in CRMs as a performance measure, while Jeanne Jennings foresaw a widening divide between brands using AI as a channel for human connection and brands that delegate responsibility to it. Neil Jennings predicted that there will be more scrutiny of AI vendors and backend algorithms as regulators evolve beyond superficial UI issues.
Ultimately, the group agreed that the future of marketing is not about eliminating risk, but about gaining trust at the speed of change. The brands that win won’t be the brands that move the fastest, but the brands that move the fastest and with integrity.
Energize yourself with free marketing insights.
Contributing authors are invited to create content for MarTech and are chosen for their expertise and contribution to the martech community. Our contributors work under the supervision of the editors and contributions are checked for quality and relevance to our readers. MarTech is owned by Semrush. The contributor was not asked to make any direct or indirect mentions of it Semrush. The opinions they express are their own.
#stakes #trust #compliance #consent #higher #MarTech


