Table of Contents
Need Help? Get in Touch!
DataTune 2026: Three Conversations I Kept Having With Data Leaders
Last weekend I had the pleasure of speaking at the DataTune Conference at Belmont University in Nashville. However, my favorite parts of events like this are the conversations that happen outside the presentation room. Throughout the conference we had dozens of discussions with engineers, analytics leaders, and executives who are all trying to solve a similar challenge: how to deploy data and AI systems that actually work in production. While the specifics varied from company to company, three themes kept coming up again and again.

Don’t Call It a Comeback: SQL Is Still the Backbone of Modern Data Platforms
One of the most interesting trends I saw at DataTune was how strongly SQL-based tools have re-asserted themselves as the standard bearer for modern data lakehouse semantic layers.
Open-source tools are leading the charge across the stack. Engines like PrestoSQL, which powers services like Amazon Athena, are becoming the backbone for many analytics workloads. DuckDB is gaining traction for fast in-memory processing, and orchestration frameworks like dbt are making structured data pipelines easier to manage and collaborate on.
For a lot of teams, what used to feel like “boring old SQL” suddenly feels new again. The ecosystem around it has matured to the point where organizations can build incredibly powerful analytics platforms while still leaning on a language that most data professionals already understand.
The takeaway for many organizations is simple: modern data architecture doesn’t always require reinventing everything. Often the smartest move is building on open, proven foundations and layering modern tooling on top.
Nobody Loves Their BI
Another conversation that came up repeatedly at the conference was about the BI layer, and more specifically, how difficult it can be to satisfy everyone who depends on it.
Power BI was probably the most discussed tool at the conference, but not always in glowing terms. What we heard over and over is that the reporting layer is where tension shows up inside modern organizations. Data teams want strong governance and clean semantic layers. Product teams want flexibility and speed. Executives want immediate answers to business questions. Balancing all three is hard.
The next generation of BI tools is starting to recognize that challenge. The best platforms are combining strong traditional analytics tooling with safe natural language interfaces that allow business leaders to explore data without constantly relying on engineering teams.
This is exactly the balance we’ve been trying to strike with platforms like Quick Suite, where structured data models and AI-powered interfaces work together. When done well, tools like this allow analysts, engineers, and executives to all interact with the same system in ways that fit their needs.

The AI Debate Is Over. It’s About Speed.
A few years ago, conferences like DataTune were dominated by debates about whether AI would actually change the way data teams work. That debate seems largely over. What I heard from leaders across the space this year is that AI is already reshaping how systems are built. The real question now is how quickly organizations can adapt their development processes to take advantage of it.
Spec-driven development tools and agentic coding systems are dramatically accelerating how new features are built. Tools like Claude Code, which was mentioned frequently throughout the conference, are enabling teams to generate and iterate on code faster than ever.
In my presentation we demonstrated how Amazon Kiro, backed by modern models like Claude Opus, can design and deploy full-stack data workloads in a fraction of the time traditional workflows require. In many cases, what used to take weeks of engineering work can now be prototyped in hours.
The organizations that benefit most from this shift aren’t necessarily the ones experimenting the most with AI. They’re the ones that build systems and processes that allow engineers to move quickly while still maintaining strong architecture and security practices.
Turning Ideas Into Systems
Many organizations have great ideas about what they want to do with AI and data, but they struggle to get from idea to something tangible. That’s why we’ve been focusing on ways to help teams move from concept to working system quickly. Programs like our Red Oak Strategic 5x5 Quick Suite AI Insights Accelerator are designed to give organizations a low-risk entry point to connect real data sources, deploy an AI-enabled analytics environment, and start testing real workflows in just a few days.
Final Thoughts
Events like DataTune are a great reminder that while the tools and technologies in the data world evolve quickly, the core challenges remain consistent. Teams are still trying to build reliable data platforms, deliver insights to decision-makers, and adopt new technologies in ways that actually improve the business. The difference now is that the tools available to do that are more powerful than ever.
If there was one overarching takeaway from the weekend, it’s that the future of data and AI isn’t about choosing the newest technology, it’s about building systems that allow teams to move faster, experiment responsibly, and deliver value sooner at a reasonable cost. And based on the conversations we had in Nashville, a lot of organizations are ready to start doing exactly that.
Contact Red Oak Strategic
From cloud migrations to machine learning & AI - maximize your data and analytics capabilities with support from an AWS Advanced Tier consulting partner.
Related Posts
Ready to get started?
Kickstart your cloud and data transformation journey with a complimentary conversation with the Red Oak team.