Juan Daza
Data & AI Architect · Father · Cloud Native Platforms
Data Architect with 10+ years of experience leading enterprise data and AI strategy. I design Lakehouse architectures, govern data platforms at scale, and build the ML/AI infrastructure that turns raw data into competitive advantage.
I work at the intersection of strategy and hands-on delivery: engaging C-suite stakeholders to align data programs with business goals, then leading the technical teams that build them. I care about systems that are well-governed, reproducible, and built to last.
More about me
Now
At AWS ProServe, I own the architecture and delivery of enterprise data platforms across the ANZ region, working backwards from customer outcomes to design Lakehouse solutions, governance frameworks, and end-to-end analytics pipelines that scale. Currently diving deep on data contracts and semantic discoverability to accelerate time-to-insight for large organisations.
Updated February 2026Experience
- Designed and delivered an enterprise federated Lakehouse with end-to-end data governance using cloud-native compute (Spark, Trino, Redshift).
- Architected event-driven data ingestion pipelines on Azure using Databricks and Delta Lake, enabling reliable high-throughput data flow across enterprise domains.
- Built production ML deployment and validation tooling, standardising how models were released and tested across the organisation.
- Developed a large-scale unsupervised fraud detection model using a custom PySpark framework, surfacing patterns not detectable with standard tooling.
- Led multi-year data transformation programs across financial services and mining, engaging C-suite stakeholders to align data strategy with measurable business outcomes.
Explore
Find me
- GitHub — code, projects, contributions
- LinkedIn — professional background
- dazajuandaniel@gmail.com — say hello