Folding Data #5

The best way to fix data quality is not to break data

podcast-gleb

I got a chance to chat with Tobias Macey from the Data Engineering Podcast about proactive data quality management and the hard lessons I learned as a Data Engineer at Autodesk, Lyft, and Phantom Auto.

We spoke about the proactive vs. reactive approaches to data quality and why improving the change management process is the easiest way that you can make data more reliable.

🔊 Listen to the Data Engineering Podcast 🔊

An Interesting Read

There’s nothing quite like reading an article that resonates on way too many levels! This Reforge article from Crystal Widjaja is relatable, helpful, and gets to the core of why so many companies fail to properly implement analytics. My favorite part of the article was the signals of success sections - seeing what bad, good, and great signals could indicate makes this post feel actionable and easy to identify where improvements could be made.

Why most analytics efforts fail - A step by step process to fix the root causes of most event analytics mistakes

Tool of the Week: Lightdash

Around 2003, Tableau revolutionized BI when it brought the famous drag'n'drop interface that let you create beautiful charts in a few clicks. Looker stormed in 10 years later, enabling self-serve data exploration on top of massive/messy datasets by adding a modeling layer – LookML.

Now we see new disruption coming, this time in line with the open-core trend. Lightdash is an impressive open-source alternative to Looker for a range of reasons. While the tool is still in its early days, Lightdash's LookML-like modeling layer and tight integration with dbt make me believe that they are onto something big! (More on that in an upcoming blog post)

⚡Check out Lightdash on GitHub 👀

Data Quality Best Practices for Data ROI

The only way to achieve ROI on your data is when the whole team believes in what the data is saying. To do this, the company needs to trust the data quality. But data quality isn’t just a project or a destination, it’s a journey. Often, it requires a range of processes and tools, but fundamentally it’s about building a culture around data and data quality. Here are best practices to serve as a blueprint or approach to help you get to data ROI.

Tell Me the 4 Best Practices ✅

Before You Go

Keep up with the latest from Datafold - follow us on Twitter and LinkedIn!

As always, here is your meme reward for making it to this point in the newsletter. We were fooling around with Ryan's tweet.

blog-meme

Meanwhile, Julia (the company behind the language) raised $24M Series A. Well, big data – big money! 💰

wallstreet-580

Love this newsletter? Forward it to a friend! 🚀
Received this newsletter from a friend? Subscribe for yourself! ✨

Get Started

To get Datafold to integrate seamlessly with your data stack we need to have a quick onboarding call to get everything configured properly

Want to see Datafold in action?

Find out how Datafold can help your team deliver better data products, faster.