Feb 17 • 11M

Putting Data To Work | Matthew Brandt, Data Practitioner and Creator

 
0:00
-10:45
Open in playerListen on);
The data beats show brings together data and GTM practitioners, industry experts, and vendors to demystify the data landscape in the hope to beat the gap between data and non-data teams.
Episode details
Comments

What good is data if go-to-market or GTM teams are unable to put data to work or operationalize data in their day-to-day workflows?

Operationalizing data entails going beyond deriving insights from data by leveraging it to improve everyday operations from delivering better content and communication to identifying customers that might churn in the near future and taking appropriate action to prevent that from happening.

In this super insightful episode, Matthew Brandt, an expert data practitioner and ace content creator, shares his experience and insights on putting data to work inside modern organizations.

In his past roles, Matthew has implemented several internal data products (and given them cool names) that are only now being productized and made available as plug-and-play solutions.

Don't miss the conversation, especially if you wish to operationalize data without a fancy 4-layer data stack!

Listen now on Apple, Spotify, Google, or YouTube.

Key takeaways from this conversation:

Q. What are the prerequisites to operationalize data?

Matthew (02:08):

If you don't understand the problems you aim to solve with data, you're just going to come in guns slinging, and you'll just be shooting wild, and you won't actually hit your target.

Q. Is it non-negotiable to have a data warehouse if you want to operationalize your data?

Matthew (03:26):

In the long term, yes. But for companies just starting out, it's certainly feasible to say, oh, I'm going to take my Salesforce data and I'm going to push it into Intercom. There's nothing wrong with that because you're helping solve a day-to-day issue that the business has. No fancy analytics, no fancy machine learning, no four-layer data stack involved, just one sync between two different tools.

There’s certainly nothing wrong with a point-to-point integration like the one Matthew describes above.

When I led growth at Integromat (Make), we were growing very fast (adding 1k users every day) but we didn’t have a stack of data tools — we didn’t even have a data team. So I built many point-to-point integrations and it got the job done just fine.

In the early stages, you can either optimize for efficient growth or slow things down in favor of a future state where data is pristine but it might never get used. Here’s a fascinating read on the current state of big data that states that a big chunk of data that is used (queried) is less than 24 hours old.

Q. How can operationalizing data help sales teams in their day-to-day?

Matthew (05:37):

We had a process that fired off an alert into a Slack channel when a customer was actively on the billing page and looking at trying to change their subscription. And sales would proactively reach out to them and ask them how they're doing and if they wanted to adjust their subscription. And customers basically just felt, oh, that's crazy timing.

Matthew is referring to a project titled Slow Turtle that he implemented four years ago using Google Tag Manager to capture events and fire Slack alerts using webhooks.

This has now been productized by Product-led Sales tools proving that most SaaS tools in the market have been built inside many organizations — I find it fascinating when I hear about internal tools like Slow Turtle.

Q. How can growth teams go beyond consuming insights and use data to engage and activate users and customers?

Matthew (08:19):

You need to obviously activate the lead that you're bringing in and part of that activation means understanding what the needs of the specific users are.

One of the things that I've worked on in the past is using demographic data that we get from other sources, and tying that into the product usage data. So we could see, for example, this lead comes in, they're probably between 30 and 40, they’re from this location, they're probably working in this type of industry, if we can pull any kind of social data for them, then we understand their needs a little bit more, and then they can funnel them into different types of messaging.

So we can show them in-app messaging in the product that's slightly different for their use case. If they come from the pharmaceutical industry, they're going to use the product in a different way than they're when they're coming from the finance industry, right?

100%. Understanding the needs and priorities of different sets of users is the only way to deliver truly personalized experiences — not only in-app but across every audience touchpoint.

Matthew mentions combining demographic data with product usage data — two subtypes of first-party data. Brands can go a step further by throwing in zero-party data into the mix — data that brands collect explicitly from their audiences by asking rather.

Here’s a post that covers the key differences between first-party and zero-party data.

Q. What's the one piece of advice you have for companies that are early on their data journey?

Matthew (09:37):

Please, if you're at a company that doesn't have an established data function yet, or is just getting started, please focus on the people and the processes. Do not focus on the tool stack. There are tools available for everything. So it isn't a question of, what tool can do the job or if a tool can do the job. It's a question of, what tool can do the job for you. But you can only do that once you've established what the job actually is.

You need to really solidify the processes that you have, and you need to really hire the right people who have the mindset to understand the problem before moving forward on huge contracts with vendors. Because once you've gone down that route, it's not so easy to just flip vendors. That's unfortunately a reality — the lock-in is real.

Connect with Matthew:

Thanks for reading/listening — let’s beat the gap! 🤝

Share