APPLE PODCASTS | SPOTIFY
Episode Overview
AI is not just accelerating paid media—it’s redefining it. From automating campaign creation to predicting budget allocations and dynamically personalizing creative, AI is forcing media teams to rethink how they build, launch, and measure success. But with so many tools promising “optimization,” what actually delivers? And where does human strategy still matter most?
In this episode we'll tackle the biggest questions marketing teams are asking:
How should we be using AI in paid media today? What metrics actually matter in an AI-augmented landscape? And how do we maintain control, clarity, and creativity when machines are making more decisions?
- What's really changing for paid media teams?
- How does AI optimize campaigns—and what control remains for marketers?
- How does AI influence budget allocation?
- What powers predictive budget allocation?
- How do you scale creative with AI while keeping brand voice unique?
- What tools best support creative testing and personalization while protecting brand identity?
- How should marketers measure success as AI changes campaign KPIs and attribution?
- What do paid media teams look like in the AI era?
- How do we ensure governance and accountability as AI speeds up decision-making?
- Final thoughts & key takeaways
Episode Transcript
Reid Carr: Welcome back to the Marketing Remix and another addition of The AI Edit where we explore how artificial intelligence is reshaping marketing one discipline at a time. Today we delve into paid media, a space where AI has arguably advanced the fastest and most significantly. Platforms promise, smarter marketing, seamless optimization, and predictive budget recommendations. But underneath those promises are questions every marketer should be asking, what's really changing? Who's making the decisions, and are we still in control of the strategy? To unpack all of this, I'm joined by two people who know both sides of the story. Emily Engberg, our director of Paid media, and Ron Hadler, our VP of Data and MarTech. Together, we'll explore what AI is doing behind the scenes, how it's changing campaign execution, and how smart teams are using it to move faster without losing clarity, creativity, or control. Emily, Ron, great to have you today.
WHAT'S REALLY CHANGING FOR PAID MEDIA TEAMS?
Reid Carr: When we talk about AI and paid media, it's easy to get lost in the buzzwords, platforms promise smarter marketing, better performance, and fewer manual tasks. But what's changed for the teams doing the work? Today we'll start by cutting through the noise. So Emily, what's real, what's helpful and what still feels like smoke and mirrors?
Emily Engberg: Fantastic question, Reid. What's real starting with that? I would say first and foremost, time savings. AI has absolutely expedited processes for the team and reduced the amount of manual work that the team has to do on a day-to-day basis. So thinking about the ad platforms that we are working in on a regular basis, Google ads, meta ads, those are part of nearly every brand's media mix these days. The process of setting up a campaign has changed drastically and the process of managing a campaign on a day-to-day basis has changed drastically.
Previously, everything was highly manual, identifying audiences, establishing bids, ongoing optimizations. Now there are AI features in virtually every platform that due to a degree, automate some of that and do so quite effectively. So for example, in Google ads, your paid search campaigns, your ads are popping up on the top of a search engine results page and you as a brand are paying. If somebody is clicking on your ad that is showing up at the top of the page. That cost per click previously was managed with manual CPC (cost per click bidding). Now we have smart bidding strategies where AI is identifying the search results that they deem as highest probability to generate a result for you as a brand. So instead of just a cost per click, it's become much more sophisticated and that is something that our team manually would not be able to accomplish, and we've seen significant improvements in results there. We are also using AI to identify patterns in data that our team manually at a glance wouldn't be able to do so quite so easily. So adjustments in audience behaviors, keyword clusters for our search campaigns, all of those are things that humans may have missed, but AI is allowing us to really tap into and that just informs optimization opportunities for us going forward.
And then creative iteration is definitely something that we've seen AI contribute to and definitely still some work to be done on that front, but it is very real that we're able to generate far more creative and ad copy at a much faster and more scalable rate than we were previously, which is allowing us to test at a much faster pace.
What is helpful? I would say the forecasting tools come to mind. Now, every platform has different AI driven features that will predict with these changes what the outcome will mean for your brand. We take that directionally. Every brand of course or every platform favors itself. So Google is going to tell you to spend more meta is going to tell you to spend more. That also applies to their attribution modeling. If you look at data in meta about how many conversions your ad that ran on Facebook drove, and then you compare that to data that we are seeing in other tools like GA4,will see wide discrepancies, so take it with a grain of salt, but it's fantastic to have that information to then compare. and then what is smoke and mirrors? In my opinion, it's the idea that platforms like Facebook and Google are touting that everything will simply be automated. You can set it and forget it and let AI algorithms control your campaigns and generate the best results. We are absolutely not there yet. There definitely still needs to be that human touch applied to every campaign to ensure brand safety, to apply that larger business context to campaign performance. Google is great at optimizing towards a singular conversion action, but it's not considering your larger business objective and how Google fits into that broader mix. So it's really imperative that we are still applying that human touch.
Reid Carr: Yeah, yeah, it's interesting. I think the motivations of every platform are going to be to have you spend more on that platform, and so you need to have a little governance I would imagine, and that's where the human touch certainly provides that beyond some of the optimization.
Ron Hadler: We're saying the advertising platforms are greedy?
Emily Engberg: Shocking!
Reid Carr: Yeah, right. I think we've learned that lesson over the last many years.
HOW DOES AI OPTIMIZE CAMPAIGNS—AND WHAT CONTROL REMAINS FOR MARKETERS?
Reid Carr: Ron, going to you on the data side of things, a lot of clients think that this idea that AI is optimizing everything, can you unpack what that optimization really looks like under the hood?
Ron Hadler: Sure, sure, sure. And I think this is one thing to really understand is AI did not replace strategy, but it did consolidate the knobs we can adjust. So, there's fewer knobs. So, really when you kind of talk, what we're doing now for an optimization under the hood is we set our objectives and then AI will create the rules or policies that are enacted. So we set like our CPA, our ROAS, our conversions, and then as Emily mentioned, smart bidding takes over and it basically learns a policy to maximize those expected values for CPAs and ROAS in a performance max that's going to span everything that Google has, right? Search, YouTube, Display, etc. And it's going take your objective, your budgets, and your audit signals, and then it's going to automatically handle that bidding, the placement, and the creative mixing. So it's doing all of that for you.
Less knobs, less micromanagement. Then it's also going to take signals and then do predictions. This is especially important in today's consent privacy mode enabled because it's going to be able to take things like what’s the query, when’s the search, what device are they on, where are they located in the world, what's the context? And then it's going to take your first party conversions, which this is where the modeling takes place because consent's not granted. So it's making assumptions, it's making those predictions in order to place those ads for you. So the conversion modeling uses Google AI to observe and analyze historical data and trends. So it's basically filling in the blanks where we don't have consent and making those choices for you. Now, it's also doing the same thing for your budget, and so when you have goal-based things, it's going to dynamically route the spend to hit your objective. Now there's some tweaks there, and then you got to be very careful how you place those because when you just say, I want to optimize for downloads, it'll go whole hog and you're not necessarily going to get the quality that you want. So the controls didn't vanish for optimization, they moved. So you have control over data quality, conversion definitions, that's especially important. Creative assets, audio seats and guardrails, but we're not micro tweaking all the time.
Reid Carr: Right. Micro tweaking, that's very on brand for these days hahaha.
Ron Hadler: I did not say micro-twerking, just to set that clear!
Reid Carr: Hahaha yes.
HOW DOES AI INFLUENCE BUDGET ALLOCATION?
Reid Carr: Okay so budgeting, it's one of the highest stakes decisions in paid media. I mean, there's a lot of money that goes into paid media and clients want us to very responsible with that as we are. One of the first places AI is claiming to optimize your outcomes is paid media--or is the budgeting process I should say. Here's the catch though. Smarter spend doesn't always mean better results. Emily, how should marketers think about AI's role in budget allocation today and where do we still need that human hand on the wheel?
Emily Engberg: Great question. AI certainly has improved our budget pacing and allocation capabilities, but it is one of those areas that is still very much not a ‘set it and forget it’. We are leveraging AI tools to support with pacing. Some of that is in-platform, some of that is external tools that we are leveraging in partnership with platforms. Optimizer is one that our team uses through scripts that are applied in ad platforms to analyze past performance data and every single day make micro adjustments to our bids based upon that day's propensity to drive conversions. Very cool that we have the ability to implement something like that. Humans would not be able to analyze every single day, eight historical weeks of performance data and determine an appropriate bid adjustment. So that type of capability has certainly enhanced what we are doing from a paid media perspective. Like I touched on earlier, forecasting is another area where AI can help us make informed decisions. But, for all of this we really want to make sure that we are considering that larger strategic context, what is going on in the competitive landscape with that particular vertical seasonality? All of those other considerations, an AI algorithm isn't going to be able to take into consideration in the same way that your human team would.
And we also want to think about more than just paid media when we're thinking about budgets, what other cross channel efforts are underway? That's something that human strategists really need to work in partnership with their other team members to understand the SEO strategy that's in place, the organic social strategy that is being rolled out. Those considerations need that human touch and really just ensuring the quality of what we are putting out there is considerate of the bigger picture.
Reid Carr: Yeah, yeah, it's interesting. Then you get into competitive and you see all the different variables that are happening out in this world. It looks like what the data is looking at is just more data. If you're looking at a brand, you're looking at its competitors, you're looking at the changing ecosystem and what they are throttling on and off is certainly different than what we're throttling on and off. So looking at performance, it's ultimately just measured based on what's historical yield, not context per se,
Emily Engberg: Right. It's fantastic for helping us react faster, and reduce wasted spend, but thinking bigger picture, that's where the team really needs to come in.
What powers predictive budget allocation?
Reid Carr: So Ron, when platforms talk about predictive budget allocation, what's actually powering those predictions?
Ron Hadler: So yeah, predictive allocation is really only as good as the objective and the evidence we feed these platforms. So they use auction time predictions of lift, that's conversion probability times value to push dollars to the next opportunity. So when we really talk about things like a data-driven attribution, they give us credit for conversions based on how people engage with your ads, and then they determine which keywords, ads, and campaigns have the greatest impact. So, the other thing is, and I talked about this previously was when it comes to our privacy era and we are missing our click data, they use model conversions for that and that helps 'em kind of fill in the gaps. Otherwise, we'd be starving the models of data, and we know cookie data is limited and nobody likes to be tracked. But kind of where this breaks, and I mentioned this before, is when your objective is misspent. The other thing is then when you have kind of a noisy taxonomy, so take a little deeper look at that. What happens is if a lot of times you tell these systems ‘Get the most out of x’, and i.e. ‘I want to do more downloads’, and so really when it tries to maximize those downloads, you'll get your CPM falling, your downloads soaring and your qualified leads are taking a hit because students and bots love free PDFs. Switching to something like ‘maximize qualified lead value’ is going to get you a better optimization there.
What do I mean by noisy conversion taxonomy is--it's kind of a garbage in gospel out thing--is that when you get kind of duplicates or mislabeled gold or misquality, this is really kind of human errors and making mistakes. But those things are all really having all those things marked as optimization eligible all of a sudden confuses the model and it's not singularly focusing on something. So, you really need to be specific about what you're saying it is and not give it too many things to try and optimize.
Reid Carr: That's interesting. Yeah, so I guess muddying the data, the more you fragment that data, the more, I guess confused the model would be?
Ron Hadler: It's too much context, if you give a chat bot and let's just take and pick on CoPilot for instance, we have lots and lots of data in our SharePoint. If you just say, ‘Hey, give me all of the things that I need’ and then don't put any parameters around that, the context is too large to bring back any sort of meaningful answer.
Reid Carr: Right.
Ron Hadler: So, if you are specific, it's like ‘I need to talk about texts and emails between this date and this date,’ you're going to get a much better response. Same thing when you're placing ads and bids on these platforms is being very specific, being very clear.
Reid Carr: Or giving a creative, ‘just be creative.’
Emily Engberg: I will say that is one of the most common issues that we find if we're auditing somebody's ad accounts, lack of prioritization in conversion actions, so lumping in conversion actions of very different values. So, a form fill submission is a much higher value action than somebody visiting an about us page or something to that effect. But if all of those are marked as a primary conversion action in ad platforms, algorithms don't know what to optimize towards, and your outcomes will not be what you are hoping for.
Reid Carr: Yeah, well, I mean I'm sure it's like any employee-employer relationship too. If people don't have the priorities or have too many of them, I mean anybody's confused by what they're supposed to do on any given day. So I think that makes pretty rational sense if we think about it in those terms.
HOW DO YOU SCALE CREATIVE WITH AI WHILE KEEPING BRAND VOICE UNIQUE?
Reid Carr: So switching gears a little bit toward then the creative optimization. I mean, AI makes it easy to scale creative, it produces dozens of ad variations in seconds. That kind of speed is a game changer, especially for testing optimization, but volume alone doesn't guarantee the resonance. It definitely doesn't guarantee brand distinction. So Emily going to start with you again. How do you leverage that scale while making sure our client's brands don't lose their unique voice or feel overly generic? I mean, I think it can easily lead toward very generic creative.
Emily Engberg: Absolutely, and I would say with caution is how we approach it. It's certainly an advantage to have rapid creative iteration at our disposal. The creative team here at Red Door uses some very cool tools. Creatopy is one that we have leveraged for that rapid versioning that allows you to implement an array of brand controls that really ensure that the versioning is not altering those touch points in the creative elements that are going to detract from brand voice. So maybe we're changing out colors, imagery, but we're really controlling the copy. Ad platforms themselves have also been introducing creative generation capabilities. So Meta certainly has them, Google has them. Those tools have been improving pretty significantly, but we are still very hesitant about tapping into those. We've had horror stories in the past of accidentally having that feature turned on. Meta is notorious for rolling out new features and setting them as auto opt-in, so you have to know to go in and opt out, and we had a university running that had that feature accidentally toggled on and meta created an ad that had a completely different university's name in it.
Reid Carr: Oh my goodness.
Emily Engberg: Yeah. So approach with caution. However, these platforms are getting feedback from their customers and are continually working to evolve the capabilities. So now many have additional controls that are being implemented and rolled out. So Google currently--in beta--has additional brand safety controls that can be applied to keep things in check if you do have some of those AI generated features active for your campaigns. Where we are using it and where we’re testing this is with more ad copy generation. A very new feature that has rolled out on Google is called AI Max. AI Max is the latest iteration of search campaigns where it--in theory--can respond to the specific search query and develop ad copy that is catered to that. So for example, you are a retailer and somebody has searched for red sundress. Perhaps you have a red sundress in your inventory but didn't have ad copy specific to that, it will generate ad copy so your headline says red sundresses. Something that's very, very relevant that is likely to drive a click for you. But with that, you need to make sure that if you are in a regulated space, for example, say it's healthcare, banking, you'd want to be very cautious about it, generating copy that does not abide by any regulations that are in place. Or if you're just a brand that has a very specific image and voice, you'd want to make sure that that isn't running wild and making claims about your capabilities in your offer that are not true. So tapping into all of those brand safety features is something that our team always implements, and if there is a scenario where it is riskier than it is rewarding, we very much err on the side of caution and we'll still deploy more manual tactics.
Reid Carr: Yeah, it's interesting. I think this is a bit of a plug on the creative side too, for creating really well-formed brand guidelines. I mean, if you want to use tools in AI and creative AI, you have to be very clear on what your brand guidelines are. I think there's a number of people obviously who do put that time and effort in, but they do so because they've got partners all over the world or whatever that may be. And in this case now you're trying to tell AI that's really not going to be conscious of just kind of looking at it and sort of making a judgment call, which a lot of brands operate by that. They make judgment calls day in and day out about what is or isn't brand safe. But now having real strict guidelines I think allows you and empowers AI to do more within the constraints of a brand guidelines, I would imagine.
Emily Engberg: Absolutely. And humans are still going to need to be involved for larger creative concepting that storytelling piece, AI isn't going to nail that.
Reid Carr: Right.
Emily Engberg: So humans will not be removed from the equation, but if we develop an approved concept and want to quickly knock out a variety of different sizes, versions, featuring different colors, that's where AI comes into place.
WHAT TOOLS BEST SUPPORT CREATIVE TESTING AND PERSONALIZATION WHILE PROTECTING BRAND IDENTITY?
Reid Carr: So Ron, going down to the tool level, what tools are most promising when it comes to creative testing and personalization and how you make sure that they reflect the brand? A lot about what we were just talking about, not just the algorithm.
Ron Hadler: There is quite a bit coming on and this is an area that I think has really got a lot of exploration going on, and I would say use the machines for exploration, but use the brand for selection. So kind think of about asset mixing platforms. So essentially they're kind of what Emily described where they're kind of replacing things almost, not quite randomly, but AI based. Platform experiments. And this is where we kind of got a structured AB testing sort of situation. That’s I think important, not just for creative, but also for just about anything we're doing in media, make sure we're testing, testing, testing, and then really kind of that governance layer. And really this is where you want to make sure that you are keeping a log of these decisions that you make because I think a lot of that gets lost in this old thing where we used do a lot of micro experiments and making sure that you're making those decisions. So understanding which assets may be allowed, what are the minimum brand cues that must appear, et cetera, when they're creating that sort of creative for you. Google meta, TikTok, Amazon all have these AB testing abilities, and so that's where I would probably start as a minimum because you have a lot more control in those things.
Now you do have on-page testing for your major platforms, VWO, Optimize, The AB Tasty, and those things can test everything from ad copy and imagery to user journeys. Now when we're really kind of talking into the deeper, and I think this is where folks are most curious, is creative intelligence or even pre-flight scoring of your creative. And then you've got platforms such as Super Ads, MarPipe, or zappy that will use artificial intelligence to analyze your creative elements at scale. So they can basically give you a lot of feedback very quickly. And then you have dynamic creative optimization or DCO. So platforms like Createo, Innovit, they automate creating thousands of variations, and that's where you can really feel overwhelmed because they can do all these things very quickly and make incredible mistakes very quickly. But then you have things like VidMob, which is actually looking at videos and being able to really assess those and give you feedback.
I think when you're talking about brand and taking care of your tone of voice, your visual identity and your messaging pillars, these brand safety features are really where the most journeys need to take place with platforms. Like Emily mentioned, they are in beta, but really they allow you to create a master template. These are things that should not change. I want to make sure my logos, my fonts, my legal disclaimers do not change. So giving them a very tight sandbox in order to make changes, and that's where we're going to be able to use these things and lean into them more when those things are really done. Now, a lot of those features in beta, they're not quite there, but I think that's where you really want to do your investigation when you're looking at these platforms, what are the brand safety measures that are allowed? Because the more control that you have over that, the more success you're going to have because it's going to adhere to your brand guidelines.
Reid Carr: Right. Yeah. Having those guardrails obviously creates a little bit more of a narrow platform by which the brand can be safe, yield the results that we want. So building out those mechanisms, I think it's an ongoing challenge that brands are going to have to face.
Ron Hadler: And build your trust in the platform, right? And then you can lean into it. Nothing is ‘set and forget’, especially in this area, we've already illustrated an example of something going wrong, but those things are coming.
Reid Carr: Yeah, letting it take over the wheel entirely can get out of control pretty fast, it seems.
Heading
Reid Carr: Well, and so actually that's where it comes down to then. The next part that we want to get into is the KPIs, what the new KPIs in this automated world, because you want to get the results, and that's ultimately why we're doing all of this. As AI takes over more campaign decisions, it also redefines how we measure performance. So we see fewer hard numbers and more modeled outcomes. Most marketers never see the algorithms that estimate the clicks, conversions, and attribution paths. So Emily, how do we evaluate success and are there metrics or signals you prioritize differently in this new landscape?
Emily Engberg: Yes, with the shift to much more privacy and less clear attribution as a whole, there has been a major shift away from last click attribution--that used to be the holy grail. It was the easiest to measure, and now that is much less clear than it previously was, and that measurement is flawed because it's not considering the entire customer journey, simply that last touch point before a conversion. So other reasons to shift away from that beyond privacy impacting our ability to measure that. But that has been a major trend as of late. We have seen a rise in modeled data like Ron was touching on, that is partially due to privacy controls giving us incomplete data sets that need to be modeled out, so we have enough to work with. But with that, you need to take those modeled results with a grain of salt.
Every platform has their own modeling methodologies that we don't always have insight into. So Meta's going to tell you it drove a ton of conversions, but GA4 may disagree, and it's important for us to evaluate the data that we have from all of these different sources and really triangulate those sources to understand the impacts of our campaign. So let's look at campaign platform data or media platform data. Let's look at GA4 data. Let's look at first-party data. Do we have CRM data from our clients to say the leads that were generated and the quality of those leads? So validating results across multiple sources is something that we're turning to. We are shifting away from last touch attribution, last click attribution, and turning to other measurement methodologies. MMM, media mix modeling, marketing mix modeling is certainly an area that many brands and agencies are leaning in on these days to understand the impacts of the media that they have in place. Beyond that last touch attribution, you'll also see multi-touch attribution methodologies. So really just shifting that mentality as a whole and understanding that quantity isn't necessarily our measure of success, it's quality. So exploring the data sources that allow us to really grade the quality of the work that we're doing versus solely a volume of leads ggenerated, olume of sales, volume of conversions.
Reid Carr: Right. Okay, yeah.
HOW SHOULD MARKETERS MEASURE SUCCESS AS AI CHANGES CAMPAIGN KPIS AND ATTRIBUTION?
Reid Carr: Ron, can you walk us through what's really happening on platforms reporting on those conversions, and optimizing that performance? What's happening under the hood in this one?
Ron Hadler: Yeah, I think it's important to know that modeled does not equal made up. I know we placed a lot of doubt in that, but it's not just made up. They are using their own internal signals. They're not always the most accurate, but they're not just random. But you need to kind of know which parts are modeled and how they’re rolled up. We've talked about two things. We've got consent gaps because people want to opt out because of privacy. We're filling in that data with other signals and previous and historical data. And then we're doing the same thing for customer journeys and how we optimize things. Now, I think really it's important to separate your optimization KPIs, qualified primary conversions, modeled ratios from your business KPIs, which is incremental revenue or marketing efficiency ratio. So incremental revenue is when we want to understand, think with ads and without ads, the difference is incremental in your revenue. So that's super important, and I think that's how you abstract from just relying here on necessarily what one specific model is telling you. Now, it sounds scary to have no ads, but that's how you're going to understand incremental. And the marketing efficient ratio is really just total revenue divided by total marketing spend, same period. So I mean, these are better ways to optimize and use KPIs in order to drive your marketing spend.
Reid Carr: One of the things that's always been curious for me in this whole modeling situation is the two ways you can arrive at this information, which is one is all the stuff we were just talking about in terms of the media and the dollars and the efficiencies and the ratios and all that, and then some of the old school metrics of how many people could possibly buy your product. Total addressable market, service obtainable, the competitive set, how often can people buy? And sometimes, I don't know is if those numbers always jive. I mean between those two things, I think that a lot of companies don't spend the time just considering the humans, how many people can do it, how many people will buy, and on those metrics there. So I wonder, and maybe you don't know the answer to that question is, are these models taking that into consideration of literally how many people match what you're looking for?
Ron Hadler: They do within their own platforms, and then that's limitations because it's not looking at something that's looking across the landscape. They're kind of platform bound, if you will, and so they're limited in what their viewpoint is. Now, Google being its hands in almost everything online has a pretty wide purview, but it again, is biased. And so taking a step back, providing and looking at data, like you said, at a top level, I think is important. Otherwise, we're getting platform bound by what a singular platform is telling us to do.
Reid Carr: Yeah, because conceptually, if you're running on meta and running on Google, that same person who's probably on both meta platforms and Google has counted as two people effectively because meta doesn't see what Google has done, Google doesn't see what meta has done. And that's where I think some of the math starts to get a little funky. There just aren't actually that many people. You've effectively double counted them. I would argue that probably Meta has most of the people, and Google has all of the people.
Ron Hadler: Well, that's what Emily said, something very smart before, and that was basically when we get reported on conversions from these platforms, we double check them in a third party. And so I think that's important is check somebody else's math, even if it doesn't disagree, doesn't mean it's necessarily wrong, but you're getting an understanding of what's reality.
Reid Carr: Yeah.
Emily Engberg: Correct. It's not necessarily inaccurate. Facebook meta may be taking credit for a conversion, and it's not inaccurate to say that it played a role in that conversion. It just may not have been that last step. So if you are looking at the conversions counted in Google ads, that conversion may also be accounted for in meta ads. Both of those platforms played a role and both want credit for that conversion.
Reid Carr: And that's where I think we come back down to it as humans on both sides of it, the humans that are buying and the humans or doing the work that we're doing, I mean, AI can automate tasks but can't replace the strategic thinking or the light bulb that comes on with the consumer when they choose to buy the media. Teams of tomorrow won't just press buttons, they'll interpret data, guide creative, make some of the calls that drive the business outcomes, and obviously have to have that overall business awareness to have a good accurate outlook at that.
WHAT DO PAID MEDIA TEAMS LOOK LIKE IN THE AI ERA?
Reid Carr: So Emily, to kind of bring us to the close here, what does the future, or I guess today's role, if we're living in that future really look like and how are you preparing our teams for that?
Emily Engberg: It's going to be a shift from paid media teams focusing primarily on execution to focusing on strategy as more of those in-platform optimizations, bid adjustments, et cetera, get taken over by AI tools that are actually rather impactful. The team will have to spend a lot less time on that type of work, and instead really needs to level up on strategic thinking, critical thinking, cross disciplinary thinking, so connecting the dots across programs and initiatives that are in play that a platform itself is not going to be able to do. So what are we doing to prepare the team for that? We have been doing an array of critical thinking oriented exercises to get the team thinking in a different way beyond just what are the levers I can pull in this individual platform? We have been upskilling them on all of the AI tools at our disposal so we can become more efficient with some of those day-to-day tasks.
So now our team has developed custom GPTs that they are training on an ongoing basis to support with nearly every function that the media team undertakes. So ad copy development, we've got GPTs trained for every client with brand tone, voice; that can really expedite our process. Those GPT know each platform's count limitations, the headlines, descriptions, et cetera, that are needed to execute every campaign. So those manual tasks are slowly, well, not slowly, rapidly becoming much more efficient and supported by other tools like that. And our team is instead elevating the strategic thinking that we bring to the table, really deepening their knowledge of the client, the industry, the vertical, so they can bring that type of expertise to the table when they are making paid media recommendations.
Reid Carr: I think threaded through critical thinking and threaded through strategy is also this idea of curiosity, and I just don't see AI being curious. I mean driven and motivated by like, oh, I wonder what that's about. I mean, it can give you information if you push it toward that. And I think that that's one of the things that we want to make sure that everyone is motivated and driven by, at least at the root level, that human curiosity and so I think that that's going to be an interesting thing that I think hopefully that will preserve the human in the loop here.
Ron Hadler: Yeah, I mean, AI makes comments all the time and you're like, Hey, what did you think about that? Giving to your point? And it's like, oh, that's a really good observation and it always makes me feel smart when that happens. Or it's just being a sycophant, I'm not sure.
HOW DO WE ENSURE GOVERNANCE AND ACCOUNTABILITY AS AI SPEEDS UP DECISION-MAKING?
Reid Carr: So Ron, let's wrap with you here on this: AI is making these decisions faster. How do we maintain governance and accountability in how these decisions get made?
Ron Hadler: Sure, Sure. Speed without stewardship is risk. Got to put some rails on that Speedway, and I think here again, this might sound a little older school, but I think still think logging decisions is highly important. So when we make goal changes, we document those things so we can refer back at a point in time or even annotate reporting later based upon those decisions when we're going to promote an AI suggested budget shift. Again, make sure we have a promotion checklist going through those things, right? Understanding which conversions are modeled versus which ones are observed. Again, understanding when we're making a human decision, we're giving data kind of structured decisions.
And then I think really audits. Your quarterly controlled audits I think are an important part of those things. And even throwing in there like a postmortem for any sort of major budget swing. I think those things, they're bringing that human, they're auditing, now there's ways to use AI to help us audit or help us create those logs in the sense of automating that it's less onerous and I'm manually typing things in, but just the same: documenting those things I think is how we keep control, because we have referenceable things to go back to and understand why something happened.
Reid Carr: Exactly. Well, and to figure out why something happened. There's a aspect of documenting what happened, but also why we chose to do what we did. So the decision making, what was the premise for that? Because I think that also starts to build that institutional knowledge on a brand as an agency operationally, because then a lot of times the real meat is in the why.
Ron Hadler: And the beauty of things like that of documentation. You can then therefore feed it into the custom GPTs that Emily's team has created, therefore, expanding our decisions and understanding what worked well, what to stay away from.
Reid Carr: Yeah, yeah. Well, and I think the rate of change, certainly at an agency when we work on so many different brands, make all the kind of different decisions that we do and building out all these tools. I just want to defend the role of an agency in this case, and we get so many reps and seeing the same things in so many different ways, and then building out these tools. I think we are keeping up with that pace of change. I think a lot more than maybe even a brand side can because they've got one set of hands on one brand, on one set of keyboards.
FINAL THOUGHTS & KEY TAKEAWAYS
Reid Carr: So yeah, so as we've heard today, AI is doing more than just automating paid campaigns. It's reshaping the way we build, measure, and manage them from scaled creative and predictive budgets to black box optimization and modeled performance.
AI is changing the game, but it hasn't replaced the need for smart human strategy. It's just made that strategy more important. The best results won't come from letting AI run wild, but from using it as a tool to move faster, test smarter, and act on insights more confidently. Thanks again to Emily and Ron for pulling back the curtain, and thanks to you for listening. Be sure to subscribe and join us next time on The AI Edit where we'll look at how AI is transforming data and analytics and what that means for how we define success in marketing today. Thanks for joining.
Ron Hadler: Pleasure.
Emily Engberg: Thanks so much!
Reid Carr: If you found this helpful, subscribe, leave a review and share it with your team. You can also visit reddoor.biz for more insights on AI, strategy, and the future of marketing. You'll see our show notes there, and we'll look forward to seeing you next time.