A matrix to help prioritize your projects
Organization, Measuring & Experimenting and Technology – so many different changes and priorities that need juggling. But where do you start, and how do you identify what your top priorities need to be? Analyzing this maturity matrix is a good way of starting to answer these questions:
Maturity matrix (© FABERNOVEL)
Reading and understanding the matrix
Organization and Strategy
Read back over this report’s article on organizational issues if you need more information.
Four key focal points emerge from this sequence:
- The way in which you manage your campaign brief, whether siloed (the creative agency brief is kept separate from the media agency brief and may even be handled by separate advertiser teams) or integrated (the creative and media agencies are briefed at the same time with a shared, clear campaign objective).
- Is your strategy offline-oriented, with your media and creative plan focusing on offline action? Or is it multi-channel, with your media and creative plan spanning all the online and offline levers most relevant to reaching the set goals?
- Orchestrating levers, a process developed at the end of the plan by the media agency (control, formats, audiences, etc.) with input from the creative agency without any impact on the creative assets versus this same process developed at the start of the plan, involving both media and creative agencies.
- Campaign management, by channel (performance for each area of control managed separately with their own specific KPIs) versus by audience (media plan performance managed by audience clusters that represent each stage of progress through development (e.g. the See-Think-Do-Care model);
Measuring and experimenting
The goal here is to assess your ‘test and learn’ approach and the way in which it’s integrated across the board among your different areas of expertise:
- Testing is only relevant to media teams (e.g. only media buying parameters (CPA mode vs. target ROAS, geolocalized targeting, landing page etc. are AB tested), or testing incorporates creative issues to generate new areas of learning (e.g. simple creative A/B testing on targets, multivariate testing to identify the best user/creative combination, etc.).
- Is the process of sharing campaign insights reduced to basic exchanges between the media agency and advertiser (reports and campaign management), or does it also involve the creative agency to ensure the latter is given feedback on how their work is performing (allowing them to optimize as a consequence)?
- Is reporting short-term oriented, with campaigns only rolled out for conversion and CPA objectives and restricted to digital levers, or is it long-term oriented, with assessments of how campaigns contribute over longer periods of time using econometric Media Mix Modeling models that take into account both offline and online factors?
For more information on these topics, feel free to refer back to our in-depth articles.
The third and final pillar to assess is how you use and manage technology between the different parties:
- Low creative personalization (using a generalized advertising message for all target audiences, with low levels of personalized content depending on the format used) or advanced personalization (the advertising message is personalized depending on the format used and the parameters of Dynamic Creative Optimization tools (e.g. product stock images, geolocation, product packshots, etc).
- Non-historical and non-centralized creative asset management (no creative classification system (e.g. control, format, campaign) or centralized tool bringing together all learning points for a past campaign) versus. centralized and historical management (a clearly defined creative classification system to help identify previous campaign tops and flops on an ad-server, allowing you to re-use them in later campaigns).
Interested in these subjects and willing to discuss about this story?Contact us