On-demand webinar: Feed-based ad placement! - What to do if the feed is missing?

Feed structure as the basis for creative excellence and user-centered ad placement in SEA

An overview of all other diva-e webinars

Feed structure as the basis for creative excellence and user-centered ad placement in SEA

An overview of all other diva-e webinars
Here's what you'll learn in the webinar

Google sets clear requirements regarding creative excellence (excellent ad design) in SEA. In addition, users are increasingly demanding what they want to see in an ad so that they feel addressed, are motivated to click and come to your site in the first place. Our diva-e experts Dominik Langner and Christian Rainer will show you what you can achieve in addition to shopping ads by using suitable feeds in your ad placement and will present in depth how you can create a suitable data feed through crawling and data structuring.

Watch online now (German only):

The speakers

Christian Rainer

Head of CC Paid Advertising, diva-e

Christian Rainer is Head of CC Paid Advertising at diva-e and is responsible for the areas of digital marketing strategy and PPC consulting. He has been involved in performance marketing as well as business and marketing consulting for over eight years. He is also a certified LUXXprofile Master. Before that, Christian worked for Treugast and blueSummit.

langner-dominik_600x600_sw
Dominik Langner

former Senior Online Marketing Manager, diva-e

Dominik Langner is Senior Online Marketing Manager and has been responsible for marketing automation at diva-e for over four years. His interests include analytics, tracking and product feeds. Dominik already gained experience as a study manager at Infratel.

Transcript of the webinar: Feed based ad serving - What to do when the feed is missing?

Angela Meyer: Welcome to today's diva-e webinar Feed-based ad placement. What to do when the feed is missing? Today, you will learn tips and tricks from our experts Christian Rainer and Dominik Langner about the feed structure as the basis for excellent ad placement in SEA. My name is Angela Meyer and I am part of the diva-e marketing team. Among other things, I am in charge of our events and webinars and I am your moderator today. At this point I would like to hand over to our speakers Christian Rainer and Dominik Langner and hope the participants enjoy the webinar. So. Christian, I will now hand over the broadcasting rights to you.

Christian Rainer: That looks good. Wonderful. Yes, hello. Welcome from my side as well. Feed-based ad placement, what to do if the feed is missing? At this point I would first like to say thank you to all those who are interested, but for the start I would like to give you the opportunity to find out a little bit with whom or from whom you are getting input today. Therefore, in the first place, I will hand over to Dominik.

Dominik Langner: Hello everyone. I'm Dominik Langner, Senior Online Marketing Manager at diva-e. I've been with the company for five years now. I take care of everything that has to do with marketing automation and tracking. In my former life I was also a study manager at Infratel. But in the meantime I've moved on to online marketing. I'm mainly interested in analytics, tracking and product data feeds. And when I'm not working, you can find me in the mountains, in the bouldering hall or brewing beer.

Christian Rainer: I'm Christian, Christian Rainer. I've been responsible as Head of PPC at diva-e for a good year now, taking care of everything that concerns click-price-based traffic purchasing for our clients' sites. I've been doing the fun stuff in agencies since 2012, originally from STA. As a team, however, we rely heavily on the consolidated know-how build-up from the areas of SEA, paid social and classic e-commerce in the direction of price search engines and other e-commerce platforms. I did gastronomy in my background or in my past for many years, then found my way into online consulting via detours in hotel consulting. I completed a degree in the field of business psychology in order to simply take the LUXXprofile Master certification again in the course of this year, also in the direction of reconciliation diagnostics. This is just to express my interest in basically working with people.

And the whole area of online marketing is of course all the more exciting because here, in connection with or directly dependent on data, the integration of people into systems and the reflected behaviour can be mapped. Yes, when I'm not working, I either run or swim, do a bit of sport. I love to read, and my favourite thing to do is either swing or slide, which, especially as a father of a two-and-a-half-year-old daughter, can push an athlete to the limit. So much for us.

We are now more or less through with the presentation. We will continue with the agenda at this point in time. On the one hand, I will want to emphasise the aspect of creative excellence in Google's definition. But I am particularly interested in emphasising the importance and relevance of creativity and ad placement, especially in SEA and search. And then I'll hand over to Dominik, who will discuss the aspect of feed-based ads and then share one of our use cases with you in detail, where we used crawlers to ensure that we had a feed in the first place. And based on this, we will be able to provide the corresponding adaptations, extensions and adjustments of advertisements in the direction of a user-centred approach. At the end, I'd also like to mention the Q&A point. If you have any questions in between, feel free to post them in the chat. Angela will bundle all these questions. At this point, we would actually put the answering of the questions at the end, although we would now like to briefly interject a question for the direct start. In order to get a feeling for who is with us today and where the interests lie, so to speak. Angela, I would like to ask you again to share the short survey. Please just click on it and tell us what goes through your mind when you think about the creation, design and creation of a new text ad in the search area? Please just click. Gladly "nothing at all", "why again? Nobody reads that anyway". "What would I click on, if anything?" Or "who would I like to reach with this ad?

Angela Meyer: Exactly. So. We'll wait for about a minute, the first votes will follow and it's going to be exciting, I'd say. In any case, our participants can all do something with it. And we'll wait another ten seconds and then I'll be happy to share the poll results with you. (12 sec) Good, I will now close the survey. And I can now show you the results. Christian, you are welcome to take over.

Christian Rainer: Thank you. Well, at least there are still people who don't think anything of it. I'm surprised, why again? Nobody reads it anyway. That's exactly the point I hope to pick up on here. To show that this is certainly the case. What would I click is certainly a first step in the right direction by at least thinking about how such an advertisement should ideally be designed. However, we on our side - or I in particular - actually attach great importance to ensuring that precisely this last fourth point is fulfilled in corresponding agreement with the participants of this afternoon. And that is the thought, who do I want to reach?

Angela Meyer: Very good. Yes, then we can start with your presentation now. I'll close the survey and yes, go ahead.

Christian Rainer: So to start with, as I said, some insights into search and the importance of ad placement. Especially with a view to the creative excellence touted by Google. Every day, 3.5 billion search queries take place on Google, about 15 per cent of which have never existed in this form until today or the day in question. Why is that important? At the end of the day, it is also about sensitising yourself or, in this case, you, to the importance of placing an advertisement in the context of the increase in search queries and the complexity that goes along with it. Ultimately, with Google, Yahoo, Bing or any other search engine, the search query does not simply come as a throw-in that hopefully then matches a term booked in the ads account, but is an expression of a human need to interact with the platform for one or more motives. In the first step status-, I'll say in the inner activation that comes with it or comes with the confidence to gain a positive result for oneself from this search query. And as individual as the interplay of all human motives, needs and the resulting confidence is, just as individual and complex is ultimately the aspect of each individual search query, which as such has its own context. This does not necessarily have to be in the semantic or literal compilation, but it is definitely the aspect when it comes to mapping here the contextual area, the life circumstances, the emotional life circumstance that stands behind the user.

As a small example of this, I'd like to ask everyone a question, without any active questioning, but which everyone can answer for themselves: what do you think of when you search for Carrera Grand Prix? Is it, in a first step, the Porsche Carrera Cup, i.e. the German one-make cup, a Porsche racing series that has been running since 1990 with standardised racing cars based on the Porsche 911? Is it the Porsche Grand-Prix as the WTA's indoor clay-court tournament held in Stuttgart's Porsche Arena? Is it the Carrera Grand-Prix 2, sunglasses from the Carrera brand, which was founded in 1956 and became a Porsche legend in the 1980s with a Porsche collection? Or is the Carrera Grand-Prix actually the car racetrack that we would now like to highlight again at this point because it is also part of our case later on? If we had done it as a survey, the answer would probably have been as individual, or probably even more individual, than our initial question.

Ultimately, our goal as advertisers or as an agency in this case is to hit exactly the offer that is relevant to the searcher with every query in response to a search request, and at the same time to represent an expression of our own trustworthiness beyond the advertisement in order to eliminate doubts, fears or uncertainties that would deter an intention to buy or would cause hesitation. And, ideally, to convince users emotionally by providing them with a great experience and to support them in their receptiveness to the message we want to convey. What we need to do for this is basically to know the life situation of the person who has made the search query. We also need to know the language in which the person would ideally like to be addressed, also from the advertisement, and otherwise prefers to communicate. And in the best case, to really understand the explicit motives that make him or her receptive to our message, so that he or she is activated at this point and an actual purchase intention develops from the need. Nowadays, it has certainly become very complex to the extent that, on the one hand, very few users directly communicate via their search query what their actual need or their actual contextual motive area is, in which they are currently emotionally, the topic of the GDPR, which also makes it more difficult at this point to further use or validate user data, or to activate it directly for further adaptation.

Ultimately, this leads to the fact that we, on our side, have to anticipate the whole thing in times of uncertainty or insecurity, whatever you want to call it, or simply in all these unknowns around the contextual elements of the users, when designing advertisements or addressing users. How do we do that, or what is probably common practice across the industry now? By means of technology and connectivity. Connectivity in the sense of, on the one hand, the interplay of the different channels through which users interact with a brand, a company. But also connectivity in the sense of data collection, when it comes to actually making user signals that can be recorded technologically at any point available for optimisations in other areas or to other areas. The best example is data that I receive about the user via the search query or the paid search, which I then also make available for the organic optimisation of the website, the app, the respective platform in the direction of UX, SEO or content optimisation.

Focus on the SEA ad as such, why do ads in this form deserve special attention? As we have just discussed, the complex behaviour in the search engine has certain challenges, which we on the other hand have the opportunity to address more and more through developments and especially technological possibilities. And in this respect, if you look at the workflows and work processes within an SEA account in many areas, keyword smart billing or billing algorithms in general, the topic of depositing audiences and controlling audiences, it is definitely time to devote more attention to the aspect of creativity in the human USP compared to every machine and thus to address users better through creativity.

At this point, I would really like to deliberately take up the cudgels for the creatives, to the effect that it is basically the case that 80 per cent of all users have a higher willingness to buy after interacting with an ad that has really appealed to them, it is generally true for ads in the digital space that 47 per cent of the success of the campaigns can also really be traced back to the ad. 51 per cent of all online users expect digital experiences that really suit them and reflect what they want. And for all the love, or caution, about data sharing, nevertheless 41 percent of all online users really explicitly want to be recognised when they repeatedly interact digitally with a brand or their trusted retailer. It's always a bit of that momentum, like walking into a shop twice a week, being approached by the same salesperson, and each time they respond to me and greet me in a way like they've never seen me before. In an offline environment, I would quickly become suspicious of what might not work with this person. Online, this is precisely the fact that we are increasingly or repeatedly seeing, unfortunately.

At the same time, the second challenge that plays into this is the point of this diversity in terms of motifs and address, where I really do ask myself whether all those looking for life insurance in Germany are really only guided by test results, as can be seen in the three examples on the left. Or do retailers only sell products online in sales, where I feel more led by my personal curiosity to click on the probably most inconspicuous of the three ads in the middle. If only because I ask myself, what is twisted terry?

So how is it that ads in this form actually have such a significant influence on the success of campaigns? Here we look at two elementary points: the first is the auction process. The ad is in fact also a clear key element for participating in auctions. If we look at the auction process in step one, the user enters his search query and I'll say the Google search engine maps the corresponding results page based on this, compiles results and at this point also gives the specification in the direction of Google's ad server to say, "how many text ads are included on this results page?" And it is precisely in the second step that it is about the selection of these auction participants, where in the end, in addition to checking whether semantic or contextual matches of search queries with the actively booked keywords can certainly also lead to ad content, provided they are not excluded via negative keywords, in a corresponding match with the search query to an auction participation and then just the third step, keyword ad rank and position of the ad. Here, the position of the ad as a criterion or influencing element in the direction of the quality score. At this point, let me remind you again of the quasi-basics of Google Ads. The higher the user relevance, the better the quality score and the lower the CPC to be paid will be in the end.

The second point why advertisements in this form are so important is the element of the famous first impression on every potential customer. The ad as such is also a decisive component in influencing click behaviour. An appealing advertisement that explicitly addresses corresponding motifs or corresponding areas receptive to the user, that best reflects his contextual and also emotional situation, leads to the corresponding success of the overall campaign. At the same time, when the ad is clicked, it has the huge advantage, especially in comparison to organic search results, that in the paid sector we have all these mapped signals from device, browser, search partner site, the language set in the browser, remarketing lists, demographics, times of day, In the paid area, we have the huge advantage that all these mapped signals from the device, browser, search partner page, language set in the browser, remarketing lists, demographics, times of day, website behaviour and so on and so forth are actually provided as additional contextual factors and data sets, which we can then of course also use for optimisation in the broader area, keyword, mentioned earlier, SEO, content UX and the optimisation of the ultimately further own assets from the website or app.

As a hopefully reflective audience, you are surely now asking yourselves how all the points presented can or should be taken into account in a standardised SEA text ad? At this point, I would simply like to give you a brief reminder that the Expanded Text Ads, as the standard format for Central European text ads in search engines today, provides 300 characters for communicating product details, the brand, the price, the handling of returns, reviews or all other possible arguments that promote the purchase decision. Mind you, this does not even include ad extensions. What does that mean? The clear recommendation at this point is to really work actively to do justice to all aspects, or especially these two aspects - the point of being the key to the auction, but also the best possible user approach, by using all available ad formats in a campaign. At this point, we have listed from common best practice and also shared from our experiences, with which intention or which goal the individual ad formats should go hand in hand.

The DSA, Dynamic Search Ads, ultimately serves as the foundation of a campaign for the identification of relevant topics or search queries as well as the coverage of a correspondingly isolated long tail for individual search queries that could or should or do fit in, but do not have the quantity and do not have the serious, significant contribution to the value creation of a campaign that it would be worthwhile here to explicitly address them via their own keywording or a detailed advertisement rather than the double bottom or the drop net, so to speak.

The RSA, Responsive Search Ads, has its great strength precisely in the area as an auction key with up to 15 headlines that can be individually designed and four descriptions that can be adapted accordingly. If I have really ensured the corresponding diversity in the formulations, then I can actually manage, even with little to just one keyword, to tap into the most diverse search queries and to find connections to the most diverse motif worlds, in order to draw out corresponding insights, especially in the direction of identifying best practices or the functioning of assets or asset combinations within my ads, Ideally, I then use this standard format, Expanded Text Ads (ETA), to actually implement them, where the aim is to pick up as much as possible as a specific query within these 300 available characters and to explicitly develop the most varied ETAs for different target groups in the direction of a personalised user experience as far as possible.

In order to actually be able to maintain a certain scalability at this point, the expanded area of these feed-based ad adjustments, which Dominik will go into in detail later, is particularly effective in the area of expanded text ads. In summary, all these ad formats plus the consideration of extensions and the type of ad playout form what Google calls creative excellence, the door opener in the customer relationship. At this point, it is perhaps important to take into account that the data or values given here in the direction of growth, which result from the merger or from the respective consideration of the individual aspects, I would say published by Google, should rather be seen and understood as a cautious minimum.

But, for example, the fact alone that, in addition to ETAs, an RSA, for which Google has to date held back a certain proportion of auctions exclusively in order to actually develop and improve its own product, actually leads to an uplift of 15 percent of clicks and, ideally, also of conversions, because more auctions can be participated in. The consideration of corresponding ad extensions alone leads to a 10-15 percent higher click rate due to the fact that ad space is normally increased when ads are played out. At this point, I would like to remind you of the example we had earlier with the twisted terrycloth - if, despite everything, the approach does not reach the user, then the slapping in of corresponding extensions is of no help, but it is important to think about it so that it is relevant for the user, and the topic of optimised ad rotation is always a point of contention that can be discussed. At the end of the day, the settings for optimised ad rotation, especially in the area of ETA, can also be used to highlight best practice and lead to more frequent playout, especially in the area of RSA. The important point here is ultimately this aspect of evaluation, which must also be taken into account despite everything, and accordingly it must also be considered for the respective advertisements how far or at what point further optimisation should take place in one's own area, out of one's own area.

If we summarise the whole thing, the relevance of the advertisement for the desired user is the absolute maxim on the way to the best possible and value-creating user approach. It is about identifying relevant user signals in order to be able to adapt the approach accordingly, in order to remain capable of acting in the complexity of all these approaches and not to be unable to see the wood for the trees, it is also this point to have the whole thing mapped in a scalable way via automation of the workflow. The easiest way to do this is with DSAs, where the headline is automatically created by Google for standardised descriptions.

The next step, the RSA with automated ad consideration of individual ad assets during playout up to ETAs. And Dominik can explain how this works. All automations nevertheless also protect against the point that the evaluation and conceptual optimisation of the ad elements according to the theme of report, test, derive best practices and in turn report and evaluate cannot be dismissed out of hand. And this is the point that we have to focus on in order to ultimately develop the corresponding creativity from such a conceptual context, to really address the user in an innovative or better form. An example of this is also the point that we have to get a user to think of sunny holiday impressions through a tour operator as an answer to a search query about the weather on grey autumn days.

Angela Meyer: (5 sec) So, I now hand over the broadcasting rights to Dominik. (7 sec) Yes, exactly. And before we go further into the topic of product data feed, I have a short survey for the participants. We would like to know how important data feed management is in your current workflow. I'll show you the survey again and you have one minute to give your opinion. Is it extremely important to you or not important at all? You have a few seconds again. (8 sec) Let's wait another ten seconds. (5 sec) Then Dominik will start with his part and now I will close the poll. Thank you very much!

Dominik Langner: Thank you very much. You must have asked yourself, what are product data feeds used for now, what does the whole thing we just heard have to do with product data feeds at all? Well, product data feeds are becoming increasingly important. We have noticed that there are now all kinds of data formats that all relate to product data feeds. For example, we have price search engines that use product data to find the best price for the user. We have marketplaces such as Amazon, which everyone knows, or social media sites, for example, which now also make use of product data-based advertising formats. In addition, there are of course the various search engines, all of which have also developed their own advertising formats, which we will look at in detail in a moment. Firstly, there is Google Shopping, which I guarantee everyone knows is one of Google's oldest feed-based ad formats.

In addition, there are now, somewhat more recently, showcase ads, which also offer the possibility of presenting your brand and your product range in the appropriate lifestyle. The whole thing also attracts users at the beginning of the customer journey, i.e. when they are still in the inspiration and research phase. Showcase ads are very helpful, especially for new customers, but for stationary trade there are also feed-based advertising formats, for example the Local Inventory Ads, with which you can build the bridge from online to offline. This means that these advertising formats have various advantages for stationary trade. On the one hand, you can advertise the offline inventory online. You show local users in Google search that the items you offer are also in stock in the retail shop. Secondly, you can use such ads to keep the online presence of the retail shop up to date and to keep the expectations of potential customers present. The third advantage would be that it also gives us the opportunity to capture the concrete impact of online measures for online retail.

Another use case for product data feeds is remarketing campaigns with product-based advertising formats, as is often used in online retail. There are different providers and also different billing methods: CPO, CPC, whatever. For example, via affiliate partners, via direct advertising networks such as Credeo or also as in this example of Google's dynamic remarketing.

The idea behind it is always the same: we tag users, we retarget users who have already been in the shop and show them ads of products they have either already looked at or for the purpose of cross-selling and upselling. That is, you show them similar products that they might also be interested in. Proper ad formats based on data clicks now exist in almost all social media channels. Here, for example, on Facebook. In addition to ad formats explicitly based on product data, you can use product data clicks for ad creation, as my colleague just teased. The question may be, "why do that at all?" Well, the alternative looks like this: You create ads manually. The whole thing is fine for a small number of products, but not if you want to scale or promote regularly changing products. Because manual ad creation is manual labour, resource intensive, poorly scalable, effective but not efficient. It's great if you want to feel productive, like, "Hey, another 200 ads written by hand today." In our opinion, the whole thing is no longer up to date, at least if we're talking about product ads. Not brand ads, mind you, nothing beats a well-written brand ad. We don't want to automate them either, but we are now interested in using automated ad creation, especially for shops that have many products, so that all categories and products can be advertised with individual text ads and not just the bestsellers, for example. Because if we have a shop that has 20,000 products or more, probably no one will write individual text ads for each of these 20,000 products, let alone adapt these text ads for different target groups.

The alternative now is that we use data feeds, product data feeds for these ads. Now there are different ways of doing that: The first option would be, we use business data like Google Ads. With that, you can customise ads. In this process, you still have to create the ad manually initially, but you can add certain elements in the ad automatically from a stored feed, such as the price or other attributes like 'free shipping' or 'many models', which we can automatically take over in the ads. The keyword for this, if you want to try it out yourself, is a business data feed. You can find it in Google Ads. It allows you to dynamically adjust the content of ads.

A second point that is also important here is the different addressing of users via ad templates. This is also possible and last but not least, the advantage of the whole story would be the topicality of the ad. We can adapt ads to changes at the product level. For example, the price with regard to the availability of products: as soon as we have this in a data feed, the ad also adjusts automatically. But updates on the Google Ads side happen once, twice, four times a day, which is enough for most use cases. If you want to achieve a higher degree of up-to-dateness, you can always fall back on the AI of Google Ads or use scripts that make the whole thing possible.

Another option for advanced users would be to use SearchAds360 and the inventory management there. If you have a feed, it is possible to create ads completely automatically using ad templates. This works by creating an ad template and working with placeholders. These placeholders are then automatically filled with the data stored in the feed. The ads generated in this way are then automatically created and synchronised with Google Ads. You proceed in a similar way if you want to use the Ad Builder at SearchAds360. First you create a template with placeholders and have the ads created automatically, feed-based. Here, compared to the previous option, you have more possibilities to influence where exactly the ad is to be created in your account. So you can choose which ads should be integrated into already existing campaigns. In summary, product data feeds are becoming increasingly important because more and more ad formats are using them. Furthermore, they create additional added value if it is possible to use them for automated ad creation. The prerequisite for this is the product data feed.

This means that some data is simply missing, but you would like to use it for the ad design. We have brought an example from one of our customers, Carrera, with whom we did exactly that, that was exactly the case and we would now like to show you how we have to proceed here. The initial situation was that we did have a feed, but it was not reliable for various technical reasons and it was not possible to reliably provide all the required product information in this feed on a regular basis. In other words, there was still some room for improvement in the feed quality, colloquially speaking. The solution we then resorted to was to simply crawl the feed ourselves. There are various ways of doing this. You either have programmers at hand or you know enough about your programming language and simply programme your own crawler, which presupposes technical know-how and also that you have a server and can execute scripts automatically at certain times that then manage this crawling process. There are also various other tools with crawler functionality. For example, we use Productsup as a data field tool that also has an integrated crawler.

The advantages that we see in using a data feed tool with a crawler function: you crawl the data and have it available in the same tool in a structured form, by then also processing the data and creating feeds for the individual channels. If you want to choose this option, there is another prerequisite. We need a list of all the URLs we want to crawl, i.e. Productsup. In Productsup you cannot let go of a complete domain, you have to enter an input file with the pages that are to be crawled. Now the question arises, "where do you get the whole thing from?" This is what the sitemaps look like that you need for this. The solution is to find these sitemaps in the robot.txt file. It's usually linked there. You can then use this file, which contains all the pages of the domain, and use it as an input file for a data crawler project. So we use this file as a source and then run our crawler on it.

The setup is as follows: insert this file in Productsup in your data feed tool, create a data feed crawler and apply it to this URL list. There are a few things to keep in mind. In Productsup we have the special feature that crawlers, data crawlers, once started, cannot be cancelled. This is important to keep in mind, especially because we have the option of running several crawlers on the page at the same time. So, now you can imagine what happens if we run ten crawlers at the same time on a page whose page is perhaps not particularly stable. The whole thing then behaves similar to a DDoS attack, i.e. the server then goes down and the website is no longer accessible at some point because too many requests come in. So please always keep this in mind if you want to try something like this. It's better to start carefully with one, two or three crawlers running at the same time and perhaps check later whether the site is stable when you put several crawlers on it.

The second possibility would be to scratch the whole thing somehow at night, so that it runs at one in the night when there are few people on the ads. What does this crawler do then? The crawler rattles through all the pages, the URLs that we have given it as input files, and visits them and pulls the HTML code of these pages into our data feed tool. That's great, then we can work with this HTML code beta. On the one hand, we have the HTML code, and on the other hand, we have a very important piece of information, namely the HTTP response code. This is automatically output by the crawler and it simply tells us whether the page we crawled in our sitemap was accessible at all. Of course, it makes sense here to exclude all pages that are not included in the sitemap and are not accessible, i.e. 404 error pages or the like, the http. We only want to have valid products with valid pages in our finished feed.

Extract data

The next step would be to process the data that we have now entered into our data feed tool via crawler and extract the data relevant to us from this HTML code. Depending on the tool we use, there are many ways to do this. Here in Productsup, for example, we have the option of working with HTML selectors, with replace functions, with various map functions, with split functions, all of which allow us to split this HTML code or select certain parts of it.

Purge data

In addition, it is useful to clean the extracted information from the existing HTML text, so that you get a reasonably readable text that does not contain any special characters or HTML characters. (5 sec) Like this. Here is a comparison of what this looks like on the page and what we extracted from this page, from the HTML code. To do this, I first selected the relevant attributes on this page and used them to create an object that contains the product information that is relevant for us. No one needs headers and footers in a data feed. The whole thing looked like this and the result of this work is a JSON object. A JSON object is a JavaScript object, a JavaScript object notation. The whole thing looks like this, we always have key values, for example name, age, car and associated values. For example, John, 30, zero. If you have that, you are already a step further. This really makes our work easier because we have an object in which we have all the information about a product on the page. Productsup now offers the possibility to select a single value from such an object very easily with the Extract Value JSON option. Afterwards, we have cleaned it up a bit, i.e. we have removed HTML special characters so that the whole thing is easy to read and we can simply use it for our data feed. The rest of the procedure is relatively standard if you have ever edited data feeds before. You reshape certain attributes, adapt them for the respective channel, add fixed values. At Carrera, for example, this was the condition. Carrera only sells new products, we have no used products. This information was not included on the page, we simply stored it as a fixed value.

Categorise data

Another possibility that we have here is that one could also categorise the products. If the categorisation is not already apparent on the page in the individual data, you can either fall back on the URL if there is a certain structure in the URL through which you can cluster data, i.e. products. Another possibility would be to crawl the breadcrumps and use them to categorise the products. The goal here is always that we create a uniform, systematic basis for data feed-based dynamisation. We had another problem. On the Carrera site itself, only one image was crawlable at a time; the remaining images, the additional images, were integrated via JavaScript code and could not be read directly from the source code. We found a solution for this too: we created an export file from our crawler project. This file contained the ID and a crawlable image URL. In a second step, we created a project by importing this file, i.e. this export, with ID and image URL. We have now used this as a starting point for our new project and have systematically created further image URLs in this new project. Most of the time, image URLs are systematic, as we can see here. We have a URL called 'URL/media/image1' and the other images, if they exist, can be found under 'image3', 'image4' and so on. And the whole thing can be adjusted relatively easily by means of 'Replace' or replacement rules.

Image URLs

However, we now have the problem that products had a different number of images. Some products had two product images, some had six. If we always create six images, six image URLs, in this way, as described here, it is possible that four of these images point to invalid URLs, non-existent images. Therefore, in a fourth step, we ran image crawlers, also data feed crawlers on the image URLs, on these newly generated image URLs, received an http response code back and simply kicked those images that were not present, where we received a 404 response code back. Finally, we play the whole result that we have generated in this way, namely additional image URLs, back into our actual data feed and have thus managed to integrate image URLs belonging to all products into our feed.

In my opinion, the result of all these efforts can also be seen. Here we see how it looked before and after. Before we switched to the crawled feed, we had the problem that the number of products fluctuated greatly every day, we didn't have all the data in the feed and therefore no good histories in Google. After we switched to the feed, the crawled feed mind you, we had an average of 412 more products in our data feed. As you can see, the number of products actually remains relatively constant, as was to be expected. After we saw that the whole thing works like this, we also made a few additional adjustments to reduce the error messages that we always got displayed. If you pay attention, we didn't get rid of them completely, you can still see a very small red line even after the adjustments. That's because we-, or, that Google, for example, does something like this here: we have individual products here, using the example of a spare part for a mirror for a Carrera car, which Google, for some reason, sees as offensive and pushes us backwards. Well, there's nothing we can do about it now, but we can actually live with that quite well if it's all kept within bounds.

I also don't want to deprive you of further results: We see here the development of our Shopping ads before our adjustment and after. As you can see, we have roughly maintained our clicks, but the ROAS has risen sharply, we have 227 percent ROAS afterwards. That means we are actually very happy with the result. I can only recommend this to everyone, if you have problems with the data feed, to perhaps resort to such a possibility, to simply crawl the feed yourself, it usually works quite well. It's a bit time-consuming, but you can actually do everything you want to do and the success speaks for itself. At this point we have already reached the end of our presentation. Thank you very much for your interest and I will hand you over to my colleague Angela.

Angela Meyer: Yes, thank you Christian and Dominik for your information about the ad placement. Exactly, we have now arrived at the Q&A round and the participants are now welcome to type their questions into the participation box and in the meantime I have also prepared a small final short survey. We would like to know internally whether you are already using automated ad placement. I will now show the last survey again. The participants have one more minute to comment here and you can also type in your questions in the meantime. Christian and Dominik are looking forward to answering them. (10 sec) So. So it's actually the case at the moment that a lot of our participants are probably not yet using automated ad placement at all or only in a rudimentary way. It's almost a fifty-fifty split. So. I will now end the question and answer session. There you go. Right. I can also share the survey results again. And yes, Christian, Dominik, what do you think?

What are the points that are variably addressed in the ad optimization of the feed?

Christian Rainer: We tried to address that briefly earlier. Ultimately, these are all the variables that can be made available as data on the one hand, but which also have a corresponding influence on purchasing decisions on the other. For retailers in particular, these are aspects such as price and availability. For products that may have a longer delivery time, it really helps to simply provide a certain expected delivery date. All the information that actually has an influence on communicating this trustworthiness as a brand, as a retailer, as a manufacturer, as a provider to a user in order to persuade them to click.

Angela Meyer: Okay, so. It's now almost exactly 4 p.m. and exactly. If you have any further questions, please feel free to contact Dominik and Christian directly and they will be happy to take the time to discuss the topic with you in more detail. As already mentioned, you will also receive our recording and the presentation afterwards, we will make everything available so that you can watch it all again at your leisure and yes. And here is a short note about our other webinars, which we organize weekly. The next one will take place with our client Ravensburger on the subject of data-driven marketing and you are welcome to register and browse through our newsroom. We look forward to your participation. Yes. Now thank you two, Dominik and Christian, for your insights and for your time. Yes, I would also like to thank the viewers and listeners for their time and see you next time. Take care.

Christian Rainer: Thank you. We say thank you and look forward to any feedback, feedback or subsequent exchange of ideas.

Angela Meyer: Exactly. Bye.

Christian Rainer: Bye.

Dominik Langner: Bye.

Jetzt Webinarfolien downloaden