Webinar: How to handle product data in the optimal way?

Or how you can sustainably solve challenges around product data in practice

Here's what you'll learn in the webinar:

The fact that product data is the key to success does not only apply to the fulfillment of requirements and goals from the business perspective of our customers. Our practical experience additionally shows that the implementation of a digitization project can also be positively as well as negatively influenced with the supply of product data. Together, we put this success factor on a sustainable, high-quality footing, so that in times of constantly changing demands of your target groups on frontends and channels, you can turn your product data into gold.

In this webinar, we will use real-world problem cases to show how you can create added value by using Product Information Management (PIM) as early as the implementation project, thus saving high costs on your way to a digital future.

When: Tuesday, January 25, 2022, starting at 2:00 p.m.

Watch online now (German only):

The speakers

Jacqueline has been working in e-commerce for more than 20 years. In her many years of experience in the conception, management and further development of online store projects, she has always had a high focus on the product data to be mapped. How important this is for successful transactions and how PIM systems can simplify business processes is something she and her team pass on in numerous projects.
Jacqueline Els
Teamlead PIM/MDM
As a Senior Consultant, Tobias supports our customers in the design and implementation of optimal processes in international PIM/MDM projects. Thanks to his many years of experience in the PIM/MDM environment, he has practical solution approaches for data management and process optimization and immediately brings added value to our TXP projects through their use.
Tobias Watermann
PIM Consultant
With his overarching knowledge and experience in various PIM/MDM systems, Robert strengthens our team not only in the design of complex data models, but also in the formulation of efficient workflows. With several years of experience in this area, his focus is additionally on the structured collection, refinement and distribution of product information.
Robert Saul
PIM Consultant

Transcript of the webinar How to handle product data in the optimal way?

Introduction & welcome

Hanni Gummel: Welcome to today's diva-e webinar. Today, we're talking about product data, how it can be managed efficiently and how challenges can also be solved sustainably in practice. My name is Hanni Gummel. I'm part of the marketing team, and I'm pleased to be accompanying the webinar today. And at this point, very warmly introduce our speaker. Jacqueline Els. She is an Expert Consultant and Team Lead at diva-e and her two colleagues, Robert Saul and Tobias Watermann. They are both PIM Consultants. Great to have you here and share your expertise and experience in the field with us today. And now, I'll pass it on to you directly and wish all participants exciting insights in the webinar.

Jacqueline Els: So, thank you very much, Hanni, for the preface and the introduction. I hope you can see the correct screen? Wonderful. Then welcome, also from our side, hello to the world. What do we have with us today? We brought today some examples from the challenges around product data. Product data, product data information management, is, as Hanni already said, our main domain, our daily goal. And it's not unheard of that we run into challenges from time to time in the many projects that we manage. That's when we pulled out a few examples, brought them with us, and got a few solutions on responding to such product data challenges. First, a few words about diva-e. Who are we, what do we do? Some of you know us for sure. Maybe you already know us from joint projects. We are one of the largest digital agencies in Germany. We employ 800 experts in various areas of digitalization and can proudly claim that we have been working since 1995. The focus of our work is the complete digital value chain in 2BB and 2BC. We advise from the initial idea, from the strategy to the system selection, to the implementation, to the complete maintenance tasks that arise after the ongoing e-commerce project. Yes, we are the number one digital partner, as I said, in content, commerce, performance marketing, et cetera. Have a partnership with all the well-known and relevant technologies you know in the market. And my colleagues and I are here today representing the PIM/MDM area. The team now has over 20 experts. Among them are certified, consultants and developers. And we currently have partnerships with Akeneo and Stibo Systems. But we also know all the other well-known PIM/MDM technologies. That's part of our job. Exactly. Customers in our references include Klöckner, SCHOTT, ZEISS, and many more. And we can be proud that we ranked seventh in the employee ranking last year by the Great Place to Work survey. This means that the colleagues and experts who work for us are also quite happy to work for us. We are a very pleasant employer.

Yes, what is our main task in the PIM/MDM area?

Jacqueline Els: What is the mission that drives us every day? We can ensure a successful business for our customers and ultimately also in our projects by providing channel-specific and well-structured product data that can eventually also be efficiently processed in your daily business. Data is such a must for digital projects of any kind, yes. You can't do it without it. And, of course, it makes the whole thing more rounded if this data can also be added to and processed in a structured way. Let's take a quick look at the schedule. What have we brought with us today? We'll take a very brief look at the topic of PIM. What is it, what can I imagine by it? As a summary again at the beginning. Then we'll take a look at the challenges we face in our day-to-day project work. We then picked out two examples. We made an elaboration or a proposal on what a solution to this challenge could look like. What benefits does this bring?

Not only for our customers but also directly for the project team, and then, of course, we are still happy to answer questions. Exactly. Yes. To be able to imagine: What is a PIM? What are its tasks? And why do I need an additional system in my system landscape that holds product data and structures product data? To explain this, we like to start looking at the digital customer lifecycle. The whole thing has changed over the last few years. I don't think I'm telling anyone anything new here. Traditionally, it has been like this: The customer researched in media such as online, catalogue business, point of sale, such stories. They found a supplier and a product. They looked at it and bought it. And if you have done quite well, service has been exemplary, delivery time, price/performance has been proper, then it has often been so that the customer was already quite loyal. Today, we no longer have all that. Today, the channels in which I obtain information about products, in which I search for the product I need, are very diverse. And we have this fact not only in 2BC but also in 2BB. Be it that one searches on the couch again for a product in the evening. That one would like to know, if necessary, also experience reports of other users up to evaluations or operating instructions. Videos that are streamed via YouTube. These are the most diverse channels, and the more media we have, the higher the requirements for channel-specific product data. We always like to illustrate this with an example. We have a catalogue where a product is displayed and compared with the online store where the product is sold. There is, of course, an apparent difference in the requirements. I always have only small sections in the catalogue, little space available, concentrate on the essentials, and describe the product here. In the online store, I then have the requirements that I must, of course, also provide search engine optimized text. So that ultimately, organic traffic also comes to the page, yes. It is challenging to map all these requirements when I now have an ERP available. Or finally, maybe just an Excel spreadsheet that contains product data. Because, of course, I have to make sure that the channel-specific content and all the information is tracked somewhere in a system. Yes.

This is where product information management or master data management comes into play.

And very briefly explained here or did a PIM work in three steps. In the first step, we naturally have all the data sources we collect that we analyze in our projects on the left-hand side. And which we also dock onto the PIM system in the first step. As I said earlier, this is not just an ERP or the data from the supplier. In some cases, it is also the product catalogue in the back of the shelf at colleague XY's, the Excel spreadsheet, or an additional Word. Perhaps also media that are stored in a data asset management system. The possibilities are very extensive. So the first step is to dock this onto the PIM system and see what data we need here for the PIM system, and in the second step, see how we can map this in the PIM system. It is then a matter of enriching the product data here, enriching it channel-specifically, and obtaining a product data quality that simply ensures that the customer ultimately buys our product and that there are no more unanswered questions. The whole thing is accompanied by workflow management to ensure that product data maintenance runs very effectively for your company. Yes. Here's an example. When you work with translation agencies, the communication with translation agencies is often very extensive or increased. And you can easily make sure via a workflow in the PIM system that the translators also have access to the products. Only see the areas they really need to translate in the end and can then do their work without much consultation by mail, phone, et cetera. The third step is to give the data to the output channels. And here there is also a lot of diversity. We don't just have the typical story like e-commerce, mobile apps, and prints. But also the voice assistants that have now been added. Marketplaces, but also produces data that is sent to the customer in newsletters. These output channels are also researched, analyzed and defined. And ultimately, it's a matter of regularly expanding the product data there via an interface. How does this benefit us? Ultimately, the higher the quality of your products, the higher the chance of increasing your success, yes. We thus increase conversion rates. No more questions remain open with the customer. The customer knows what he is buying. And because the customer has all the information about the product, we also reduce return rates, for example. Which in turn leads to lower costs in your company. In addition to that, all PIM systems MDM systems have the ability, of course, to have up-sell products, cross-sell products or accessories assigned. This, in turn, increases your shopping cart value, respectively, and the possibility of repeat purchases. We have all the product data in one place and can process it very structured and efficient. As a result, we naturally also can increase our reach. We can configure or connect to new sales markets much more quickly. If you think about it: Okay, I want to play out my data on marketplace XY, then you look at the requirements and adjustments I still have to make. But once I have the data in one place, I can play it out quite quickly after making adjustments and making this new channel available. Internally, structured workflows also optimize process costs. You can present this entire product maintenance process much more effectively, and by cleverly implementing these workflows in the PIM, every colleague also knows when to do what. Coordination becomes much easier. Ultimately, we increase the quality of our product data, as I said at the beginning, and respond to the requirements of the individual channels. Yes. In our projects, it's already the case that even in our project data, comprehensive product data, there are one or two challenges that we have to master. That we have to master together with our customers. If you perhaps have your digitization project or store project, you know that product data can be quite tidy if it is not quite as structured. This starts with the fact that we sometimes have to link several data sources with different output formats, yes. So we have to react in the PIM system because data has been made available in different formats. We have data from suppliers where we have the case that for the same product if we get it delivered or buy it from different suppliers, we also get different data made available. Here we have to look at: How do we standardize them and map them in the PIM system? At the same time, however, we have to take a look. Suppliers naturally also deliver their content to several retailers. Yes. And you have to make sure that you create unique content here and stand out from the rest. And that you also can stand out from the rest in the PIM system. Furthermore, we know that standardized data often has complex logic or dependencies to each other, so that comes into question, yes. That again has to be mapped in the system as well. If we look, if you want to set up internationally, there are also different requirements here. Different units of measurement, different languages. And sometimes, there is also different content that must be mapped here. A translation process that has to be shuttered behind it. This also means that we may not receive product data as quickly as we need for digitization projects. On the next page, a few more examples. The standardization of data. Depending on the situation, I don't know if there are ten or five people in your product management team. Of course, they all have a different view of what shade of blue is now. For one person, it's a dark blue. For the next, it's an azure blue. And if I don't set any rules here in the care process, then I have the problem in the online store, for example, that the customer has various shades of blue in the filter and the storage system has to ensure that he has only one filter, and that is called blue. A major issue is often the creation of a master variant concept. We'll go into that later as well. This is a crucial thing that should be clarified at the beginning. Because, of course, the complete back-end and front-end of store systems are based on such concepts. If you subsequently make pervasive changes to the product and data concepts, this simply involves a lot of work. I have already mentioned maintenance processes.

Why are care processes challenges for you internally in the following implementation projects?

Because we simply have to be provided with product data promptly and on time, yes. These are the challenges we face in practice. There are certainly a few more. I'm sure you know one or two of these challenges in your company. But I would now like to hand it over to Tobias, who will take a closer look at the first example of data sources and target systems.

Tobias Watermann: Exactly. Thank you very much, Jacqueline. As already mentioned, I would now like to deal with data sources and target systems. Or rather, I would like to give you an outlook on what we can always expect in the projects or what challenges we have with these two topics.

Data sources

Typical for us in a project like this is that product information always comes from very different sources. And that this information is also always available in very different formats. This means, for example, that we have various departments that work independently of each other on product information. Whether it's the marketing department or product management, for example, each handles product information somehow but does so in its unique way. And we often see that the employees themselves do not know where some of this data comes from. So, they just have it available, take it. But where this information comes from, or where it is maintained, this information is not available in the individual departments. And one of the reasons for this is that we don't have a single point of truth, or as we call it, product information. This means that the first challenge for us regarding data sources is: Who are the people responsible for this? And how do they get hold of this information? As a result, we are usually faced with data chaos. So, as I have just mentioned, the individual departments may even work in systems that are not suitable for filling in the product information. Of course, they don't talk to each other or with others—for example, an ERP system and a storage system. Then someone fills in product information in the store or the ERP system. And yet somehow, care must be taken to ensure that they are synchronized with each other. And then we have tables and quite a few other tools. And I always call this island knowledge. There is knowledge in departments about where information is located, but the other department knows nothing about the other. In this situation, the challenge for us is again: How do we get this information and the formats? And then, of course, how do we bring all these formats together? And in the following graphic, we can also see our purpose—namely, the so-called golden record. For us, the Golden Record is the state that knows the truth about the product. Meaning we bring together all these sources that we find. We carry all this information together, and then at the end, we have this yellow field here, namely our Golden Record. And then we have a system that speaks the truth, that knows the truth and where the fact continues to be maintained. So, that is our goal in the whole data source issue.

And to get there, we have to ask ourselves questions in a project, especially at the beginning.

And these are the following questions: who and what, where, and how. Quite simple, but they are essential questions for us. And let's start with who and what. As we said earlier. Who are the responsible people, and how do we get this information? And an essential part of getting this information is workshops for us. And we take all the people responsible and all the people working on product information and look at everything first. What processes are in place? What product information is maintained where? And, yes, together, we then work that out at that point. And as soon as we have an overview of all these sources and information, we analyze them, collect them, and then begin to consolidate them. And we also look at all the different data structures and exports from other systems. And, yes, then already have an overview of what is there. The next question is then where and how. So, how is all this information brought together? And that is one of the most important aspects for us. The right choice of PIM system. Because if we have a lot of different formats from our sources, for example, we have to ask ourselves: How should the PIM system process this information and how should these products be mapped in the end? That which then deals with the data model again. We'll come back to that later. For example, does my PIM system need a function like mapping? So, for example, do I need to take formats to bring them into a new format. Do I need to be able to map certain fields to do that? And once we have clarified all these questions and we have agreed on a PIM system, or have found one for, yes, for the aspect of the right PIM system, then we come to the so-called initial load, namely the first initial import, where we take all this information that we have now worked out in the workshops. And, yes, import everything into one place, namely our PIM system then. And there, we then know our truth about the PIM system in principle. Namely, the central place where all the information is held. That's then the first step of these three steps that we just saw Jacqueline still in the graphic. And I will now go back to the last step. So, not the middle one but the derivation, namely the target systems. And here we have the challenge that target systems, i.e. all our channels, actually always have to be supplied with information simultaneously. Preferably in different languages and synchronously. And here, we also often have the challenge that, until a PIM system is used, the departments have to send out this data independently. In other words, there are different BUs, business units, departments that then take care of the web store, the homepage, whatever. And of course, they work for themselves and do it in their way. That means there is a table that is filled in. Sometimes manually. Some things happen automatically. And as with the data sources, it is often the case that there is no general knowledge about the target systems. There is no general knowledge. What kind of target systems are there, and how are they filled? This then leads us to the question. But when we look at the target systems, we also have to look at this data distribution. So, how do we get that from the PIM to the respective target systems in the first place? And in the best case, a channel can be supplied with product information directly from the PIM or retrieved. For example, via RP or a hot folder. So, where data is stored there, the system then pulls the product information that the PIM makes available. That is, of course, the best case for us. But when things get a bit more complex, and we have to respond to a large number of requests for any number of thousands of products, a PIM system can quickly reach its limits. And then, we overload the system due to many live queries, i.e. real-time queries. You have to weigh that up as well. And the more output channels I have, the more we have to see how we can solve the whole problem. And then, for example, a kind of enterprise service bus can provide a remedy by mapping something like that in this architecture. And that's what we see here. For example, we have the PIM system at the very bottom. And the blue fields up there are our output channels or our channels in general. And in the middle, we see this enterprise service bus. And this now makes it very convenient for me to send out data and receive data. So I can make something available to the Enterprise Service Bus, and it knows exactly: Ah, okay, I have to send this to the website, to the catalogues, to the web stores, to the ERP system. And this enterprise service bus also knows exactly: Ah, okay, I'm getting something from the ERP system, for example. And I know how the PIM system needs it. And, of course, a lot of queries can occur in this area. Live queries about product information because this is exactly what it is designed for. If you decide to introduce this type of architecture here, we also have challenges in data distribution. I'll come to that on the next page. Exactly. Namely, if you now decide to introduce such a layer or integration product, as it is also often called, this can also cause effort in the development. Because data or, respectively, yes, data may have to be re-converted for the respective target systems if that is not possible in an automated way. And if, for example, a PIM export is not provided optimally, i.e. I cannot provide channel-specific exports, as some PIM systems can. I naturally also have more effort because I have to detach this from the integration layer again. And this data then has to be handled again, converted, filtered. And that, of course, is an unnecessary effort for us in the development. So, what challenges do I have here again in the data distribution? How does the PIM system have to deliver the product information to have the least possible effort for an integration service bus, enterprise service bus, for example? And what data does the PIM layer need for which system? And which product data, which section of product data, for example, does which system need? You can see that quite well in the following graphic. Here I would have, namely, an example, wherefrom the Golden Record, that it speaks our truth, simply a data master goes to the layer, and this distributes that. And of course, as we have just said, we want the data to be distributed with the least possible effort. And to do this, we also take another look at the processes in workshops. How is that solved? Which channels are there? How do these channels need the information? So that we don't end up with the result of this graphic but with the next result. Namely, the golden record already provides individual parts of my overall assortment, specifically for the output channel. The integration layer only takes these parts and passes them on to the known output channels. And in this way, of course, we get an excellent result for passing on the data. And that sums up quite well the goals we have in terms of data sources and target systems. Namely, the first goal, quite clearly, we want the right PIM for the right purpose. There are many different PIM systems, and we have to see which one fits best for which project. And which one fulfils its purpose at that point. So we need our goal from all these workshops and working out and analyzing the data. We need a single source of truth or a golden record, as we also call it so that one system knows the whole truth. We also want to prevent all this time-consuming searching and mapping data from the various systems from being outsourced and carried out by the PIM. And that this is then basically passed on to these output channels in bits and pieces and that we no longer have any redundant configurations or efforts as a result. And in the end, this simply leads to greater efficiency in the forwarding product information. And these are our main goals in the area of data sources and target systems. As you can see, we already have many challenges here, but we have a clear path. And one component of all this is the creation of variants and data models. And I would now like to pass this on to Robert, who would like to tell you more about this.

Robert Saul: Thank you very much for the super transition, Tobi. Exactly. So, we had heard this more often now in the last two parts. The two words variant generation and data model. And that's why it's crucial for us to get into a particular PIM topic and to take a closer look at this one, very special, component of a PIM. For this purpose, we have brought two graphics with us on the first slide. On the left-hand side, the structure of a data model is shown as an example. At first glance, it is very, very complex and probably not very usable. At least if it is only about product data, that's why we brought another solution on the right side, which we worked out last year together with a customer in the fashion sector. I hope you can't hear my dog. But? It is immediately quiet. And there, you can see that it's subdivided into several hierarchical levels. And that is the product, which one buys finally, which can be put into the Warenkorb, that that still several other products have superordinate. So there are several product hierarchies above it, which are also somehow a kind of product, but in our sense, it is just virtual products that group all the underlying products. And then, you can already see in this data model what is shown on the right-hand side. At the bottom right, it says Product Variants: Size.

What exactly is a product variant?

And I would like to quote my favourite example, which fits in quite well with the data model from the fashion sector. Namely, it's about T-shirts, right? So, let's imagine that we have two different T-shirts. One is red. The other is blue. Other than that, these two T-shirts are no different. They were made from the same yarn. They're the same size, exact fit, same quality. Have the same collar. The only thing is they differ in colour. Now, if there were not only red and blue T-shirts but let's say there are twelve different colours and also for each of those twelve colours again six different sizes and again a V-neck and a round neck. If you multiply that out, you come up with very different variants. If we now imagine that these variants all appear multiplied out on the Product Listing Page, but all of them only represent the one single product, where the end customer should then perhaps be able to put it together himself on the Product Detail Page: Okay, I'm only wearing an L now, and blue is my favourite colour, yes.

Why do we overload the product listing page with these many different variants? That's the one reason. That's why we combine that into one blank. We decide, okay, on the Product Listing Page, I would like to play out the red T-shirt of this model. So, now that was the advantage on the landing page. So in the page where the product information is played out, on the sales channels. But there are also advantages during product data maintenance when you combine similar products in variants. And that simply means that you can write the same kind of information, i.e., if I want to write a continuous text about the fit and comfort of the blue T-shirt, it's also the same for the red T-shirt.

I would have to copy this one text then after there. It could also be that I don't just copy it over, but that perhaps an additional red T-shirt is added half a year later. That there was already the blue one before. And then I have to write the new text again and invest working time. And the possibly also there contradictory data to the back met again. Which just would not be good if the data contradict each other and do not give a consistent picture. The solution here is that these different variants have a common parent. So, if we recall the image from before, you can see other product hierarchies here, such as Product Variants: Size at the bottom correct. This is the red one and this one. Oh, no. That's the shirt in L and XL. Those two probably have the same comfort level, which can even be maintained two levels upon Style. This means that all T-shirts, regardless of colour and size, have the comforting text born on Style, and you only have to enter it in one place, and it is then automatically adapted for all products below.

Why is it a challenge now?

It's a challenge because it's not always just T-shirts. Even in the fashion industry, individual product groups are not so easy to keep track of. And there are also quite a few other industries. I'm thinking of the chemical industry, for example. It's tough for an outsider who hasn't worked in the company for several years to understand the data model and make an ad hoc proposal. In other words, it's a very, very abstract topic that you have to approach strategically. Otherwise, you will end up with the disadvantages that I highlighted initially. Then I would go one step further and say: What is a data model in the first place? On the next slide, I will also detail this: What can happen if the data model is wrong or bad or not optimal for me? And how can I recognize that at all? So data model, on the right side, this little picture with the hierarchical structures. That means I try to find a data model for my product range, which I offer, where all products fit optimally. That means that as a manufacturer or as a retailer, I have different products that can be sorted into other product categories. And they don't all have to be T-shirts, because they can be, to use the fashion industry's term. It can also be skirts or underwear or jackets. And there the information of different kinds can purely and the hierarchy can be built up into different types. Then it is a matter of consideration. How is the data model built now? Because the goal should be that all products from the assortment fit into this one data model and that there are no individual products that are so inconveniently found there. For example, this could lead to inheritance, i.e. to the fact that I have this one value, which is shared by all variants below it, and that this inheritance functions differently within individual product categories than it does for others. All of this must be considered during the initial implementation of the data model. Because otherwise, errors are then actually unavoidable. We have already discussed the disadvantages of offering incorrect product information. Other things here are, for example, if I don't map the master variant logic correctly, then search engines may penalize me in my product listings by simply thinking that these two shirts, red and blue shirts, which now only differ in a small word and a picture on my product detail page, that this is simply duplicate content and therefore does not continue to be ranked as highly as it should be. Or it can be just that then are contradictory. On the red T-shirt, it says this, on the blue T-shirt it says that. In the end, the end customer is undecided and prefers to cancel the purchase. And, of course, the entire product data maintenance becomes much more efficient if I only have to enter this one value, which is the same for many products, once and only have to work it out once. Okay. Then I would like to summarize the whole thing. And say how to get on a perfect path with expert knowledge there. Because it's because, to rephrase that again.

There is no scheme F

So, we can't even be diligent and work out the fashion data model at one customer and then go to ten other customers and present this data model there as that's the right one because each customer has a different product data structure. Each has different products, handles them differently, and has an entirely different underlying IT system architecture, which also influences the data model. This means that we have to look at which output channels exist and in what form the output channels want the data to be provided. There may be a job system that only accepts flat products. So really only the variants and not at all over it information would like to have: Who is now the master of these respective variants. We also have to be prepared for this, and if the data model does not allow it, we have to prepare mappings on the way to the output so that the data can be output to the output channels in small chunks. And at the very end, I would like to talk about this. We, as PIM consultants, see ourselves as mediators between the data model, which is quite a dry topic, as it sounded on the last slides. And those who know their way around product data, i.e. the product data managers. The ones you can throw 50 article numbers at and who know precisely: Okay, I know the products, I know exactly what that is.

We were just based on the article number. We do experience this frequently, that there is something like this. But it is sometimes difficult for them to build up a data model. Because we also want to proceed in a media-neutral way. And in the PIM, we have a central media-neutral data model. We try to make this data model so detailed and efficient that all the other output channels and all the initiating channels, such as the ERP, can still store and receive their data correctly. And so it's kind of like we're trying to learn the language of product data management and the other way around. That he knows to understand our dry data model language within the framework, yes, at the very end, we want to emphasize the general benefits for the project team and also generally for the e-commerce project.

Because at first, it seems as if the PIM is now just another component of my system architecture. Another system that first has to be implemented. Which then has to be maintained. Where users have to be trained. But all I want is a new storage system with a friendly, new UX to convince my customers. But that's just not true. Yes, it's really about that: We work hand in hand with the e-commerce team. Two teams sit one aisle over from us in the office. We know each other very, very well. And if there are requirements there that go both in the direction of the store and in the order of the PIM, then we do have significantly, very short walking distances and can coordinate accordingly. This also means that we can take over tasks that were initially the responsibility of the e-commerce team. In other words, the transformation of the data from the ERP, which we had already pointed out earlier, is not an optimal system for maintaining product information data to the store. We take care of that. In other words, we prepare the product information data in precisely the form that the output channel needs. In other words, the output channel tells us: Okay, I would like to have an XML in this schema. In other words, we prepare the data, put it into the schema, and then pass it on in bits and pieces. That means the e-commerce developers don't have to worry about it in the data mappings. Or even more broadly, in the implementation phase. The e-commerce developers don't need to conduct workshops around product data migration; we do that. And we have also built up specific knowledge in this area. And that's what we do every day, not what an e-commerce developer has to do. I would like to hand over the remaining two points to Tobi.

Tobias Watermann: Yes, thank you. Actually, in a nutshell, we can shorten a project like this by introducing a PIM system, or make the timeline a bit shorter, because we can already provide the other teams with upstream product information. In other words, all the product data consulting happens in advance or can occur in advance. And, yes, what is also an essential thing for us in the whole project. Using a PIM system, we are also flexible in terms of architecture. This means that we can continually expand our system architecture with existing systems, replace or delete existing systems, and still ensure that the product information is maintained uniformly. Be it now also to a new system, for example.

Key Takeaways

We summarize this again. Briefly, we have outlined this here in the three points. This sums up quite well what we have just told you. A PIM system is not just another system, but that the PIM system helps us, or instead simplifies and accelerates this entire process of e-commerce transformation. In complex IT system landscapes, the use of a PIM system in combination with a middle-wear is this integration layer or enterprise service bot that we had talked about. It can complement the whole thing. And simplify the synchronization and syndication, so the product data distribution. And finally, we said that a media-neutral data model is essential for all other processes that handle product information. Yes, it's the necessary prerequisite for that. Exactly. And those would be our three key takeaways from our presentation today. And I think then we would have now, in conclusion, I would then hand over to Hanni again, right?


Hanni Gummel: Yes, very good. Thank you very much for telling us about Product Information Management. And now we're going to start the Q&A session. You currently have the opportunity to ask our experts your questions. Please use the red webinar control panel on the right-hand side of the screen, in the question box, and write your questions in there. Then we can clarify them right away. And then the first questions come in, and I start.

What order would you suggest if we have a PIM implementation and an e-commerce implementation?

Jacqueline Els: Yes, Hanni, I would perhaps go into that. Yes, it is advantageous to think about my product data in the first and second steps of an e-commerce project in a perfect world. In a store project, I make the product data visible to the end customer, yes. In the upstream PIM project, I can also react to requirements from the store project. Respond from the respective channel. And can thus provide the data structure in PIM already, yes, coordinated with it. But since we are not always in a perfect world, I think the more important question is if an ongoing store project already or a store project already exists. Many of our customers already have online stores and online platforms, yes. We simply communicate about it with a good exchange in the team, in the development team: What are the requirements now, and how can we respond to them? It works both ways. In a perfect world, the PIM is, of course, the one that should be in front.

Hanni Gummel: Thank you very much. One question that follows on here now is:

How then do the PIM and e-commerce teams synchronize during such an implementation?

Jacqueline Els: If we assume that both trades are on our side, on diva-e's side, then it is already the case that we are a project team, yes. And we organize ourselves in such a way that the PIM consultants then actually exchange information very closely with the developers, with the experts on the implementation side. Suppose there are two different companies and we are only responsible for the PIM environment. In that case, a good exchange based on partnership is also necessary here to get it on the road together.

Hanni Gummel: Wonderful. And one more question.

You talked about ESB. Can you explain that again very briefly?

Tobias Watermann: Yes, that was the example of the Enterprise Service Bus. So also in a kind of moderate wear. And this middle wear is only there to connect other systems. Because this middle wear knows in principle, has the direct contact to the plans. And we in the PIM simply supply this middle wear with the necessary data. This means that we don't have to deal with how to connect system X or system Y. Instead, we know precisely: Okay. Instead, we know precisely: Okay, this is what system X and system Y want. And we simply pass this product data to this middle wear. And that middle wear doesn't deal with anything else except giving data to the appropriate targets. And, yes, basically distributing it again. That's what we mean by ESB or by moderate wear in combination with a PIM system.

Hanni Gummel: All right, thank you very much. So the next question. You have mentioned a wide variety of data sources.

How do you manage the migration of the data?

Tobias Watermann: I would also answer that again. Yes, of course, the composition is not a pattern F either. You always have to look at how we bring all the data together. But here, too, we look through all these workshops. Where are the sources, and how can we best tap into the authorities. And in the best-case scenario, we consolidate all of these data sources and, for example, use this initial load, this initial import, to create an overall concept in the PIM system. And in the best case, we can also address all sources, be it systems, and have the systems give us the individual pieces of the puzzle automatically.

Hanni Gummel: Okay. Good. Then one last question about the data model.

I already have a well-functioning data model in my Magento store system. Can't I just transfer that to the PIM?

Robert Saul: That's a good question. Yes, actually. So, it's obvious, of course, that you say: "Okay, Magento is currently my only output channel. Everything is already there, and I've already put a lot of brainpower into building this data model." But Magento, for example, is, as I know it, very hierarchically flat. So I know it so that there can be only two hierarchical levels. Namely master and variant. Everything that is mapped above them is no longer mapped directly in the product data hierarchy but would have to be mapped roundabout. This means that we might be giving away potential if we simply adopted this approach. It would be possible, of course, depending on how the product data structure looks and which products are offered there. But I couldn't say that this is a good idea across the board. So, the good idea is to map the media-neutral first. That means taking a close look in the PIM at how I can ideally map the products among each other here so that as much product information as possible, which is the same, is outsourced to the respective parent and then passed down again, for example, via inheritance. That I come very close to maximizing the product information data quality with this goal alone via the data model. Well, not necessarily in principle, but it is possible.

Hanni Gummel: Good, very good. Yes, that seems to be all the questions for now. Thank you very much for now. If you have any further questions, please feel free to contact Jacqueline or Guido, the responsible sales manager. They are looking forward to your questions and the exchange with you. And of course, you will receive the presentation recording, yes, as I said, afterwards. And that's the end of it. Thank you very much for your participation. A huge thank you to Jacqueline, Robert and Tobias today for your time and your exciting input. And of course, we would be happy if you leave us a short evaluation in the survey that will appear soon so that we just know how we can develop further. With that, yes, we wish you a great day yet, until next time. Yeah, all the best, and stay healthy. Until then, bye.

Jacqueline Els: Thank you very much, bye.

Tobias Watermann: Bye.