Why Dynamic Publishing Projects Fail

Part I: Product data management - success factor and obstacle at the same time

This text is part of a four-part series by Werk II and Laudert on the topic of "Why dynamic publishing projects fail.

Product data. One of those buzzwords that everyone has a different idea of. In terms of content, the definition is quite clear. It is about product information, from the master data (item number, product name, price...) to the technical data (height, length, weight...) to the staging data (product descriptions, promotional texts, images, videos...).

The importance of this data in modern marketing processes, especially in dynamic publishing, is enormous. And the trend is still rising. Without product data, product communication is simply not possible. They not only present the goods to the customer as individual products with meaning and characteristics, but also enable comparability between similar products within a product range. Provided that the data quality is right.

However, our experience as media and IT service provider Laudert also shows: there is a certain discrepancy between the feeling that companies have developed for the quality of their data and the actual data quality. Ironically, it is usually the case that it is precisely the companies that tended to doubt the quality that were able to demonstrate the best product data quality.

Acknowledge the effort of data management

The effort required for product data management is one of the most underestimated time and cost factors - by the way, in almost all areas of product and marketing communication: in the implementation of new IT systems as well as in the set-up of own online shops, the connection to marketplaces or even dynamic publishing. Time schedules for the provision of product data are often enormously unrealistic because the manpower required is underestimated.

Sample calculation of data management costs for a fictitious company, moderate assumptions:

  • 400,000 articles/year
  • 50% of the data can be transferred directly from the suppliers/are already available
  • Time required per attribute: 60s including quality assurance
  • 5 attributes per article

Time spent in minutes: 400,000 x 50% x 5 = 1,000,000
In hours: 16,666.67
In working days of eight hours each: 2083.33

The dimension of the effort in data management is immediately apparent. Data quality can only be ensured if the data within the company is given the corresponding significance in the first place and, as a consequence, the necessary manpower. To put it bluntly: "We do it on the side" is the first step into the abyss.

High-quality data requires skilled personnel

While we're on the subject of casual phrases: Even when it comes to the quality of the individual attributes, there are one or two classic phrases that you stumble across as a service provider. One of them is: "Temporary staff and student workers can take over data management." Really? Let's take a look at this in detail:

Master data

Master data usually accumulates "automatically" from the product life cycle, i.e. from the ERP or PLM. In practice, they are usually uncritical, and problems or efforts occur only very rarely.

Technical data

Technical data often comes from manufacturing or purchasing. If the data is incomplete or needs to be enriched, not inconsiderable expertise is required.

Without specialist knowledge, correct, specialist data collection is difficult for many products. And the rabbit hole can be considerably deep, depending on the product. In the case of staging data (product texts and descriptions, photographs for the various discharge channels), many other factors come into play.

Texts are particularly relevant for dynamic publishing: They are needed on promotional materials such as handouts, flyers and the classic catalogue. A "single source of truth" is therefore also necessary for the product texts in their versions (promotional, factual, channel-specific). Central data storage, uniform communication and efficient playout are the key factors here. Mouth-painting the text length for each individual publication contradicts the idea of picking up speed with clean processes.

The stumbling block of image keywording

Image tagging is somewhat simpler. Nevertheless, it is not a by-product. Simple mistakes creep in quickly. To stay with the screw example: "Where is the front of a screw?" Clean image keywording requires trained people (for example, Laudert's specialised content services team).

Automation "The AI can do all that"

Especially in the area of keywording, the keyword artificial intelligence comes up again and again. "They could take care of that." In principle, AI has enormous potential. But the benefits depend very much on the tasks being discussed - because AI is pattern recognition and not a panacea. For example, an AI can recognise very well whether people, houses, trees or sunsets can be found in a picture. Or whether the people are happy or sad, have a beard or are depicted with or without a cap. Embedded in an appropriate taxonomy, AI can be very useful here.

In the field of product images, however, the areas of application are still limited. Exemptions and masks work well - but of course only as well as the training material of the AI. And: an AI is specialised for its singular purpose.

AI is a decision for which you have to understand very precisely what you want to achieve yourself - and what the AI can contribute. You also have to consider for which tasks you want to use a "public" AI (e.g. Google Vision) - for whose training you then provide your data, and which AI you would rather develop and train in-house.

Efficiency must meet quality in data management

As the basis for dynamic publishing and the entire product and brand communication, high-quality product data is a key factor. Even with a high degree of automation and efficient processes and workflows, a considerable amount of work remains for companies. To master this effort, an awareness of the complexity is indispensable.

Regardless of whether one finally decides on internal processing or relies on professional support, such as the Content Services from Laudert: Good product data is the trump card - bad product data is one reason why dynamic publishing projects fail.