Back to overview

Customer Story

Clever market monitoring

Companies have to be able to react to market changes in time if they want to be successful. But how can this be achieved with quantity and complexity of information sources increasing? Knowledge graphs could be the answer.

In Short

CHALLENGE

Quantity and complexity of information sources for market monitoring increase.

SOLUTION

Harmonization of data using knowledge graphs as a basis for market and technology monitoring.

BENEFIT

Early identification of market changes and trends; competitive advantage.

Our Solution

The success of innovative pioneers like SpaceX and Uber illustrates where the journey towards digitalisation is heading. New technology has the potential to turn entire industries on their head and supplant traditional companies and their business models. This is why businesses who want to survive long-term competition have to incorporate technology and innovation management into their operations so that they can respond to market changes in time and tap into new markets and/ or products. The main components of technology management are, firstly, identifying relevant trends at an early stage and, secondly, examining them in terms of potential changes. Without automated solutions, this process takes up a considerable amount of time and analytical work. At the same time, the quantity and complexity of the information sources that need to be taken into account are increasing. Other barriers to using this data include the variety of languages, the ambiguity of data, the cost of getting licences, and also technical hurdles such as the five big data Vs: Volume, Variety, Velocity, Validity and Value.

Knowledge graphs bring order to an increasingly complex world of data

Lots of businesses tackle these challenges with the preventive approach of creating data-intensive silos or through the use of classic machine learning processes, which in many cases have proven to be expensive and not particularly sustainable dead ends. One of the main reasons why these approaches fail is that data is often detached from its original source and then collected and processed without a structured, uniform understanding of its content. As such, not even the best text analysis procedures are able to reliably determine whether the word “Apple” in a text is referring to the company or the fruit. Knowledge graphs enable conceptual properties to be learnt by putting the focus on the relationships between terms. Companies, technologies, points of interest and even people are depicted as nodes within a harmonised graph, which represents the realworld links between any such things. The results of refining data in this way aids the understanding of the meaning – in other words, the semantics – of any constructs within our world and provides us with a basis of data for recognising technology and market changes.

Intelligent monitoring of markets and technology as a key competitive factor

Initially launched as a joint research project, an innovative pilot application has been developed as part of the long-standing working relationship between Nuremberg Institute of Technology, the Fraunhofer Center for Applied Research on Supply Chain Services SCS (in Nuremberg) and Trivadis. This application enables the development of markets, technology and industries all over the world to be analysed regardless of their language or domain. The application was able to provide helpful information on the following typical issues:

  • What technology is up-and-coming in a certain sector of industry, for instance a new material for manufacturing energy storage systems?
  • Based on their product-portfolio development, which up-and-coming companies could be suitable as new suppliers, acquisition targets or potential customers?
  • Which companies are our competitors in certain markets and how are they developing?

To answer these questions, unstructured information sources, such as news feeds and other publicly available text documents like publications, patents or articles, are automatically recorded and mapped onto a knowledge graph as the central core of data using modern AI-supported procedures. This central core of data represents the knowledge from the collected data sources presented in a structured, fact-based manner. The success of the pilot project was able to demonstrate that using knowledge graphs for the intelligent consolidation of “data” as a raw material can be a key competitive factor.

Following our guiding principle of “Turning Data into Business”, Trivadis has worked with its partners to develop a Cloud-based platform and several trend analysis tools. Thanks to this solution, multiple SMEs are already equipped with a systematic trend radar for their specific areas of work and are able to integrate the results into businessrelevant decision-making processes. The ongoing research-oriented cooperation with the Fraunhofer Institute and Nuremberg Tech combines over 25 years of experience Trivadis has as a data expert and not only enables the team to “trial” innovative cutting-edge PoCs for trend monitoring but also allows them to implement sustainable solutions and operate them over the long term. The diagram on the previous page uses a simplified model to illustrate the path from a data source to fact-based knowledge, and ultimately to valuable analysis and reporting tools for day-to-day work.

Knowledge Graphs help to understand the meaning of any constructs in our world and serve as a data basis for identifying relevant technologies and market changes.

Dr. Roland Zimmermann, Professor for Information Systems and Statistics, Nuremberg Institute of Technology

The exciting challenges presented by a new upcoming key technology motivated Trivadis and its partners to set aside traditional ways of thinking about data management and apply their shared expertise in new areas. The core points include:

  • The heterogeneity of data sources and formats, the speed with which information should be received and processed, and the granularity with which it can be consumed (batch, API, event-based) call for a high degree of basic functionality from data management. These requirements relate primarily to technical criteria for the functionality of a data platform. At this point, Trivadis’s reference “architecture for analytical data platforms” could be applied, which had already been implemented for a large number of customers in a variety of industries. Using this architecture, manufacturerindependent modules could be sourced for the consumption and pre-processing of data.
  • A further aim of the partnership was to use mainly open-source or licence-free software components. For the development of a pilot application, an extensive comparison matrix by Trivadis revealed several freely available services that could be tested. When it comes to the application’s productive, commercial operation, the solution’s entire lifecycle, e.g. including support, has to be clarified. In many cases, there are specialist companies for each software component on the market. Alternatively or on a complementary basis, Trivadis and its Managed Services products can cover operation, support and maintenance throughout the implemented application’s entire lifecycle and the infrastructure.
  • The most challenging requirement from a technical perspective is presented by the cross-industry use of market and technology monitoring. Public knowledge graphs like Wikidata.org or DBpedia.org stand out with impressively broad bases of knowledge and that’s why they are often used in pilot applications. However, these graphs often lack depth if your aim is to process special, domainspecific vocabulary within texts. To find a solution for this, particular attention was paid to Active Learning procedures: Experts are given the job of checking information extracted from data sources where there are uncertainties in the AI-supported procedures. However, this valuable expert knowledge is not only used to correct the displayed erroneous information. Their knowledge is also fed back into the original AI process, enabling the quality of all future information to be improved.
  • The creation of a harmonised and highly informative knowledge graph has laid the foundation for market and technology monitoring; however, a business does not get to enjoy the added value until it can use this information to incorporate technology and innovation management into its operations, for instance. To achieve this, the knowledge has to be made accessible to the respective consumers in a suitable form. Interfaces to established BI tools, like Tableau or PowerBI, are essential for the acceptance and active use of the refined data. Depending on the application, something known as graph views (marts) are defined, which can easily be made available in table form. From this abstraction layer and above, technical understanding of knowledge graphs should no longer be required.
  • Knowledge graphs are new territory for many customers and companies often have only little expertise in this area in-house. For all topics related to data, Trivadis offers a wide range of training courses and workshops with experienced experts, which universities and research institutes are also welcome to attend. Practical workshops are also held on a regular basis as part of this project. To begin with, workshops will focus on an initial foundation of understanding related to knowledge graphs and the underlying technology and concepts. As the project continues, the additional training courses and knowledge transfer events will be organised on a needs basis, so that they can be trialled in practice without any major delays. Data is the “oil of the 21st century”, which makes it the fuel of the digital transformation. However, handling data in a way that generates value poses major challenges to many companies. An accurate understanding of the development of markets and technology within the context of your own company is a central component to success. The smart market and technology monitoring from Trivadis uses knowledge graphs as a core element of this approach, enabling bespoke data sources to be intelligently linked and harmonised for your company.

ABOUT FRAUNHOFER SCS

In Nuremberg and Bamberg, the Fraunhofer Working Group for Supply Chain Services SCS designs data rooms for networked overall systems and rapidly deployable IoT prototypes, develops state-of-the-art data analytics methods in concrete applications and provides support in the realisation of digital transformation. It combines economic methods and technological solutions with mathematical procedures and models. The Future Engineering research group, which belongs to the SCS, combines methods of machine processing of natural language with methods of market, trend and scenario research.

Technologies used

  • Semantic Web concepts and technologies
  • (Linked) Open Data
  • Graph databases (RDF TripleStores)
  • Microsoft Azure (Cloud infrastructure)
  • Standardised data formats (RDF)
  • Vocabularies, taxonomies and ontologies (RDFS, OWL, PROV-O, SKOS)
  • Protegé, StreamSets, Talend, Dataiku
  • Natural Language Processing, Understanding and Linking

More stories

AI Cloud IoT Analytics Data Platform Microsoft Health Care
How artificial intelligence helps children breathe
Data Platform Cloud Automation Microsoft Automotive Analytics
With central data plat­form to the self­driving car
Data Platform IoT Cloud Microsoft Public Administration Analytics
Platform analyses and visualises large amounts of data

Your contact