Data Engineering for industry 4.0

Data Engineering for industry 4.0

Within industry 4.0 data is playing a pivotal role, in this field we

  • Develop and apply black boc machine (deep) learning tecniques, including machine learning and semantics to include expert knowledge to obtain interpretable outputs;
  • Develop innovative data and hybrid drive modelling algorithms to increase the accuracy of machinery and factory models, discover new knowledge and to allow tuning of physical models;
  • Develop cross-context models to allow machine learning models to adapt to new contexts wherin machines and factories need to operate;
  • Introduce virtual, augmented and mixed reality into the manufacutring industry based on/inspired by the gaming industry.

Topics

Sensor Fusion & Dynamic Dashboards

DynamicDashboard.jpgBy designing multi-modal and multi-sensor architectures we provide collaboration among sensors (classical, video-based, time-series, mobile sensing and virtual/data mining sensors) in order to feed back the available information and intelligence of all sensors to optimize their functionality and enhance the detection and interpretation of advanced events. A wide range of applications can benefit from these multi-modal and multi-sensor architectures that fuse amongst other visual, audio, thermal, vibration and/or data mining information. Examples are industrial process control or condition monitoring. 

The heterogeneity and vast amount of sensors, as well as the difficulty of creating interesting sensor data combinations, also hinder the deployment of fixed structure dashboards as they are unable to cope with the accordingly vast amount of required mappings. Therefore, we additionally develop dynamic dashboards that precisely visualize the interesting data for the end-user produced by sensors in multi-sensor environments by dynamically generating meaningful service compositions, allowing the detection of complex events that used to remain undetected.

More information

More in this video

Series Distance Matrix for Anomaly detection

Series Distance MatrixWe developed an unsupervised learning toolbox (Generalized Matrix Profile) that processes a sequences of discrete-time data. Toolbox has a variety of Python implemented techniques such as Z-normalized Euclidean Distance, Matrix Profile (MP), Multidimensional MP (MMP), Contextual MP (CMP), Radius Profile, Series Distance Matrix, etc. suited to address specific challenges of the discrete-time data. Due to the variety of implemented techniques, the toolbox has a potential to become near universal data mining solution. One specific tool includes series distance matrices (SDM), a black-box type framework (data-driven) capable to efficiently process anomalies and trends in the time series. In its core, the method works by searching for pairwise ‘mathematical’ distances (=correspondence) between selected subsequences within the time series. Next, step involves processing of the resulted distance matrix, we offer a framework that includes a fragmented processing of distance matrix making it much more efficient and suitable for (i) processing long time series as well as (ii) online processing. Our toolbox is currently deployed for anomaly detection in NewYorkDataset, VentilationDataset.

Link to Github

Surrogate models - AI for small data sets

Surrogate ModelsIn a manufacturing context, often models are unavailable (e.g. for very complex processes) or are too challenging to run in the loop (e.g. high fidelity models). In such case, surrogate models provide fast-running approximations of complex time-consuming computer simulations. Surrogate models are also known as response surface models (RSM), metamodels, proxy models or emulators. They mimic the complex behavior of the underlying simulation model. They bridge the gap between the numerical or experimental, and the analytical. Surrogate models are used for parametric studies, optimization, design-space exploration, visualization, prototyping, uncertainty quantification and sensitivity analysis.

Further, to build such models, there is a need for data. However, obtaining such data can be expensive (costly experiments, numerically challenging simulations, ... ). One solution to cope with this is to apply adaptive sampling as a method within a design of experiments strategy. We have developed a whole suite of tools to develop surrogate models. 

More information

Blue collar training based on AR/VR

DAE.pngUsing the Virtual Reality training application made by howest DAE-Research, Flemish Minister Crevits learned how to manually assemble a machine part step by step. Embedding the graduation work of student Giuliano De Luca made it possible to blend the virtual and real world by using the greenkey studio inside the Level, thus creating a mixed reality experience. The tools are being developed a.o. within the VLAIO-TETRA project 'Sector Innovating Virtual & Augmented Reality'. In this project Flemish organisations are guided into defining the added value of Virtual & Augmented Reality technology in their current workflow. One part of the project focusses on developing proof-of-concept applications for Flemish industrial settings (e.g. remote support, virtual reality training,...) with the aid of Howest, application and game developers.

More information

Learn more here

Research & test infrastructure

See here our list of infrastructure