3D Computer Vision-based Production and Simulation Modeling Technologies, Geolocation Data Mining and Tracking, and Cognitive Mapping and Navigation Tools in the Industrial Metaverse.

AuthorBelas, Jaroslav
  1. Introduction

    Cognitive digital twins, 3D machine and multi-modal deep neural networks, and artificial intelligence-based Internet of Manufacturing Things and blockchain-enabled metaverse systems further photorealistic virtual reality environments. The purpose of our systematic review is to examine the recently published literature on the industrial metaverse and integrate the insights it configures on 3D computer vision-based production and simulation modeling technologies, geolocation data mining and tracking, and cognitive mapping and navigation tools. By analyzing the most recent (2022-2023) and significant (Web of Science, Scopus, and ProQuest) sources, our paper has attempted to prove that digital twin manufacturing and synthetic data-based autonomous production systems, infrastructure virtualization and 3D visual technologies, and collaborative intelligence and deep learning-based ambient sound processing tools configure realistic 3D simulation environments. The actuality and novelty of this study are articulated by addressing spatial cognition and situational awareness algorithms (Kliestik et al, 2022), hyperrealistic artificial intelligence-generated faces, and semantic network representations in the industrial metaverse, that is an emerging topic involving much interest. Our research problem is whether synthetic digitally-mediated and the metaverse interactive environments necessitate digital twin simulation and deep learning artificial intelligence tools (Zvarikova et al., 2023), Internet of Things sensing infrastructures, and virtual immersive and remote sensing technologies.

    In this review, prior findings have been cumulated indicating that ambient sound recognition software, biometrics data fusion, and blockchain token-based digital assets are pivotal in the interconnected metaverse. The identified gaps advance neuromorphic computing and wearable haptic augmented reality systems, image processing and geospatial mapping tools (Grupac and Lazaroiu, 2022), and ambient sound recognition software (Machova et al, 2022) in the industrial metaverse. Our main objective is to indicate that machine learning prediction and production scheduling algorithms, digital twin modeling and simulation tools (Valaskova et al, 2022), and extended reality and behavioral biometric technologies assist Internet of Things-based industrial environments.

  2. Theoretical Overview of the Main Concepts

    Mobile location analytics, industrial Artificial Intelligence of Things, and explainable artificial intelligence-based decision support and metaverse decentralized governance systems shape immersive virtual environments. 3D digital environments require spatial audio and cognitive modeling technologies, Internet of Things-based decision support and visual perceptive systems, and 3D image modeling and machine vision tools. The manuscript is organized as following: theoretical overview (section 2), methodology (section 3), geospatial big data visualization and neuromorphic image processing systems, context awareness and brain-inspired artificial intelligence algorithms, and event modeling and forecasting tools in the industrial metaverse (section 4), digital twin manufacturing and synthetic data-based autonomous production systems, extended reality and behavioral biometric technologies, and 3D modeling and virtual simulation tools in the industrial metaverse (section 5), machine learning-based production planning and scheduling systems, image processing and geospatial mapping tools, and virtual object behavior modeling in the industrial metaverse (section 6), discussion (section 7), synopsis of the main research outcomes (section 8), conclusions (section 9), limitations, implications, and further directions of research (section 10).

  3. Methodology

    Throughout January 2023, a quantitative literature review of the Web of Science, Scopus, and ProQuest databases was performed, with search terms including "the industrial metaverse" + "3D computer vision-based production and simulation modeling technologies," "geolocation data mining and tracking," and "cognitive mapping and navigation tools." As research published between 2022 and 2023 was inspected, only 143 articles satisfied the eligibility criteria. By taking out controversial or ambiguous findings (insufficient/irrelevant data), outcomes unsubstantiated by replication, too general material, or studies with nearly identical titles, we selected 26 mainly empirical sources (Tables 1 and 2). Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AMSTAR, Dedoose, Distiller SR, and SRDR (Figures 1-6).

  4. Geospatial Big Data Visualization and Neuromorphic Image Processing Systems, Context Awareness and Brain-inspired Artificial Intelligence Algorithms, and Event Modeling and Forecasting Tools in the Industrial Metaverse

    Extended reality environments and the industrial metaverse (Calandra et al., 2023; Durana et al., 2022; Leng et al., 2022) require deep learning-based sensing and cognitive enhancement technologies, decision-making process automation and simulation modeling tools, and geospatial big data visualization and neuromorphic image processing systems. Synthetic digitally-mediated and the metaverse interactive environments necessitate digital twin simulation and deep learning artificial intelligence tools, Internet of Things sensing infrastructures, and virtual immersive and remote sensing technologies.

    Photorealistic synthetic environments and digitally-networked mediated spaces (Chen, 2023; Liu et al., 2023; Yao et al., 2022) require event modeling and forecasting tools, metaverse assets and services, and context awareness and brain-inspired artificial intelligence algorithms. Smart sensor and biometric self-authentication devices, wearable haptic garments, and 3D holographic avatars are pivotal in extended reality environments and immersive hyper-connected virtual spaces.

    Multisensory immersive extended reality experiences (Ganchev et al., 2023; Ma et al., 2023; Yang and Wu, 2023) can be achieved by use of spatial cognition and situational awareness algorithms, hyper-realistic artificial intelligence-generated faces, and semantic network representations in the industrial metaverse...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT