Artificial Intelligence-driven Big Data Analytics, Real-Time Sensor Networks, and Product Decision-Making Information Systems in Sustainable Manufacturing Internet of Things.

AuthorAdams, Donald
  1. Introduction

    Digitally networked artificial intelligence-driven technologies can assist smart shop floors in optimizing personnel collaboration efficiency, manufacturing processes and operational transparency (Nica et al., 2018), and product quality and performance, while reacting adequately to inconstancies in market demand, decreasing the time to market for customized items, and streamlining product innovation. (Qi et al, 2021) Mismanagements in data recording and labeling significantly limit prediction accuracy. (Bachinger et al, 2021)

  2. Conceptual Framework and Literature Review

    Deep learning-enabled diagnosis necessitates connecting condition-related characteristics (Ginevicius et al, 2020; Lazaroiu, 2017; Mihaila et al, 2016; Wallin and Sandlin, 2020) derived from sensing data (Poliak et al, 2021a, b; Valaskova et al, 2021) to the related deficiency or anomaly root source in equipment and process monitoring. (Zhang and Gao, 2021) The advancement of artificial intelligence-based self-governing assembly systems is assisted by deep learning-based instantaneous allocation, modeling, and comprehension (Campbell, 2021; Lazaroiu et al, 2017; Nemteanu and Dabija, 2021; Popescu et al, 2017a, b, c; Vatamanescu et al., 2021) of inactive and fluid scenes and intensification in the capacity of reinforcement learning for integrating robot skills. (Ji et al, 2021) The prolonged life cycle of manufacturing plants and subsequent demands for prediction models (Ionescu, 2020; Lazaroiu et al., 2020; Noack, 2019; Svabova et al, 2020) require constant model adjustment and administration. (Bachinger et al, 2021) By use of Internet of Things smart devices (Kliestik et al., 2021; Lyons and Lazaroiu, 2020; Paskaleva and Stoykova, 2021; ?alnar-Naghi, 2021), production scheduling is exemplarily carried out. (Qi et al., 2021) Smart networked factories advance on deep learning for production operations, big data-driven assembly station design and administration for reconfigurable cyber-physical systems, self-optimization patterns for shop floor balancing and sequencing, predictive maintenance and condition supervision of equipment, and self-diagnosis computerized tools developed on machine learning technologies. (Cohen et al, 2019)

  3. Methodology and Empirical Analysis

    Building our argument by drawing on data collected from Management Events and McKinsey, we performed analyses and made estimates regarding how reliable and resilient smart factories develop on deep learning-based autonomous assembly systems. The data for this research were gathered via an online survey questionnaire. Descriptive statistics of compiled data from the completed surveys were calculated when appropriate.

  4. Study Design, Survey Methods, and Materials

    The interviews were conducted online and data were weighted by five variables (age, race/ethnicity, gender, education, and geographic region) using the Census Bureau's American Community Survey to reflect reliably and accurately the demographic composition of the United States. The precision of the online polls was measured using a Bayesian credibility interval. Confirmatory factor analysis was employed to test for the reliability and validity of measurement instruments. Addressing a significant knowledge gap in the literature, the research has complied with stringent methodology, reporting, and data analysis requirements. This survey employs statistical weighting procedures to clarify deviations in the survey sample from known population features, which is instrumental in correcting for differential survey participation and random variation in samples. Results are estimates and commonly are dissimilar within a narrow range around the actual value. Descriptive and inferential statistics provide a summary of the responses and comparisons among subgroups. If a participant began a survey without completing it, that was withdrawal of consent and the data was not used. To prevent missing data, all fields in the survey were required. Any survey which did not reach greater than 50% completion was removed from subsequent analysis to ensure quality. The data was weighted in a multistep process that accounts for multiple stages of sampling and nonresponse that occur at different points in the survey process. Test data was populated and analyzed in SPSS to ensure the logic and randomizations were working as intended before launching the survey. To ensure high-quality data, data quality checks were performed to identify any respondents showing clear patterns of satisficing (e.g., checking for high rates of leaving questions blank). The cumulative response rate accounting for non-response to the recruitment surveys and attrition is 2.5%. The break-off rate among individuals who logged onto the survey and completed at least one item is 0.2%. Sampling errors and test of statistical significance take into account the effect of weighting. Throughout the research process, the total survey quality approach, designed to minimize error at each stage as thus the validity of survey research would be diminished, was followed. At each step in the survey research process, best practices and quality controls were followed to minimize the impact of additional sources of error as regards specification, frame, non-response, measurement, and processing. The sample weighting was accomplished using an iterative proportional fitting process that simultaneously balanced the distributions of all variables. Stratified sampling methods were used and weights were trimmed not to exceed 3. Average margins of error, at the 95% confidence level, are +/-2%. The design effect for the survey was 1.3. For tabulation purposes, percentage points are rounded to the nearest whole number.

  5. Statistical Analysis

    An Internet-based survey software program was utilized for the delivery and collection of responses. Panel research represents a swift method for gathering data recurrently, drawing a sample from a pre-recruited set of respondents. To ensure reliability and accuracy of data, participants undergo a rigorous verification process and incoming data goes through a sequence of steps and multiple quality checks. Only participants with non-missing and nonduplicated responses were included in the analyses. Individuals who completed the survey in a too short period of time, thus answering rapidly with little thought, were removed from the analytical sample. An informed econsent was obtained from individual participants. Study participants were informed clearly about their freedom to opt out of the study at any point of time without providing justification for doing so. Question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls. All data were interrogated by employing graphical and numeric exploratory data analysis methods. Descriptive analyses (mean and standard deviations for continuous variables and counts and percentages for categorical variables) were used. Descriptive statistical analysis and multivariate inferential tests were...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT