top of page

PLM & Predictive Engineering Analytics



Disclaimer: This post is heavily biased in favor of Products, not Services.


I keep receiving lots of questions about Engineering and Data Science, so I strongly recommend you to read this post before sending me a question.


I'm going to base my arguments on Siemens definition of a PLM system:

"Product lifecycle management (PLM) is an information management system that can integrate data, processes, busines systems and, ultimately, people in an extended enterprise. PLM software allows you to manage this information throughout the entire lifecycle of a product efficiently and cost-effectively, from ideation, design and manufacture, through service and disposal."

In short terms, a sucessfult PLM implementation needs to be strongly correlated to faster time-to-profit, reduced production costs, and continuous innovation throughout the product lifecycle (from development to phase-out).


I'm going to focus more on Ideation and Design with this post. Predictive Engineering Analytics it's the closer that Data Science, Machine Learning, Optimization and Simulation can get to deliver astonishing results to a company. This can be achieved through:

  • Succesfull Introduction/Maintenance of new/current software (CAx) tools;

  • Digital prototyping, Rapid Prototyping and Engineering

  • Prototyping (not to be confused) more at [4];

  • Integration between current process and new tools/updates/upgrades;

  • Improvement of simulation processes;

  • Improvement of testing processes;

  • Facilitate collaboration between teams;

Most of companies rely on independent workflows for Computer-Aided Engineering (CAE), Validation, Verification and Testing. These independent workflows are responsible for a significant number of development process disconnections, resulting in delay, refactor, inacurracy, "rework" etc.


A couple of years ago there was a lot of buzz around MBE (Model Based Enterprise) and the IT infrastructure was centered on the goal. After a while, Original Equipment Manufacturers (OEM) and larger suppliers are now “Model Centric”, where 3D models are developed and "consumed" downstream, resulting in same source for all departments.


The problem is that now, more than never, we have way more than just models. We have meta-data, (un)structured data, legacy data, ERP (Enterprise Resource Planning), MES (Manufacturing execution systems) , SCADA (Supervisory control and data acquisition), CRM (Customer Relationship Management) and the really good/bad new is that they all generate/consume tons of GBytes. Analysing this increasing complex scenario Siemens created a better term to describe companies: "Digital Enterprise".

Finally, a Leading class PLM will have to connect and facilitate a flow between all the mentioned systems (Where it's needed) through PEA. This can be visualized with the following points:


Process that need to bee tightly integrated:

  • 3D CAD model (the source for every downstream process, considering iterations and loops)

  • 1D simulation

  • 3D simulation:

* Computational Solid Mechanics (CSM)

* Finite Element Analysis (FEA) w/o Multibody Dynamics

* Computational Fluid Dynamics (CFD)

  • Multibody dynamics / multiphysics simulation

  • Multidisciplinary design exploration,

  • Physical testing (trend to minimize costs. 1 Physical Test for 100 Virtual tests. More at [5]

  • Visualization (rendering), Real-time visualization, VR (Virtual Reality) and IR (Immersive Virtual Reality)[2]

  • Minimize the costs related to physical mockups.

  • Data Analytics * Descriptive Analytics (What?) * Diagnostic Analytics (Why?) * Predictive Analytics (What and When?) * Prescriptive Analytics (How?)

Side note: Remember that all the mentioned points can happen BEFORE a full engineering/production ready prototype.


We can expect the following positive outcomes of a successful PEA implementation:

  • Predict the behavior of complex systems (mechanics, electronics, software and control systems) with high fidelity models;

  • Integrate multi-disciplinary knowledge across all engineering departments and stakeholders;

  • Provide Models and data for every decision and stage of the design cycle (even from the very beginning);

  • Emulate the Darwinian evolutionary theory: only the best concepts and system architectures will "survive";

  • Faster simulation/optimization processes with fewer disconnections can find problems before modification costs are too high;

  • Balance multi-disciplinary requirements and reinforce IPD (Integrated Product Development);

  • Keep simulation models in-sync with the actual product (i.e. Digital Twins);

  • Simulation and Test/Experiments iterate in a closed loop with the common goal of meeting requirements;

  • Data Science combined with Simulation, Experiments and Testing to gain insights and drive innovation;

  • Development continues after delivery with IoT (Internet of Things). The product itself need to share usage information (cloud connectivity) and PEA will process the incoming data and turn into inputs for new features or even live updates ( like the Tesla's “over the air” software update) More at [3].


That's all XD


References:

[1] http://www.zdnet.com/article/gms-new-crash-test-dummies-transmit-data-10000-times-a-second/ [2] https://www.3ds.com/products-services/3dexcite/ [3] https://www.wired.com/insights/2014/02/teslas-air-fix-best-example-yet-internet-things/ [4] https://inertiaengineering.com/three-phases-of-prototyping


TODO: create links, insert image of design cycle

Posts Em Destaque
Posts Recentes
Arquivo
Procurar por tags
Siga
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page