PDF (1097 K) - Science Direct

10 downloads 45314 Views 1MB Size Report
2016, IFAC (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved. prototyping tools ... in a myriad of ways that change with context (Lee 2015). The efficiency ... at the shop-floor, through the best embedded database technologies .... domain and is apt and open to innovation and research.
Management and on Control IFAC Conference Manufacturing Modelling, IFAC Conference Conference on on Manufacturing Manufacturing Modelling, Modelling, IFAC June Conference 28-30, 2016. Troyes, France Modelling, IFAC Manufacturing Management and on Control Available online at www.sciencedirect.com Management and Control Management and Control Control Management and June 28-30, 2016. Troyes, France June 28-30, 28-30, 2016. 2016. Troyes, Troyes, France France June June 28-30, 2016. Troyes, France

ScienceDirect

IFAC-PapersOnLine (2016) 249–254simulation and control of cyberA database-centric approach for the49-12 modeling, A database-centric approach for the modeling, and control of cyberA approach for the simulation and control of cyberphysical systems in modeling, the factorysimulation of the future. A database-centric database-centric approach for the modeling, simulation and control of cyberphysical systems in the factory of the future. physical systems in the factory of the future. physical systems in the factory ofSauro the Longhi* future. Andrea Bonci*, Massimiliano Pirani*,

 Andrea Bonci*, Massimiliano Pirani*, Sauro Longhi* Andrea Pirani*, Andrea Bonci*, Bonci*, Massimiliano Massimiliano Pirani*, Sauro Sauro Longhi* Longhi* Andrea Bonci*, Massimiliano Pirani*, Sauro Longhi*  *Dipartimento di Ingegneria dell’Informazione (DII),  *Dipartimento di Ingegneria dell’Informazione (DII), Università Politecnica delle Marche, 60131, Ancona, Italy (e-mail: [email protected], [email protected]) *Dipartimento di Ingegneria dell’Informazione (DII), *Dipartimento di dell’Informazione (DII), *Dipartimento di Ingegneria Ingegneria dell’Informazione (DII), [email protected]) Università Politecnica delle Marche, 60131, Ancona, Italy (e-mail: [email protected], Università Politecnica delle Marche, 60131, Ancona, Italy (e-mail: [email protected], [email protected]) Università Università Politecnica Politecnica delle delle Marche, Marche, 60131, 60131, Ancona, Ancona, Italy Italy (e-mail: (e-mail: [email protected], [email protected], [email protected]) [email protected])

The path towards Industrie 4.0, requires that factory automation problems cope with the cyber-physical The pathcomplexity towards Industrie Industrie 4.0, requires requires that practical factory automation automation problems cope with with thefield cyber-physical system and its challenges. Some experiencesproblems and literature in the testify that The path towards 4.0, that factory cope the cyber-physical The path towards Industrie 4.0, requires that factory automation problems cope with the cyber-physical The path towards Industrie 4.0, requires that factory automation problems cope with the cyber-physical system complexity and its challenges. Some practical experiences and literature in the field testify that the role complexity of the database management systems is becoming central for control and automation technology system and its challenges. Some practical experiences and literature in the field testify system complexity and its Some practical experiences and literature in the field testify that system complexity andscenario. its challenges. challenges. Someproposes practical experiences and literature inand thearchitectures field technology testify that that the role of the database management systems is becoming central for control and automation in the new industrial This article database-centric technology that the role of the database management systems is becoming central for control and automation technology the role of the database management systems is becoming central for control and automation technology the rolenew of the database management systems is becoming central for control control and automation technology in the industrial scenario. This article proposes database-centric technology and architectures that seamlessly integrate networking, artificial intelligence and real-time issues into a unified model in in the the new new industrial industrial scenario. scenario. This This article article proposes proposes database-centric database-centric technology technology and and architectures architectures that that in the new industrial scenario. This article proposes database-centric technology and architectures that seamlessly integrate networking, artificial intelligence and real-time control issues into a unified model of computing. The proposed methodology is also viable for the development of simulation and rapid seamlessly seamlessly integrate integrate networking, networking, artificial artificial intelligence intelligence and and real-time real-time control control issues issues into into aaa unified unified model model seamlessly integrate networking, artificial intelligence and real-time control issues into unified of computing. The proposed methodology is also also automation. viable for the development development of simulation simulation andmodel rapid prototyping tools forproposed smart andmethodology advanced industrial of computing. The is viable for the of and rapid of computing. The proposed methodology is also viable for the development of simulation and rapid of computing. The proposed methodology is also viable for the development of simulation and rapid prototyping tools for smart and advanced industrial automation. prototyping tools for smart and advanced industrial automation. prototyping tools for advanced industrial automation. © 2016, IFAC (International Federation ofDatabase Automatic Control) Hosting by Elsevier Ltd. All computer rights reserved. Keywords: Computational methods, systems, Distributed control prototyping tools for smart smart and and advanced industrialmanagement automation. Keywords: Computational methods, Database management systems, Distributed computer control systems, Embedded systems, Industry automation, Intelligent manufacturing systems, Simulators. Keywords: Computational methods, Database management systems, Distributed computer control Keywords: Computational methods, Database systems, computer Keywords: Computational methods, Database management management systems, Distributed Distributed computer control control systems, Embedded systems, Industry automation, Intelligent manufacturing systems, Simulators. systems, Embedded systems, Industry automation, Intelligent manufacturing systems, Simulators. systems, Embedded systems, Industry automation, Intelligent manufacturing systems, Simulators. systems, Embedded systems, Industry automation, Intelligent manufacturing systems, Simulators.  management of a network of enterprises. The traditional 5  1. INTRODUCTION  management a network of enterprises. Theand traditional ISA-95 levelsofseem designated to be blurred surpassed5 management 1. INTRODUCTION management of of aaa network network of of enterprises. enterprises. The The traditional traditional 555 management of network of enterprises. The traditional 1. INTRODUCTION ISA-95 levels seem designated to be blurred and surpassed 1. INTRODUCTION soon by the new smart factory technologies and concepts. It The factory of the future is in the objectives of the Industrie ISA-95 1. INTRODUCTION ISA-95 levels seem designated to be blurred and surpassed levels seem designated to and surpassed ISA-95 levels seem designated to be be blurred blurred andconcepts. surpassed soon by the new smart factory technologies and It is expected to move from the existing hierarchical control The factory of the future is in the objectives of the Industrie 4.0 strategy. Cyber-physical systems (CPSs) are the new soon soon by the new smart factory technologies and concepts. It by new smart factory technologies and It The factory of the is the of the The factory of the future is in the objectives of the Industrie soon by the thebased new smart factory technologies and concepts. concepts. It is expected to move from the existing hierarchical control The factory ofCyber-physical the future future is in in the objectives objectives of are the Industrie Industrie structures, on the ISA-95 automation pyramid, towards 4.0 strategy. systems (CPSs) the new research framework that makes the complexity of the is expected to move from the existing hierarchical control is expected to move from the existing hierarchical control 4.0 strategy. Cyber-physical systems (CPSs) are the new 4.0 strategy. strategy. Cyber-physical Cyber-physical systems systems (CPSs) (CPSs) are are the the new new structures, is expected to move from the existing hierarchical control based on the ISA-95 automation pyramid, towards 4.0 more decentralized and reconfigurable structures basedtowards on the research framework that makes thephysical complexity of the Industrie 4.0 goals treatable and where and software structures, based on the ISA-95 automation pyramid, based the ISA-95 automation pyramid, towards research research framework framework that that makes makes the the complexity complexity of of the the structures, structures, based on on(Leitao the reconfigurable ISA-95 automation pyramid, towards more decentralized and structures based on the research framework that makes the complexity of the CPS principles 2015). Indeed cyber-physical Industrie 4.0 goals treatable and where physical and software components are deeply intertwined to interacting each other more decentralized and reconfigurable structures based on the decentralized and reconfigurable structures based on the Industrie Industrie 4.0 4.0 goals goals treatable treatable and and where where physical physical and and software software more more decentralized and reconfigurable structures based on the CPS principles (Leitao 2015). Indeed cyber-physical Industrie 4.0 goals treatable and where physical and software production systems (CPPSs) will show cyber capabilities components are deeply intertwined to interacting each other in a myriad of ways that change with context (Lee 2015). The CPS principles (Leitao 2015). Indeed cyber-physical CPS principles (Leitao 2015). Indeed cyber-physical components are deeply intertwined to interacting each other components are deeply intertwined to interacting each other CPS principles (Leitao 2015). Indeed cyber-physical production systems (CPPSs) will show cyber capabilities components are deeply intertwined to interacting each other within every physical component, as distributed computing in aa myriad of ways that change context (Lee The efficiency measurements and with assessment of 2015). industrial production production systems systems (CPPSs) (CPPSs) will will show show cyber cyber capabilities capabilities in myriad of ways that change with context (Lee 2015). The in aa myriad of ways change with context (Lee 2015). The systems (CPPSs) will show cyber within every physical component, as distributed computing in myriad processes of ways that that change with contexthave (Lee 2015). The production along with distributed intelligence, and ‘self-*’capabilities methods, efficiency measurements and assessment of industrial production in the future scenario to take into within every physical component, as distributed within every every physical physical component, component, as as distributed distributed computing computing efficiency measurements and assessment of industrial efficiency measurements and assessment of industrial within computing along with distributed intelligence, and ‘self-*’ methods, efficiency measurements and assessment of industrial namely: self-adaptation and self-configuration, along with production processes in the future scenario have to take into account theprocesses increasing role of scenario information flow across along with distributed intelligence, and ‘self-*’ methods, along with distributed intelligence, and ‘self-*’ methods, production in the future have to take into production processes in the future scenario have to take into along with distributed intelligence, andself-reconfiguration ‘self-*’along methods, namely: self-adaptation and self-configuration, with production processes in the future scenario have to take into self-diagnosis, self-organization and account the increasing role of information flow across processes fromincreasing the enterprise system level down flow to theacross shop- namely: namely: self-adaptation and self-configuration, along with self-adaptation and along account the role of account the increasing role of information flow across namely: self-adaptation and self-configuration, self-configuration, along with with self-diagnosis, self-organization and self-reconfiguration account the increasing role system of information information flow across dynamics, as required through the Industrie 4.0 strategy. processes from the enterprise level down to the shopfloor. Information is the main vehicle that allows humans to self-diagnosis, self-organization and self-reconfiguration self-diagnosis, self-organization and self-reconfiguration processes from the enterprise system level down to the shopprocesses from from the the enterprise enterprise system system level level down down to to the the shopshop- dynamics, self-diagnosis, self-organization and self-reconfiguration as required through the Industrie 4.0 strategy. processes floor. Information is the mainand vehicle that allows to dynamics, be in control of sustainability productivity and humans allows the dynamics, as required through the Industrie 4.0 strategy. as the 4.0 floor. floor. Information Information is is the the main main vehicle vehicle that that allows allows humans humans to to The introduction of anthrough active distributed database mechanism dynamics, as required required through the Industrie Industrie 4.0 strategy. strategy. floor. Information is the main vehicle that allows humans to be in control sustainability and productivity and allows the introduction of artificial intelligence as a decision support be in control of sustainability and productivity and allows the The introduction of an active distributed database be in in control control of of sustainability sustainability and and productivity productivity and and allows allows the the The at the shop-floor, through the best database embeddedmechanism database be introduction of an active distributed mechanism introduction of artificial intelligence as aa decision introduction of an active database mechanism tool. The appearance of unforeseen behaviours is a support typical The The introduction ofavailable, an active distributed distributed database mechanism introduction of artificial intelligence as decision support at the shop-floor, through the best embedded database introduction of artificial intelligence as aa decision support technologies today renders the data mining and the introduction of artificial intelligence as decision support at the shop-floor, through the best embedded database tool. The appearance of unforeseen behaviours aa typical at the through the best embedded database phenomenon of complexity: new features andis prospects at the shop-floor, shop-floor, through the for best embedded database tool. The appearance of unforeseen behaviours is typical technologies today available, renders the data mining and the tool. The appearance of unforeseen behaviours is a typical optimization of processes viable the CPS challenge. By tool. The appearance unforeseen behaviours is prospects a mining typical technologies phenomenon of complexity: new features and technologies today today available, available, renders renders the the data data mining mining and and the the might emerge from theofdeliberate application of data technologies today available, renders the data mining and the phenomenon of complexity: new features and prospects optimization of processes viable for the CPS challenge. By phenomenon of complexity: new features and prospects the use of the quality of a declarative language, as the phenomenon of complexity: new featuresand and prospects might emerge from the deliberate application of data mining optimization of of processes processes viable viable for for the the CPS CPS challenge. challenge. By By techniques coupled with artificial reasoning inference on optimization optimization of processes viable for the CPS challenge. By might emerge from the deliberate application of data mining the use of the quality of a declarative language, as might emerge from the deliberate application of data mining database languages in the relational model, most of the might from the and deliberate application ofIndeed data mining the use of the quality of declarative language, as the techniques coupled with artificial reasoning and inference on the use of the quality of aaa declarative language, as the alreadyemerge well known established data. new the use of the quality of declarative language, as the techniques coupled with artificial reasoning and inference on database languages in the relational model, most of techniques coupled with artificial reasoning and inference on techniques for planning and optimization (Jeon 2016) canthe be techniques coupled with artificial reasoning and inferencenew on database languages in the relational model, most of the already well known and established data. Indeed languages in the relational model, most of the opportunities would originate from the full exploitation of database database languages in Decision the optimization relational model, most of the already well known and established data. Indeed new techniques for planning and (Jeon 2016) can be already well known and established data. Indeed new enabled dynamically. support systems based on already wellacquired, knownoriginate and established data.exploitation Indeed new techniques for for planning and optimization (Jeon 2016) can be opportunities would from the full of techniques planning and (Jeon 2016) can be information stored and communicated in industrial for planningDecision and optimization optimization (Jeon 2016) can on be opportunities enabled dynamically. support systems based opportunities would would originate originate from from the the full full exploitation exploitation of of techniques time-aware relational model inference can lead towards opportunities would originate from the full exploitation of enabled dynamically. Decision support systems based on information acquired, stored and communicated in industrial enabled dynamically. Decision support systems based on processes. Most of it unfortunately, due to the high volume of enabled dynamically. Decision support systems based on information acquired, stored and communicated in industrial time-aware relational model inference can lead towards information acquired, acquired, stored stored and and communicated communicated in in industrial industrial results potentially unforeseen at the beginning of the information time-aware relational model inference can lead towards processes. Most of it unfortunately, due to high volume of time-aware relational model inference can lead towards data, is usually doomed to be neglected asthe noise rather than time-aware relationalunforeseen model2016, inference can lead towards processes. Most of it unfortunately, due to the high volume of results potentially at the beginning of the processes. Most of it unfortunately, due to the high volume of information gathering (Yang Nickel 2016, Date 2014). processes. Most of it unfortunately, due to the high volume of results potentially unforeseen at the beginning of the data, is usually doomed to be neglected as noise rather than results potentially unforeseen at the beginning of the useful added-value information. Though expensive smart results potentially unforeseen at Nickel the beginning of the data, is usually doomed to be neglected as noise rather than information gathering (Yang 2016, 2016, Date 2014). data, is usually doomed to be neglected as noise rather than The full relational model will require more scientific effort in data, is usually doomed to be neglected as noise rather than information gathering (Yang 2016, Nickel 2016, Date 2014). useful added-value information. Though expensive smart information gathering (Yang 2016, Nickel 2016, Date 2014). metering methods are common, on current manufacturing information gathering (Yang 2016, Nickel 2016, Date 2014). useful added-value information. Though expensive smart The full relational model will require more scientific effort in useful added-value information. Though expensive smart the future but a restricted database-centric technology based useful added-value information. Though expensive smart The metering methods are common, current manufacturing The full full relational relational model model will will require require more more scientific scientific effort effort in in plants, they should deserve a deeperon exploitation. The full relational model will require more scientific effort in metering methods are common, on current manufacturing the future but a restricted database-centric technology based metering methods are common, on current manufacturing on the established SQL database language standard can create metering methods are common, onexploitation. current manufacturing the the future but restricted database-centric technology based plants, they should deserve aa deeper future but aaa restricted database-centric technology based the future but restricted database-centric technology based plants, they should deserve deeper exploitation. the established SQL standard can create plants, they should aa deeper aonfirst technological stepdatabase towardslanguage the challenges made up by This paper an “information-conservative” plants, they proposes should deserve deserve deeper exploitation. exploitation. approach on the established SQL database language standard can create on the established SQL database language standard can on the established SQL database language standard can create create athe first technological step towards the challenges made up by smart manufacturing scenario. This paper proposes an “information-conservative” approach suggesting a key enabling technology and a methodology for first technological step towards the challenges made up by aaa first technological step towards This paper proposes an approach This paper proposes an “information-conservative” approach first technological stepscenario. towards the the challenges challenges made made up up by by the smart manufacturing This paper acontrol, proposes an “information-conservative” “information-conservative” approach suggesting key enabling technology and a methodology for modeling, simulation, planning, optimization and the smart manufacturing scenario. the smart manufacturing scenario. suggesting a key enabling technology and a methodology for suggesting aa key key enabling enabling technology technology and and aa methodology methodology for for the We propose guidelines and technology hints that can be smart manufacturing scenario. suggesting modeling, control, simulation, planning,through optimization and scheduling of industrial processes, dynamical modeling, propose guidelines and technology hints that can be modeling, control, control, simulation, simulation, planning, planning, optimization optimization and and We effectively used in KPI-based control for the energy modeling, control, simulation, planning, optimization and We propose guidelines and technology hints that can be scheduling of industrial processes, through dynamical guidelines and technology hints that can be assessment of some common key performance indicators We We propose proposeofused guidelines and technology hints that can be scheduling of industrial processes, through dynamical effectively in KPI-based control for the energy scheduling of industrial processes, through dynamical efficiency industrial processes within the sustainable scheduling of industrial processes, through dynamical effectively used in KPI-based control for the energy assessment of some common key performance indicators effectively used in KPI-based control for the energy (KPIs). The key approach is the pervasive use of relational effectively used inresearch KPI-based control for the energy assessment of some common key performance indicators efficiency of industrial processes within the sustainable assessment of some common key performance indicators factory of the future framework (Stiel 2016). assessment of some key performance indicators (KPIs). The key approach is the pervasive use of relational efficiency of of industrial industrial processes processes within within the the sustainable sustainable database systems that common actively support transmission, storage efficiency efficiency of industrial processes within the sustainable (KPIs). The key approach is the pervasive use of relational factory of the future research framework (Stiel 2016). (KPIs). The key approach is the pervasive use of relational (KPIs). The key approach is the pervasive use of relational factory of the future research framework (Stiel 2016). database systems that actively support transmission, storage factory of the future research framework (Stiel 2016). and elaboration of information across the 5 levels defined in In section 2 we introduce related work on the basic ideas of factory of the future research framework (Stiel 2016). database systems that actively support support transmission, storage database systems that actively transmission, storage database systems that supportthe transmission, storage and elaboration of information across 55 levels defined in the ISA-95 standard –actively from sensing and actuation to the In section 2 we introduce related work on the basic ideas of the database-centric technology. In section 3 a problem and elaboration of information across the levels defined in and elaboration of across 55 levels defined in In section 2 we introduce related work on the basic ideas of section 2 related on ideas and ISA-95 elaboration of information information across the the levels defined in In In section 2 we we introduce introduce related work work on the the basic basic ideas of of the standard –– from sensing and actuation to the the database-centric technology. In section 3 a problem the ISA-95 standard from sensing and actuation to the the database-centric technology. In section 3 a problem the ISA-95 standard – from sensing and actuation to the the database-centric technology. In section 3 a problem the ISA-95 standard – from sensing and actuation to the the database-centric technology. In section 3 a problem

Copyright © 2016 IFAC 249 2405-8963 © 2016, IFAC (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved. Copyright ©under 2016 responsibility IFAC 249Control. Peer review of International Federation of Automatic Copyright © 2016 IFAC 249 Copyright © © 2016 2016 IFAC IFAC 249 Copyright 249 10.1016/j.ifacol.2016.07.608

IFAC MIM 2016 250 June 28-30, 2016. Troyes, France

Andrea Bonci et al. / IFAC-PapersOnLine 49-12 (2016) 249–254

description along with a technology description and its unifying role for industry is provided. In section 4 a simulation and modelling methodology is introduced. In section 5 the results expected from the methodology are discussed. Section 6 is for conclusions.

with the plethora of extensions provided from SQL database systems vendors. Nowadays, some industrial producers like Inductive Automation™ already use this kind of databasecentric technique (white paper 2012). They use a SQL database as the centre of every software and business logic of their industrial automation application. In (De Morais 2014) the author shows how data processing can be integrated and performed within the DBMS. Both the formerly cited solutions are going in the right direction lest the necessary scalability foreseen in cyber-physical systems is still lacking. In CPSs the database must scale down to use a minimum of resources (potentially below 1MB of memory). Therefore, the SQL DBMS implementation needed, must fit the requirements of special purpose embedded solutions. A viable DBMS implementation could be represented from the very portable and mature SQLite which is released in public domain and is apt and open to innovation and research. Nevertheless, a major drawback in the SQL approach is the SQL itself. Notably, SQL falls short when it has to cooperate with artificial intelligence. Although SQL database language is a declarative and 4th generation programming language, it doesn’t supply the expressiveness needed, for example, in first order logic inference problems. The real model that should be adopted and implemented soon is the relational model in the original Codd’s sense (Codd 1990) reviewed by some followers (Darwen 2012). The relational model in general can act as a higher-order logic representation language, so fit to cope with all problems and inherent limitations of SQL (e.g. SQL has not a standard query for asking a list of the database tables, or treat a table itself as a variable. This is usually overcome by producers with proprietary extensions). Unfortunately, no lightweight implementation of the relational model is already available. Until the availability of research efforts in the relational model implementation sense, we are left with SQL and its shortcomings. Besides, SQL based automation still can represent a great step forward in the paradigm for industrial process control right now. Means and techniques to adapt SQL to artificial intelligence (AI) have been already developed (Schuldt 2008, AlAmri 2012, Bhatia 2013). The use of SQL as a base technology for AI and automation is promising and is the straightest path towards managing the complexity of CPSs challenges.

2. PERVASIVE DATABASE TECHNOLOGY AND RELATED WORK Information is gold, and nothing of it should be lost. On the top of it we can decide different views with their different resolutions and granularity. As the level of aggregation of information increases, the influence of changing structural effects and other factors are lost as some emergence over the data might remain statically frozen by aggregation itself. Going down towards the lower and more detailed levels increases the understanding of the multitude of factors that affect energy efficiency, to make smart production decisions. However, as we go to details the quantity of data and sensing becomes complex, and this complexity is unavoidable in the cyber-physical systems scenario. We have to cope with that complexity in a clever and lean way. The appropriate level of detail for the construction of energy efficiency indicators can be controlled through the application of a methodology based on pervasive database management systems (DBMSs). Industrial production processes are complex and the heterogeneity of the components and protocols across the factory floor, and its organizational and geographical boundaries, create strong demands for system integration and interoperability (He 2014). Valuable unifying efforts towards standards, like the OPC UA (Mahnke 2009), have been well established. In any case, this computing model might not be completely able to scale down and keep up with the incoming CPS storm, mainly because those issues were not present at the time of its design. While in OPC UA the database is used mainly for historical data storage, in our proposed approach, each component or agent in the process is provided with active database capabilities that tracks data and events to trigger control on the plant. Information is made constantly available for its use at the highest strategic levels, where decisions cope with the long-term issues. Decisions made at the strategic level can influence the procedures implemented at lower levels through dynamical programming based on planning and reasoning with context awareness. By using distributed computing techniques and tools, we can keep the costs of complex solutions low and viable. The technique of adding a database everywhere does not necessarily require a change in the lower level technology or acquisition system. It can be applied on top of existing facilities. Distributed databases for CPSs imply networking as we have to include the Internet of things (IoT) inside. In (Jia 2013) it is shown how the IoT promise of integrating the digital world of the Internet with the physical world needs to be implemented through a systematic approach for integrating the sensors, actuators, where data is the central entity for the realization of context-aware services. In (Rajhans 2014) it is shown how abstraction of models is necessary in the current complexity of CPSs. For this kind of abstract modelling we put forth the expressiveness of the relational model, or where not fully available, at least the ordinary relational database technology

3. FACTORY AND PROCESS AUTOMATION PROBLEM A DBMS-centric approach for complex process automation was already looked into through the federation of MySQL and PosgresSQL technologies in 2008 in the developments of a European FP7 research project (CAFE 2008). MySQL was used for embedded units as a moderately lightweight DBMS solution for embedded Remote Terminal Unit (RTU) boards while PostgreSQL was used as the remote global data centre. The major concept put forth there was the unifying role of the DBMS to host heterogeneous technologies for acquisition, actuation and data processing. With suitable and simple adapters, any sensor and actuator, along with special purpose computing units and machines, were connected in a whole unified DBMS informational centre. In Figure 1 we show the unifying role of DBMS-centric technology in an architecture

250

IFAC MIM 2016 June 28-30, 2016. Troyes, France

Andrea Bonci et al. / IFAC-PapersOnLine 49-12 (2016) 249–254

that copes with the problem of the heterogeneity of the automation components.

251

control purposes. Some assumptions can be made that allow synchronization efficiency. It can be assumed that a variable xxx is unique and has an absolute identifier (at present, the IPv6 addressing scheme appears to be the best identifier choice). The variable xxx might be associated to a physical object in the universe. The xxx identifier must anyway be unique and reserved forever (like a Web URI). It has to be assumed that the process that affects the values of variable xxx is unique, whether it is a mere sensor acquisition task or complex processing. It can further be assumed that xxx is a primary variable as it is the only source of the information related to xxx in the whole universe, and so it is called a Master variable. A Master variable is a variable that is physically acquired by a device (e.g., embedded board, general purpose computer, software) or is generated through calculation from one or more acquired variables. Slave variables are then defined as those variables that need to be synchronized to a certain Master through replication. Of course, a process that affects a Master variable can also concurrently be fed as a Slave from other Master or Slave variables. A topological structure is then obtained that allows all of the sources of information to feed all of the distributed components that subscribe to the Master variables. Note that the replication configurations are themselves database variables as everything should be a relational variable in the relational model. They can be shared and moved across the network as the processes that use them are also distributed, but unique (similar to a Domain Name System).

Figure 1: the unifying role of database centric approach. Note that the DBMS must be a distributed system. The distribution of the information and processing is carried on by a lightweight synchronization mechanisms obtained by a well calibrated data replication. A distributed infrastructure is obtained when suitable mechanisms are used to propagate the updates on database tables across the distributed DBMS units. Therefore, the UPDATE or INSERT on a table, due to a new acquisition is propagated to a networked DBMS that hosts a logic that needs the information update as input.

4.2 Publish-subscribe paradigm and the Internet of Things 4. DISTRIBUTED DBMS-CENTRIC METHODS The formerly explained replication technique addresses the major requirement for communication in CPSs. In this context communication is the Internet of things (IoT). It is well known that a challenge posed by the IoT is the search for a unified and lightweight means of communications that allows machine-to-machine (M2M) intelligent connection (Al-Fuqaha 2015). Breaking new ground in pervasive and distributed computing is the swarmlet concept. Swarmlets are applications and services that leverage networked sensors and actuators with cloud services and mobile devices. Their architecture is conceived to embrace heterogeneity instead of attempting yet another unreasonable standardization (Latronico 2015). The proposed DBMS replication technique is candidate to be easily compliant with the major architectures in IoT. As described in section 3.2 our proposal follows a publish/subscribe scheme where each RTU or software acts like a human being that only subscribes to interesting events/information and publishes relevant events or information. Prominent examples of publish/subscribe schemes are MQTT (Locke 2010), robot operating system (ROS: http://www.ros.org/). A promising framework is the DDS (Data Distribution Service, at http://portals.omg.org/dds/). DDS addresses data in a manner similar to relational databases and recently has been considered its optimization for the IoT (Beckman 2015). Furthermore, the Allseen Alliance (https://allseenalliance.org/) puts forth the AllJoyn® Framework. The common concept for all of them is the topic, where each node subscribes and some other publishes

4.1 Lightweight Database Synchronization through Distributed Replication Through the replication mechanism, a selective and optimized synchronization among the computing actors is obtained. Through appropriate configuration programming (in database language, of course), the set of subscriber components asynchronously receive updates through the networking infrastructure. Information is propagated and communicated across the whole infrastructure through appropriate triggers registered in the database tables. This allows every device and controller to be in contact through the data simply by querying the database. Each embedded database can synchronize centrally or with a neighbor by means of a lightweight replication mechanism. In SQLite, for example, these kinds of triggers are easily implemented. A trivial example of trigger code is provided, with the assumption that table v1 has value and timestamp columns (attributes): CREATE TRIGGER vardisplay AFTER INSERT ON v1 BEGIN SELECT publish_value(NEW.value, NEW.timestamp); END;

The publish_value() procedure can be written in C, or in any other language, and might transmit the value to a distant component’s database that requires it for its own device 251

IFAC MIM 2016 252 June 28-30, 2016. Troyes, France

Andrea Bonci et al. / IFAC-PapersOnLine 49-12 (2016) 249–254

information. In our DBMS-centric proposal the topic is simply the database content itself without intermediate language adapters: the unique language is the database language and its queries. All the information in the process is published to actors and any controller and optimizer can be build over the DBMS infrastructure and its semantic abstraction as a plug-in. However, in a different context, in (De Morais 2014) there is a quite detailed explanation of the advantages of such an active database-centric approach. In the distributed system the DBMS must play the role of inter process communication (IPC) means. In (De Morais 2014) the IPC is mentioned but not used in a distributed sense (because there was only one DBMS computing unit). They use the term resource adapter to define a computing processing unit that is connected with physical actuators and sensors and which communicates with a centralized DBMS through the IPC extensions available on PostgresSQL – NOTIFY, LISTEN and UNLISTEN commands. In the industrial distributed control system (DCS) context the same resource adapter role is performed by the remote terminal units (RTUs). IPC through DBMS connects the RTUs and higher level central units of the DCS system.

infrastructure, there is fertile land for intelligence and logic to grow and develop.

Figure 2: significant functional parts of the RTU With the infrastructure available, algorithms can easily be plugged into the physical world as a back-end while they see a unified database front-end that enables controllers and other software to be seamlessly interconnected. Every algorithm developed is deployed as a plug-in of the system. Every plugin can be immediately put into communication with the others. This enables a vast class of control topologies and hierarchies and any controller software is a plug-in. With the plug-in concept, an abstraction layer has been created that enables scientists and engineers to operate the control framework without being concerned about the low-level communication details. A plug-in can be uploaded, enrolled and started by a web-based graphical interface.

4.3 Remote Terminal Units and plug-ins Typically, an RTU in industrial control systems is a holonic unit which features some local procedural capabilities along with concurrent communication tasks. It is responsible both for real-time autonomous controlling actions and decisions and for communications with higher intelligence layers up to the enterprise management level. Holonic systems are relevant in enabling the multi agent system technology for intelligent distributed control of industrial plants (Leitao 2013). For our purposes, an RTU device features an embedded DBMS and replication. We need to identify and separate 4 parts in such an RTU structure: the input, the database tables, the logical and procedural part and the output. In figure 2 we identify the four holonic RTU parts. The input part contains the hardware and software that converts input information to an INSERT/UPDATE database query. DBMS tables and related triggers constitute the second part. The triggers launch the logic and algorithms relevant to the former event (with a local publish/subscribe scheme as well) on the third part. That might produce a new INSERT/UPDATE query as a result or a physical output to be handed off to the output part. Concurrently, information is published or received with the replication mechanism across the whole DCS. When a sensor acquires new data it is put in the DBMS through an INSERT command. The new record is transmitted to other holons or to a central DBMS by simply transmitting the same database query through the network. If some distributed business logic depends on that information, a database trigger calls the associated procedure. The output can in turn be propagated to other units. The same mechanism, in reverse order, is used for actuators. In this computing infrastructure the database tables are the globally shared storage part with their triggers for IPC.

Figure 3: the plug-in design and deployment workflow. Figure 3 illustrates the workflow of plug-in development and deployment. What scientists or practitioners need are the knowledge and the semantics of the database variables, along with the controller source code. The controller source code can come from a simulated and calibrated Simulink® or Stateflow® model or any Matlab® or high level programming language. Having in mind the input and output variables, the designer can edit the Integration File with the assignment of the inputs/outputs of the control algorithm to the corresponding database variables – the Integration File as a set of relations must be encoded as a database table as well. The source code is then transformed into the procedures that are triggered by the DBMS to perform the computations.

The components see the database tables as the unified and unique information interface available. With such an

252

IFAC MIM 2016 June 28-30, 2016. Troyes, France

Andrea Bonci et al. / IFAC-PapersOnLine 49-12 (2016) 249–254

5 EXPECTED EXPERIMENTAL RESULTS

253

through stored procedures or triggers, can be challenging. A best practice suggested here is to separate clearly the data infrastructure from the business logic. Once the data input and outputs are available globally through the DBMS relations (tables) we can abstract and simulate the business logic that constitutes the algorithms that produces outputs from the inputs, develop them with preferred language, test and calibrate and then compile them back into the DBMScentric implementation through the plug-ins approach.

5.1 Rapid process control prototyping through simulation and modeling In (Bonci 2014) the authors show how the former plug-in concept promotes openness in the rapid prototyping of scientific software, and how it breaks the entry barriers for small and medium enterprises in industrial applications. Through the database abstraction we can model the industrial process automation in two classes of components. The first class is constituted by the database relations (e.g. tables) and communication means (e.g. IPC and the signaling by triggers). The second class is represented by the procedures that implement the controlling algorithms relying on data inputs and outputs, formerly defined as plug-ins. Within the second class, we can also design all the components that constitute a complete model of the physical processes involved in the production. In turn the plug-ins class is to be divided in two subclasses, one for the control and one for the models suitable for simulations and model-based control (e.g. in model predictive control techniques).

5.2 The Virtual framework development The choice of the virtual framework is quite immaterial as far as the components of the simulation can be connected to a DBMS. Suitable APIs exist for the majority of programming languages. Matlab Simulink® and Stateflow® are quite a de facto standard for researchers in the control community. Future work will focus on these tools for a validation of the proposed methodology. The tools will allow these three phases to be performed easily: 1. Creating models of the production lines, environment and energy consuming components. 2. Gathering the real data and calibrate the models through real incoming data. 3. Using the output of the calibrated models to assess the KPI effectiveness and optmimize through data-mining on the raw not aggregated data. Optimized blockset is implemented back as a plug-in on the real plant. The actual novelty, is on paying due attention to the capabilities of embedded DBMS for the dynamical modeling and operation of hybrid systems like the production processes. A good starting point for the next experimental work is the set of already available blocksets for the Simulink environment that support the publish/subscribe scheme and already developed for: ROS (Matlab Robotics System Toolbox®); DDS (MathWorks provides Simulink® blocks and MATLAB® classes for RTI Connext DDS); and in the RT-LAB Orchestra® product (http://www.opal-rt.com). The design of the tools starts with the connection of every blockset to a publisher (outputs) and a subscriber (inputs) blockset.

Figure 4: DBMS abstraction and development framework In Figure 4, we try to represent how the DBMS-centric infrastructure and related plug-ins are a useful scheme for the creation of tools for the rapid prototyping and simulation of a production plant. Indeed, by switching between simulation and physical acquisition we can create models of the plant or calibrate the controllers as well. Furthermore, by abstracting the business logic blocks and the physical plant blocks over the ubiquitous database infrastructure, we can functionally replace them with the plug-ins that model the control and business logic and the physical plant models in a virtual environment. These blocks can be simulated, tested and calibrated in a suitable development framework and then deployed back into the physical RTUs. Figure 4 expresses the concept of the database as the fundamental and invariant ground enabling the architectural link between the physical and the virtual industrial plant. The dotted lines in the modelling framework are not real connections but associations with the suitable variable tables. They are transformed back into physical communications through the deployment process. The implementation of algorithms

Figure 5: desired functionality of the new Simulink signal/bus element (central box) The publish/subscribe scheme can be achieved by forging a tailored new signal/bus object in Simulink that transforms the blockset outputs into database INSERT/UPDATE queries. 253

IFAC MIM 2016 254 June 28-30, 2016. Troyes, France

Andrea Bonci et al. / IFAC-PapersOnLine 49-12 (2016) 249–254

On query execution, an IPC event triggers a SELECT query for the retrieving of the blockset input values. This way, the simulator tool is controlled by the DBMS. Note that this also allows hybrid virtual configurations like the hardware–in-theloop techniques (see for example the VxWorks™ Async Interrupt and Async Task blocksets within the Real Time Workshop Toolbox®). Figure 5 shows the functional scheme of the signal object component under development.

Date CJ, Darwen H, Lorentzos N. (2014) Time and relational theory: temporal databases in the relational model and SQL. Morgan Kaufmann. De Morais, W. O., Lundstrom, J., Wickstrom, N. (2014). Active in-database processing to support ambient assisted living system. Sensors., vol. 14, no.8, pp.14 785. He, W., Xu, L. D. (2014). Integration of Distributed Enterprise Applications: A Survey. IEEE Trans. Ind. Informat., vol.10, no.1, pp.35,42. Inductive Automation white paper (2012). SQL: The Next Big Thing in SCADA, How SQL is Redefining SCADA. Available: www.inductiveautomation.com Jeon, S.M., Gitae K. (2016) A survey of simulation modeling techniques in production planning and control (PPC). Production Planning & Control, pp. 1,18. Jia, Z., Iannucci, B. Hennessy, M., Xiao S., Kumar, S., Pfeffer, D., Aljedia, B., Ren, Y., Griss, M., Rosenberg, S., Cao, J. Rowe, A. (2013). Sensor Data as a Service -A Federated Platform for Mobile Data-centric Service Development and Sharing. Services Computing (SCC), 2013 IEEE Int. Conf. on, pp.446,453. Latronico, E.; Lee, E.A.; Lohstroh, M.; Shaver, C.; Wasicek, A.; Weber, M. (2015). A Vision of Swarmlets. Internet Computing, IEEE , vol.19, no.2, pp.20,28. Lee, E. A. (2015). The Past, Present and Future of CyberPhysical Systems: A Focus on Models. Sensors, vol 15, no. 3:pp. 4837-4869, 2015 Leitao, P., Marik,V., Vrba, P. (2013). Past, Present, and Future of Industrial Agent Applications. Industrial Informatics, IEEE Trans., vol.9, no.4, pp.2360,2372 Leitao, P., Barbosa J., Papadopoulou, M.-E.C., Venieris, I. S. (2015). Standardization in cyber-physical systems: The ARUM case. Industrial Tech. (ICIT), 2015 IEEE Int. Conf., pp. 2988,2993, Mar 2015. Locke, D. (2010). Mq telemetry transport v3. 1 protocol, specification. IBM developerWorks Technical Library Mahnke, W., Leitner, S., Damm, M. (2009). OPC Unified Architecture. Nickel M., Murphy K., Tresp V., Gabrilovich E. (2016). A Review of Relational Machine Learning for Knowledge Graphs. Proc. of the IEEE, vol.104, no.1, pp. 11,33. Rajhans, A., Bhave, A.; Ruchkin, I.;Krogh, B.H.;Garlan, D.; Platzer,A.;Schmerl, B. (2014). Supporting Heterogeneity in Cyber-Physical Systems Architectures. Automatic Control, IEEE Trans. on, vol.59, no.12, pp.3178,3193 Schuldt, H. (2008). Agent and Databases: A Symbiosis ?. Cooperative Information Agents XII, 12th International Workshop, September 10-12, CIA 2008, Prague. Stiel F., Michel T., Teuteberg F.. Enhancing manufacturing and transportation decision support systems with LCA add-ins. J. of Cleaner Production, vol. 110, pp.85,89 Yang, S., Khot T, Kersting K., Natarajan S. (2016). Learning Continuous-Time Bayesian Networks in Relational Domains: A Non-Parametric Approach. Proceedings of the 30th AAAI Conference on Artificial Intelligence.

5. CONCLUSIONS This paper proposes a viable enabling technology for the harnessing of complex issues in the incoming scenario of industrial automation where the smart features of cyberphysical systems are going to play a leading role. Based on lessons learned, state of the art in recent literature and commercial trends, a database-centric architecture is proposed as a suitable solution for the problems in networking and control of factory and process automation of the future. Moreover, the proposed technique allows the development of simulation, modelling and rapid prototyping tools by leveraging the best available standard approaches and technologies. Future work suggested is the development of scientific research for the enforcement of the full relational model with efficient practical implementations. In the short term a set of new Simulink tools are going to be developed for the expected development framework along with tests of advanced control techniques for KPI-based optimization of production and energy efficiency of real processes. REFERENCES AlAmri, A. (2012). The relational database layout to store ontology knowledge base. Information Retrieval & Knowledge Man. (CAMP), 2012 Int. Conf. on, pp.74,81. Al-Fuqaha, A.; Guizani, M.; Mohammadi, M.; Aledhari, M.; Ayyash, M. (2015). Internet of Things: A Survey on Enabling Technologies, Protocols and Applications. Communications Surveys & Tutorials, IEEE. Beckman K., Dedi O. (2015), sDDS: a portable data distribution service implementation for WSN and IoT platforms. IEEE WISES 2015, Proc. Of 12th Int. Workshop on Intelligent Solutions in Embedded Systems, Oct. 29-30, 2015, pp. 115-120 Bhatia, P., Khurana, N., Sharma, N. (2013). Intuitive Approach to Use Intelligent Database for Prediction. Int. J. of Computer Applications, vol. 83, no. 15, pp. 36,40. Bonci, A., Imbrescia, S., Pirani, M. Ratini, P. (2014), Rapid prototyping of open source ordinary differential equations solver in distributed embedded control application. Mechatronic and Embedded Systems and Applications IEEE/ASME 10th Int. Conf. on , pp. 1-6. CAFE project, EU FP7-KBBE, g.a. 212754 (2008). Deliverable. D.8.6, “The integrated system and the software main frame”. Codd, E. F. (1990). The Relational Model for Database Management: Version 2. Addison-Wesley Longman Publishing Co., Boston, MA, USA. Darwen, H. (2012). An Introduction to Relational Database Theory, Ventus, 2012. Available: http://bookboon.com).

254