Posted: July 1st, 2015

AUTONOMOUS RESOURCE CAPACITY PLANNING FOR SHORT-SHELF LIFE FOOD PRODUCTS SUPPLY CHAIN

Table of Contents

Table of Contents. ii

GLOSSARY OF ACRONYMS. iv

Chapter 1. 1

1     INTRODUCTION.. 1

1.1       Research gap. 3

1.2       Scope. 3

1.3     Aim and Objectives. 4

1.4       Research questions. 4

1.5     Contribution to knowledge. 5

Chapter 2. 6

2             Overview of existing methods on Data integration and   Transformation. 6

2.1   Introduction. 6

2.2     Data exchange concepts. 7

2.2.1     Point to point connection. 7

2.4 The concept of ontology. 9

2.5       Problem domain of data sharing and system integration. 10

2.5.1     Data distribution. 11

2.5.2   Incompatible terminology. 12

2.6       Physical system Integration level 12

2.6.1     Data translation. 12

2.6.2     Data transfer. 13

2.6.3 Scenarios Navigator- based data transfer. 15

2.7     Discussion/Conclusions. 16

Chapter 3. 17

3     Overview of Packml 17

3.1       Introduction. 17

3.2       Overview of PACKML framework. 18

3.2.1     PackML modes. 20

3.2.2     Assignment of modes and states. 20

3.3     Standard format for data exchange. 21

3.4     Existing methods and their limitations. 23

3.4.1     Weighnstephan Standards. 23

3.4.2   PackML.. 23

3.4.3   XML.. 24

3.4.4     CAEX.. 24

  1. 5   Research problem.. 25

3.5.1   Research gap. 25

3.5.2 Advantage. 26

3.5.3   Integration and transformation. 26

Chapter 4. 27

4     Research methodology. 27

4.1     Introduction. 27

4.2    Research Methodology. 28

4.2.1     Research methodology vs Research methods. 28

4.2.3     Quantitative Methodologies. 28

4.2.4     Qualitative methodology. 30

4.2.4     Mixed research methods. 31

4.3     Research Method Steps. 32

4.3.1     Step one:   Data collection. 34

4.3.2     Step two:   Semantic definition. 34

4.3.3     Step 3:   Identification of the components. 35

4.3.4   Step four:   Definition of design data. 36

4.3.6     Step five: Definition of the relations. 37

4.3.7     Step six:   Selection of the elements. 38

4.3.8   Step eight:   New approach to data modelling strategy. 38

5.1     Proposed strategy and its advantages. 40

Chapter 5. 44

5 Experimental results. 44

Chapter 6. 45

6     Conclusion. 45

References. 46

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

GLOSSARY OF ACRONYMS

Acronym                          Definition

CAEX                        Computer Aided Engineering Exchange

PACKML                  Packaging Machine Language

IEC                          International Electrotechnical Commission

XML                         Extensible Markup Language

CMs                           Content management system

P&ID                       Piping and instrumentation diagram

PLC                         Programmable logic controller

COLLADA              Collaboration for Interchange file format for interactive 3D

Applications.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Chapter 1

1     INTRODUCTION

The production and engineering systems employ different sequences of phases that are focused on delivering a specified product. The engineering production, just like other engineering systems, is purely characterized by contribution of different profession groups that execute single engineering phases in order to realize a given product. The single engineering steps are executed using different modern engineering tools. This implies that a given product may be associated with many changes that occur at the chain of production activities. The more the changes that are associated with a particular product, the more the tool chains is passed to different modules. This creates the necessity of defining the communication protocol that can be employed among the different phases that are involved in the product development and production of a given product. At some point the need to modify the whole plant arises. The need to modify the plant can also arise as a result of the adaptability and interoperability requirements (Aarts, 2005).

In reference to the international academy for production engineering (CIRP, 2004), the ability for permanent changes within an industry can influence competitive advantage not only in the market but also in terms of production efficiencies achieved. This point out to the need of undertaking permanent changes within the production lines in order to realize full competitive advantage as part of the manufacturing strategy.

The concept of adaptability is considered to lead the German strategic research agenda. The concept is considered to drive the core facts that are relied upon when deriving marketing strategy and manufacturing competitive advantage. The other issues that are equally incorporated for purposes of attaining competitive advantage in the market is digital factory, energy efficiency, high performance, miniaturization and product design and configuration (Beamer and Gillick, 2010).

This clearly defines the need to have an integrated data exchange system for engineering production systems. The integrated data exchange will facilitate sharing of production information among different discrete event simulation tools. It should, however, be noted that the few standard data exchange formats that has been established so far are inefficient. Additionally, the existing interfaces of data exchange are considered to be tool specific, lossless and specific to proprietor. As such, there is a logical consequence of the past division of evolution in the chain of production engineering (Boettner, 2014).

It can also be said that the prevailing exchangeability in the engineering production data is more and more its core value. It therefore, calls for a standardized data exchange format that can surpass all the phase of the engineering life cycle (Boettner, 2014). There are some attempts that have been made in the past to establish standardized data exchange formats. The most recent data exchange developed is known as CAEX. This format is considered to be the current data exchange format and has been endorsed by IEC (Brekelov, Borisov and Barygin, 2015).

Creating a flexible and automated production facilities within a complex production process is critical to the efficiency and reliability of the whole system. The current CAEX that is established is known to work among different software tools. The main objective of this research is to evaluate the data exchange formats that are available for engineering production systems. The efficiency and adaptability of the interfaces will be evaluated. The paper will also evaluate the integration capability of the standard data exchanges. A model will then be derived as a novel modelling strategy that will allow seamless integration and also transformation of data from different software tools.

1.1      Research gap

The existing software tools are considered to be tool specific. Others like CAEX are considered to be proprietor specific with data sources structurally incompatible. There are a few standardized model that can integrate all the software tools and facilitate exchange of engineering data. It is, therefore clear that there is need to develop a novel modelling strategy that will interface different software tools. There is also need to develop a model for exchanging data which will allow seamless integration, as well as, transformation of the data from different types of software tools. The research will not only focus on the existing data exchange platforms but it will also develop a model that will used as an interface among the different software tools.

1.2       Scope

The research will be limited to the approaches for systems integration from the data modelling point of view. The research will evaluate the information transfer approaches among the different standardized data models. The standardized data exchange will then be developed on the basis of the software tools that would have been incorporated in the research methodology section. The development of the novel modelling strategy will incorporate standard software tools that are used in sharing information within the manufacturing industry. This will form part of the Masters submission.

The project is intended to be continued for the PhD in order to realize a seamless integration and transformation of the already established software tools.

1.3     Aim and Objectives

 

The Aim of this research is

  1. Evaluate the standardized data exchange formats that are available for engineering production systems
  2. Evaluate the efficiency and adaptability of the data exchange interfaces
  3. Examine the integration capability of the standard data exchanges
  4. Derive a model that will allow seamless integration and transformation of data from different software tools

1.4       Research questions

What are the existing standardized data exchange formats?

What criteria should be used to qualify the most standardized data exchange formats?

What is the problem with the current data exchange models?

Why proposed data exchange model more preferable compared to the existing formats?

What other benefits are associated with the proposed data exchange model?

What is the development methodology of the proposed data exchange model?

What are the other applications of the proposed data exchange model?

Which aspect of PACKML data integration and transformation can be automated? Are they distinguished from the manual aspect? How do we minimise the manual aspect?

1.5     Contribution to knowledge

The research looks forward to evaluate the existing data exchange standards for software tools. An elaborate criteria is used to evaluate the suitability of the most appropriate technique that will be used to solve the research problem. The research will highlight the concept of data exchange in relation to the seamless integration and transformation of the data from the software tools.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Chapter 2

2             Overview of existing methods on Data integration and    Transformation

2.1   Introduction

This section will carry out a review on the existing data exchange approaches that have been implemented in the past. Different data exchange techniques will be included in the discussion. Additionally, other concepts related to the data exchange paradigm will also be included in this section.

The engineering production, just like other engineering systems, is purely characterized by contribution of different profession groups that execute single engineering phases in order to realize a given product. The single engineering steps are executed using different modern engineering tools. This implies that a given product may be associated with many changes that occur at the chain of production activities. The more the changes that are associated with a particular product, the more the tool chains are passed to different modules. This creates the necessity of defining the communication format that can be employed among the different phases that are involved in the development and production of a given product. At some point the need to modify the whole plant arises. The need to modify the plant can also arise as a result of the adaptability and interoperability requirements (Cevallos, 2014).

2.2     Data exchange concepts

Object oriented paradigm is the most preferred method for data modelling. The object oriented technology is considered to encapsulate development of pattern solutions that have classes. Individual classes can be used to generate arbitrary number of individual instances. The advantage of using this standard is the re-use of predefined and proven solutions of patterns (Daniel L. Smith, 1995).

2.2.1     Point to point connection

The concept of data exchange is considered to require bidirectional interconnection of the tool pairs through the proprietary point to point connection. This can be illustrated in the figure below. Each tool pair comprises of an interface (adapter) that establishes the required pair (Data exchange, 1990).

Figure 2.1 Data exchange by means of individual point to point connections (Data exchange, 1990)

Each tool is considered to raise the need for more adapting mechanisms into the model. This implies that this approach will only support a limited number of tools. Even the XML platform is known to assign this concept to each software tool that requires it. It should, however, be noted that the number of adapters can be reduced if the data format is syntactically standardized. This implies that there is need to incorporate a standardized syntax definition for the purposes of exchanging the data. In this case, the semantic is provided by the adapters (Data/telex exchange, 2003). As such, a standardized interface is realized where each device in the engineering production line can comfortably understand how to utilize the information of other devices in the line. This concept is clearly illustrated using the following figure;

Figure 2.2 Data exchange by means of standardized interfaces (Data echange, 1990)

2.4 The concept of ontology

Another critical concept in the data exchange concept is the usage of ontologies.

Figure 2.3 Data Sharing by using Ontology (Miriam & Rainer, 2008)

The use of ontologies establishes a common semantic that can be used for data modelling. The concept of ontologies is not fully developed as a result of the preference for individual company standards with specific wording, own granularity understanding and specified data structures (Miriam & Rainer, 2008)

The concept of ontology defines a unified semantic system which has a common wording. Additionally, the concept embraces standardized semantic definition for data storage, as well as data exchange for each data item. Within the ontology concept is the NE100 activity which pursue standardization of the device properties.

One of the applications that has already embraced the concept of ontology is the production monitoring and control system for ProVis.Agent (Design and Implementation of a new XML-Signcryption scheme to protect the XML document, 2003). The following figure can be used to illustrate the concept of data exchange by means of an ontology;

Figure 2.4 Data Exchange using ontology

As can be seen in the above diagram, the concept of ontology involves data integration which is accomplished through a series of identified data sources which defined the sources of the data to be integrated. There is also business rules which govern how the data processing should be evaluated and also the transformations. It should also be noted that ontology requires domain modelling and semantic specifications which define the minimization that must be represented in ontology. The templates provide a collection of standardized data sources that are used for processing. This is again referred to the ontology library as can be depicted in the above figure.

2.5       Problem domain of data sharing and system integration

The problems occur in data sharing and system integration are the challenge in making it possible for processes to interlinks seamlessly, which is as a result of interoperability problems that includes: data translation, system integration and modeling issues (Chen, et al., 2002).

The current tools lack neutral interface that allows easy and quick integration approaches, the literature highlights the current solution available, but there is lack of a proper solution that aligns the incompatible data.

The current integration and transformation has not really shown an in-depth solution on how to streamline data as a result of the major challenges such as Data distribution and Incompatible terminology (Chen, et al., 2002).

2.5.1     Data distribution

In manufacturing system and beyond, data and information extraction, uses simulation technology like Computer Aided Technology (CAE), Product Data Management (PDM) model software to predict the manufacturing process; it does not use the same data format as input therefore offer less interface between the simulation model softwares (Chen, et al., 2002).

According to Drath (2010), identified the need for a seamless, coherent approach that has the capability to import the input of one model to another. Researchers have only succeeded in the development of a product information with a capacity of importing standardized data that have been processed appropriately (Peters, et al., 2001).

The current standard in data exchange between different simulation tools is not really established due to the fact that most of the variety of data exchange is usually tools specific without support standard (Miriam & Rainer, 2008).

Likewise, the approaches in (Miriam & Rainer, 2008) and (Chen, et al., 2002) have shown that simulation technology is one of the future of manufacturing, but the greatest challenges is how this simulation model can work when theyare combined and sharing data and information seamlessly.

2.5.2   Incompatible terminology

Modelling elements and associated attributes in terms of model information for manufacturing products are the key to any system integration and exchange. Most of the available models often use different names or terms for describing their products under development by using the specific tool, therefore, leading to customized models (Skoogh & Johansson, 2007).

The difference in terminology has created a lot of various interpretation and misunderstanding in the knowledge sharing and exchanges in an engineering discipline (Waktersdorfer & Zoitl, 2010).

However, much has also been achieved in this regards, (Lee & Yan, 2005) uses a common database to describe a data exchange for machine shops for simulation by using an XML file and generic database to extract the raw data. (Skoogh & Johansson, 2007) also presented an excel interface for common terminology to support information exchange and dynamic simulation model.

However, none of the current methods highlighted above can satisfy all simulation tools and models involved in manufacturing engineering to exchange data, information and inputs (McLean & Leong., 2001). The sharing and common data exchange between different package seamlessly is the one of the novel approach in this research.

2.6       Physical system Integration level

Physical integration is more related to the issues of implementations as it related to the ways each of the model and simulation software data is being translated.

2.6.1     Data translation

As related to the current research, data automated translation is important to eliminate or reduces the time it takes human to interpret different data. To achieve data translation and among the customized simulation tools, there is a need to also share a common relationship and concepts between various simulation models (Miriam & Rainer, 2008).

As highlighted in (Kahng & McLeod, 1998) ontology is early research areas that allow semi-automated transfer of data between different simulation models (Viviana, et al., 2008). Ontologies according to (Dejing, et al., 2004), is a formal specification of concepts of vocabulary (source) and axioms relating them which describe the semantics of data.

Notwithstanding, the many ontology developed and researched upon and many achievements realized, no clear, robust solution for data integrates/ translation between different data modelling software (Euzenat & Shvaiko, 2008).

Therefore, a more novel approach is needed to have a seamless that will enable open format that will stream down the many converters available to automate the process, this will enable each of the simulation software to support the common data format, to achieve this, there is the need to have a common philosophy among the software tools.

2.6.2     Data transfer

Data transfer refers to a common classification system among different simulation software tool base on principle of central database, file exchange and message exchange according to individual applications. A common principle related to data transfer among different tools is demonstrated in figure 2.5 below.

Figure 2.5: File-base data conversion

The file based data conversion is another method of data transfer from one simulation model to another with automation modelling. (Miriam & Rainer, 2008) used CAEX (Computer Aided Engineering Exchange) format in transforming hierarchical information between different manufacturing simulation tools. XML method was also used in (Barnes & Arnaud, 2006) transform data for exchange between kinematics and 3D CAD models.

Although both (Miriam & Rainer, 2008) and (Barnes & Arnaud, 2006) used XML format based file for their exchanges, these methods still posses some benefits such as they are easily processed because the model of the simulation packages have model inbuilt support for XML and also the simplicity in compatible query.

However, (Woolf & Hohpe, 2011) highlights the limitations in (Miriam & Rainer, 2008) and (Barnes & Arnaud, 2006) as its integrated approach requires a protocol to read the file that will be transformed to another model, therefore the translator has to create a separate location for each file which tends to be problematic since extra referencing is required especially when dealing wth multiple files. There require a flexibility that allows seamless translation that can uniquely translate any condition from either System A or Sysytem B etc.

Figure 2.6 Scenarios Navigator- based data transfer

2.6.3 Scenarios Navigator- based data transfer

Scenarios Navigator- based data transfer is another method used to have a centralized navigating database system to address the limitation of protocol to read the file from separate locations, the scenario strategies such as basics of function (Arnim & Scholz, 2006), the scenarios manager enables the model to have a standard access to data through interface without the need to have an interface for individual software model (Kagioglou, et al., 2007).

Figure 2.7  Scenarios Navigator- based data transfer

To address the limitation in Scenarios Navigator- based data and File-base data conversion approaches mentioned above, (Waltersdorfer, et al., 2010) developed a database process which enables data from different model to have a centralized storage of data and to detect conflicts of interest among the available data.

However, (Woolf & Hohpe, 2011), highlighted some limitation in Scenarios Navigator- based data transfer such that a large database creates a larger structure when the data exchanged among many too many models, therefore, making it impossible to hide information and therefore lacking the integrity to adopt changes effect to the simulation softwares and integrated systems.

2.7     Discussion/Conclusions

In general, some of problems that limit data exchange, integration of processes are modelling issues such as incompatibility and interdisciplinary, such as (i) data distribution (ii) description level (the problem of supporting standard) (iii) independent variable, attributes, (v) format representation (basic incompatibility): ACCESS against XML against tabulated data and (vi) expressiveness e.g. separate entity composition against single entities.

The future trends look not impressive enough to improve the field data distribution and integration among different softwares because of the increase in engineering specialization and for the fact that there will continue to be increased in many disciplines such as (e.g. Growing in IT technology) and the number of conventional tools will continue to increase.

The maker of these software tools can not be influenced or compromised to change their habits to make their process more user friendly and aligning with other tools, therefore researcher activities should focus on developing strategies for improving data sharing and integration strategy solutions.

Although model such as Schinarios Nafigator- based data and file –base data conversion approaches, CAEX- data exchange, Ontolology data sharing have had major impact in this field but did not consider the issues of incompatible nomenclatures which happens as a result of the software involved concern posses some similarity but yet different understanding of terminology.

The literature has shown that the existing software tools are considered to be tool specific. Others are considered to be proprietor specific. There is no standardized model that can integrate all the software tools and facilitate exchange of engineering data. It is therefore, clear that there is need to develop a novel modelling strategy that will interface different software tools.

 

 

Chapter 3

3     Overview of Packml

3.1       Introduction

The engineering production is purely characterized by contribution of different profession groups that execute single engineering phases in order to realize a given product.This clearly develops a necessity for establishing an integrated data exchange system for engineering production systems. The integrated data exchange will facilitate sharing of production information among different engineering tools. This research proposes the need to employ Packml as an integration tool. An overview of the Packml technology is, therefore, provided in this section. It should, however, be noted that there is no standard data exchange format that has been established so far. Instead, the existing interfaces of data exchange are considered to be tool specific, lossless and specific to proprietor. As such, there is a logical consequence of the past division of evolution in the chain of production engineering (Cevallos, 2014).

The prevailing exchangeability in the engineering production data is considered to be more and more its core value. There is, therefore, need to call for a standardized data exchange format that can surpass all the phase of the engineering life cycle. There are some attempts that have been made in the past to establish standardized data exchange formats. The most recent data exchange developed is known as CAEX. This format is considered to be the current data exchange format and has been endorsed by IEC (Communication Shift, 2011).

CAEX (Computer Aided Engineering Exchange) is a data format that is developed to enhance transparency in the data exchange processes. CAEX is considered to be an object-oriented data format that is based on the XML platform. The data format is used to depict logical and real plant objects that are in form of data objects. CAEX is known to provide allowance for predefinition of the pattern solutions that are defined in form of classes that are instantiated several times (Coyle, 2002).

3.2       Overview of PACKML framework

PACKML stands for Packaging Machine Language. The language is generally used in the batch control industry. The models is considered to provide guidelines of the techniques that are useful for the design and packaging of the software for automated machinery. The model is also known to bring a common look and feel to all machines that could be associated with a given production line. PACKML was initially designed for packaging industries. However, it is now evident that it can be applied across different production lines to boost data exchange requirements (Coyle, 2002)..

The following diagram can be used to illustrate the PackML state model. As can be viewed from the figure 3.1 below, the model forces states are only activated in a sequential manner except the Abort and Stop functionalities. As such, the model is considered to be efficient in monitoring the current state of the production line since the model force states are activated in a sequential manner. It can also be used to drive the next state (how this will be helpful in your research?) and hence serve as a high-level master for machine sequencing. The following diagram in figure 3.1 can be used to illustrate the architecture of the PackML model;

Figure 3.1 Architecture of the PackML model

3.2.1     PackML modes

The first decision in the data exchange using the PackML involves the determination of the modes that will be included and their names.

PackML is considered to provide unlimited number of machine modes. The model contains three common modes of operation; Automatic, Maintenance and Manual. Automatic mode is commonly used for normal production, the maintenance mode is used to run all sections of the machine in a dry cycle for setup, debug. Manual mode relates to manual operation of individual mechanisms on the machine.

3.2.2     Assignment of modes and states

The user is allowed to define the state names, as well as, the transitions. As such, the user is allowed to define what machine functions happen in each state. The decisions of the user ought to be clearly identified in a document in order to have all project engineers achieve uniform understanding.

One aspect that is closely associated with PackML is the modularity. The PackML is a state model that can be implemented in a modular way. The modularity facilitates separation of the codes into logical modules that are used to match the physical machine construction. The hierarchy of the code modules consists of six levels that are in the range of global company. The data related to machine, equipment and control signals. The hierarchy is commonly defined by the commands that age generated at the Unit Machine level, which then flow down through the equipment modules on the control modules as shown below;

Figure 3.2   Illustration of the data flow in a PackML application

It should be noted that the model deals with individual codes on one single machine at a time. The machine is divided into equipment modules that contain one or more control modules. At this point an elaborate decision is required to define EMs and CMs for logical and efficient operation.

3.3     Standard format for data exchange

There are many data exchange standards that are used for different applications. The available standard data exchange define the transfer of information for general purpose text such as file formats. There is also a standard legal data exchange for low-level e-learning that is commonly referred to as XML. Engineering applications are considered to constitute complex production phases that requires special type of standard data exchange. It is normally conceived to be difficult to share production information that is contained in the piping and instrumentation diagram (P&ID), as well as PLC carrying information. Such a platform has not been fully exploited. Most of the data formats are considered to be specific to brands, proprietors and even software tools. This implies that a PLC will only understand the information that is contained in the STEP 7 or any other programming software that is understood by the PLC. The program contained in P&ID cannot be understood by the PLC yet both P&ID can be involved in the engineering production (Drath, 2010).

Such technicalities can be eliminated if a standard format is developed that can understand the discrete event simulation tools. This implies that the interface should incorporate all the critical software tools in its design for it to be acceptable. There is a lot of time and effort lost when dealing with manual or even semi-automatic conversion among different software tools. There is also the problem of developing innovative engineering methods when a standard platform of data exchange is lacking. The most appropriate standard data exchange format will have to be developed on the basis of separating data from its original tools. This implies that the exchange format will automatically evaluate the data contained in the other formats and use the same data for the next engineering step. One such platform that has been established in CAEX. The other formats that have also been developed include Weihenstephan standards and PackML (East, 1995).

3.4     Existing methods and their limitations

3.4.1     Weighnstephan Standards

These standards have been developed and used to connect production data acquisition systems with the bottling plants. The standards are known to have efficient data interface that is capable for physical connection of the control devices. They can also be connected physically to the machine controllers. It should also be noted that the standards constitute how data is represented and also exchanged among differet processes.The design of this standard allows each device and machine to have a corresponding device description file. The device description file is used to specify the type of data that can be provided by the machine or device (Euzenat, Napoli and Baget, 2003).

3.4.2   PackML

This is also another standard that has been developed for machines. The standard has been implemented successfully in the packaging industry. The packaging industry, just like other engineering processes involve different product development phases that required data integration. The use of Packml in the packaging industry has demonstrated a high level of efficiency that makes it appropriate to adopt it in this research.The standard is associated with definition of a common machine language that is used to communicate among different packaging machines. The standard was developed by OMAC Packaging Workgroup. Additionally, the standard is considered to have a universal PackML State Model. The model constitutes a description of the state of the packaging machines in the automatic operation mode. This greatly simplifies the corresponding work flow (Euzenat, Napoli and Baget, 2003).

3.4.3   XML

This is one of the few formats whose standards have been accepted worldwide. There are many data models and tools that are currently making use of XML. XML is also considered to contain description language that can be embraced by the automation technology. It should be noted that XML contains storable information set that is not limited to information about field devices. As such it is considered to constitute adequate granularity that is suitable for most of the software tools (Exchange of Diagrams and Data between Radiotherapy Centres, 1945).

3.4.4     CAEX

CAEX stands for Computer Aided Engineering Exchange. CAEX is a data format that was developed to enhance transparency in the data exchange processes. CAEX is considered to be an object-oriented data format that is based on the XML platform. The data format is used to depict logical and real plant objects that are in form of data objects. CAEX is known to provide allowance for predefinition of the pattern solutions that are defined in form of classes that are instantiated several times. CAEX is an abstract object-oriented data format that is based on XML. As such, it allows predefinition of the pattern solutions that are accepted in form of classes. The classes are instantiated several times (Fishburn, 2013).

It should be noted that CAEX data format embraces the open data exchange paradigm between standard regulations. The format was developed to facilitate cooperation among the chair of process control engineering of RWTH Aachen and also Research center of ABB. Additionally, CAEX is specified in the IEC CDV 62424, as well as, DIN V 44366. Initially, the data format was developed for storage of process control information. The first part of the format defines the graphical representation of the control information in relation to P&ID. The second part defined the format of the data and the flow of information among the engineering tools (Fishburn, 2013).

3. 5   Research problem

Efficient and effective use of simulation model technology in different areas such as computer science and engineering have been greatly achieved, but the interrelationship between different processes to collaborate with each other is less developed. This has created challenges in simulation data exchange and also transformations.

Major concerns are the lack of research effort towards achieving seamless  integration, effective modeling strategy for data exchange and developing efficient tools for discrete event simulation tools (Declan, et al., 2014).

There is a problem of interfacing different discrete event simulation tools. There is no standard interface that can be used to facilitate exchange of data among different discrete event simulation tools. There is need to develop a novel model that will strategically exchange data among different discrete event simulation tools.

3.5.1   Research gap

The existing discrete event simulation tools are considered to be tool specific. Others are considered to be proprietor specific. There is no standardized model that can integrate all the discrete event simulation tools and facilitate exchange of engineering data. It is, therefore clear that there is need to develop a novel modelling strategy that will interface different discrete event simulation tools. There is also need to develop a model for exchanging data which will allow seamless integration, as well as, transformation of the data from different types of discrete event simulation tools. The research will not only focus on the existing data exchange platforms but it will also develop a model that will used as an interface among the different discrete event simulation tools.

3.5.2 Advantage

This research gap will go far in introducing new model strategy that will facilitate exchange of data among different software tools. This will in turn improve the efficiency of the communication among different production lines in the manufacturing setup.

3.5.3   Integration and transformation

The model is considered to provide easy communication status both upstream and downstream among different equipment. This is possible even when the equipment that are being integrated are from different vendors. It is also possible even when the equipment make use of different machine controllers. The integration process makes use of a communication protocol standard to relay the data among the different equipment in the production lines or even the industrial network. Interfacing is normally accomplished through digital I/O trigger that are instantiated by the current PackML Mode and State.

 

 

 

 

 

 

 

 

Chapter 4

4     Research methodology

4.1     Introduction

This section outlines an approach that will be used to develop the standard data exchange format. The approach that will be used to generate automated model for integrating software tools will be reviewed. The development will incorporate parallel activities that will be implemented for control codes. It should, however, be noted that simulation will be employed to test essential part of the control engineering information. This implies that the test will not be carried out in conjunction with the real data from the plant. Simulation will provide an adequate platform that will evaluate the developed control code (Jacobs, Otter and Reijer, n.d.).

The outcomes of this research will form the base for the PhD, at this stage it is limited to approaches for systems integration from the data modelling point of view. The development of the novel modelling strategy will incorporate standard software tools that are used in sharing information within the manufacturing industry. This will form part of the Mphil and The project is intended to be continued for the PhD in order to realize a seamless integration and transformation of the already established software tools.

 

4.2     Research Methodology

(Kothari, 2008), have defined, “Research as a systematised effort in gaining new knowledge” which comprises of defining and redefining of problems, suggest solutions, collecting, organising and evaluating data, reaching conclusion and making suggestions.

4.2.1     Research methodology vs Research methods

Research methods involve the techniques adopted for conducting research while on the other hand, research methodology refers to the ways of systematically solving research problems. Research methods are also referred to as science of knowing about how research is scientifically conducted (Zhang & Steven, 2006).

According to (Walle, 2005), research methodologies involved three types: Namely, Qualitative, quantitative and mixed strategies.

4.2.3     Quantitative Methodologies

 

Quantitative research involves the aspect that can be expressed in terms of quality, and measurement of amount or quantities (Wen & Chi-Ming , 1992). In quantitative the various available econometric and statically is implemented for analysis in the research. The evidence is redefined iteratively using mathematical models, hypothesis and theory (Liu & Yan , 2012).

The quantitative is for analysis because it results in producing hard numbers and it most often used for large surveys, it asks questions like when, how, how often and where? This research approach is objective approach, a formal, systematic process in which numerical data findings rather than subjective.

Advantages of Quantitative include (Todd , 1989)

  1. Quantitative data are statistics driven and can provide a lot of information and variants to be investigated.
  2. It is easier to compile the data into a chart or graph because of the numbers that are made available.
  3. Replicable results
  4. The research can be conducted on a large scale and gives a lot more information as far as value and statistics; future work can also be identified.

Steps in quantitative methodology (Zhang & Steven, 2006)

  1. Identify variable and problems to be investigated
  2. State Research Questions or hypothesis and Select research approach & decide measures to be used in to analyze variables.
  3. Select a sample & a data collection method       by carrying out experiments on the variable.
  4. Be aware of ethical & cultural issues.
  5. Carry out experiment and Visualize results in the form of a table.

The limitation of quantitative research is that some variable are not fully measurable and therefore assumptions cannot be made outside the scope of the samples (Morse, 1991), another drawback in quantitative research is that number changes often which subject the experiment to be conducted frequently to help fill the gap when the number changes.

In quantitative research, the researcher doesn’t know what to expect which is why the question why, where, and how? Are being asked and putting statistics together to compile and figure out what the data means.

4.2.4     Qualitative methodology

According to (Beach, et al., 2001), qualitative methodology allows the researcher to have an idea of what to expect and how it will happen, Qualitative research gather the data and transformed it into a bigger picture, the way and how things occur and why it happen.

(Michael, 2005), Qualitative methodology analyses data from direct observation, open ended interviews, in-depth and written document and makes knowledge claims based on constructive perspective. The methods used in qualitative research are reviewed, focus groups, interview and observations.

Steps in qualitative research (Wen & Chi-Ming , 1992)

  1. Collect and Code the Data, including content interview, analysis, observation and focus group.
  2. Data Analysis of focused information to identify problems
  3. Write detailed descriptions of the organisation techniques, deliverables, trends and entities.

Nevertheless, (Beach, et al., 2001) has described qualitative methodology as more costly, time consuming and more challenging.

Advantages of Qualitative methodology (Michael, 2005)

  1. Allowing flexibility and attainment of valid understanding of the subject.
  2. It is easier to understand the target data because of the questions that are being asked during the research process such as why, how and when?

4.2.4     Mixed research methods

Mixed method is a method of research that combined both elements of quantitative and qualitative approach. (Sharon & Halcomb, 2006), explain that mixed methods permits the collection of Quantitative and qualitative data in the same study, it left for the searchers to study and determine the extent to base his knowledge and the approach to use over one another.

Advantages of mixed research methods (Sharon & Halcomb, 2006)

  1. It enables the researcher to build a study based on the strength of both qualitative and quantitative research methods.
  2. It provides a complete picture of research problems.

Each of the techniques possesses some advantages and disadvantages, to reduce the limitations and the effect of one method over the other, the study carry out the concept of triangulation (Jick, 1979). Both qualitative and quantitative in combination with triangulation are used separately and will be adopted for this research, since triangulation method enables the collection of data in numerical format(attributes, numbers etc.) and in text format (survey, interviews etc).

The methodology is adopted in this research to mix quantitative methods such as statistic analysis with qualitative analysis such as collaborators inputs.

4.3     Research Method Steps

 

The first step for this research will be to initially build the and develop the new model to integrate and transform the data originating from different software tools for manufacturing processes. This research will be limited to the approaches for systems integration from the data modelling point of view, The project is intended to be continued for the PhD in order to realize a seamless integration and transformation of the already established software tools.

 

Figure 4.1 Proposed research framework

4.3.1     Step one:   Data collection

This research is part of the Technology Strategy Board (TSB) project ‘’aims at the AMDT (Autonomous Model Development Tool)’’ Therefore, The initial data for the proposed method will be collected for the development of PACKML model from different collaborators involved. Additionally, a case study of collaborator production line data will also be used to evaluate the success of the implementation of the solution.

4.3.2     Step two: Semantic definition

The first segment of modelling the PACKML is the definition of the semantic. The definition will help to obtain accurate results from the novel model. In this stage, the structure of the static object information will be defined. It should also be noted that this stage will cater for the definition of the data types that will be handled by the system. At this stage, the system will be made aware of the form of data it will expect from the specified section of the engineering system. As such, it will put in place the necessary tools that will process the data (Lou, 2009). The following figure can be used to illustrate the successful transfer of data between stack magazine module and changer module using the methodology described here;

Figure 4.2 Exchange of data using PACKML between Stack Magazine Module and Changer Module

The definitions of the data and the object structure that are expected to be stored are defined. The illustration given above demonstrates how the composition layout and the object relations are defined in a hierarchial structure. In this way, the system is capable of presenting basic usages of PACKML elements that can be used for the description of the IMC FBs (Lou, 2009).

4.3.3     Step 3: Identification of the components

The identification of the components (attributes and modelling elements) is also considered to be critical to the implementation approach. The process of identifying components involves outlining the roles that are associated by the abstract functions of the components. The requirements and properties of each of the roles of the abstract functions are then stored as attributes within the corresponding RoleClass element. The InductiveSensorRole element can also be used to specify the operating temperature, size, sensing range and physical parameters of the components. All this information is captured within the PACKML module together with the correct dimensions, sensitivity, suitable temperature tolerance and other features that are associated with correct operation of the system. The system Unit Class Lib will be used to store and also categorize the details of the components (Lou, 2009). Typical components of the data exchange model are illustrated in the following figure;

Figure 4.3 Components of data exchange model

4.3.4   Step four: Definition of design data

The design data modules are considered to incorporate data which goes above the core design data. The data can be distributed among the few discrete event simulation tools in order to make it self-adapting. The definition of the data is derived from the software tools where it is obtained. The most important aspects of the definition will include the link to the module that has the core design data element that is described by the stored information in the software tool. There is also a template file that is attached to the link. The template file at this stage is empty and it is expected to carry the detailed design data and also gather necessary design data that is described in the software tool (Lyons, 1977).

The exercise of adding design data module to core design elements gives rise to a link that is set between the two sets of information. There are many options that can be used to establish such kind of a link. The option chosen depends on the method that will be used for implementation. The file containing the information about the detailed design data software tool ought to be stored in the respective folder that represents the core data element. The name of the file can then be set in accordance to the template file (McGrath, 2007).

4.3.6     Step five: Definition of the relations

This stage will be used to analyze the relationship between different ME within the different forms of data. The system is expected to handle different types of data. The nature and relationship among the different forms of data will be defined at this stage. Some of the elements that will be used to define the relationships among the data types include the SupportedRoleClass elements which will be used to define inheritance relations.

The last stage in the modelling is provision of the core design data that is enriched with core design data including the details for data modules and other aspects of the software tools involved in the interdisciplinary engineering production processes

 

4.3.7     Step six:  Selection of the elements

This stage involves the selection of the classes that are associated with the components. The selection process normally starts with the instantiation of the internal elements which is then followed by the assembled hierarchical element. This process will facilitate the definition of the internal structure of the system. All the components that are used to create the module together with their attributes will be defined and their details, roles, requirements stipulated.

The last section of the definition can be realized by defining external interfaces of the components. In this case definitions for workpiece flow direction, the connection ports and types of workpiece are provided. The interface concept is used to provide a definition mechanism, for the semantics for automatic connections that are established within the system (Lawrence, 1984).

4.3.8   Step eight:   New approach to data modelling strategy

The new approach proposed here will make use of PACKML standard format for exchanging data. The format is basically suited for the monitoring and controlling the engineering production. The additional information in relation to the visualization and conversion can be integrated in PACKML files and still kept consistent with XML schema. The data model for the proposed approach will constitute communication of the automation components that are in form of PACKML classes that are referred to as templates (Nuclear Data Exchange, 1963).

The results of the implementation process describes the corresponding classes of PACKML that provide a description of the structure, as well as, semantic of the data items. The system independent data representation is also provided for syntactic definition. The meta mechanisms were also realized to provide definition for the required semantics. The classes of PACKML were defined by the system unit class. The software tool system relevant information was stored within the PACKML class description. This was found to decouple the data representation from the software tools.

Additionally, it allowed an algorithmic accessibility for the data structures that were provided. The section of the mapping was applied on the current PACKML data associated with the software tools. This created the possibility of monitoring and integrating the data from the software tools.

It can also be said that the already existing information that was in form of structure of the manufacturer was transformed into the PACKML format. This facilitated the expected transformation of information into other software tools as long the appropriate PACKML description is provided on the available data (Ortin et al., 2014). The following figure can be used to illustrate the flow chart of the modelling strategy;

Figure 4.4 Flow chart of modeling strategy

This concept was used for the exchange of information from the field to the level of enterprise but under different software tools that were described in the model. In this case the information that can originated from the PLC was successfully re-used for the production monitoring and control of other phases of the production (Osborne, 1990).

5.1     Proposed strategy and its advantages

The extended PACKML is equally important to the novel model since it provides standardized data exchange and storage facilities that are planned in a syntactic way. It, therefore, follows that the software tool format in a given production line cannot cause any changes on the XML description format that is employed in processing the data.

Furthermore, the data templates that are defined by the authors can be used to provide semantic specification in relation to the required object structures (Poussart, and Olsson, 2004).

Extended PACKML is considered to have presented possibility of processing planning data using PACKML files even if they are manufacturer-independent. Additionally, it is possible to analyze the data and visualize the data that may lie between the two entities.

The extended PACKML also constitutes a to-do-list that contain the calculated differences between the software tools involved. As such, a transparent data exchange is realized which has defined templates that help to realize the required mapping.

The presentation of the differences make it possible to construct a mapping even when there are no technical background and knowledge. Two PACKML files can have a mapping even when they are considered to come from two different manufacturers using the evaluation of the structure templates of the corresponding system libraries. This means that one control PACKML file and the other PACKML coming along with PLC can have an appropriate control system PACKML class and the PLC PACKML class that are examined on the basis of the available structures and possible attributes. The data types, and the units corresponding to the attributes can also be defined (Processing XML Queries Using XML Materialized Views: Decomposition of a Path Expression and Result Integration, 2003).

Such a definition can also incorporate the syntax and semantic results that can be used to provide better integration results of the content that can be obtained in the future. The mechanism by which the authors integrate the required information in the PACKML model together with the expanded PACKML files has been provided below;

Figure 4.5   Illustration of data exchange model

It is also worth to note that most of the engineering processes are explained using the PACKML description of the plant. When the available engineering information can be presented in the system-independent data which can be sent to the monitoring and control system through the internet and obtain validation using PACKML (Ra, 2006). The following figure can be used to illustrate the engineering workflow of the underlying process;

Figure 4.6   Illustration of the engineering workflow

The information that is received in this way can be distributed in the engineering-relevant and also visualization-relevant data using the XSL transformation. As such, the system is automatically enabled to generate PACKML files by finding the relevant links in the database (Shen et al., 2010).

 

Chapter 5

5 Experimental results

The above outlined methodology and analysis matches the requirements of the PACKML. There are many benefits that are associated with the use of PACKML as a standard data exchange platform. The section in 4.3.8 and 5.1 above provides the strategies and experimental results that can be obtained with the implementation of PACKML as a standard novel model for data exchange. The section will also provide the benefits and critical characteristics of PACKML as a standard data exchange platform. Additionally, a case study will also be provided to evaluate the success of the implementation of the solution (McLaughlin, 2000).

 

 

 

 

 

 

 

Chapter 6

6     Conclusion

The prevailing exchangeability in the engineering production data has been discussed in detail. The available standardized data exchange formats have been reviewed. Additionally, a model has been developed to integrate and transform the data originating from different software tools. The model developed here can be important the appropriate strategy that can introduce seamless integration of the data files involved. There are some attempts that have been made in the past to establish standardized data exchange formats. The most recent data exchange developed is known as PACKML. This format is considered to be the current data exchange format and has been endorsed by IEC.

It is also worth to mention that the novel model that has been developed in this research incorporates the object oriented paradigm. The object oriented technology is considered to encapsulate development of pattern solutions that have classes. Individual classes can be used to generate arbitrary number of individual instances. This concept has been combined with the file descriptors for different software tools to construct the model.

 

 

 

References

Aarts, J. (2005). Coins, money and exchange in the Roman world. A cultural-economic perspective.Archaeological Dialogues, 12(01), p.1.

Arnim , W. & Scholz, W., 2006. Functions of scenarios in transition processes. Futures, 38(7), p. 740–766.

Beamer, A. and Gillick, M. (2010). ScotlandsPlaces XML: bespoke XML or XML mapping?. Program: electronic library and information systems, 44(1), pp.13-27.

Beach, R. et al., 2001. The role of qualitative methods in production management research. International Journal of Production, 74(1-3), pp. 201-212.

 

Barnes, M. & Arnaud, R., 2006. COLLADA, Mass: Peters Wellesley.

Boettner, B. (2014). Destabilizing data in CF. Science-Business eXchange, 7(34).

Brekelov, V., Borisov, E. and Barygin, I. (2015). Integration Testing Automation: Case Study of Financial Data Exchange Modules Based on FIX-protocol. St. Petersburg State Polytechnical University Journal. Computer Science. Telecommunications and Control Systems., 212(1), pp.88-96.

Cameron, M. (1996). More exchange of ideas needed. Computational Statistics & Data Analysis, 23(1), p.198.

Carey, P. (2003). XML. Cambridge, Mass.: Course Technology.

Cevallos, C. (2014). Automatic generation of 3D geophysical models using curvatures derived from airborne gravity gradient data. GEOPHYSICS, 79(5), pp.G49-G58.

Chen, J. L., Yücesan, E. & Snowdon, L., 2002. Simulation Software and Model Reuse: A Polemic. New York, Association for Computing Machinery Press.

Child language data exchange system. (1984). J. Child Lang., 11(03).

Communication Shift. (2011). Marketing Review St. Gallen, 28(5), pp.62-62.

Coyle, F. (2002). XML. Addison-Wesley Professional.

CUPEK, R., ZIEBINSKI, A. and FRANEK, M. (2013). FPGA BASED OPC UA EMBEDDED INDUSTRIAL DATA SERVER IMPLEMENTATION. Journal of Circuits, Systems and Computers, 22(08), p.1350070.

Daniel L. Smith, (1995). How an Independent is Using 3-D Seismic and CAEX Technology to Reduce Risk: ABSTRACT. Bulletin, 79.

Data exchange moves forward. (1986). Displays, 7(4), p.200.

Data exchange. (1990). Computational Statistics & Data Analysis, 9(2), pp.248-249.

Data/telex exchange. (1983). Computer Communications, 6(5), p.271.

Dejing, D., Drew, M. & Peishen, Q., 2004. Ontology Translation on the Semantic Web. Journal on Data Semantics II, pp. 35-57.

Design and Implementation of a new XML-Signcryption scheme to protect the XML document. (2003).The KIPS Transactions:PartC, 10C(4), pp.405-412.

Drath, R. (2010). Datenaustausch in der Anlagenplanung mit AutomationML. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg.

Drath, R. (2010). Datenaustausch in der Anlagenplanung mit AutomationML. Springer Berlin Heidelberg.

East, E. (1995). The standard data exchange format for critical path method scheduling. Champaign, Ill.: US Army Corps of Engineers, Construction Engineering Research Laboratories.

Euzenat, J., Napoli, A. and Baget, J. (2003). XML et les objets (objectif XML). Objet, 9(3), pp.11-37.

Euzenat, J. & Shvaiko, P., 2008. Ten Challenges for ontology mathching:On the move to meaning ful internet systems. pp. 1164-1182.

Exchange of Diagrams and Data between Radiotherapy Centres. (1945). Nature, 155(3939), pp.508-508.

Fishburn, C. (2013). Oxford goes big (data). Science-Business eXchange, 6(20).

Forberg, J. and Barth, M. (1985). Profiler standard format for data exchange. Boulder, Colo.: U.S. Dept. of Commerce, National Oceanic and Atmospheric Administration, Environmental Research Laboratories.

Frankel, J. and Froot, K. (1985). Using survey data to test some standard propositions regarding exchange rate expectations. Cambridge, MA: National Bureau of Economic Research.

Funding for cattle data exchange system. (2015). Veterinary Record, 176(15), pp.377-377.

Goldberg, K. and Castro, E. (2009). XML. Berkeley, Calif.: Peachpit.

Jacobs, J., Otter, P. and den Reijer, A. (2012). Information, data dimension and factor structure. Journal of Multivariate Analysis, 106, pp.80-91.

Jacobs, J., Otter, P. and Reijer, A. (n.d.). Information, Data Dimension and Factor Structure. SSRN Journal.

Jiang, Z., Fan, Y., Zhang, C. and Wang, L. (2014). Realization of Public Building Energy Efficiency Monitoring Simulation System Based on OPC UA. AMR, 926-930, pp.615-618.

Kahng, J. & McLeod, D., 1998. Dynamic Classification Ontologies:Mediation of Information Sharing on Cooperative Federated Database Systems. Cooperative Information Systems Trends and Directions, 2(5), pp. 179-203.

Kagioglou, M., Aouad, G. & Bakis, N., 2007. Distributed product data sharing enviroments. Automation in construction, 16(5), pp. 586-595.

Kim, C. and Habib, A. (2009). Object-Based Integration of Photogrammetric and LiDAR Data for Automated Generation of Complex Polyhedral Building Models. Sensors, 9(7), pp.5679-5701.

Kothari, C., 2008. Research Methodology: Methods and Techniques. 2nd ed. Australia: Wiley Eastern Limited.

Kim, J. and Kang, H. (2009). XML Fragmentation for Resource-Efficient Query Processing over XML Fragment Stream. The KIPS Transactions:PartD, 16D(1), pp.27-42.

Kosek, J. (2003). Sazba XML. Zpravodaj CSTUG, 13(1), pp.3-6.

Kotz, J. (2012). Bringing patient data into the open. Science-Business eXchange, 5(25).

Kumar, B. (2013). Automated Surveillance System and Data Communication. IOSR-JCE, 12(2), pp.31-38.

Lawrence, K. (1984). Data entry in foreign exchange dealing. Data Processing, 26(7), pp.42-43.

Lawrence, P. (1959). Exchange of Genetic Material. Evolution, 13(4), p.570.

Lee, Y. T. & Yan, L., 2005. Data Exchange for Machine Shop Simulation. New Jersey, Institute of Electrical and Electronics Engineers, Inc.

Liu, . M. & Yan , C., 2012. Mathematical Model and Quantitative Research Method on the Variability of Task Execution-time. China, IEEE.

Lou, K. (2009). Smaller samples, more data. Science-Business eXchange, 2(20), pp.1-2.

Lyons, N. (1977). An automatic data generating system for data base simulation and testing. ACM SIGMIS Database, 8(4), pp.10-13.

Mateosian, R. (2001). XML: Learning XML: Creating self-describing data [Book Reviews]. IEEE Micro, 21(2), pp.95-95.

McGrath, M. (2007). XML. Southam: Computer Step.

McKinnon, A. (2003). XML. Australia: Thomson.

McLaughlin, B. (2000). Java and XML. Cambridge, Mass.: O’Reilly.

McLean, C. & Leong., L., 2001. The expanding role of simulation in future manufacturing. New Jersey, Institute of Electrical and Electronics Engineers, Inc.

Michael, . Q. P., 2005. Qualitative Research. 2nd ed. London: John wiley &sons, Lmt.

Miriam, S. & Rainer, D., 2008. The system-independent data exchange format CAEX for supporting an automatic configuration of aproduction monitoring and control system. s.l., IEEE.

Morse, J., 1991. Approaches to Qualitative-Quantitative Methodological Triangulation. research, 40(2), pp. 120-123.

Nitta, M., Ogawa, K. and Aomura, K. (1975). Studies on the Catalytic Properties of Synthetic Zeolite A. V. Site Selectivity of Silver Ion in (Agex, Caex)-A. Bulletin of the Chemical Society of Japan, 48(6), pp.1939-1940.

Nuclear Data Exchange. (1963). Physics Today, 16(7), p.66.

O’Beirne, R. (2000). XML websites2000221 XML websites . 2000. Reference Reviews, 14(5), pp.10-11.

Ortin, F., Zapico, D., Quiroga, J. and Garcia, M. (2014). Automatic Generation of Object-Oriented Type CheckersLNSE, 2(4), pp.288-293.

Osborne, B. (1990). Information Exchange. Exchange of NIR data. NIR news, 1(1), p.12.

Peters, J. S., Smith, D. J. & Medeiros, M. W., 2001. THE EXPANDING ROLE OF SIMULATION IN FUTURE MANUFACTURING. Gaithersburg, Gaithersburg.

Poussart, J. and Olsson, L. (2004). Effects of Data Uncertainties on Estimated Soil Organic Carbon in the Sudan. Environmental Management, 33(S1).

Présentations libres. (2010). Pédagogie Médicale, 11, pp.S31-S53.

Processing XML Queries Using XML Materialized Views : Decomposition of a Path Expression and Result Integration. (2003). The KIPS Transactions:PartD, 10D(4), pp.621-638.

PUMSA-ARD, K., UCHAI, W. and YAN, Y. (2006). MESON EXCHANGE THEORY FOR HIGH ENERGY PROTON-PROTON SCATTERING. Int. J. Mod. Phys. E, 15(01), pp.109-119.

Ra, Y. (2006). XML Schema Evolution Approach Assuring the Automatic Propagation to XML Documents. The KIPS Transactions:PartD, 13D(5), pp.641-650.

Reijer, A., Jacobs, J. and Otter, P. (n.d.). Information, Data Dimension, and Factor Structure. SSRN Journal.

Reijer, A., Jacobs, J. and Otter, P. (n.d.). Information, Data Dimension, and Factor Structure. SSRN Journal.

Romano, F. (2001). XML. Paramus, N.J.: NAPL.

Sharon , A. & Halcomb, E., 2006. Mixed methods research is an effective method of enquiry for community health research. Mixed methods research, 23(5), pp. 145-153.

Shen, B., Qi, D., Fan, L. and Meier, H. (2010). Co-Service Manufacturing System Integrated OPC UA/FDT for Complex Equipments & Field Devices. AMM, 34-35, pp.344-349.

Simon, S. (2001). XML. New York: McGraw-Hill.

Skoogh, A. & Johansson, B., 2007. Time-consumption analysis of input data activities in discrete event simulation projects. Gothenburg, Swedish Production Symposium.

SNA and document exchange. (1986). Data Processing, 28(4), p.216.

St Amant, K. (2001). Cultures, computers, and communication: evaluating models of international online production. IEEE Trans. Profess. Commun., 44(4), pp.291-295.

St. John, D. (1976). 24. The Illinois Automated Long-Term Care System—Three Years of Experience.Medical Care, 14(Supplement), pp.192-197.

Stein, M. and Dellwig, I. (2002). XML. Boston: Addison-Wesley.

Stemple, D. (1975). A data base management facility for automatic generation of data base managers.ACM SIGIR Forum, 10(3), p.14.

Suwanrungruang, K., Kamsa-ard, S. and Wiangnon, S. (2014). Asian Pacific Journal of Cancer Prevention, 15(18), pp.7985-7987.

Treese, W. (2002). XML, web services, and XML. netWorker, 6(3).

Todd , J., 1989. Mixing Qualitative and Quantitative Methods: Triangulation in Action. Administrative Science Quarterly, 24(4), pp. 602-611.

Valdes-Cruz, L., Sahn, D., Larson, D. and Scagnelli, S. (1984). Quantitation Of Ventricular Septal Defect Shunting: 2d Echo Contrast Studies In Animals; Using A Stand-Ardized Experimental Echo Contrast Agent.Pediatr Res, 18, pp.121A-121A.

Venkatesh, R., Sekhar, M. and Sandeep, K. (2014). Secured Luxuries Transporting System. IOSRJECE, 9(2), pp.91-110.

Vergeest, J. (1996). Product Data Exchange. Computer-Aided Design, 28(8), p.665.

Vesga, J., Cori, A., van Sighem, A. and Hallett, T. (2014). Estimating HIV incidence from case-report data. AIDS, 28, pp.S489-S496.

Viviana, M., Angela, L. & Paolo, R., 2008. Automatic Ontology Matching via Upper Ontologies: A Systematic Evaluati. Genova, IEEE Computer Society.

Volume G: Definition and Exchange of Crystallographic Data. (2006). Chemistry International — Newsmagazine for IUPAC, 28(5).

Waktersdorfer, F. & Zoitl, A., 2010. Version Management And Conflict Detection Across Heterogenous Enginering Data Models In Industrial Infomatics. London, Ieee

Walle, A., 2005. Work In Progress -Research Methodologies: Why, What, and How. Indianapolis, IEEE.

Waltersdorfer, F., Moser, T. & Zoitl, A., 2010. Version management and conflict detection across heterogenous engineering data models. 8th International conference on IEEE, pp. 928-935.

Wen , P. & Chi-Ming , C., 1992. A Quantitative Measure&””‘^* – For Different Testing Methodologie. Taiwan, Ieee.

Wiman, H. (1998). Automatic Generation of Digital Surface Models Through Matching in Object Space. The Photogrammetric Record, 16(91), pp.83-91.

Woolf, B. & Hohpe, G., 2011. Enterprise intergration patterns,Designing, building and deplying solution, Readings: Adision-Wesley.

 

XML ersetzt GAEB. (2003). Bautechnik, 80(4), pp.272-272.

XML. (2004). IT Prof., 6(2), pp.61-61.

  1. Peng, and M. Ohura, (2000). REMOTE AUTOMATED ENVIRONMENTAL CONTROL SYSTEM FOR INSECT PRODUCTION.Applied Engineering in Agriculture, 16(6), pp.715-721.

YIN, L. and TIAN, H. (2010). XML weak functional dependency and inference rule based on XML Schema. Journal of Computer Applications, 30(9), pp.2314-2316.

Zhang, W.-B. & Steven, S., 2006. PATH Innovative Research on ITS Technologies and Methodologies for Manufacturing Solutions. Toronto, IEEE.

Expert paper writers are just a few clicks away

Place an order in 3 easy steps. Takes less than 5 mins.

Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
$0.00
Live Chat+1-631-333-0101EmailWhatsApp