IEEE International Conference on Smart Computing and Electronic Enterprise (ICSCEE2018) 02018
FlexiDC:A Flexible Platform for Database Conversion SK Ahammad Fahad Faculty of Computer and Information Technology Al-Madinah International University Malaysia
Wael .M.S.Yafooz Faculty of Computer and Information Technology Al-Madinah International University Malaysia
[email protected]
Abstract—Database conversion is a process to transfer data
from one database to another along with its structure. Since there are many database systems created by organizationor individuals, such systems can be in diverse types such as Access, Oracle and MySql. The progression in technology requires some systems to be upgraded to a newer system (e.g. adding new records' structures, changing platform) or migrated (e.g. adapting to a newer version of a system). In order to work with different types of database, a common platform is needed to do data integration or conversion due to their heterogeneity and platform diversity. This paper presents a computerizedtool, namely FlexiDC, which is implemented using Java programming language to provide a single platform for database conversion. This platform uses Oracle as a working platform that allows records from various formats and types of databases to be integrated and manipulated before producing a single or multiple databases. Noveltiesof this work are column level conversion and flexible changing of data type. Therefore, the cost and time to deal with any database enhancement, migration, integration, conversionand new developmentcan be reduced in order to accommodate with the changing requirements in the existing databases. database conversion, heterogeneous KEYWORDS—Database, database types, platform, FlexiDC, computerized tool. 1.
INTRODUCTION
Computers manage and organize data to facilitate daily work to be more efficient. The method for managing and organizing data can be performed through database applications. Databases are common tools to store, manipulate and retrieve information for personal and business purposes[l]. For example, databases are used to keep track of students' results and registrations in university, payrolls, booking and billing systems in organizations or companies.
new queries transformation.
construction,
database
settings and
Database transformation involves changing engine between one database application to another while maintaining its original structure. One way to solve the matter is by developing a database conversion tool that is able to create a master database [17]. A master database can be created based on construct attributes and entities. Data is imported from different databases that already exist in an organization by using database connection such as Open Database Connectivity (ODBC) and Java Database Connectivity (JDBC). ODBC is Microsoft's strategic interface for accessing data in a heterogeneous environment for relational and non-relational database management systems. It is an open standard application programming interface (API) for accessing a database [28]. JDBC is an application programming interface, introduced by Sun Microsystems, to handle the interaction between programminglanguage and a wide range of databases, [32]. By using ODBC or JDBC statements in developing applications,files can be accessed from different databases, such as Access, DBase, DB2 Ms Excel, Oracle or SQL Server.The purpose of this study is to design software for database conversion. It should be able to import requested attributes, entities and data from different databases of the same or different types such as MS Access, Oracle, Sybase, SQL Server and many other types to master database. By having the master database, end users can perform a specific task such as retrieve and manage data from the master database easily. In fact, the data conversion performed from different databases to the master database will save time, cost and effort of the end-user who spends time on enquires and shifting between database applications. Besides, it saves time in designing a new database system and data entry for new database system.
The rest of this paper is organized as follows: Section 2 presents overview on related work, while the system architectureof the proposed model and expected results are described in Section 3. Section 4 presents system functions. The system testing is discussed in section 5. Section 6
Normally, a database consists of massive amount of data. A database is an integrated collection of logically related records or files consolidated into a common pool that provides data for one or multiple uses [16,27]. It is a common practice for every business to have its own database system that uses any of the database engines available in the
concluded this paper.
market such as Oracle, Ms Access, SQL Server, Sybase. A company might use more than one database management system such as Oracle for payroll system, SQL Server for billing system and Ms Access for data stock system. The company (end user) sometimes needs specific data from more than one database which causes delay and obstacles. A lot of effort is required for obtaining the stored data due to
Il. RELATEDWORK
In recent times, the society depends on data management to achieve their work. Therefore, database applications usually change over time and they depend on user requirementsor company structures and new business units will arise when boundaries between departments change. Due to these reasons, there are many types of database 196
IEEE Intemational Conference on Smart Computing and Electronic Enterprise (ICSCEE2018) 02018
applications in the same firm, especially those database applications that are implemented at different times to
depends on the data integrity where two main complementary components for the database integrity will be used. The second complementary components are the validity of data which ensures that all incorrect information is excluded from the database. In this way, the completeness of data can ensures that all correct information is included in
perform specific tasks. However, when the end-user needs to retrieve a particular data from more than one database
applications, more time and effort are required and sometimes the data cannot be retrieved precisely. There are research works has been focus on database access, database integrity and database management technique.
the database.
The database system will be responsible for data correctness. In other word database management system protects databases against incorrect data that is not expressed in the real world data or business rules (Turkerl& Gertz, 1997). However, during the database design the semantic integrity constraints can be specified by assigning the business rules and database constrains to the database entities which depend on the relationship between the entities (tables) which is called referential integrity.
A. Database Access Database access is a method that allows developers to have the right to use a database. Data in the database can be manipulated through several operations such as retrieve, update and delete and these operations must be performed with rights. In addition, data definition such as creating a new table and deleting unwanted table can also be performed [7,23]. It is also possible to perform data queries by using an application programming interface (API) that acts as the interface between the application and the database. The standardized interfaces namely ODBC, OLE DB, and JDBC
2) Referential Integrity There is more than one entity in a database schema which
is called tables and link between two tables is called relationship. In order to verify the correctness of the relationship between tables, referential integrity can be used. It is a database constraint that ensures the references between data are really valid and intact. Database management system will also ensures that the foreign key value in a table will matches a primary key in the referenced table. For many reasons, the referential integrity is always violated in cases such as tables with similar information, extracted from different sources where databases are integrated.
for database access are available for free for database programmers as well as database administrators. It would include a combination of a common interface, common gateway and common-protocol to be the direct access between clients and servers. Open database connectivity (ODBC) and Java database connectivity (JDBC) are industry standard protocols for connecting directly to tables and reporting views in relational databases [24]. JDBC consists of interfaces and classes written in Java. It is a way to communicate between java databases applications and a database schema. Therefore, data can be manipulated in different types of database by using method calls which have a built-in SQL statements [9]. Furthermore,JDBC allows an application to access a variety of databases and run on any computer platform with Java Virtual Machine. The ODBC API is difficult and hard to master and use as well as it decreases the safety, robustness, and portability of applicationsit develop [10]. In addition to that, it is possible to connect join reporting views/tables across multiple databases. In general, JDBC connections perform somewhat better than ODBC. Therefore, this study uses the JDBC and ODBCbecause the software application can be accessed and retrieved any data from multiple databases in different platforms
There are several advantages to use the referential integrity in database such as improved data quality, faster development, fewer bugs and consistency across applications.Referential integrity relationships are defined by primary key and foreign key during the process of table creation thus the relationships are automatically maintained during load, update, delete and insert operations. However, in the relational database system, referential integrity becomes the main global constraint. "Referential integrity can be violated basically for two reasons: (1) a relation (table) coming from a different source database is integrated into a central database, such as a data warehouse. (2) A database operation at the row level introduces a referential integrity violation when a constraint is disabled or nonexistenf' [30].
B. Database Integrity Data integrity refers to the correctness of the database which means that the data is reliable. It is easy to assign the data integrity constraints to any attribute in the table. The advantage of these constraints is that the DBMS enforces the conditions for every operation on table regardless of the source or method of data entry [38]. There most two is data component shows correctness of the database . integrity validation and referential 1) Validation application in order Data validation is a vital part of any business rules of matches to make sure that the entered data and huge complex selected application. Therefore, in the controlling for service database applications, a powerful Peter, & (Grefen required is semantic correctness of the data the correctness of data 1993; Tandel et al. , 2014). However, 197
Referential integrity also arises in data warehousing, data modeling, data quality assurance and database Integration to achieve the consistency and competence of database C.
Database Management
Techniques
Recently, database management system (DBMS) is improving its performance by developing database applications that concentrates on data management in all businesses. In this study, techniques in data management will be discussed. l) Database Migration Recently, the relational databases are widely used and growing to satisfy user needs and optimizing business processes. Most of the database applications deal with a large database and also new versions of database systems. Therefore, the database applications must meet the users' needs, especially when IT organizations consolidate storage, purchase or lease new storage arrays, or refreshes their
IEEE International Conference on Smart Computing and Electronic Enterprise (ICSCEE2018) 02018
technology. Therefore, it is very important to know what is meant by the database migration and techniques available. This is because, most relational database systems and object database systems have supported method migration and data migration .
Data migration is necessary when a company upgrades its database or system software. Thus, database migration is the upgrading of the database system from one version (the old version) to another version (new version) by moving required volumes of data. Data migration can contain not only records in database but also images, text files, paper documents and spreadsheets [39,401.
There are three migration strategies for the data migration namely (1) data and code conversion, (2) language transformation and (3) data propagation between two database management systems. All three migration strategies
are commonly used by commercial products that can be found on the market [6]. The migration process involves converting data and application.The converting of data has three stages of the migration process: capture the source data,
map or create a relational database model, and migrate the source data [26]. As Figure I shows the migration process can be performed in stages, the first stage is accessing the database source that can be accessed and captured all relationships between the entities and constraints. The second stage, validation between constraintsand referential integrity between the database schemes will be checked before the implementation of the database scheme (destination). The third stage, the logical data from the source scheme to destination schema will be copied. Data conversion process will be involved during the data migration.
2) Database Conversion The performance of the developed software database applications can only become realizable if the database is interacting with end-users. Thus today there are several database applications developed by different vendors in which each of it has different features. Data conversion is a process to convert values for data item from one database to another database to improve productivity and flexibility. Data conversion can also refer to the movement of data from a system or application to a replacement application or subsystem. The conversion process performed between various databases and file formats to covert data from legacy system to a new system [41]. Database conversion is also a process needed to convert from database schema to another database schema.This is to improvethe systemperformance,to get more storage and to use new features in each vendor's products. Therefore, the database must be at the schemaequivalent during data conversion such as all the constraints in the second databases(destination)will be the same as in the first database (source).
The two main important techniques in database conversion process are schema evolution and schema versioning [36, 27]. Schema evolution is the changes that happen on the content of the database. Contrastingly, the schema versioning is creating and developing the new schema by preservingthe data of the old schema. There are two method of data conversion; the first method is the single table version (STV) and the second is the multiple table versions (MTV). Both methods are not managingthe space of databaseschema after conversion due to the redundancy 198
of attributes. Therefore, Wei & Elmasri suggested new concepts called Partial multiple table versioning (PMTV) to solve the problem of managing space in single and multiple tables. It uses the concept of temporal normalization and
reduces the complexityas well as simplifies the database conversion process. "A new attribute is added, it creates a new bi-temporal relation with (only) the new attribute, plus the key attribute of the relation being modified. This way, no null values will be introduced, no searching for the overlapped current versions is needed, no database restructuring and data duplication is required, and no extra effort is needed for the problem of mandatory version creation." [26,271 Database Integration
An integrated database is a database which acts as the data storage for multiple applications in a distributed area or local area. The integrated database combines data across database applications. Integrated databases need a schema that takes all their clients applications into a single physical database management system as data management [31.
However the data sharing between applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time the database were being committed- thus keeping the applications data in a synchronized state.
There are several reasons why many big businesses have spent large amounts of money implementing data warehouses through database integration. The main benefit of using data warehouses is that they store and present information in such a way that it allows business executives to make important decisions and allow them to look at the company as a whole, Another benefit of data integrationis that it can obtain information with good quality and perfect meaning from different locations (distributed databases) between integrated data using semantic relation . There are two general approaches to database integration which are data warehouse approach and the federated database approach. The data warehouse approach emphasizes data translation while the federated approach emphasizes query translation [3, 33]. The warehouse approach transforms data from diverse sources (distributed databases) into a local warehouse that will execute all queries and operations on the local data warehouse rather
than on the distributed source of that data in different places. This approach is to overcome several problems including slow response times and network bottlenecks and also
improve the query efficiency because it can be performed locally.
The federated database approach which is very popular today is being used in the database integration systems [31J. It focuses on the query translationsthat use middlewarethat works at runtime. Federated database is responsiblefor the translation of user query on a single federated schema into queries in local schemas. Therefore the mapping is between the source schema and local federated schema. This process will allow the translation of query between federated schema and source schema. Database integration can also happen through attributes integrationbased on the attributes and entities identification. It uses the informationderived from the attribute values. [51 highlighted that a robust attribute identification method that matches the attribute groups for integration. Furthermore,
IEEE Intemational Conference on Smart Computing and Electrome Entetvrise
18) €2018
fully details for the data instances can be addition, the related attributes to integrate canperformed. In be identified and in cases of database schema misleading can be solved. Ill. SYSITiM ARCHITECITIRE
System architecture is the conceptual design of any system that shows the structure of the system and its components. System architecture of this proposal is illustrated in Figure 1 to show all the important components:
First is user, Users are the people who use computers for personal or business purposes to carry out his business tasks more productively. Therefore, the users in this research are responsibleto browse the databases and select the attributes or entities from different databases which require designing new databases. New databases will be designed based on the previous ones. Second, JDBC is the way of accessing the databases and manipulating data from any platform. It is developed by Sun Microsystems as discussed in the literature review. Third, Database Conversion The process of converting data in this research is the most important practical component of this research. This will be carried out automatically by the system. Therefore, the automatic process is to confirm and maintain relationships and constraints as well as checking the validation of data between the entities in the databases based on the selection of columns and tables. There are five elements to be performed in order for the database conversation process to be successful. They are: l) Data validation is the first step of data conversion. It check for the data validation if there is data in database sources or not. 2) Referential integrity checking for the referential integritybetween the tables. It is match the primary key with foreign key with considering the data type of column. 3) Constraints is checking if the source database contain other constraints such as unique, rule constraints. 4) Creation of database and conversion is creating the target database and automatically convert the data from the source databases to target database. 5) Verification is the final step for checking that all data transferred from source databases to target database. Fourth, Graphical User Interface : is the interface that will interact with the users in order to display the available databases (source databases). It is also the way to access the databases(source or target databases)as well as a method which automatically perform the creation of the target database. Thus, the creation process of target database based on the user request of attributes.
Fig. l. Systemarchitecture
Fifth, Single or Multiple Databases : Single or multiple databases are ones that already exist in the organization.The
databasescan be the same or of different types. It is also known as the source databases. End-users can select tables and columns from those databases. Data transfer will be performed after the new design database is completed. Sixth, Master Database, is the new database (target database) that is prepared in Oracle. This database will be designed automaticallyby the program based on the selected fields and tables chosen by users. IV. SYSTEM FUNCTIONS
There are four main important elements in FlexiDC tool which are Tree table, Temporary table, Direct conversion and Target database management.
A. Tree Table The first important element is the Tree Table. It is automatically appearing when user selecting database source. It is the tree view that represents the database schema. Thus, database schema contains the tables and attributes of the different databases (source database). Figure 2 shovvs all entities and attributes descriptions, The contents of the Tree table are the database type, tables, columns and direct conversion button. The users of FlexiDC can select any attributes from the tree table. The selected attributes are directly transformed to the temporary table. In addition, when the FlexiDC's users do double clicking on the table name, it allows the columns expansion which in selected table. When users do single clicking on the column name, it will copy all the column descriptions to the temporary table.
199
IEEE International Conference on Smart Computing and Electronic Enterprise (ICSCEE2018) 02018
B. Temporary Table This is the second important element of FlexiDC tool. It's called Temporary Table. Because it temporarily holds selected entities from the tree table. It is used to hold the description of the selected columns such as column name, column data type, column size, table name and database to be merged in target database as well as table name and SQL statement. The user can change the target table name as well as edit the SQL statement which is created automatically while the users select the columns from the tree table.
the EMPLOYEE table but with different attributes. Additionally, the payroll database system is developed using Ms Access and vacations database system using Oracle
V Dit•
C, Direct Conversion The third important element in FlexiDC is the Direct Conversion. It can perform full conversion for the tables from different databases (source database) to the target
2
fromE}ZOE
-
database (destination database). Selectedcots from2.PLOEE
D. Target Database Management The fourth important element is the Target Database Management. It gives users flexibility of managing the target database such as show, edit, drop and rename for both table and column. In addition, users can change data type and size of column. For adding new column is not capable.
- Acc."
coluzcs totug
ZIZOEE 10
V.
SYSTEM TESTING
In this section the testing part of system is discussed.The test part of the system is an important part for checking the system is in a practical environment and know the correct and useful results of the system. Therefore, the testing will take place in FlexiDC system to check the functionalityof system. In addition, checking the usability of the system is to get the feedback from the users who had tested the functionality of the system. Thus, in the section it discusses the functionality and usability of FlexiDC.
A. Functionality The FlexiDC has been carefully designed and developed to achieve many useful functions. These functions are used to assist the system users (database administrator and database programmers) to manage, convert, and merge different databases to target database. These functions are: 1) Direct Conversion FlexiDC tool performs direct conversion from different databases (source database) to the target database (destination database). It is the direct transformation for entities from the source database to the target database. The transformation of tables as per user's request who have selected the entities then converts that to the target database.
2) Partially Conversion FlexiDC can accomplish a partial conversion from different databases (source database) to the target database (destination database). The system users can select any attributes from one or more databases (source databases) to be converted to a single target database (destination database). Thus, the system users can do the relationship between the attributes during the process of merging the table(s) attributes. Additionally,the system users can assign and unassign a primary key to one or more selected attributes. For instance, in a shipping company, if there are two databases in the company. The first is for payroll system and the second for vacations system. Both databases contain 200
Fig.2. Merging and converting processes
Figure 2 illustratesthe practical example of how FlexiDC tool merge and convert data from two different databases. Therefore, when the users need to get information from both different database system. FlexiDC can do the partial mergingand convertingprocess as per user's request.
3) Restructuring Target Database FlexiDC can perform target database management. It can
manage the target database such as drop column or table, change data types for columns and change column size. Additionally, it can rename column or table as well as show data of table(s). All these options, gives system users the flexibilities of handling the target database after the process of merging and converting. During target database management, users can do restructuring of tables with an option of holding the content of columns (values).
4) Target Database Description The target database description is to create the master database (Oracle). In this way, the FlexiDC gives the flexibility of creating the table space and to identify the size of target database with an extended option. Additionally, FlexiDC can create user accounts for the target database. The system users can extend the table space as well as change table space. B. Usability In this section of the research, the usability of FlexiDC is discussed. The usability is the important part of the application development. It shows the controlling part of the system and tests the system. Therefore, the FlexiDC has been
used by 20 persons of database users for testing and evaluating the functionality of system. Additionally, they check all functions of the FlexiDC, most of the users prefer
IEEE Intemational
Conference on smart Computing
and ElectronicEnterprise(ICSCEE2018)02018
to perform the database conversion from MS Excel and Ms Access.
The users are divided into three groups of database administrators (DBA), database programmers (DBP) and normal user (NU). The testing and evaluating process are conducted according to the functionality and usability of the
system. It is to know how the system is used in practical environment as well as how it can achieve the objectives in database conversion and merging .The evaluating process has been done depending on the feedback from users by Check list within a questionnaire (See Appendix C).There are six factors in the evaluation process that include Easiness, Interface, Navigating, Impression, Interesting and Recommend. Figure 4.7 shows the rate of testing and evaluating of FlexiDC by the three groups.
2' 30 Fig. 4. Percentage of users who recommendto use FlexiDCin their universities/organization/ institutes/organization
Overall, the result demonstrates that the database administrators and database programmers have positive feedbacks to use the FlexiDC for databas restructuring and data conversion. Furthermore, the normal users who are not familiar with database management techniques give less support in recommending FlexiDC with approximately 70% rate based on the previous five factors.
20
Fig.3. Overall results on FlexiDC usability testing towards three group of users
Fig. 3 shows the evaluation rate of FlexiDC for three groups of users based on five factors; easiness, interface, navigating, impression, and interested. The database administrators (DBA) and database programmers show their interests in FlexiDC since the percentage are nearly reaching
to 100%, while the normal users who seldom use the
database applications and have less knowledge about database conversion, show less interest. Thus, their interests show approximately 55%. However, the normal users who seem impress with the tool have 85% value of impression, nearly close to impression of DBA users and DB programmers for 90%. For the interface design of FlexiDC, the DBA users' rate is 98%, DB programmers 95% and normal users have 78%. These show that the knowledgeable users agree that the graphical user interface design is user friendly and allows interaction from users. Furthermore, the easiness of using flexiDC is almost 100% for DBA users and of DB programmers as this software helps to simplify most 55% their works, while the normal users with approximately works rate do not have the clear idea of how difficult are the of the database administrators. the use A survey on the users' opinion for recommending
conducted. of this software to their organizationsis also Figure 4 illustrates the result of the survey.
VI. CONCLUSIONS In this paper, we have presented a comparative study on database conversion tools and techniques in a database management. All the database conversion tools provide data
sharing,migration and merging within the existing table with similar attributes. Therefore, we propose new system architecture for database conversion which provides flexibilities in the record length and number of attributes within a table. In addition, JDBC connectivity will be used for an open platform. With such approach, data sharing, conversion, migration and integration could possibly be achieved. The added feature of automatic restructuring would ease users in upgrading and expanding their existing database system without having to incur cost in developing a
new databasesystem to as well as coping with the current demand of database management and advances in computer technology. FlexiDC can be used in many organizations. It can help them to manage their different databases. The exact use for this tool to assist the database administrator and database programmers in developing new database applicationbased on the existing databases. In this way, they can merge and convert their data from the existing database
to a new database.Therefore, it saves cost, time and effort for those who relied on different databases apart from individuals who can manage multiple databases. It can merge and convert different databases as an alternative of designing a new system and data entry. Therefore, it is a cost saving system for the database management and placing the high commercial values for this tool if publish on the internet for wide range of users REFERENCES
[l]
Brunette, W., Sundt, M., Dell, N., Chaudhri, R., Breit, N., & Bon-iello,G. (2013, February). Open data kit 2.0: expanding and refining infonnation services for developing regions. In Proceedings of the 14th Workshop on Mobile Computing Systems and Applications (p. 10). ACM.
201
IEEE International Conference on Smart Computing and Electronic Enterprise (ICSCEE2018) 02018 [2] Camara, L, Li, J., Li, R., & Xie, W. (2014). Distortion-free watemarking approach for telational database integrity
Enterprise, Convert [21] Full http://www.spectralcore.com/fullconveW,2010
checking, Mathematical Problems in Engineering,
[22] Grefen P. and Peter M.G. Apers, Integrity control in relational
[31 Cheung K.H., Smith A.K, Yip K.Y.L , Baker C.J.O and Gerstein M.B. , Semantic Web Appmach To DatabaseIntegrationIn Life Sciences, Elsevier Science Publishets B. V. Amsterdam, the
database system, Elsevier Science Publishers B. V. Amsterdam,the Netherlands, the Netherlands, 1993. [23] Hackathorn R.D , Enterprise Database Connectivity, Wiley professional computing, 1993. [241 He, G. L., Wu, S., & Yao, J. P. (2013, April). Application of design pattern in the JDBC programming. In Computer Science & Education
Netherlands, 2006. (41 Calvé A.L, Savoy J., Database Merging Sttutegy Based On Logistic Regtession, Pergamon Press, Inc. Tanytown, NY, USA, 2000.
Chua E.H, Chiang H.L, Lim E.P, Instance-based attribute identification in database integration, Springer-Verlag New York,
(ICCSE), 2013 8th International Conference on (pp. 1037-1040). IEEE.
Inc. Secaucus, NJ, USA, 2003. Chatterjee, P., Narayanan, As, Ranganathan, L, & Enoch, S. (2013),
[25] IBM , IBM i Access, Retrived 20 july 2010, 2005, from http://www-
03ribm.com/systems/i/software/access/windows/odbc/index.html
U.S. Patent No. 8,370,597. Washington, DC: U.S. Patent and Trademark Oflice. [7]
spectralcore
[26] Lin C.Y, Migrating to Relational Systems: Problems, Methods, and
Strategies, Contemporary Management Research, 2008.
Chen, T. H., Shang, W., Hassan, A. E., Nasser, Ms, & Flora, P. (2016,
[27]
May). Detecting problems in the databaseaccess code of large scale systems: an industrtal experience report. In Proceedings of the 38th Intemational Conference on Software Engineering Companion (pp.
McFadden F.R
and Hoffer J.A, Modern database
Oracle Edition, Prentice Hall, 1999. [28]
Database Content Compare and Merge Tool, Altova, Retrieved 4 June 2016, 2016, from http://www.altova.com/databasespy/database-
[29] McFadden
F.R,
Database Connectivity (ODBC),
/en-us/11brary/ms710252 (VS.85).aspx,
Hoffer
J.A,
Database
Management,
Benjamin/Cumming Publish Company, 1985.
compare-tool.html.
[30]
Dietrich S.W, Urban S.D and Kyriakides I, JDBC Demonstration CoursewareUsing Servlets and Java Server Pages, ACM New York,
MSDN, Microsoft Open http://msdn.microsoft.com 2010.
71-80). ACM.
Ordonez C. and Garcia J.G, Referential integrity quality metrics, Elsevier Science Publishers B. V. Amsterdam, the Netherlands, the Netherlands, 2008.
NY, USA, 2002.
Parent C. and Spaccapietra S., Issues and Approachesof Database
[10] Database converter, Sanmaxi, Retrieved 6 April 2016, 2016, from http://www.databaseconversionsoftware.com/ Database convener, DRPU, Retrieved 25 June 2016, 2016, from
Integration, ACM New York, NY, USA, 1998. [32] Sun Microsystems Inc., JDBCTM 2.1 API, Seth White and Mark Hapner, 1999.
http://www.dmusoftware.com/drpusoft/mysql-mssql-converter.html.
[33] Shao-hua, H. A. 0., & Xie, H. A. N. (2010). Heterogeneous relational
[12] Data Sync, Spectralcore, Retrieved 25 june 2016, 2016, from
database integration model based on XML [J]. Computer Engineering and Design, 24, 034. [34] Tandel, S., Money, C. B., De, J. G. D. C. E., Eichelberger, R. A, & Malheirwa, H. G. (2013). U.S. Patent Application No. 15/037,341.
http://www.spectmlcore.com /datasync/. [13] DBConve1t Product Line, DMSoft Technologies, Retrieved 25 June 216, 216, from http://www.dmsofttech.com/projects-dbconvert.html [14] Database Conversion Tool, Pro Data Doctor, Retrieved 25 June from 2016, 216,
[35] Wick M., Rohanimanesh K.and Schult K., A Unified Approachfor
Schema Matching, Coreference and Canonicalization, KDD'08
http://www.prodatadoctor.com/prodata/databaseconversion-tool.html
Proceeding of the 14th ACM SIGKDD intemational conference on Knowledge discovery and data mining, 2008.
[15] Developers book, JDBC Architecture, Retrieved 30 June 2016, 2016,
[36] Wei
from http://www.developersbook.com/jdbc/interviewquestions/]dbc-interview-questions-faqs.php.
[16] Flu, Z. L., Park, C. A., wu, X. L., & Reecy,J. M. (2013).Animal QTLdb: an Improved database tool for livestock animal QTL/association data dissemination in the post-genome era. Nucleic acids research, 41(DI), D871-D879. Hamngton, J. L. (2016). Relational database design and implementation. Morgan Kaufmann.
H.C and Elmasri R., Schema versioning and database conversion techniques for bi-temporal databases, Kluwer Academic Publishers Hingham, MA, USA, 2001. [37] Elmasri, R., & Navathe, S. (2011). Database systems (Vol. 9). Pearson Education.
[38] Davie, B., Koponen, T., Pettit, J., Pfaff, B., Casado, M., Gude, N.
& Chanda, A. (2017). A database approach to sdn control plane design. ACM SIGCOMM Computer Communication Review, 47(1),
[18] Data migration tool 5.7, Retrieved 30 June 2016, 2016, SwisSQL, from
15-26.
[39] Sinha, S. (2017). Database Migration. In Beginmng Laravel (pp. 49-
52). Apress, Berkeley, CA
[19] E.F. Codd., A relational Model of Data for Large Shared Data Banks, Communications of the ACM 13, 1970. [20] E.F. Codd, relational model for database management: version 2,
[40] Mamertino, M., & Sinclair, T. M. (2018). Migration and Online Job
Search: a Gravity Model Approach.
[41] Akhtar, A., Abdelsalam, M., & Nick, R. (2017). Relational Database
Addison-Wesley Longman Publishing Co., Inc. Boston, MA, USA,
Migration: A Perspective.
1990.
202