Conceptual Crowdsourcing Models for E-Learning

3 downloads 13570 Views 257KB Size Report
generic issue in the case of a free theme. ... Act of the provider defining a theme to be passed as ... Course Builder is an online crowdsourcing course engine,.
Conceptual Crowdsourcing Models for E-Learning Carlos Eduardo Barbosa1,2, Vanessa Janni Epelbaum1, Marcio Antelio1,3, Jonice Oliveira4, José A. Rodrigues NT.1, Sergio Palma J. Medeiros1, Jano Moreira de Souza1 1

COPPE - Graduate School and Research in Engineering Universidade Federal do Rio de Janeiro (UFRJ) Rio de Janeiro, Brazil {eduardo, epelbaum, marcioantelio, rneto, palma, jano}@cos.ufrj.br 2

CASNAV – Center for Naval Systems Analyses Brazilian Navy Rio de Janeiro, Brazil

Abstract: Crowdsourcing tools have improved e-learning over the last few years. However, these tools vary in many aspects. This work discusses and analyzes the basic e-learning processes based on these tools, defining ways to classify them through these processes. Finally, we define conceptual models tailored to each type of process, using a standard notation for software development. These conceptual models may be used to build new e-learning crowdsourcing tools. Keywords: collaborative, learning, distance education.

I.

INTRODUCTION

Crowdsourcing is focused on improving the connectivity and collaboration of people, organizations, and societies. According to [1], crowdsourcing is the outsourcing of a function that was previously performed by employees for an undefined crowd of people in the form of an open call. It is an online production model that uses the crowd as a source of information, relying on the concept of the wisdom of crowds [2]. Wisdom of crowds is based on the fact that the success of a solution depends on how big the group is, since, under certain circumstances, the group as a whole is often much more intelligent than the most intelligent person in the group [3]. The most common forms of work in crowdsourcing are: work groups, institutions, communities, industries, governments, and global societies. Crowdsourcing does not necessarily offer the best educational experience; however, it is a natural way of learning [4]. There are three reasons that support the research of crowdsourcing: crowdsourcing techniques are needed to provide quality education in certain fields, the existing techniques are ready to be applied, and online education is a new, relatively unexplored experience [4]. One application of crowdsourcing is in the distance learning field. Crowdsourcing tools allow educators to reach a scale previously unimaginable. Several methodologies have been created by different types of educators, each with its specificity. The Massive Open Online Course (MOOC) [5] pertains to this group of tools, but it is only a subset of them. Although there are a few open source initiatives in the area, there are no conceptual models to guide the creation of new

3

CEFET-RJ - Federal Center for Technological Education Celso Suckow da Fonseca Rio de Janeiro, Brazil 4

Department of Computing Science (DCC/IM) Universidade Federal do Rio de Janeiro (UFRJ) Rio de Janeiro, Brazil [email protected] tools. Even the processes behind theses tools are not standardized. Our goal is to create standard processes and models for crowdsourcing tools applied to e-learning. These models allow a quick start for developing a new e-learning crowdsourcing tool. II.

CROWDSOURCING E-LEARNING PROCESSES

There are several crowdsourcing tools available on the Internet, each with its own objectives and specific models. Our goal is to design generic process models for crowdsourcing tools for distance learning purposes. This work analyzed tools with very different focuses, ranging from traditional tools for online courses to tools for creation and evaluation of educational video playlists, and included: Connexions [6], Course Hero [7], Coursera [8], Curriki [9], edX [10], Khan Academy [11], Knowmia [12], Lynda [13], MathTV [14], MentorMob [15], Minerva Project [16], MIT OpenCourseWare [17], Mozit [18], Open.Michigan [19], Saylor [20], Sophia [21], Udacity [22], Udemy [23], University of the People [24], Venture Lab [25], Youtube EDU [26], and Wikiversity [27]. In a previous work [28], we classified the same e-learning crowdsourcing tools using 11 dimensions. This classification was used as a basis to define the types of crowdsourcing elearning. To create a crowdsourcing e-learning framework, it is necessary to formalize the roles of the users in a crowdsourcing tool, to classify the kinds of crowdsourcing elearning processes, and to define a generic framework for crowdsourcing e-learning. A. Roles in Crowdsourcing Vuković [29] defined a set of key roles and operations for a common crowdsourcing process with two basic roles: crowdsourcing requestor and the crowdsourcing provider. The requestor submits the task request, defines the acceptance criteria, and is responsible for paying the responding providers. The provider is responsible for executing the crowdsourcing tasks in exchange for money. The crowdsourcing platform is a trusted broker, ensuring payment to providers. In [28] we did a deeper analysis of these roles applied to this context.

B. Types of Crowdsourcing E-learning With the differences in purpose and methodology of crowdsourcing tools, it was possible to observe the existence of two groups of tools: traditional crowdsourcing e-learning tools, which are modified distance learning tools for working with crowds, designed to teach in course-like format, and providing a certification of conclusion; and peer-learning tools, which are platforms designed to enable anyone to share knowledge in tutorial-like format, providing knowledge collaboratively, however, without concerns about certification. This classification is the evolution of the work done in [28], which already showed these two groups of crowdsourcing e-learning tools, but did not assign a name to them. 1) Traditional Crowdsourcing e-learning The crowdsourcing tools focused on e-learning started as a natural extension of traditional distance education courses. These massive online courses use crowdsourcing methodology to multiply their reach. What distinguishes a crowdsourcing elearning tool from a distance education tool is the class size — the maximum size varies. Tools like Coursera have classes with thousands of students [30], while a traditional distance education course usually splits its class if it exceeds a hundred students. In order to provide scalability [31] to the course, the tools were modified to allow peer evaluation, in which a student assessment is evaluated by several peers. To encourage student participation in these peer evaluations, the participation of the student is under a quantitative evaluation. This means that peer evaluation is a course activity and must be done in order to achieve a perfect score. The revisions are not in a one to one proportion, since there are abstentions, and the end result is the average of peer reviews. Another common way to evaluate students is via multiple choice tests. It is important that peer evaluations provide complete feedback to the student, since the activities evaluated by peers are subjective, and nothing impedes the reviewer from doing a biased evaluation. The feedback helps students to understand how the activity was evaluated and they can even disagree with the peer opinion. This feedback is often corrective, but a careful student is free to provide the other types of feedback. Even if you disagree, your rating is not reviewed, which justifies the peer evaluation ratio being one to many. The tools that follow this classification are: Coursera, Curriki, edX, Minerva Project, MIT OpenCourseWare, Open.Michigan, Saylor, Udacity, University of People, and Venture Lab.

transmit knowledge, in less formal ways, were classified as peer-learning. Peer-learning tools are classified this way because their users can also have the role of requestor or provider, depending only on the content created or accessed. It is important to not confuse with crowdsourcing tools for elearning, by only by using peer revisions, since the revision is the only aspect in which the requestor replaces the provider. In the case of peer-learning, sometimes you are a provider, from start to finish of the creation of content, and sometimes you are a requestor, from the beginning to the end of accessing content. Peer-learning tools are made to counting on both experts and non-experts to create content (such as tutorials). Thus, the requestor’s evaluation ranks the content, which facilitates access to popular content. The review of the content is responsible for providing the feedback to the provider. This feedback is often reinforcing or suggestive, but users are free to provide corrective or didactic feedback. In this classification, we can see some differences in approaches: Knowmia is moving towards traditional crowdsourcing e-learning, with a tool for assignments, and Mozit invested in providing a complete marketplace. Peer-learning tools are usually free, do not offer certificates, and use specifically-sorted videos to pass content to the requestor. The tools that follow this classification are: Connexions, Course Hero, Khan Academy, Knowmia, Lynda, MathTV, MentorMob, Mozit, Sophia, Udemy, Wikiversity, and Youtube EDU. C. The E-Learning Framework Analyzing the generic process models, we propose an abstraction model for the e-learning framework, which is shown in Fig 1. The steps indicate the order of events for the transmission of knowledge in the tools. This framework can be applied to many types of knowledge transfer beyond the scope of e-learning. The following framework steps are detailed: • Preparation: step responsible for the creation of knowledge. • Subscription: step responsible for the selection of the knowledge to be acquired. • Learning: step responsible for the transfer of knowledge. • Evaluation: step responsible for the evaluation of the received knowledge. • Conclusion: step responsible for the end of the transfer of the knowledge, as well as certification.

2) Peer-learning Crowdsourcing tools that rely on the crowd to create and

Fig 1. Proposed e-learning framework.

peer-Learning

e-Learning Provider

Crowdsourcing

Select Provider

Define Scope

Requestor

Provider

Crowdsourcing

Define Scope Create Lectures

Create Evaluation

Search Content List Contents

Select Material

Select Material

Select Content

Condense Material

Access Content

Condense Material

Open Course Publish Content or Activity

Enroll in Course Access Content or Activity

Create Content

Review Content

Publish Content Activity Review

Peer Review

Automatic Review

Receive Feedback No

Finished Course? Yes

Compute Grades Generate Certificates

Publish Grades

Validate College Credits

Yes

View Grades

Course has a University partnership? No

Fig 2. Traditional generic e-learning process.

III.

Requestor

Ask for Course

Select Courses

CONCEPTUAL CROWDSOURCING E-LEARNING MODELS

Based on the concepts presented in the previous section, we propose conceptual models which represent the tools and serve as guidelines for the construction of new ones. The models were made using a UML diagram [32], which is the most common abstraction model for software design used in the business community. The UML is a simple and standardized notation for describing object-oriented models. It has the support of text books, journal papers and other publication resources, and software tools. Although there are more than a dozen diagrams, we focused on three primary UML diagrams (class diagram, use case diagram, and activity diagram) and on the two main actors (Provider and Requestor) of the process. The Crowsourcing actor represents the actions made automatically by the tool or manually by their managers.

Fig 3. Generic peer-learning process.

A. Class Diagram The class diagram is the most important UML diagram — other diagrams are drawn based on this diagram. The class diagram is at the core of the architecture and it is an important tool for documenting a system. A class diagram presents a set of classes, interfaces, and collaborations, as well as their relationships. The class diagram is used to model a static view of the system design. It involves modeling the system's vocabulary, collaborations, or modeling schemes [32]. In this work, we built separated class diagrams to model e-learning and peer-learning. 1) E-learning The class diagram was built to support the traditional crowdsourcing e-learning process – shown in Fig 2. The model classes and the diagram are presented in TABLE I and Fig 4. TABLE I. CLASS DESCRIPTION FOR TRADITIONAL CROWDSOURCING ELEARNING TOOLS. Class Responsibility Register the course being taught, with its topic, start date, and Class duration. Specific lecture, having an issue that is a subset of the topic of Lecture the course. Activity to be performed by students, which is related to a Activity specific class, with a defined deadline and concepts. It is divided into one or more tasks. Compose activities in minimum units, thus allowing the Task correction/evaluation of students' answers. Answer Store a specific response to a task. Evaluation Store the evaluation result of the answer of a task. Relate an answer to an evaluation, including feedback from Feedback the evaluator about the evaluation. Benefit Define a benefit to the user to complete a course. Store the information of the professor who teaches / publishes Provider the course contents. Requestor Store the information of the students in the course. Team Assemble groups of students to accomplish activities. Executor Allow the alternation of students or groups to perform tasks. Register the communication between the involved in the Communication course.

pkg

Communication - Topic : String - Comment : String - TimeStamp : Date

0..*

0..* 0..* 0..1

0..1

0..1

Class

Requestor

Team - Name : String

- Name : String 0..1

0..*

1

1

0..1

0..1

1

- Theme : String - Duration : int - StartDate : Date 1

Task

1

1..*

1

1

1 Answer

- Progress : float 1

0..1

0..* Activity

- Type : int - Answer : String

1

Benefit - Type : int - Quantity : String

- Topic : String

1

Executor

0..1

Lecture

- Type : int - Description : String - MaxScore : int

0..1

Provider - Name : String

1..*

0..1

0..*

- Deadline : Date - TotalScore : int

1 Feedback Feedback

- Comment : String

0..* Evaluation 0..*

- Type : int - Score : int

Fig 4. Class diagram for traditional generic crowdsourcing e-learning tool.

2) Peer-learning Similarly, we defined a class diagram to support the peerlearning process, which is described in Fig 3. The model classes and the diagram are presented in TABLE II and Fig 5. TABLE II. CLASS DESCRIPTION FOR PEER-LEARNING TOOLS. Class Lecture Provider Requestor Evaluation

Responsibility A specific lecture, having a free issue. Relate to the user who created and posted the content. Relate to users who accessed the content. Store the result of the content evaluated by a Requestor.

pkg Requestor

Lecture

- Name : String

Provider

- Subject : String 0..*

- Name : String 1

1

1

1

1

0..* 1

Evaluation - Score : int

Fig 5. Class diagram for generic peer-learning crowdsourcing tool.

B. Use Case Diagram According to [32], use case diagrams are essential for modeling the behavior of a system, subsystem, or a class. The use case diagram shows a set of use cases and actors and their relationships. It involves modeling the context of a system. Use case diagrams are important for visualizing, specifying, and documenting the behavior of an element. The use case diagram of the proposed conceptual model is shown in Fig 6. uc

Create Course

C. Activity Diagrams Activity diagrams are one of the diagrams in the UML used to model the dynamic aspects of systems. An activity diagram is essentially a flowchart showing the control flow from activity to activity [32]. It involves modeling the sequence and parallelism of the steps of a computational process. We defined two activity diagrams, one from the perspective of the requestor (Fig 7) and another from the perspective of the provider (Fig 8). The activities from the requestor’s and the provider’s perspective are listed in TABLE IV and TABLE V, respectively.

Enroll

act Requestor

Create Lecture

Access Content

Requestor

Peer Evaluation

Create Assignment

Publish Content Provider

Seek Knowledge

Evaluation Receive Feedback

Request Unavailable Knowledge

Compute Grades

Certify

Fig 6. Use case diagram. Subscribe

The use cases from the perspective of the requestor and provider are listed with their descriptions in TABLE III. TABLE III. USE CASE DESCRIPTION. Use Case Create course

Perspective

Provider

Description Optional Creation of a course type, specifying the Yes theme, start date, and duration. A package with the explanation of a subset of the topic of a course, or a Yes generic issue in the case of a free theme. The assignment related to one or more lectures. The evaluation may (but not No necessarily) be done by peers. The registration of the requestor in a selected learning activity, whether it is a No course or a crowdsourcing system. Publish one or more lectures. It is the No main part of an e-learning system. Action of accessing learning content and consuming it. It is the main part of an eNo learning crowdsourcing system. Peer evaluation. This may represent the review of an activity or a suggestion for Yes an improved answer. Generates feedback to the requestor. Evaluate the requestor’s knowledge of a given subject. Generates feedback to the Yes requestor. Send the peer reviews back to the Yes requestor. Compute the final grades for the course. Yes

Provider

Generate certifications for the course.

Provider

Create lecture

Provider

Create assignment

Provider

Enroll

Requestor

Publish content

Provider

Access content

Requestor

Peer evaluation

Requestor

Evaluation

Provider

Receive feedback Compute grades Certify

Requestor

Yes

Leveling Knowledge

Receive Knowledge

Evaluate Knowledge

Finish Course

Fig 7. Requestor’s activity diagram.

TABLE IV. DESCRIPTION OF THE ACTIVITY DIAGRAM (REQUESTOR). Activity

Description Act of the requestor seeking to acquire knowledge for a certain subject. Act of asking for the future inclusion of knowledge that the requestor is seeking but is currently unavailable. Act of requestor committing himself to acquiring certain knowledge. Act of requestor gaining knowledge, either through a class with audiovisual resources or by reading texts. Act of the requestor quantifying how much knowledge he was able to absorb. Act of the requestor formalizing the completion of the knowledge transfer. This only happens when a subscription is performed.

Seek knowledge Request unavailable knowledge Subscribe Receive knowledge Evaluate knowledge Finish course

act Provider

Idealize Learning

Define Scope

TABLE V. DESCRIPTION OF THE ACTIVITY DIAGRAM (PROVIDER). Activity

Description Act of the provider defining a theme to be passed as knowledge. Act of the provider defining a level of theme depth in accordance with the time available to pass the knowledge, Define scope the media chosen, and the level of background knowledge expected of requestors. Select original Act of the provider searching for and selecting knowledge knowledge sources for use. Process Act of the provider processing the knowledge sources and knowledge combining the best available resources. Packing Act of the provider splitting the processed knowledge into knowledge pieces that can be passed individually or in small groups. Create Act of the provider creating evaluations from a set of evaluations knowledge packages. Publish Act of the provider passing knowledge packages and their knowledge evaluations. Act of the provider rating the knowledge level of requestors from their evaluations. To achieve the goal, the provider Evaluation may use peer evaluation, automated tools, or make correction sampling revisions. Act of the provider publishing the average rating of the Publish results requestors. Act of the provider certifying the requestor as possessing Certify specific knowledge that is the theme of the learning.

Idealize learning

IV.

DISCUSSION

This work analyzed most of the crowdsourcing tools available online. We didn’t find any study providing models for elaborating new crowdsourcing e-learning tools which could be used by others. At this time, there is no evidence that the current conceptual models for crowdsourcing e-learning tools will merge in the near future. Including new tools in the proposed models could result in the creation of new models and the alteration of the current ones, since they will be expanded in the future. In the worst-case scenario, some current models may be broken into two or more.

Select Original Knowledge

Process Knowledge

Packing Knowledge

Create Evaluations

Publish Knowledge

Evaluations Correction

Publish Results

Certify

Fig 8. Provider’s activity diagram.

This work has a classification limitation: some of the paid tools were harder to analyze, mostly because they are restricted to their students and do not provide official information about their methodology. These tools were classified in a speculative way. However, this does not affect the generic processes and the creation of the conceptual models. Some crowdsourcing tools in the e-learning context were analyzed and considered to be beyond the scope of this work; for example, Google’s Course Builder [33], XBlock SDK [34], Duolingo [35], Cerberus Game [36], and Spectral Game [37]. Course Builder is an online crowdsourcing course engine, which allows teachers to publish their course material, including lessons, student activities, and assessments [33]. XBlock is a component architecture, which may be used to create new course components — called XBlocks — that are seamlessly combined with other components within an online course. XBlocks may use data from a variety of sources — they may use the traditional text and video formats and may connect to sophisticated environments and online laboratories [34]. Both tools are within the scope of this work because they are engines; that is, they may be used to build new MOOCs.

Duolingo offers free language education. It uses a leveling system to evaluate students and match their exercises according to their level. As the student does a series of exercises, he gains experience and may unlock subsequent levels. The exercises include listening comprehension, speaking, translation, and vocabulary. The student may translate sections of web pages, review translations, and suggest a better translation [35]. Duolingo is beyond the scope of this work because it is a very specific tool which cannot be grouped with others at this time. The Cerberus Game and the Spectral Game are both games with a purpose (GWAP) [38]. The first is focused on Mars’ surface study. The second is about helping organize spectra files submitted to ChemSpider [39], identifying missed, low quality, and incorrect spectra files. They both use crowdsourcing techniques and enable some learning, but their goal is to use humans to compute large amounts of data. Each game has a specific model and cannot be grouped together, so we considered GWAP to be beyond the scope of this work. V.

CONCLUSIONS

This work presented an attempt to unify different designs of e-learning tools which utilize the concept of crowdsourcing. After an analysis, the crowdsourcing tools were separated into two main groups: traditional crowdsourcing e-learning, and peer-learning. Each group was analyzed and we presented process models suitable for crowdsourcing e-learning, as well as standardized models for software development that follow the UML notation. However, some tools are too domainspecific to be part of a group or to form a new group; for example, Duolingo [35], the Cerberus Game [36], and the Spectral Game [37], and so they were considered to be beyond the scope of this work. New groups may emerge in further work, especially from GWAPs. ACKNOWLEDGMENTS We would like to thank CAPES, CNPq, and CASNAV for their financial support of this work.

[9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28]

[29] [30] [31] [32]

REFERENCES [1] [2] [3] [4] [5] [6] [7] [8]

Y. Zhao e Q. Zhu, “Evaluation on crowdsourcing research: Current status and future direction”, Inf. Syst. Front., abr. 2012. J. Surowiecki, The wisdom of crowds. Anchor, 2005. D. C. Brabham, “Crowdsourcing as a Model for Problem Solving An Introduction and Cases”, Converg. Int. J. Res. New Media Technol., vol. 14, no 1, p. 75–90, jan. 2008. D. S. Weld, E. Adar, L. Chilton, R. Hoffmann, e E. Horvitz, “Personalized Online Education—A Crowdsourcing Challenge”, 2012. L. Pappano, “The Year of the MOOC”, N. Y. Times, vol. 4, 2012. “Connexions”, 2013. [Online]. Available at: http://cnx.org/. [Acessado: 31-mar-2013]. “Course Hero”, 2013. [Online]. Available at: http://www.coursehero.com/. [Acessado: 31-mar-2013]. “Coursera”, Coursera, 2013. [Online]. Available at: https://www.coursera.org/. [Acessado: 31-mar-2013].

[33] [34]

[35] [36] [37] [38] [39]

“Curriki”, 2013. [Online]. Available at: http://www.curriki.org/. [Acessado: 31-mar-2013]. “edX”, 2013. [Online]. Available at: https://www.edx.org/. [Acessado: 31-mar-2013]. “Khan Academy”, 2013. [Online]. Available at: https://www.khanacademy.org/. [Acessado: 31-mar-2013]. “Knowmia”, 2013. [Online]. Available at: http://www.knowmia.com. [Acessado: 31-mar-2013]. “lynda.com”, 2013. [Online]. Available at: http://www.lynda.com/. [Acessado: 31-mar-2013]. “MathTV”, 2013. [Online]. Available at: http://www.mathtv.com/. [Acessado: 06-maio-2013]. “MentorMob”, 2013. [Online]. Available at: http://www.mentormob.com/. [Acessado: 31-mar-2013]. “The Minerva Project”, 2013. [Online]. Available at: http://www.minervaproject.com/. [Acessado: 31-mar-2013]. “MIT OpenCourseWare”, 2013. [Online]. Available at: http://ocw.mit.edu/. [Acessado: 31-mar-2013]. “Mozit”, 2013. [Online]. Available at: http://mozit.tv/. [Acessado: 31mar-2013]. “Open.Michigan”, 2013. [Online]. Available at: http://open.umich.edu/. [Acessado: 31-mar-2013]. “Saylor.Org”, 2013. [Online]. Available at: http://www.saylor.org/. “Sophia Learning”, 2013. [Online]. Available at: http://www.sophia.org/. [Acessado: 31-mar-2013]. “Udacity”, 2013. [Online]. Available at: https://www.udacity.com/. [Acessado: 31-mar-2013]. “Udemy”, 2013. [Online]. Available at: https://www.udemy.com/. [Acessado: 31-mar-2013]. “University of the People”, 2013. [Online]. Available at: http://www.uopeople.org/. “Venture Lab”, 2013. [Online]. Available at: https://venture-lab.org/. [Acessado: 31-mar-2013]. “YouTube EDU”, 2013. [Online]. Available at: http://www.youtube.com/education. “Wikiversity”, 2013. [Online]. Available at: http://en.wikiversity.org/. [Acessado: 31-mar-2013]. C. E. Barbosa, V. J. Epelbaum, M. Antelio, J. Oliveira, e J. M. de Souza, “Crowdsourcing Environments in e-Learning Scenario: A classification based on educational and collaboration criteria”, in Systems, Man, and Cybernetics (SMC), 2013 IEEE International Conference on, 2013, p. 687–692. M. Vukovic, “Crowdsourcing for Enterprises”, 2009, p. 686–692. C. Severance, “Teaching the World: Daphne Koller and Coursera”, Computer, vol. 45, no 8, p. 8 –9, ago. 2012. A. B. Bondi, “Characteristics of scalability and their impact on performance”, in Proceedings of the 2nd international workshop on Software and performance, New York, NY, USA, 2000, p. 195–203. G. Booch, I. Jacobson, e J. Rumbaugh, The Unified Modeling Language User Guide. Addison Wesley, 1999. “Course Builder”, 2013. [Online]. Available at: https://code.google.com/p/course-builder/. [Acessado: 06-maio-2013]. edX, “edX Takes First Step toward Open Source Vision by Releasing XBlock SDK”, 14-mar-2013. [Online]. Available at: https://www.edx.org/press/xblock-announcement. [Acessado: 16-maio2013]. “Duolingo”, 2013. [Online]. Available at: http://duolingo.com/. [Acessado: 31-mar-2013]. “Cerberus Game”, 2013. [Online]. Available at: http://www.cerberusgame.com/. [Acessado: 31-mar-2013]. “The Spectral Game”, 2013. [Online]. Available at: http://www.spectralgame.com/. [Acessado: 31-mar-2013]. L.-J. Chen, B.-C. Wang, e K.-T. Chen, “The design of puzzle selection strategies for GWAP systems”, Concurr. Comput. Pract. Exp., vol. 22, no 7, p. 890–908, 2010. “ChemSpider”, 2013. [Online]. Available at: http://cs.m.chemspider.com/. [Acessado: 16-maio-2013].

Suggest Documents