2011 Panhellenic Conference on Informatics
Touch Your Program With Hands Qualities in Tangible Programming Tools for Novice Theodosios Sapounidis
Stavros Demetriadis
Department of Electronics Technological Educational Institute of Thessaloniki Thessaloniki, Greece
[email protected]
Department of Informatics Aristotle University of Thessaloniki Thessaloniki, Greece
[email protected]
learning of dynamic concepts [14], and three dimensional modeling [15]. One of the scientific areas that shows high interest for the tangible user interfaces is applications in learning programming [16,17,18]. The systems that were presented at times and aimed at the programming can be divided into two big categories: the active systems [19] and the passive systems [20]. In the active-systems category, systems with embedded electronic circuits are involved, whereas the operation of the passive systems is based on image recognition. Despite the various designing efforts, it seems that during the learning process in a real classroom environment, the advantages of using a tangible system have not been proved with clarity [21,22,23]. Moreover, the features of tangible user interfaces that make tangible systems more useful and valuable have not been defined yet [24,25,26].
Abstract—This paper presents two tangible programming tools, which are designed to make programming concepts more accessible to novice programmers of all ages. Our design approach consists of two tangible environments with different features. The first system programs a virtual butterfly in a maze and the second is dedicated to the programming of a Lego NXT robot. Both approaches are presented and their function is analyzed in short. Moreover, the paper provides information about the advantages and innovations of the design approach. Finally, we compile a list of open questions for future research in the domain of tangible programming interfaces. Keywords-Educational technology; Educational Tangible Programming; Programming; Education;
I.
robot;
INTRODUCTION
III.
The recent research on tangible interfaces, as defined by Ishii and Ulmer [1], provided great opportunities for the innovative application of technology in classroom [2,3]. A field that seems to have benefited from this kind of technology is that of the tangible programming environments that apply to education. A tangible programming interface may have the same use results with a text-based or visualbased language. The particularity of such environments lies on the fact that real objects of the natural world are used instead of visual objects or written commands [4,5,6]. Radia Perlman, a researcher at the MIT media lab, in the late ‘70s found out that most children under the age of 11 14, were not ready yet to start programming in the traditional way, that is, by typing a programming code (Logo code) on a computer with a keyboard [7]. The children encountered severe difficulties not only in writing the code, but also in using the interface of the programming environment. Perlman began to design interfaces that would allow even preschool children to learn how to program a turtle. The outcome of those efforts was the Tortis-Slot machine [8], the first interface of this kind. Since then various design suggestions on tangible programming interfaces have followed. II.
Aiming at contributing to the innovation of designing tangible programming languages we used microcontrollers and at the same time we embedded in our systems some features that have never been exploited in the past. The incorporation of features such as weight, shape, interaction with the user on the interface and so on can significantly contributed to the dialogue about which are the features of tangible systems that can be embedded in order to add advantages in learning programming concepts with tangibles [27,28,29]. We created, T_Butterfly and T_ProRob, two tangible programming systems with new features that can be used in a complementary manner. Using the T_Butterfly the users program in order to lead a butterfly in a Virtual maze. On the other hand, using the T_ProRob system the users program a Lego NXT Robot. Each system is an active tangible programming language based on tangible command blocks. Using these command blocks, novice users over the age of 4, are able to create a program [27,16]. In any case, the users do not write any commands with the keyboard or use the mouse. It is not even necessary to become familiar with them, increasing their cognitive load [30].
BACKGROUND
A. T_Butterfly This is our first version of a tangible programming language that aims at giving young children the opportunity to take their first steps in programming through its simplicity [7,31,21]. The system involves cubic commands sized 8.5x2
The field of tangible user interfaces has offered systems that were used in various applications at times [9,8,10]. The majority of these systems was designed for novice users and was applied in fields, such as music [11,12], math [13], 978-0-7695-4389-5/11 $26.00 © 2011 IEEE DOI 10.1109/PCI.2011.5
DESIGN OF THE PROGRAMMING SPACE
363
•
cm and 5x5cm. The user connects the cubic commands on a master-box basis and by pushing a push button he/she sees a butterfly in the screen ‘flying’ following the commands. The butterfly always begins its ‘flight’ from the starting point, its house, aiming at reaching a flower. The master-box basis is connected with the computer via a cable RS232. Regarding the data transfer between the command blocks, this occurs without using a connection to a common data bus. It is worth mentioning that at the same time many blocks can communicate with each other. The construction of command blocks is based on a low-cost microcontroller, the 16F628 of the Microchip Technology Inc. and an equally low –cost connector. In the following Photo 1 we can see the creation of a simple program which consists of four commands.
• • • • •
The shape which is formed in the real environment by the boxes is similar to the route that the butterfly will follow so as to achieve its goal. The sizes of the commands can be easily grabbed from young child’s palm [32]. It can be easily used in a real educational classroom environment. The educator can easily make the labyrinth with simple clicks. Batteries are not required, so the system autonomy is not limited. It is not required stable operation conditions. Also, the users are not required to use any particular surface in order to form their programme.
B. T_ProRob The T_ProRob system consists of 28 cubic commands and 16 smaller cubic parameters. The users of this system can order the cubic commands and program the NXT to run the sequence of commands that have been formed by the cubes pushing just one button. An indicative program is shown in photo 3
Photo 1
Creation of a program with T_Butterfly
The sequence of the commands is one step forward, turn left, one step forward, a double step forward. What the user will see on the screen is what is shown in the Photo 2.
Photo 3
Creation of a program with T_ProRob
In this program the Lego’s NXT robot will run the sequence three times: • 2 steps forward, • delay, • make a sound. When the loop has been completed the robot will carry out a check using the light sensor. If there is light, it will take another step forward. The user can start with very simple actions, such as turn on the light, make sound. At the same time, the user can lead the robot to move around using commands, such as move a step forward – backward turn right –left. Moreover, for and if commands are available in the system supporting at the same time more complicated combinations such as nested loops. A special cube, where user can save his/her program code and reuse it later, completes the set of commands.
Photo 2 Execution of a program
1) The special features of this systems can be summerized as follows:
364
The set of T_ProRob parameters are smaller cubes which are connected to the commands and changing their operation. For example, the movement command of one step forward is converted into a movement of 3 steps, towards the same direction, by adding the parameter three. The available parameters which concern the if commands deal with certain issues. Here are some examples. If the touch sensor is active or not, if the light sensor detects lights over a defined brightness level or not, if the ultrasonic sensor detects a barrier that is located within a defined distance and finally if the sound sensor detects sound volume over a defined level or not. For the construction of this system Microchip Technology Inc microcontrollers the 18F2620 and 18F4550 have been used.
Photo 4
communicate with a remote computer using Bluetooth or RS 232. This computer records in a Database information about the commands that have been used and also statistical data concerning the program which was created by the user. Once the computer finishes the recording, it sends the program to a NXT robot using Bluetooth so as to run it. Due to the fact that all the communications are two-way the system can understand possible mistakes during the sharing of the data. In this case, for example, it can be inquired by the blocks to send again the commands that represent. In addition to this, these two-way communications assure increased interaction with the users. As the program is running the robot can, for example, inform the block of if command, that the result of brightness measurement was positive. Then, the block informs the user by turning on the correspondent Led. 1) The special features of this systems can be summerized as follows: • The subsystem is active, with embedded Intelligence in each box. • There is a satisfactory collection of commands and parameters [7]. • The system can hide the computer from the users given the fact that the communications can also be wireless. • The user programs in the real environment with real cubes and sees the outcome of his/her program in the same physical space. [33,34] • The systems set the appropriate restrictions [35] on the users. Consequently, it is not possible for anyone to connect the blocks in the other way round. In this way, we reduce the users’ cognitive load given the fact that it is not necessary to inform them for what they should not do. • Batteries are not required for the blocks and in general for the system so it is assured continuous operation without recharging. • The size of the blocks has appropriately been adjusted to the age of the children who are going to use them. • Stable operation conditions are not required. Also, the users are not required to use any particular surface in order to form their program. This characteristic gives those kinds of systems the opportunity to be used more easily in the real classroom environment.
the construction
The block diagram of the T_RroRob system is shown to the Diagram 1.
IV.
INNOVATIONS THAT WE INTRODUCE
With these two systems we introduce some innovations to the tangible programming environments that have never been used before. • In the T_Butterfly system the shape which is formed in a real environment by the command blocks is similar to the route of the butterfly. So, the interface itself helps the children to understand the route that is going to be followed.
Diagram 1 T_RroRob system
The user connects on the basis (Master Box) the commands in order to form the program. Then by pushing the run button, which is on the master-box, the communication between the blocks and the master-box starts in order to have a successful reading of the program. The next task which is undertaken by the master-box is to
365
•
REFERENCES
There is the opportunity to save and reuse the program code which had been created some time or by other users [36,37,6]. • In both systems the number of commands that can be connected by the users is not limited by the I/O ports of the microcomputer which has been used [32]. • We have also exploited some physical features such as weight. For example, the parameter 4 has the twice the weight of the parameter 2 [38,28]. The T_ProRob system provides increased interaction with the users on the interface [27]. • The user knows the outcome of an if command carried out by the robot, with led that are on the interface. • The user knows in synchronous mode [39] a potential wrong structure through an indication on the parameter. So, if the user puts a parameter on a command and the command does not approve the particular parameter then immediately without running the program he/she sees the led indication on the parameter. • The user gets an indication of good operation on the blocks which carry out self – check procedures concerning various domains and also the connection quality with the neighboring blocks. V.
[1]
[2] [3]
[4]
[5]
[6]
[7]
[8] [9]
[10]
CONCLUSION
In this article we presented two systems for tangible programming. These systems can also be used by novice users, especially of a young age. Despite the differences between them, these systems can also be used in a complementary manner in order to achieve the common goal which is to bring the concepts of programming closer to the novice users. The first system programs a virtual object whereas the other a real robot. In both cases, the user uses a more natural way of learning [11,27] using physical cubecommands instead of using the keyboard or the mouse. Finally, we presented certain innovative special features that were embedded in our systems. In this way, we offered some new designing ideas in the field of tangible programming environments. VI.
[11]
[12]
[13]
[14]
[15]
FUTURE RESEARCH [16]
Our goal is to enrich our systems with new special features and to examine their efficiency in relation to the age, sex and previous knowledge of the users [40,41]. Moreover, our intention is to extent our research with these innovative systems in special education, that is, to examine the possibility of training individuals with special needs, such as, blindness or kinetic problems.
[17]
[18]
ACKNOWLEDGMENT Thanks, to everyone involved in the development of the systems especially to Dimitra Baltzi, for the help with our children. To Aristotle Kazakopoulos for the equipment he offered in the lab of microprocessors II at the Department of Electronics of the Technological Educational Institute of Thessaloniki. Finally, to Paul Zorpidis for his contribution.
[19]
[20]
366
H. Ishii and B. Ullmer, “Tangible bits: Towards seamless interfaces between people, bits and atoms,” Proc. of the SIGCHI Conference on Human Factors in Computing Systems, 1997, pp. 234-241. H. Ichida, Y. Itoh, Y. Kitamura and F. Kishino, “ActiveCube and its 3D applications,” Proc. at IEEE VR, March 2004. Y. Itoh, S. Akinobu, H. Ichida, R. Watanabe, Y. Kitamura and F. Kishino, “TSU. MI. KI: Stimulating children's creativity and imagination with interactive blocks,” Proc. Second International Conference on Creating, Connecting and Collaborating through Computing, 2004, pp. 62-70. P. Wyeth and H. C. Purchase, “Using developmental theories to inform the design of technology for children,” Proc. of the Conference on Interaction Design and Children, 2003, pp. 93-100. A. C. Smith, “Using magnets in physical blocks that behave as programming objects,” Proc. 1st International Conference on Tangible and Embedded Interaction, 2007, pp. 147-150. P. Frei, V. Su, B. Mikhak and H. Ishii, “Curlybot: Designing a new class of computational toys,” Proc. SIGCHI Conference on Human Factors in Computing Systems, 2000, pp. 129-136. A. Cockburn and A. Bryant, “Leogo: An equal opportunity user interface for programming,” Journal of Visual Languages and Computing, vol. 8, 1997, pp. 601-619. C. Kelleher and R. Pausch, “Lowering the barriers to programming,” ACM Computing Surveys, vo. 37, 2005, pp. 83-137. T. S. McNerney, “From turtles to tangible programming bricks: Explorations in physical language design,” Personal and Ubiquitous Computing, vol. 8, 2004, pp. 326-337. T. Sapounidis and S. Demetriadis, "Tangible programming interfaces: A literature review," Proc. in 4th Balkan Conference in Informatics, Thessaloniki, greece, 2009, pp. 70-75. B. Schiettecatte and J. Vanderdonckt, “AudioCubes: A distributed cube tangible interface based on interaction range for sound design,” Proc. 2nd International Conference on Tangible and Embedded Interaction, 2008, pp. 3-10. H. Newton-Dunn, H. Nakano and J. Gibson, “Block jam: A tangible interface for interactive music,” Journal of New Music Research vol 32, 2003, pp. 383-393. E. Schweikardt and M. D. Gross, “roBlocks: A robotic construction kit for mathematics and science education,” Proc. 8th International Conference on Multimodal Interfaces, 2006, pp. 72-75. O. Zuckerman and M. Resnick, “System blocks: A physical interface for system dynamics learning,” Proc. 21st International System Dynamics Conference, 2003, pp. 5-6. D. Anderson, J. L. Frankel, J. Marks, D. Leigh, E. Sullivan, J. Yedidia and K. Ryall, “Building virtual structures with physical blocks,” Proc. 12th Annual ACM Symposium on User Interface Software and Technology, 1999, pp. 71-72. P. Wyeth and H. Purchase, “Designing technology for children: Moving from the computer into the physical world with electronic blocks,” Information Technology in Childhood Education Annual, vol. 2002, 2002, pp. 219-244. A. C. Smith, “Simple tangible language elements for young children,” Proc. 8th International Conference on Interaction Design and Children, 2009, pp. 288-289. M. S. Horn, E. T. Solovey, R. J. Crouser and R. J. K. Jacob, “Comparing the use of tangible and graphical programming languages for informal science education,” Proc. 27th International Conference on Human Factors in Computing Systems, 2009, pp. 975984. H. Suzuki and H. Kato, “AlgoBlock: A tangible programming language, a tool for collaborative learning,” Proc. 4th European Logo Conference, 1993, pp. 297-303. M. S. Horn, Tangible Computer Programming: Exploring the use of Emerging Technology in Classrooms and Science Museums, PhD thesis, Tufts University, 2009.
[21] P. Marshall, “Do tangible interfaces enhance learning?,” Proc. International Conference on Tangible and Embedded Interaction, 2007, pp. 163-170. [22] D. Xu, “Design and evaluation of tangible interfaces for primary school children,” Proc. 6th International Conference on Interaction Design and Children, 2007, pp. 209-212. [23] C. O'Malley and S. Fraser, “Literature review in learning with tangible technologies,” report 12, NESTA Futurelab Publications: Bristol 2004. [24] B. Zaman, V. Vanden Abeele, P. Markopoulos and P. Marshall “Tangibles for children: The challenges,” Proc. 27th International Conference Extended Abstracts on Human Factors in Computing Systems, 2009, pp. 4729-4732. [25] A. N. Antle, “Designing tangibles for children: Games to think with,” Proc. Tangible Play Workshop, Intelligent User Interfaces Conference, 2007, pp. 21-24. [26] S. Price, “A representation approach to conceptualizing tangible learning environments,” Proc. 2nd International Conference on Tangible and Embedded Interaction, 2008, pp. 151-158. [27] O. Zuckerman, S. Arida and M. Resnick, “Extending tangible interfaces for education: Digital montessori-inspired manipulatives,” Proc. SIGCHI Conference on Human Factors in Computing Systems, 2005, pp. 859-868. [28] A. Manches, C. O'Malley and S. Benford, “The role of physical representations in solving number problems: A comparison of young children's use of physical and virtual materials,” Comput. Educ, Vol. 54, 2010, pp. 622-640. [29] L. Xie, A. N. Antle and N. Motamedi, “Are tangibles more fun?: Comparing children's enjoyment and engagement using physical, graphical and tangible user interfaces,” Proc. 2nd International Conference on Tangible and Embedded Interaction. 2008, pp. 191198. [30] T. McNerney, “Tangible computation bricks: Building-blocks for physical microworlds,” Proc. CHI 2001. [31] A. N. Antle, “The CTI framework: Informing the design of tangible systems for children,” Proc. 1st International Conference on Tangible and Embedded Interaction, 2007, pp. 195-202. [32] O. Shaer, “Tangible user interfaces: Past, present, and future directions,” Foundations and Trends® in Human–Computer Interaction, vol. 3, 2009, pp. 1-137.
[33] J. Patten, L. Griffith and H. Ishii, “A tangible interface for controlling robotic toys,” Proc. on Human Factors in Computing Systems, 2000, pp. 277-278. [34] A. Cockburn and A. Bryant, “Do it this way: Equal opportunity programming for kids,” Proc. IEEE Sixth Australian Conference on Computer-Human Interaction, 1996, pp. 246-251. [35] B. Ullmer, H. Ishii and R. J. K. Jacob, “Token constraint systems for tangible interaction with digital information,” ACM Transactions on Computer-Human Interaction (TOCHI). Vol. 12, 2005, pp. 81-118. [36] K. Kahn, “Drawings on napkins, video-game animation, and other ways to program computers. Communications” ACM, vol. 39, 1996, pp 49–59. [37] P. Wyeth and H. C. Purchase, “Tangible programming elements for young children,” Proc. CHI'02 Extended Abstracts on Human Factors in Computing Systems, 2002, pp. 774-775. [38] A. Blackwell, “Cognitive dimensions of tangible programming languages,” Proc. First Joint Conference of the Empirical Assessment in Software Engineering and Psychology of Programming Interest Groups, 2003, pp. 391-405. [39] Y. Kitamura, Y. Itoh and F. Kishino, “Real-time 3D interaction with ActiveCube,” Proc. CHI'01 Extended Abstracts on Human Factors in Computing Systems, 2001, pp. 355-356. [40] L. Buechley, N. Elumeze, C. Dodson and M. Eisenberg, “Quilt snaps: A fabric based computational construction kit,” Proc. IEEE International Workshop on Wireless and Mobile Technologies in Education, 2005, pp. 219 – 221. [41] Y. Fernaeus and J. Tholander, “Finding design qualities in a tangible programming space," Proc. SIGCHI Conference on Human Factors in Computing Systems, 2006, pp. 447-456.
367