Development of a Distributed Hardware-In-The-Loop ...

2 downloads 35 Views 204KB Size Report
Engelbrecht, Poe, and Balke. 1. Development of a ... Christopher M. Poe, P.E.. Director. TransLink. ® ...... Washington D.C., January 1998. 7. National Electrical ...
Engelbrecht, Poe, and Balke

1

Development of a Distributed Hardware-In-The-Loop Simulation System for Transportation Networks by

Roelof J. Engelbrecht Research Associate TransLink® Research Center Texas Transportation Institute CE/TTI Building Suite 405I Texas A&M University System College Station, Texas 77843-3135 Tel: (409) 862-3559 • Fax: (409) 845-6001 E-mail: [email protected] Christopher M. Poe, P.E. Director TransLink® Research Center Texas Transportation Institute CE/TTI Building Suite 411C Texas A&M University System College Station, Texas 77843-3135 Tel: (409) 862-3558 • Fax: (409) 845-6001 E-mail: cpoe@ tamu.edu Kevin N. Balke, P.E. Assistant Research Engineer Texas Transportation Institute CE/TTI Building Suite 309G Texas A&M University System College Station, Texas 77843-3135 Tel: (409) 845-9899 • Fax: (409) 845-6254 E-mail: [email protected]

Offered for Presentation and Publication Review 78 Annual Meeting of the Transportation Research Board January 1999 th

Word count: 7491

Engelbrecht, Poe, and Balke

2

ABSTRACT Microscopic traffic simulation is increasingly used for planning, evaluation, and research. Hardware-in-the-loop traffic simulation enhances the advantages of software-only traffic simulation by replacing the traffic control component of the simulation with real traffic signal control hardware. This increases the realism of the simulation and provides access to controller features currently not available in software-only simulation models.

This paper presents the hardware-in-the-loop simulation concept, traces the development in recent years, and describes a distributed hardware-in-the-loop traffic simulation system architecture developed by the Texas Transportation Institute’s TransLink® Roadside Equipment Laboratory. The described simulation system architecture allows any microscopic traffic simulation model to be used, provided that it supplies methods to “export” simulated detector actuations and “import” phase indications. The architecture also allows the simulation model to communicate with most types of traffic signal control devices.

The simulation system was evaluated during the development and testing of a real-time bus priority algorithm. The system performed well and allowed an extensive evaluation of the bus priority algorithm. In addition, it enabled researchers to identify problems that could not have been foreseen during the design of the algorithm, but would otherwise have become evident during the first field implementation.

A hardware-in-the-loop traffic simulation system such as the one described in this paper could benefit research organizations, universities, and all organizations that design and maintain traffic signal systems. This environment will advance the state-of-the-practice by allowing consistent evaluation of controller hardware, software, and operating strategies, which could ultimately lead to an improvement in traffic operations.

Key Words: Traffic simulation, hardware-in-the-loop, traffic signal controller

Engelbrecht, Poe, and Balke

3

INTRODUCTION Traffic simulation should be an important tool in the toolbox of present-day traffic engineers and analysts. Since the development of the first microscopic traffic simulation models 30 years ago, the capabilities of these models have improved vastly, primarily due to phenomenal increases in computer processing power. The migration to the personal computer (PC) platform during the last few years made microscopic traffic simulation more accessible and easier to use. Models such as CORSIM (1), Integration® (2), and SimTraffic (3) and are good examples of the state-of-the-art in PC-based microscopic traffic simulation models. The models all include graphical user interfaces, graphical network editing capabilities, simulation visualization, and support various traffic control strategies.

Microscopic traffic simulation is typically used to estimate the performance of a traffic system under a particular traffic control strategy. Field studies should ideally be conducted to measure system performance, but, in general, simulation is an attractive alternative for a number of reasons. Simulation is less costly than field studies, and results are obtained quicker. Simulation yields extensive measures of effectiveness, some of which are very difficult to measure in the field, e.g. fuel consumption and emissions. Traffic operations may be disrupted when traffic control strategies are changed in the field for evaluation purposes, while simulation does not have this drawback. Most importantly, simulation gives the analyst full control over all traffic, geometric and control variables. Using simulation, physical changes to the roadway or control system (e.g. addition of lanes, changes in detector location, etc.) can be evaluated easily and quickly. Simulation is also the only way in which the effects of increases in demand to can be analyzed.

Microscopic traffic simulation is increasingly used for planning, evaluation, and research. There is, however, a gap between the simulation and real traffic operation. One tool that can bridge this gap is hardware-in-the-loop traffic simulation. This paper presents the hardware-inthe-loop simulation concept, traces the development in recent years, and illustrates the utility of the concept through an application of testing transit priority on a traffic signal system.

Engelbrecht, Poe, and Balke

4

HARDWARE-IN-THE-LOOP TRAFFIC SIMULATION Existing microscopic traffic simulation models typically rely on internal controller emulation logic to perform signal control. Although many models include emulation for actuated signals and basic signal coordination, some types of advanced signal control may be difficult or impossible to implement in some models. Examples of such strategies include certain advanced signal coordination strategies, cycle transition algorithms, and signal preemption capabilities. Also, vendor-specific controller capabilities may not be available in the emulation. In addition, emulated controller logic in simulation models are not required to conform to any specification; therefore, users are not guaranteed that any emulated controller will function exactly as expected.

Hardware-in-the-loop traffic simulation offers a solution to this dilemma. Hardware-in-theloop simulation refers to a computer simulation in which some of the components of the simulation have been replaced with actual hardware. Hardware-in-the-loop simulation has been successfully used in the aerospace and defense industries for a number of years (4). The application of hardware-in-the-loop traffic simulation, however, is relatively new. In 1995, Urbanik and Venglar described the “SMART” diamond system, which included a real-time traffic simulation component based on hardware-in-the-loop simulation (5). In 1998 Bullock and Catarella described a real-time simulation environment for evaluating traffic signal systems that is based on hardware-in-the-loop simulation (6).

In hardware-in-the-loop traffic simulation, the traffic control component of the simulation (i.e. the internal controller emulation logic) is replaced with real traffic signal control hardware. To achieve this type of external simulation control, the simulation model and the controller hardware should be able to communicate. The simulation runs in real time (i.e. one model second is simulated in one real second), since controller hardware usually performs input and output in real time. In the case of traffic-actuated control, the simulation generates simulated detector actuations by modeling simulated vehicles crossing simulated detectors. The simulated detector actuations are then sent to the controller hardware, which react to them as it would to real detector actuations, for example by changing phase indications, according to the phasing and timing plan programmed in the controller. The phase indications are then read back from the controller hardware to the simulation and assigned to the simulated traffic signals. The simulated

Engelbrecht, Poe, and Balke

5

vehicles then react to the simulated traffic signals by stopping or departing as appropriate. The flow of data between the simulation model and a traffic-actuated controller is shown in Figure 1.

Simulation Model Controller Hardware Detector

Vehicle

Detector actuations

Phase indications Signal

Figure 1. Flow of data between the simulation model and a traffic-actuated controller.

Hardware-in-the-loop traffic simulation offers several advantages over regular traffic simulation: •

It adds realism to the simulation because real control hardware is used.



It increases the confidence of practitioners in the simulation.



It allows the testing and use of advanced signal control strategies and vendor-specific controller capabilities in the simulation.



It allows the technician or engineer to set up signal controllers in the signal shop or laboratory using simulation before deploying in the field. Thus, gross errors may be identified beforehand, resulting in improved traffic operations and increased traffic safety.

Hardware-in-the-loop traffic simulation also has certain disadvantages over regular traffic simulation:

Engelbrecht, Poe, and Balke •

6

Regular traffic simulation models can run at speeds faster than real time, especially on small networks and fast computers. Since hardware-in-the-loop traffic simulation runs in real time, it is much slower than regular traffic simulation, and it will take longer to perform the same number of replications as regular traffic simulation.



When large networks are simulated, more powerful computers may be required for hardware-in-the-loop traffic simulation to ensure that the simulation runs in real time.



Special care should be taken when replicating hardware-in-the-loop simulation experiments, since the control component of the simulation is under external control, and may be in an unknown state at the start of the simulation.



Hardware-in-the-loop traffic simulations which contain many real controllers may be problematic due to the limited input-out capabilities of typical computers running the simulation. In addition, not enough real control hardware may be available for large networks.



Input-output processing may use a significant proportion of the processing power of the computer running the simulation, thereby reducing the size of the network that may be simulated in real time.

With these items identified, there are a number of applications that can use take advantage of hardware-in-the-loop simulation. Next, the implementation details of hardware-in-the-loop simulation systems will be described. The application of such a system will then be demonstrated with a specific example of testing a selective bus priority algorithm on a coordinated traffic signal system.

THE CONTROLLER-COMPUTER INTERFACE The critical component of a hardware-in-the-loop traffic simulation system is the controller interface device (CID), which allows a computer to communicate with traffic control hardware. The CID allows the simulation model to send detector actuations to the control device and to read phase indications back from the control device. Most of the controllers in use today communicate with the signal assembly by direct (hard-wire) connection. The detector inputs of the controller are connected to the detector units while the controller outputs are connected to the

Engelbrecht, Poe, and Balke

7

signal load switches. The CID takes the place of the detector units and the load switches, in effect “fooling” the controller into “thinking” that it is communicating with a signal assembly.

The connection between the controller interface device and the controller is established according to the type of controller. For controllers adhering to the NEMA TS 1 specification (7), the CID is connected to the controller with the standard TS1 A, B, and C connectors. For controllers adhering to the California Department of Transportation (Caltrans) 170/170E/2070 ATMS controller unit specifications (8), the CID is connected to the controller with the standard C1 connector. The NEMA TS 2 specification (9) introduced the Port 1 connector, which allows the controller to communicate with the load switches and detector units using a high-speed, serial communications interface. A CID connecting with a NEMA TS2 controller should interface via the Port 1 connector using the appropriate communication interface and protocol.

In 1995 the Texas Transportation Institute developed a controller interface device for use in the “SMART” diamond project funded by Texas A&M ITS Research Center of Excellence (5). The design is based on the National Instruments™ multiple channel, parallel, digital input/output (I/O) PC interface board architecture and data acquisition driver software. The interface boards are specifically designed for high-performance data acquisition and control, while the driver software allows the user to access that functionality from any application programming environment.

The digital I/O interface board is used to interface with the input and output pins of the traffic controller hardware. The controller output pins (usually connected to the signal load switches) are connected to the input channels of the digital I/O board, while the controller input pins (usually connected to the detector units) are connected to the output channels of the digital I/O board. The connections are not direct, but made through optically isolated, solid state I/O modules (relays) on mounting racks. These I/O modules are used for two reasons: •

The logic (voltage) levels of the digital I/O interface board and the controller hardware are typically different. For example, the digital I/O interface board and computer operates at 5V logic levels, while NEMA TS1 traffic signal controllers operate at 24V logic levels (7). The I/O modules allow connection over these different logic levels.

Engelbrecht, Poe, and Balke •

8

The I/O modules are optically isolated to protect the delicate electronics of the digital I/O interface board and the computer from damage due to over-voltage transients. The Texas Transportation Institute TransLink® Research Center recently expanded on the

original “SMART” diamond CID design when it developed controller interface devices for use in its Roadside Equipment Laboratory (REL). The configuration of the device is shown in Figure 2.

In 1998 Bullock and Catarella described another controller interface device (6). This CID was developed at the Louisiana State University for the Federal Highway Administration under contract from ITT Systems & Sciences Corporation (previously Kaman Science Corporation). The configuration of the device is shown in Figure 3. The CID was developed to allow traffic engineers and analysts to replace the emulated controller logic of the CORSIM simulation model with a real controller based on the NEMA TS1 standard. The CID consists of Opto 22 SNAP I/O® optically isolated input/output hardware mounted in a portable suitcase (10). The CID was designed to connect to any personal computer or notebook computer with a standard RS-232 serial interface. The design allows for multiple CIDs to be daisy-chained together using the RS422 multi-drop serial protocol. The CORSIM model runs on the PC or notebook connected on the CID(s) and interfaces with the CID through a specially designed CORSIM run-time extension dynamic link library (11) and the appropriate driver software. There is currently a patent pending on the Louisiana State University controller interface device.

Engelbrecht, Poe, and Balke

9

Controller

I/O Modules on mounting racks

I/O Interface Board

Figure 2. REL Controller Interface Device with Controller.

LSU Controller Interface Device Controller

Serial Connection

Figure 3. LSU Controller Interface Device with Controller.

Engelbrecht, Poe, and Balke

10

The differences between the Roadside Equipment Laboratory CID (REL-CID) and the Louisiana State University CID (LSU-CID) can be summarized as follows: •

The REL-CID uses a parallel interface to the computer, which is much faster than the serial connection used by the LSU-CID. Data transfer between the computer and the controller can take place at higher rates with the REL-CID, with lower processor power requirements.



The serial connection of the LSU-CID allows it to interface better with a notebook computer, making the LSU-CID system more portable than the REL-CID.



The LSU-CID is more rugged than the REL-CID, since the REL-CID was designed for laboratory use only. The LSU-CID suitcase allows the CID to be taken out to the field; however, the CID may also be removed from the suitcase, if required, for example for laboratory use.



The REL-CID design is more flexible than the LSU-CID design. The current LSU-CID design is more difficult to change that the REL-CID design, which can be reconfigured to connect to other control devices such as ramp meter controllers.



The number of REL-CID I/O interface boards that may be installed in a single computer is limited by the number of expansion slots available on the computer, usually between two and five. However, up to three controllers may be connected to a single interface board. The LSU-CID design allows multiple CIDs to be daisy-chained, but the restricted serial port bandwidth limits the number of units that may be connected.

It is evident that the REL-CID and the LSU-CID designs have advantages in certain areas. The REL-CID is faster and more flexible, while the LSU-CID is easier to connect and more portable. The next section will describe an architecture developed by the TransLink® Roadside Equipment Laboratory to incorporate both these CID designs into a hardware-in-the-loop traffic simulation system.

Engelbrecht, Poe, and Balke

11

HARDWARE-IN-THE-LOOP SIMULATION ARCHITECTURE Overview As a minimum, a hardware-in-the-loop traffic simulation should consist of a traffic simulation model, one or more controller interface devices, and one or more real controllers. Figure 4 shows the architecture of the hardware-in-the-loop simulation system developed by the Louisiana State University. This system has been used successfully to validate the performance of hardware-inthe-loop traffic simulation (6).

CORSIM

LSU-CID

LSU-CID

LSU-CID

TS1-CU

TS1-CU

TS1-CU

Acronyms: CORSIM LSU-CID TS1-CU

CORSIM Simulation Model Louisiana State University Controller Interface Device NEMA TS1 Controller Unit (hardware)

Figure 4. The Louisiana State University hardware-in-the-loop simulation system architecture (6).

The TransLink® Roadside Equipment Laboratory has expanded the architecture in Figure 4 to address some of the shortcomings of hardware-in-the-loop simulation and controller interface devices described earlier in this paper. The resulting architecture is shown in Figure 5. The main characteristics of the architecture are: •

The architecture is open. It allows the use of different simulation models, controller interface devices, and controllers, in both existing and future designs. To incorporate a new component

Engelbrecht, Poe, and Balke

12

into the architecture, only a small part of the system needs to be changed. For example, only a new controller interface client (see below) needs to be developed to incorporate a new type of controller interface device into the architecture. Likewise, if a simulation model is changed, it has no effect on the rest of the architecture, provided that it communicates with the architecture in the prescribed manner. •

The architecture is distributed. A client-server arrangement is used to separate the simulation model from controller I/O. The simulation model is run on a server computer while the controller I/O is performed by one or more client computers. The client and server computers are all networked together. Distributing the simulation from the controller I/O solves two drawbacks of hardware-in-the-loop simulation: (i) the limited I/O capabilities of a typical desktop computers, and (ii) the sharing of processing power between the simulation and I/O processing. Since communication takes place over a network, the simulation model and controllers need not be at the same location. This feature allows the sharing of resources between organizations. For example, one organization may connect remotely to another organization’s controllers, if needed. The client and server functions may also be combined on a single computer, allowing the architecture to function like traditional non-distributed simulation models.



The architecture is flexible. In addition to allowing for hardware-in-the-loop simulation, it provides for real-time control optimization and other applications such as automated controller testing.

Each of the components of the architecture will be described in more detail in the following sections. Specific mention will be made of the existing setup in the TransLink® Roadside Equipment Laboratory.

Engelbrecht, Poe, and Balke

13

SIM

SDI

On Server Computer

RTCA

Closed Loop Network Master

CIS

CIA TCP/IP Network

CIC

CIC

VCU’s

CIC

CID

CID

CID

CU

CU

CU

Software Controllers

CIC

CID

CU

TS-1 Controllers

CU

CU

CU

CU

CU

TS-2 Controllers

Acronyms: SIM SDI RTCA CIA CIS

Simulation Model Simulation Data Interface Real-Time Control Algorithm Controller Interface Application Controller Interface Server

CIC CID CU VCU

Controller Interface Client Controller Interface Device Controller Unit (hardware) Virtual Controller Unit (software implementation)

Figure 5. The TransLink® Roadside Equipment Laboratory hardware-in-the-loop simulation system architecture.

Engelbrecht, Poe, and Balke

14

Simulation Model The architecture allows any microscopic traffic simulation model to be used, provided that it supplies methods to “export” simulated detector actuations and “import” phase indications. The TransLink® Roadside Equipment Laboratory currently uses two simulation models in architecture. The first model is CORSIM Version 4, developed by ITT Systems & Sciences Corporation for the Federal Highway Administration (1). CORSIM communicates with the architecture through the CORSIM run-time extension, which allows for hardware-in-the-loop simulation and other external control applications (11). The CORSIM run-time extension provides an interface to detector and phase information. The Roadside Equipment Laboratory also uses an internal simulation model (TexSIM), developed for research and development purposes by the Texas Transportation Institute (12). TexSIM is the model that was used in the original hardware-in-the-loop simulation system developed for the “SMART Diamond” project (5). The CORSIM simulation model is normally used in the TransLink® Roadside Equipment Laboratory, but TexSIM is sometimes used, for example, when a required feature is not available in CORSIM. Other simulation models such as Integration® (2) and SimTraffic (3) may be integrated into the architecture, provided that the necessary “hooks” are available.

The simulation model runs on the server computer, which is typically a powerful desktop PC. The TransLink® Roadside Equipment Laboratory server contains a 266 MHz Intel® Pentium® II processor and 128MB random access memory (RAM) running the Microsoft® Windows NT® 4.0 Workstation operating system. More powerful server computers will be able to simulate larger networks in real time.

Simulation Data Interface The simulation data interface (SDI) acts as a real-time repository for detector actuation and phase indication data. The simulation model, the controller interface server (see below), and other applications have access to the detector and phase information in real time. The simulation model writes detector information to the SDI, and reads phase information from the SDI. Other

Engelbrecht, Poe, and Balke

15

components of the architecture then read detector information from the SDI, and write phase information to the SDI, as appropriate.

In addition to the phase indications, the SDI contains extended phase information such as the Next, On, Call, Omit, Hold, and Force Off status for every phase; the Force Off, Max 2, and Max Inhibit status for every ring; and preempt control state information as defined in the NEMA TS3.5 standard (13). The extended phase indications may be used to implement real-time control strategies.

The simulation data interface was developed in the C++ language and is implemented as a shared memory dynamic link library that may be called from any application that needs access to the phase and detector data. The simulation data interface runs under Microsoft® Windows® 95/98/NT.

Controller Interface Server and Clients The heart of the distributed architecture is the controller interface server and controller interface clients. The architecture allows multiple controller interface clients to connect to a controller interface server over any TCP/IP network such as a local area network (LAN), intranet, or the internet. The controller interface server is located on the server computer running the simulation, while the controller interface clients are located on client computers connected to controller interface devices. The advantage of connecting over a TCP/IP network is that it does not matter whether the client and server computers are in the same room or halfway across the world—as long as all the computers have network connections and IP addresses, the architecture is functional. The architecture also allows for the server and multiple clients to be located on a single computer.

The client-server connection is used to facilitate a two-way data flow between the server and client(s). The controller interface server reads detector data from the simulation data interface and sends it to the controller interface client(s), from where it is written through the controller interface device(s) to the controller unit(s). The controller interface client then reads the phase

Engelbrecht, Poe, and Balke

16

data from the controller(s) through the controller interface client(s) and then sends it to the controller interface server from where it is written to the simulation data interface.

Each controller in the simulation network is identified by a unique controller ID number. The user provides the controller interface server with a list of all the controller ID numbers in the network, through an initialization file. Before a simulation starts, the controller interface server waits for all the controllers in the network to connect through their respective controller interface clients. Multiple controllers may be connected to one controller interface client, as shown in Figure 5. The user provides each controller interface client with a list of the controller ID numbers it is connected to, through an initialization file. This file also identifies the controller interface server the client should connect to. When connecting to the server, each client sends the ID numbers of all the controllers connected to it to the server. The server then checks each ID number to ensure that the controller is in the list of expected controllers, and that the controller has not already connected.

After all the controllers have connected, the controller interface server starts polling the controller interface clients at a user-defined polling frequency. During this polling process, the current detector data is read from the simulation data interface and sent to the controller interface client(s). The controller interface client(s) reply by sending their current phase data to the controller interface client. To ensure that data latency is minimized, the polling frequency should be set as high as possible. Values depend on the quality of the network connection, and may be as high as 50 times per second on a local area network but much lower on a dial-up internet connection.

To further minimize data latency, the controller interface clients poll their controller interface devices in a separate process. During this polling process, the controller interface client writes its current detector data (as updated by the controller interface server) to the controller through the controller interface device. The controller interface client then waits a user-specified specified time for the controllers to react, and reads back the phase status. This polling frequency should also be set as high as possible, but depends primarily on the design of the controller interface device. Parallel devices such as the REL-CID can sustain much higher data transfer rates than

Engelbrecht, Poe, and Balke

17

serial devices such as the LSU-CID, and can therefore have higher polling frequencies. In general, though, most controller interface devices should be able to sustain polling frequencies of at least 10 times per second. The TransLink® Roadside Equipment Laboratory successfully runs controller interface clients on 66 MHz Intel® 486-DX2 processors with 40MB RAM under the Microsoft® Windows NT® 4.0 Workstation operating system.

The controller interface client and server software components of the architecture were developed in the Borland® Delphi™ application development environment and run under Microsoft® Windows® 95/98/NT. The transmission control protocol (TCP) was chosen for communication between the server and clients, since it is connection-oriented and assures the reliable transmission. A connection-oriented protocol is important for reliable transmission because it would immediately know when the communication link between the server and any client is broken.

A different controller interface client was developed for the two types of controller interface devices currently available in the TransLink® Roadside Equipment Laboratory, i.e. the LSU-CID and REL-CID designs. In addition, a controller interface client that serves as a “wrapper” for any virtual (i.e. software only) controller was developed. Development is also under way to develop a controller interface client for a TS2 Port 1 controller interface device. It should be noted that the different controller interface clients all communicate with the controller interface server in the same way, although they communicate differently with the different types of controller interface devices. The controller interface devices will be described in more detail in the next section.

Controller Interface Devices and Controllers Both types of controller interface devices described previously are currently available in the TransLink® Roadside Equipment Laboratory. Three LSU-CIDs were build by the Louisiana State University Remote Sensing and Image Processing Laboratory and funded by the Texas A&M ITS Research Center of Excellence (10). The TransLink® Roadside Equipment Laboratory built seven REL-CIDs. Both these designs are accommodated in the architecture. The REL also owns a number of traffic signal controllers from different manufacturers conforming to the

Engelbrecht, Poe, and Balke

18

NEMA TS1 (7), NEMA TS2 (9) and Caltrans 2070 ATMS (8) specifications, as well as a number of ramp meter controllers. All of these controllers have been used in the architecture through one of the two CID designs. The LSU-CID has been connected to 8-phase, 4-overlap, 8detector NEMA TS1 controllers, while the REL-CID has been connected to the following controller hardware: •

8-phase, 4-overlap, 8-detector NEMA TS1 controllers;



16-phase, 4-overlap, 32-detector NEMA TS2 Type 2 controllers;



8 phase, 4-overlap, 12-detector NEMA TS2 Type 2 or 2070 ATMS controllers;



2-lane, 7-detector ramp meter controllers; and



4-lane, 32-detector ramp meter controllers.

It should be reiterated that the architecture allows the simulation to be completely independent of the type of controller interface device or controller used. The only requirement is that the controller reacts properly to the simulated detector actuations and correctly returns the resulting signal indications to the controller interface client.

Virtual Controllers The architecture allows virtual (emulated, or software only) controllers to be incorporated into the simulation. Ideally, virtual controllers could be used to run the software portion of real controllers, eliminating the need for physical controllers and controller interface devices. This feature may be useful for controller software vendors to test changes in existing controller software and develop and validate new controller software designs.

Virtual controllers are also useful in reducing the number of controllers required to simulate larger networks. Controllers at non-critical intersections may be replaced with virtual controllers replicating the required control strategies. This addresses another drawback of hardware-in-theloop simulation—that not enough real control hardware may be available to simulate large networks.

Engelbrecht, Poe, and Balke

19

The TransLink® Roadside Equipment Laboratory has developed a very simple fixed-time, coordinated virtual controller in Borland® Delphi™ for use in the architecture. These controllers may be used on the periphery of networks to ensure realistic, plattooned, vehicle arrivals at downstream signals controlled by real controllers. These virtual controllers may also be used to replace any number of fixed-time, coordinated signals in a network, reducing the required number of real controllers.

Real-Time Control Algorithms The architecture was specifically designed to facilitate the incorporation of real-time control algorithms into the simulation. This allows real-time control algorithms such as the Real-Time Traffic Adaptive Signal Control System (RT-TRACS) (14) to be evaluated with this architecture using hardware-in-the-loop simulation.

According to Figure 5, any real-time control algorithm implemented in the architecture would have access to the phase and detector information in the simulation data interface. This data can then be used for strategy selection. The architecture is flexible in that the real-time control algorithm may communicate with the controllers along two data paths. Communication may be internally through the controller interface server, client, and CID; or externally along an independent data path such as a closed loop network; or both. The TransLink® Roadside Equipment Laboratory recently developed a real-time bus-priority algorithm within the architecture. The algorithm provides selective priority to buses through a system of coordinated traffic signals. The algorithm communicates with the traffic signal controllers using a combination of the internal and external data paths. The application of the algorithm is described in more detail later in the paper.

Controller Interface Applications In addition to providing support for real-time control algorithms, the architecture also supports controller interface applications. Controller interface applications are best described as general plug-in utility applications that may interact with the user, the simulation data, and the

Engelbrecht, Poe, and Balke

20

controllers. Controller interface applications provide considerable flexibility to the architecture, since it allows users or other software modules to access to the control data and interact with the simulation. The TransLink® Roadside Equipment Laboratory has developed three small controller interface applications. The first application gives a visual display of the data in the simulation data interface, and is very helpful in troubleshooting. The second application is used to manually actuate detectors, and set phase and ring inputs. These two applications together perform exactly like a hardware controller tester, since it allows the user to set certain controller inputs and view the resulting controller outputs. The third application is a data logger that records detector and phase information during the simulation. Amongst other data, it provides the user with the exact times of all phase changes during the simulation, so that the phase order, average phase duration, and cycle length variations may be analyzed.

The fact that there are two data paths to the controllers (see Figure 5 and previous section) opens up the possibility of other controller applications, such as for automated controller testing and the testing of closed loop system software. An automated controller tester, for example, would execute a test script, providing predetermined detector and other inputs to one or more controllers. The tester would then read the phase status from the controller(s), compare it to the expected responses, and report any discrepancies.

APPLICATION TO THE DEVELOPMENT OF A REAL-TIME BUS PRIORITY ALGORITHM One of the first applications of the hardware-in-the-loop simulation architecture developed by the TransLink® Roadside Equipment Laboratory was in the development of a real-time bus priority algorithm. The objective of the algorithm is to provide selective priority to buses through a system of coordinated traffic signals, without disrupting coordination. The algorithm is designed to capitalize on advance vehicle tracking technology by using estimates of the arrival time of the bus at the bus stop and the intersection to decide (i) whether or not the bus is sufficiently behind schedule to warrant priority, and (ii) which one of five strategies to implement to provide priority.

Engelbrecht, Poe, and Balke

21

Before testing the algorithm in the field, a proof-of-concept study was performed in the TransLink® Roadside Equipment Laboratory using the hardware-in-the-loop simulation system architecture shown in Figure 5. The algorithm was tested under different geometric and demand conditions on a hypothetical three-intersection arterial. The TexSIM simulation model was used since it could provide all the data required by the algorithm. Three NEMA TS2 Type 2 traffic signal controllers were used to control the intersections by implementing the algorithm. The controllers were connected to three REL-CID controller interface devices using the NEMA TS-1 standard A, B, and C connectors. In addition to phase indications and detector actuations, the REL-CIDs provided access to the phase Hold, Omit, and Call functions, as well as to the ring Force Off function, since the algorithm is implemented through these additional functions. The three REL-CIDs were connected to a single client computer. Each controller was also connected to the client computer via the RS232 serial port to form a simple closed loop system. The serial connections were used to access controller timing and coordination data, which are required by the algorithm but are not available through the REL-CID.

Using more than 340 hours of simulation, the hardware-in-the-loop simulation system allowed an extensive evaluation of the bus priority algorithm under many different traffic and signal control conditions. Preliminary results of the evaluation based on measures of effectiveness such as bus travel times, non-bus travel times, stop delay by movement, and overall system delay showed that the algorithm functioned as designed over a wide range of traffic conditions, providing priority only to those buses 5 minutes or more behind schedule, and doing so without disrupting coordination. The simulation also showed that the algorithm can provide priority with only minor impacts to main-street and cross-street non-bus traffic. The hardwarein-the-loop system also allowed the researchers to identify operational problems and constraints that could not have been foreseen during the design of the algorithm, but would otherwise have been evident during the first field implementation.

OTHER APPLICATIONS Thus far, the simulation system has been used for a number of applications other than the bus priority algorithm development described above. For example, it has been used in the evaluation

Engelbrecht, Poe, and Balke

22

of alternative diamond interchange control strategies. In this project, hardware-in-the-loop simulation shed new light on the effectiveness of some commonly used operating strategies.

The simulation system has also been used from educational purposes. The graduate level CVEN 618 (Traffic Engineering: Operations) class at Texas A&M University has used the system to demonstrate the effects of detector placement and controller setting on actuated traffic signal control. In addition, efforts are underway to use the system for training and professional capacity building. The distributed nature of the architecture makes the system particularly suited for distance learning, allowing students to interact remotely with real traffic control hardware. TransLink® Roadside Equipment Laboratory will also provide other research partners with laboratory testing of control strategies before field implementation. The laboratory research staff will typically code in the simulation network and traffic demand, based on information supplied by the sponsoring research partner. This agency can then bring their controller hardware and software to the laboratory where it is connected to the simulation system through the appropriate controller interface device. The agency then evaluates and optimizes their controller settings in the laboratory, based on the simulation results. The research results can then help improve and expedite field implementation.

CONCLUSIONS Hardware-in-the-loop traffic simulation is a relatively new technique that holds great promise. It enhances the advantages of traffic simulation by increasing realism and providing access to controller features currently not available in software-only simulation models. Although there are drawbacks to hardware-in-the-loop traffic simulation, the advantages will outweigh the disadvantages in many cases. The Texas Transportation Institute TransLink® Roadside Equipment Laboratory has developed a hardware-in-the-loop traffic simulation system architecture to address some of the shortcomings of traditional hardware-in-the-loop simulation. An open architecture is used to allow the use of different simulation models, controller interface devices, and controllers. To incorporate a new component into the architecture, only a small part of the system needs to be

Engelbrecht, Poe, and Balke

23

changed. A client-server approach is used to separate the simulation model from the controller interface. Since communication takes place over a network, the simulation model and controllers need not be at the same location. This feature allows the sharing of resources between organizations. The system architecture provides for real-time control optimization and other applications such as automated controller testing.

The simulation system was evaluated during the development and testing of a real-time bus priority algorithm. During more than 340 hours of simulation the system performed flawlessly. The system allowed an extensive evaluation of the bus priority algorithm and enabled researchers to identify problems that could not have been foreseen during the design of the algorithm.

All indications are that the use of an open, distributed, flexible hardware-in-the-loop traffic simulation system such as the one described in this paper could benefit research organizations, universities, and all organizations that design and maintain traffic signal systems. This environment will advance the state-of-the-practice by allowing consistent evaluation of controller hardware, software, and operating strategies.

It should be noted that, although hardware-in-the-loop traffic simulation may be more realistic than regular traffic simulation, it should never be considered as a substitute for field evaluation of control strategies done in a safe, practical, and cost-effective manner.

ACKNOWLEDGMENTS The founding partners of the TransLink® Research Center are Texas Transportation Institute, Rockwell International, the U.S. Department of Transportation, the Texas Department of Transportation, and the Metropolitan Transit Authority of Harris County. The support of these organizations, as well as other members and contributors, is gratefully acknowledged. The development of the distributed architecture for hardware-in-the-loop traffic simulation was funded by the U.S. Department of Transportation, Federal Highway Administration. The development of the bus priority algorithm was funded by the Texas A&M ITS Research Center of Excellence.

Engelbrecht, Poe, and Balke

24

REFERENCES 1. ITT Systems & Sciences Corporation. CORSIM User’s Manual. Version 1.04. FHWA,U.S. Department of Transportation, March 1998. 2. Van Aerde, M. Integration® User’s Guide Volume 1: Fundamental Model Features. Blacksburg, VA, July 1998. 3. Husch, D. SimTraffic User’s Guide. Trafficware, Berkeley, CA, 1998. 4. NERAC Incorporated. Hardware-in-the-Loop Simulation: Aerospace and Military Systems Applications. Tolland, CT, 1994. 5. Urbanik, T., and S. P. Venglar. Advanced Technology Application: The "SMART" Diamond. In Compendium of Technical Papers, ITE 65th Annual Meeting in Denver, Colorado. ITE, Washington, DC, 1995, pp. 164-168. 6. Bullock, D., and A. Catarella. A Real-Time Simulation Environment for Evaluating Traffic Signal Systems. Paper presented at the 77th Annual Transportation Research Board, Washington D.C., January 1998. 7. National Electrical Manufacturers Association. NEMA TS 1-1989 Traffic Control Systems. Washington D.C., 1989. 8. California Department of Transportation. Transportation Electrical Equipment Specifications. Sacramento, CA, 1997. 9. National Electrical Manufacturers Association. NEMA TS 2-1992 Traffic Controller Assemblies. Washington, D.C., 1992. 10. Bullock, D. CORSIM Computer Simulation Interface Equipment. Louisiana State University Remote Sensing and Image Processing Laboratory, Baton Rouge, LA, December 1997. 11. ITT Systems & Sciences Corporation. Traffic Software Integrated System Version 4.2 CORSIM Run-Time Extension. FHWA, U.S. Department of Transportation, March 1998. 12. Koothrappally, J. Optimization of Control Parameters for Isolated and Coordinated Signal Systems. Master’s Thesis, Texas A&M University, College Station, TX, 1993. 13. National Electrical Manufacturers Association. NEMA TS 3.5-1998 National Transportation Communications for ITS Protocol (NTCIP) Object Definitions for Actuated Signal Controller Units. Rosslyn, VA, 1996.

Engelbrecht, Poe, and Balke 14. Pooran, F.J., P.J. Tarnoff, and R. Kalaputapu. RT-TRACS: Development of the Real-Time Control Logic. In Proceedings of the 1996 Annual Meeting of ITS America. ITS America, 1996, pp. 422-430.

25

Suggest Documents