Communication Architecture for Grid Integration of Cyber ... - MDPI

5 downloads 0 Views 7MB Size Report
Oct 10, 2017 - the WF electric topology, where the optical fiber cables are integrated with the submarine cables. The. WF communication architecture is shown ...
applied sciences Article

Communication Architecture for Grid Integration of Cyber Physical Wind Energy Systems Mohamed A. Ahmed 1,2 1 2

*

ID

and Chul-Hwan Kim 1, *

College of Information and Communication Engineering, Sungkyunkwan University, 2066 Seobu-ro, Jangan-gu, Suwon-si, Gyeonggi-do 16419, Korea; [email protected] Department of Communications and Electronics, Higher Institute of Engineering and Technology-King Marriott, Alexandria 23713, Egypt Correspondence: [email protected]; Tel.: +82-31-290-7124

Received: 4 September 2017; Accepted: 8 October 2017; Published: 10 October 2017

Abstract: As we move toward increasing the grid integration of large-scale wind farms (WFs), reliable monitoring, protection, and control are needed to ensure grid stability. WFs are considered to be large and complex cyber physical systems owing to coupling between the electric power system and information and communication technologies (ICT). In this study, we proposed a framework for a cyber physical wind energy system (CPWES), which consists of four layers: a WF power system layer, data acquisition and monitoring layer, communication network layer, and application layer. We performed detailed network modeling for the WF system, including the wind turbines, meteorological mast (met-mast), and substation based on IEC 61400-25 and IEC 61850 standards. Network parameters and configuration were based on a real WF (Korean Southwest offshore project). The simulation results of the end-to-end delay were obtained for different WF applications, and they were compared with the timing requirements of the IEC 1646 standard. The proposed architecture represents a reference model for WF systems, and it can be used to enable the design of future CPWESs. Keywords: wind farm; communication network; cyber physical wind energy system; grid integration; IEC 61850; IEC 61400-25

1. Introduction There is a growing interest in increasing the penetration rate of renewable energies, such as wind power, solar energy, and biomass. Among them, great attention has been given to wind energy, and several large-scale wind farm (WF) projects are scheduled to be constructed in the near future. In South Korea, the cumulative installed wind power was about 610 MW and 835 MW by the end of 2014 and 2015, respectively [1]. In September 2014, the Korean government announced a long-term plan for new and renewable energies, with a target of obtaining 5.0% of the total primary energy supply from renewable energies by 2020, and 11.0% by 2035. Wind power will supply and contribute about 18.2% of the total power supplied by new and renewable energies by 2035 [2]. The planned WF projects include Southwest phase 1, Tamra, Hanlim, Daejeong, North-East, Shinchong, and Quiduck. The complete list of WF projects is available on the Korea wind power industry association (KWEIA) website, including site location, total capacity, manufacturer, and number of turbines [3]. As an increasing number of WFs are integrated into the power grid, communication infrastructure will play an important role in enabling the real-time operation, monitoring, and control of both wind turbines and the electric power grids to ensure grid stability [4]. Communication infrastructure in WF is considered as an example of an industrial network that requires special consideration in network design owing to the following reasons [5–13]: (1) WFs are built in remote locations (onshore/offshore) where abundant wind resources are available. The remote sites lack established communication infrastructure or cellular coverage; (2) access to WF is difficult, especially in the case of offshore Appl. Sci. 2017, 7, 1034; doi:10.3390/app7101034

www.mdpi.com/journal/applsci

Appl. Sci. 2017, 7, 1034

2 of 20

sites where the only means of access is by boat or helicopter. Furthermore, weather conditions may postpone/prevent access to WF for an unspecified period. Therefore, remote monitoring and control are highly required; (3) as advancements in the wind turbine industry enable wind turbines to migrate from shallow water to deep water with taller towers and blades, recent studies show the increase in the failure rate of large wind turbines, which impacts the availability of wind turbines and their operation and maintenance costs [14]. Therefore, condition monitoring systems (CMS), structure health monitoring (SHM), and supervisory control and data acquisition unit (SCADA) systems are used for real-time monitoring; (4) the monitoring scope of large-scale WFs has been expanded to cover the operation status of wind turbines, generated power, meteorological masts (met-mast), and substations. Therefore, a considerable amount of data needs to be transferred between WFs and their control centers, which will increase the burden on the communication infrastructure. The main components of a WF are wind turbines, electric systems, a substation, and met-masts. Communication infrastructure is used to connect WF components together using wired/wireless technologies. Owing to coupling between the physical system and the cyber communication network, a WF is considered to be a cyber physical system. Considerable research studies and investigations have been conducted to study the WF electric power system. However, information on the corresponding communication networks of the proposed models is limited. Furthermore, few research studies have been carried out to investigate the SCADA system and the underlying communication infrastructure. Moreover, it is difficult to find detailed information about how to design the communication network for a wind energy system, as the designs are the proprietary information of each manufacturer. The only available published information consists of outline descriptions of the communication infrastructure of a number of real WF projects [4,15–17]. The extent of the research on communication infrastructure and its role in supporting the grid integration of large-scale WFs is insufficient. In this study, we propose a framework for cyber physical wind energy systems (CPWESs), which consists of four layers: a WF power system layer, data acquisition and monitoring layer, communication network layer, and application layer. We specified and explained the communication architecture of a wind turbine, met-mast, substation, and control center. We modeled the data transmission of the monitoring and protection systems in wind turbines, a met-mast, and a substation based on IEC 61400-25 and IEC 61850 standards. We considered an actual WF (Southwest offshore project, South Korea) as a case study to evaluate the network topology and configuration. We developed a network simulation platform for the performance evaluation of the proposed architecture. The proposed framework contributes by providing a reference architecture model that can be used for the design and implementation of future CPWESs. The remainder of this paper is organized as follows. Section 2 presents the proposed cyber physical wind energy system. Section 3 explains WF modeling and assumptions. Section 4 discusses and analyzes the simulation results. Finally, Section 5 concludes and gives direction for future work. 2. Cyber Physical Wind Energy System The CPWES is considered to be a large and complex system, as it comprises multiple heterogeneous domains that interact with each other. Figure 1 shows the proposed CPWES, which consists of four layers: a WF power system layer, data acquisition and monitoring layer, communication network layer, and application layer. 2.1. WF Power System Layer This layer represents the WF electric power system, which is defined as a group of wind turbines connected together and tied to the utility through a system of transformers, transmission lines, and substations. A typical wind turbine consists of a wind turbine generator, step-up transformer, and a circuit breaker. The step-up transformer is used to step-up the generation voltage of each wind turbine. Wind turbines are divided into groups, and each group is connected to the collector bus through a circuit breaker. A high-voltage transformer is used to step-up the voltage to the transmission level.

data sources. Sensor nodes and measurement devices are connected to different elements of the WF (inside wind turbine, at the met-mast, etc.), and their main function is to report measurements, such as temperature, speed, voltage, current, and pressure. The CMS of a wind turbine continuously collects a large volume of data signals in real time. Based on data collected from CMS, sensors, and other devices, the wind turbine controller (WTC) performs the control operation for the wind turbine Appl. Sci. 2017, 7, 1034 3 of 20 (different operation modes).

Figure Figure1.1.Proposed Proposedarchitecture architectureofofthe thecyber cyberphysical physicalwind windenergy energysystem. system.SCADA: SCADA:supervisory supervisory control and data acquisition; CCTV: closed-circuit television; VOIP: voice over internet control and data acquisition; CCTV: closed-circuit television; VOIP: voice over internetprotocol; protocol;IED: IED: intelligent intelligentelectronic electronicdevice; device;WTG: WTG:wind windturbine turbinegenerator; generator;WTC: WTC:wind windturbine turbinecontroller; controller;HV: HV:high high voltage; voltage;Met. Met.Mast: Mast:meteorological meteorologicalmast. mast.

2.2. Acquisition Network and Monitoring 2.3.Data Communication Layer Layer The acquisition layer includes sensor nodes measurement devices, are the Thedata communication infrastructure provides theand connection between WF which elements, andbasic it is data sources. nodes measurementnetwork devices inside are connected different elements the WF divided into Sensor two parts: theand communication the windtoturbine (turbine areaofnetwork, (inside turbine, wind at the turbines, met-mast,and etc.),the and their main is to network, report measurements, as TAN) wind and between control centerfunction (farm area FAN). Insidesuch a wind temperature, speed, voltage, current, and pressure. The CMS of a wind turbine continuously collects a turbine are different communication protocols, such as the field bus, industrial ethernet protocols, large data signals in real time. on data of collected from CMS, sensors,network and otherisdevices, and volume control of area network (CAN). TheBased architecture the WF communication switchthe wind turbine controller performs the control operation for the wind turbine (different based, consisting of ethernet(WTC) switches and communication links in every wind turbine. The network operation modes). configuration is based on point-to-point communication and a local area network (LAN). In order to link the WF network with a remote control center, different wide area network (WAN) technologies, 2.3. Communication Network Layerbe configured, such as optical fiber cables, microwaves, and satellites. either wired or wireless, could The communication infrastructure provides the connection between WF elements, and it is divided into two parts: the communication network inside the wind turbine (turbine area network, TAN) and between wind turbines, and the control center (farm area network, FAN). Inside a wind turbine are different communication protocols, such as the field bus, industrial ethernet protocols, and control area network (CAN). The architecture of the WF communication network is switch-based, consisting of ethernet switches and communication links in every wind turbine. The network configuration is based on point-to-point communication and a local area network (LAN). In order to link the WF network

Appl. Sci. 2017, 7, 1034

4 of 20

Appl. 7, 1034 4 of 19 withSci.a 2017, remote control center, different wide area network (WAN) technologies, either wired or wireless,

could be configured, such as optical fiber cables, microwaves, and satellites. Each WF has a dedicated Each WF hastoaadedicated connection a local control centerand forcontrol. real-time monitoring and control. connection local control center fortoreal-time monitoring However, one control center However, one control center can remotely manage and control one or more WFs. can remotely manage and control one or more WFs.

2.4. 2.4.Application ApplicationLayer Layer The used for fordata dataacquisition, acquisition,remote remote monitoring, real-time control, TheSCADA SCADA systems systems are are used monitoring, real-time control, andand data data recording. There are multiple applications covered by SCADA the SCADA systems a WF. The three recording. There are multiple applications covered by the systems in a in WF. The three main main applications as follows: the turbine SCADA provides the connectivity wind applications are asare follows: the turbine SCADA systemsystem provides the connectivity among among wind turbines turbines and enables remote monitoring and control for each wind turbine and their associated suband enables remote monitoring and control for each wind turbine and their associated sub-systems, systems, the WF SCADA system connects allfrom devices from turbines, all wind turbines, well as the electric the WF SCADA system connects all devices all wind as well asas the electric substation substation together, and the security SCADA system provides the IP telephony services and video together, and the security SCADA system provides the IP telephony services and video surveillance. surveillance. The SCADA system remotely collects the process information from WF components The SCADA system remotely collects the process information from WF components and stores it in and stores databases it in historical databases and servers (historical metering server, meteorological historical and servers (historical servers, meteringservers, server, meteorological server, etc.). Based server, etc.). Based on the collected information, the control center executes appropriate actions. on the collected information, the control center executes appropriate actions. 3.3.CPWES CPWESModeling Modelingand andAssumption Assumption InInthis thisstudy, study,the theSouthwest Southwestoffshore offshorewind windfarm farmlocated locatedininSouth SouthKorea Koreawas wasconsidered consideredasasaacase case study. study.We Weaimed aimedtotodesign designthe thecommunication communicationinfrastructure infrastructurefor forphase phase11ofofthe theproject, project,which whichconsists consists ofof2020wind windturbines turbineswith witha atotal totalcapacity capacityofof6060MW, MW,asasshown shownininFigure Figure2.2.The Theoutput outputvoltage voltageofofeach each turbine stepped-up to to the thecollector collectorbus buswith witha atypical typicalvoltage voltageofof 34.5 turbinewas was690 690V. V. The The voltage is stepped-up 34.5 kV.kV. All All turbines were connectedtotoananoffshore offshoresubstation. substation. The The spacing spacing between between turbines turbines were connected turbineswas wasabout about800 800mm (along (alongthe therows rowsand andbetween betweenthe therows). rows).The Thelongest longestcable cablelength lengthwas wasabout about1.2 1.2km kmbetween betweenturbine turbine No. the offshore platform, while thethe shortest cable length waswas about 524 m between turbine no. No.3 and 3 and the offshore platform, while shortest cable length about 524 m between turbine 1No. and1the platform [18]. The topology configuration consisted of three feeders, one andoffshore the offshore platform [18].electric The electric topology configuration consisted of three feeders, bus, We assumed that the network topology followed the oneand bus,one andHV onetransformer. HV transformer. We assumed thatcommunication the communication network topology followed WF topology, where the optical fiber fiber cables are integrated with with the submarine cables. The WF theelectric WF electric topology, where the optical cables are integrated the submarine cables. The communication architecture is shown in Figure 3. 3. WF communication architecture is shown in Figure

Figure Figure2.2.Configuration Configurationofofthe theSouthwest Southwestwind windfarm farmproject project(Test (TestWF). WF).

Appl. Sci. 2017, 7, 1034 Appl. Sci. 2017, 7, 1034

5 of 20 5 of 19

Figure Figure 3. 3. Proposed Proposed OPNET OPNET model model for for the thewind windfarm farmcommunication communication architecture. architecture. OPNET: OPNET: optimized optimized network engineering tool. network engineering tool.

3.1. Traffic Model for Wind Turbine based on IEC 61400-25 3.1. Traffic Model for Wind Turbine Based on IEC 61400-25 In order to model the wind turbine subsystems, we considered the IEC 61400-25 standard, which In order to model the wind turbine subsystems, we considered the IEC 61400-25 standard, which is an adaption of IEC 61850 [19]. IEC 61400-25 provides information for the monitoring and control is an adaption of IEC 61850 [19]. IEC 61400-25 provides information for the monitoring and control of wind power plants. We assumed that a wind turbine consists of 10 logical nodes (LNs): WROT, of wind power plants. We assumed that a wind turbine consists of 10 logical nodes (LNs): WROT, WTEM, WGEN, WCNV, WNAC, WYAW, WTOW, WTRF, WMET and WFOU, as shown in Equation (1). WTEM, WGEN, WCNV, WNAC, WYAW, WTOW, WTRF, WMET and WFOU, as shown in Equation (1). Each LN represents a wind turbine sub-system, as given in Table 1. Each LN represents a wind turbine sub-system, as given in Table 1. (1) = , , , , , , , , , , WTLN = {WROT, WTEM, WGEN, WCNV, W N AC, WYAW, WTOW, WTRF, W MET, WFOU }, (1) Different types of sensor nodes and measurement devices were connected to different turbine partsDifferent to measure different parameters, such as voltage (V), current wind speed (WdSpd), wind types of sensor nodes and measurement devices were(A), connected to different turbine direction (WdDir), temperature (Tmp), Humidity (Hum), displacement (Disp), and pressure (Pres). parts to measure different parameters, such as voltage (V), current (A), wind speed (WdSpd), wind Equation definestemperature types of sensor nodes (SNTYPE) (Hum), inside adisplacement wind turbine (Disp), sub-system. direction (2) (WdDir), (Tmp), Humidity and pressure (Pres). Equation (2) defines types of sensor (SN sub-system. ⊆ , nodes , , TYPE ) inside , , a wind , turbine , ,… ,

(2)

Each sensor node (SNi)⊆was byWdDir, a senor Tmp, identification ID Pres, (SNID.),. .sensor type (SNTYPE(2) ), SNTYPE I, WdSpd, Hum, Disp, {V,identified }, and the physical location inside the wind turbine (WTLN), as given in Equation (3). Each sensor node (SNi ) was identified by a senor identification ID (SNID ), sensor type (SNTYPE ), (3) , = , , and the physical location inside the wind turbine (WTLN ), as given in Equation (3). Regarding the data acquisition and monitoring layer, the authors in Ref. [20] designed a data , SNIDThe , SN (3) {WTLN }, acquisition system to be installed onSN a real turbine. total number of sensor nodes was about 29. TYPE i = wind The sensor nodes were divided into two groups with sampling rates: 50 Hz (low speed) and 20 kHz (high speed). In our model, the total number of sensor nodes defined inside a wind turbine was 108 [21]. Figure 4 shows a schematic diagram of the cyber physical model for a wind turbine system

Appl. Sci. 2017, 7, 1034

6 of 20

Regarding the data acquisition and monitoring layer, the authors in Ref. [20] designed a data acquisition system to be installed on a real wind turbine. The total number of sensor nodes was about 29. The sensor nodes were divided into two groups with sampling rates: 50 Hz (low speed) and 20 kHz (high speed). In our model, the total number of sensor nodes defined inside a wind turbine was Appl. Sci. 2017, 7, 1034 6 of 19 108 [21]. Figure 4 shows a schematic diagram of the cyber physical model for a wind turbine system where seven LNs are located at the turbine nacelle and three LNs are in the bottom of the tower. Two where seven LNs are located at the turbine nacelle and three LNs are in the bottom of the tower. Two data collection units were considered for aggregating the traffic from different turbine subsystems. data collection units were considered for aggregating the traffic from different turbine subsystems. Table 1. Wind turbine logical nodes [18]. Table 1. Wind turbine logical nodes [18]. LN

Description

LN Description WROT Wind turbinerotor rotorinformation information WROT Wind turbine WTRM Wind turbine transmission information WTRM Wind turbine transmission information WTGEN Wind turbine generator information WTGEN Wind turbine WCNV Wind turbinegenerator converterinformation information WCNV Wind information WTRF Windturbine turbine converter transformer information WTRF Wind turbine transformer information

LN

LN WNAC WNAC WYAW WYAW WTOW WTOW WFOU WFOU WMET WMET

Description

Description Windturbine turbine nacelle Wind nacelleinformation information Wind turbine yawing information Wind turbine yawing information Wind turbine tower information Wind turbine tower information Wind turbine foundation information Wind turbine information Wind power plantfoundation meteorological information Wind power plant meteorological information

Figure Figure 4. 4. Schematic Schematic diagram diagram of of the the cyber cyber physical physical model model for the wind turbine system.

To determine the the amount amount of of data data generated generated from from each each sensor sensor node node (data (data rate, rate, SN SNDR DR), To determine ), we we defined defined the sample size (N B ), the sampling rate (F S ), and the number of channels (N C ), as shown in Equation the sample size (NB ), the sampling rate (FS ), and the number of channels (NC ), as shown in Equation (4). (4). (4) SN = N × F × N , SNDR = NB × FS × NC , (4) Given the displacement sensor as an example, the sensor generates 10 samples/s and the number of channels required for the measuring of data is 2 channels. The total amount of data is 320 bits/s, considering a sample size of 16 bits. Table 2 shows how to calculate the measuring requirement for different sensor nodes. We calculated the sensing data for all sensor nodes inside a wind turbine (108 sensor nodes), and Table 3 shows the traffic configuration at each wind turbine subsystem.

Appl. Sci. 2017, 7, 1034

7 of 20

Given the displacement sensor as an example, the sensor generates 10 samples/s and the number of channels required for the measuring of data is 2 channels. The total amount of data is 320 bits/s, considering a sample size of 16 bits. Table 2 shows how to calculate the measuring requirement for different sensor nodes. We calculated the sensing data for all sensor nodes inside a wind turbine (108 sensor nodes), and Table 3 shows the traffic configuration at each wind turbine subsystem. Table 2. Measuring requirement for sensor nodes. Sensor Type

Sampling Frequency

Sample Size

Number of Channels

Data Rate

Temperature Wind Speed Displacement Vibration Voltage

1 Hz 1 Hz 10 Hz 200 Hz 2048 Hz

16 bits 16 bits 16 bits 16 bits 16 bits

1 1 2 3 3

2 bytes/s 2 bytes/s 40 bytes/s 1200 bytes/s 12,288 bytes/s

Table 3. Traffic configuration for a wind turbine. LN

# Sensors

Data Rate

LN

# Sensors

Data Rate

WROT WTRM WTGEN WCNV WTRF

14 18 14 14 12

642 bytes/s 2828 bytes/s 73,764 bytes/s 74,060 bytes/s 73,740 bytes/s

WNAC WYAW WTOW WFOU WMET

12 7 4 6 7

112 bytes/s 220 bytes/s 8 bytes/s 1434 bytes/s 228 bytes/s

3.2. Traffic Model for an Offshore Substation Based on IEC 61850 In this work, a WF is divided into different protection zones: the wind turbine zone, collector feeder zone, collector bus zone, and high-voltage transformer zone. Each zone has one or more protection devices associated with it, such as a wind turbine protection relay, step-up transformer protection relay, feeder protection relay, bus protection relay, and distance relay for the transmission line [22]. We considered three types of IEDs: a breaker IED, merging unit (MU) IED, and protection and control (P&C) IED [23–26]. Based on IEC 61850, the function of a MU IED is to acquire the voltage and current signals from the field CT and PT. The function of a CB IED is to monitor the state and control the condition of the circuit breaker. Furthermore, it receives control signals from the P&C IEDs and updates the status change to the protection IEDs. The P&C IED is a universal device that integrates protection and control functionalities and operates at the bay level in the substation. All acquired signals from different zones were transmitted to the central protection and control (CPC) system on the offshore substation using point-to-point fiber communication. We considered one CPC system, which is responsible for all protection and control devices in the WF, as shown in Figure 5. In addition, each wind turbine generator (WTG) was configured with a merging unit to acquire current, voltage, and breaker signals for wind turbine protection. Data packets were transmitted over a communication network to a central relaying unit (CRU) for P&C functions of the whole WF. For MU-IEDs, we assumed that the sampling frequency of the voltage and current data was 6400 Hz for a 50 Hz power system, and each piece of sampling data was represented by 2 bytes [27]. Considering the 3-phase voltage and current measurement, the MU-IEDs send updated values of 76,800 bytes/s to the P&C server at the local control center [25]. Moreover, the CB-IED sends a status value of 16 bytes/s to the P&C server. The details of the process level, bay level, and station level are as follows:

• • •

Process Level: The primary equipment includes merging units (MUs), actuators, and I/O devices. Bay Level: The secondary equipment includes different IEDs and protection relays. Station Level: It is located on the offshore platform and includes a central protection and control unit, remote terminal units (RTU), and a human machine interface (HMI).

Appl. Sci. 2017, 7, 1034

8 of 19

architecture. At the Appl. Sci. 2017, 7, 1034

station level, there were the following: a station PC, a protection and control 8 of 20 server, ethernet switches, and communication links.

Figure5.5.Schematic Schematic diagram diagram of of the the cyber cyber physical Figure physical model model for forthe thewind windfarm farmsubstation. substation. Table 4. IEDs configuration in the wind farm substation. IED: intelligent electronic device.

The communication network was constructed based on ethernet-based architecture. The offshore CBone IEDtransformer. MU IED P&Cdevices IED at wind turbines, substation consisted of three Zone feeders, one bus, and Protection Wind Turbine 1 1 1 feeders, and bus were modeled as a subnet consisting of one CB-IED, one MU-IED, one P&C-IED, Collector Feeder 1 and one ethernet switch. The step-up transformer 1was modeled1 as a subnet consisting of one CB-IED, Collector Bus 1 1 1 one MU-IED, two P&C IED, and an ethernet switch. Table 4 shows the IEDs configuration. The Substation 1 considered1in our model are given traffic configuration and dataTransformer flow for the CB-IED2 and MU-IED in Table 5. IEDs and relays were connected through a 100 Mbps ethernet-based architecture. At the Table 5. Traffic configuration for IEDs. station level, there were the following: a station PC, a protection and control server, ethernet switches, and communication Zonelinks. Measurement From To CB IED MU IED

Breaker Status Sampled 3-phase V, I message

CB IED MU IED

Station Server Station Server

Table 4. IEDs configuration in the wind farm substation. IED: intelligent electronic device. Zone

CB IED

MU IED

P&C IED

Wind Turbine

1

1

1

3.3. Traffic Model for Meteorological Mast

We considered a met-mast Collector Feeder to measure the 1 wind condition in1 the WF area. Based1 on Reference [28], Collector Bus 1 1 we defined the number, types, and locations of the senor nodes1and measuring instruments installed Substation Transformer 2 1 1 at the met-mast, as shown in Figure 6. The platform height was 10 m with respect to the mean sea level, and the top of the met-mast was about 97 m. Anemometers were installed at 97 m, 96 m, 86 m, Table IEDs. at 96 m, 76 m, 56 m and 46 m. 86 m, 76 m, 66 m, 56 m, 46 m and 26 5. m.Traffic Windconfiguration vanes were for installed Barometers and temperature and humidity sensors were installed at 94 m and 13 m. The details of Zone and their locations Measurement From To the sensor types at the met-mast are shown in Table 6. The acoustic doppler current profile (ADCP) is used toBreaker measure the current direction level. CB IED Status CBand IEDvelocity at sea Station Server MU IED

Sampled 3-phase V, I message

MU IED

Station Server

Appl. Sci. 2017, 7, 1034

9 of 20

3.3. Traffic Model for Meteorological Mast We considered a met-mast to measure the wind condition in the WF area. Based on Reference [28], we defined the number, types, and locations of the senor nodes and measuring instruments installed at the met-mast, as shown in Figure 6. The platform height was 10 m with respect to the mean sea level, and the top of the met-mast was about 97 m. Anemometers were installed at 97 m, 96 m, 86 m, 86 m, 76 m, 66 m, 56 m, 46 m and 26 m. Wind vanes were installed at 96 m, 76 m, 56 m and 46 m. Barometers and temperature and humidity sensors were installed at 94 m and 13 m. The details of the sensor types and their locations at the met-mast are shown in Table 6. The acoustic doppler current profile (ADCP) Appl. Sci. 2017, 7, 1034 the current direction and velocity at sea level. 9 of 19 is used to measure

Figure Figure 6. 6. Schematic Schematic diagram diagram of of the the cyber cyber physical physical model model for for met-mast. met-mast. Table Table 6. 6. Locations, Locations, types types of of sensors, sensors, and and measurement measurement devices devices installed installed at at the the met-mast. met-mast. Type Type Anemometer Anemometer Anemometer Anemometer Wind Vane Wind Vane Temperature Temperature Humidity Humidity Pressure Sensor Pressure Sensor Anemometer Anemometer Anemometer Anemometer Wind Vane Wind Vane Anemometer Anemometer

Level Sampling Rate Data Rate Type Level Sampling Rate 6 bytes/s Data Rate Anemometer Type Level 97Level m 3 Hz 56 m 97 m 3 Hz 6 bytes/s Anemometer 56m m 96 m 3 Hz 6 bytes/s Wind Vane 56 3 Hz 6 bytes/s Anemometer Wind Vane 56m m 9696mm 3 Hz 6 bytes/s 46 96 m 3 Hz 6 bytes/s Anemometer 46 m 93 m 1 Hz 2 bytes/s Wind Vane 46 m 93 m 1 Hz 2 bytes/s Wind Vane 46 m 9393mm 1 Hz 2 bytes/s 26 1 Hz 2 bytes/s Anemometer Anemometer 26m m 9393mm 100 100 Hz Hz 200200 bytes/s Humidity 14 bytes/s Humidity 14m m 3 Hz 6 bytes/s Temperature Temperature 14m m 8686mm 3 Hz 6 bytes/s 14 3 Hz 6 bytes/s Pressure Pressure Sensor 14m m 7676mm 3 Hz 6 bytes/s Sensor 14 76 m 3 Hz 6 bytes/s Rain Sensor 10 m 76 m 3 Hz 6 bytes/s Rain Sensor 10 m Foundation 66 m 3 Hz 6 bytes/s ADCP 66 m 3 Hz 6 bytes/s ADCP Foundation

Sampling Rate Sampling 3 Hz Rate 3 Hz Hz 33 Hz Hz 3 Hz 3 Hz 3 Hz 33 Hz Hz 1 Hz Hz 1 Hz Hz 100 Hz 100 Hz 4 Hz 4 Hz -

Data Rate Data Rate 6 bytes/s 6 bytes/s 66 bytes/s bytes/s 6 bytes/s 6 bytes/s 6 bytes/s 66 bytes/s bytes/s 2 bytes/s 2 bytes/s 200 bytes/s 200 bytes/s 8 bytes/s 8 bytes/s 150 bytes/s 150 bytes/s

3.4. Wind Wind Farm Farm Timing Timing Requirements Requirements 3.4. In aa power power system, system, it it is is important important to to meet meet the the latency latency requirement requirement for for different different real-time real-time In monitoring and and protection protection applications. applications. Therefore, Therefore, the the WF WF communication communication network network should should ensure ensure monitoring

the end-to-end delay for data transmission between wind turbines, the met-mast, the substation, and the control center. The IEC 61400-25 standard does not provide any specific timing requirements for the WF. In this study, we considered the latency requirement based on the electric power system. Table 7 shows the communication timing requirements for electric substation automation based on IEEE 1646 [29]. We mapped WF monitoring and control information as follows: protection

Appl. Sci. 2017, 7, 1034

10 of 20

the end-to-end delay for data transmission between wind turbines, the met-mast, the substation, and the control center. The IEC 61400-25 standard does not provide any specific timing requirements for the WF. In this study, we considered the latency requirement based on the electric power system. Table 7 shows the communication timing requirements for electric substation automation based on IEEE 1646 [29]. We mapped WF monitoring and control information as follows: protection information from IEDs were mapped for protection information, analogue measurement and status information were mapped for monitoring and control information, and meteorological data from the met-mast were mapped for operation and maintenance. Table 7. Wind farm communication requirements based on the IEEE 1646 standard. Information Types

Delay Requirements

Protection Information (high speed) Analogue Measurement and Status Information Meteorological data Audio and Video data stream Image files

4 ms 16 ms 1s 1s 10 s

4. Simulation Results The WF communication architecture is modeled using the OPNET Modeler [30], as shown in Figure 3. The proposed communication architecture for the Southwest WF consists of 22 subnetworks (20 wind turbines, one met-mast, and one offshore substation) [31]. Upstream data from WF subsystems are transmitted to the local control center at the offshore substation. We simulated different scenarios for WF monitoring, protection, and control, as shown in Table 8. The details of simulation cases are as follows:

• • • •

Standalone wind turbine: includes data transmission of the condition monitoring system during different operation modes. Wind farm: consists of 20 wind turbines configured in a cascade architecture. Met-mast: includes data transmission of sensors and measurement devices installed at the met-mast. Substation: consists of protection and control devices (IEDs) at wind turbines, feeders, and offshore substation. Table 8. Summary of simulation cases. Scale

Level

Options

1 2 3 4

Standalone Wind Turbine Wind Farm Met-Mast Substation

10 Mb/s, 100 Mb/s 100 Mb/s, 1 Gb/s 10 Mb/s, 100 Mb/s 10 Mb/s, 100 Mb/s, 1 Gb/s

4.1. Standalone Wind Turbine Results We considered four operation modes for a wind turbine, which are idle, start-up, power generation, and shutdown, as shown in Figure 7. Based on wind conditions, the WTC determines the turbine operation mode [8]. We assumed that the control center operator turned on a wind turbine for operation. The WTC first checked the condition of the wind turbine. If no faults were detected in the wind turbine system, the WTC activated the operation in the idle mode. During the idle mode, if the average wind speed was greater than the cut-in wind speed for a certain time, the WTC activated the start-up operation mode. During the power production mode, the WTC adjusted the blades’ pitch

Appl. Sci. 2017, 7, 1034

11 of 20

angle and position of the nacelle in order to optimize the turbine operation. In case of higher wind speeds, which were greater than the cut-off, the WTC activated the shutdown mode. Appl. Sci. 2017, 7, 1034 11 of 19

Figure Figure7.7.Power Powercurve curveand andgeneral generaloperation operationmodes modesof ofaatypical typicalwind windturbine. turbine. 9. Classification of data based on different operation modes. We classifiedTable the sensor nodes inside thetypes wind turbine into four different categories: mechanical measurement (rotor speed, pitch angle, displacement, vibration, etc.), electrical measurements (voltage, Data Type Idle Mode Operation Mode Shutdown Mode current, power, frequency, etc.), meteorological data √(wind speed, wind Xdirection, temperature, Status √ humidity, etc.), Mechanical and foundation measurements, as shown √ √ in Table 9. Based on X the turbine operation mode, the amount of monitoring dataXforwarded to the √WTC was different. This Electrical X classification meant that some of the sensor nodes were continuously working, while other sensor Meteorological √ √ Xnodes were generating the monitoring Foundation data based on the turbine operation mode. √ √ X Table 9. Classification of data types based on different operation modes.

The OPNET Modeler was used to simulate data transmission of condition monitoring system for a wind turbine during three operation modes (shutdown, idle, and operation). The nacelle Data Type Idle Mode Operation Mode Shutdown Mode dimension was configured as 12 m × 4 m. √ The total number of √ monitoring parameters and sensors Status X √ √ between DCU and was 108. The height of the tower was configured as 85 m (distance main ethernet Mechanical X √ switch). Based on the IEC 61400-25 standard, different profiles were defined, configured, and Electrical X X √ √ assigned toMeteorological each LN. The simulation time was 20 min. X √ √ Foundation X modes. The Figure 8a shows the total received traffic at the WTC for different operation maximum received traffic was about 227,036 bytes/s (225,544 AM, 58 SI, and 1434 FOU) during turbine while the traffic was 5808 bytes/s duringmonitoring the idle mode. Theoperation, OPNET Modeler wasreceived used to simulate dataabout transmission of condition systemThe for difference was due to the sensing data of the electric measurement (voltage, current, power, power a wind turbine during three operation modes (shutdown, idle, and operation). The nacelle dimension factor and frequency), 221,228 of bytes/s. No traffic was received at thewas WTC while was configured as 12 mwhich × 4 m.were Theabout total number monitoring parameters and sensors 108. The the turbine was in the shutdown mode. height of the tower was configured as 85 m (distance between DCU and main ethernet switch). Based on the IEC 61400-25 standard, different profiles were defined, configured, and assigned to each LN. The simulation time was 20 min. Figure 8a shows the total received traffic at the WTC for different operation modes. The maximum received traffic was about 227,036 bytes/s (225,544 AM, 58 SI, and 1434 FOU) during turbine operation, while the received traffic was about 5808 bytes/s during the idle mode. The difference was due to the sensing data of the electric measurement (voltage, current, power, power factor and frequency), which were about 221,228 bytes/s. No traffic was received at the WTC while the turbine was in the shutdown mode. Figure 8b shows the end-to-end delay inside the wind turbine between sensor nodes and the WTC). The link capacities inside the wind turbine were configured with 10 Mbps and 100 Mbps. During the power production mode, the end-to-end delay was about 0.098 ms and 1.1 ms using link capacities of 10 Mbps and 100 Mbps, respectively. The received traffic from the pitch angel sensor, wind

(a)

(b)

Figure 8. (a) Total received traffic at WTC for different operation modes; (b) End-to-end delay for the internal communication inside the wind turbine.

The OPNET Modeler was used to simulate data transmission of condition monitoring system for a wind turbine during three operation modes (shutdown, idle, and operation). The nacelle dimension was configured as 12 m × 4 m. The total number of monitoring parameters and sensors was 108. The height of the tower was configured as 85 m (distance between DCU and main ethernet Appl. Sci. 2017, 7, 1034 12 of 20 switch). Based on the IEC 61400-25 standard, different profiles were defined, configured, and assigned to each LN. The simulation time was 20 min. speed sensor, tilt sensor werereceived 18 bytes/s, 36 bytes/s, and 40 respectively, as shown in Figure 8aand shows the total traffic at the WTC forbytes/s, different operation modes. The maximum received traffic was about 227,036 bytes/s 58 SI, center and 1434 FOU)activated during Figure 9a. The measurements were received at the WTC (225,544 just after AM, the control operator turbine while the received traffic wasthe about 5808measurement, bytes/s during idle mode. the windoperation, turbine for operation at t = 300 s. For electric thethe received trafficThe for difference to the sensing data of theand electric measurement (voltage, current, power, power the currentwas anddue voltage were 73,728 bytes/s 147,456 bytes/s, respectively. Note that the electric factor and frequency), which were 221,228 was received at the WTC while measurements were transmitted toabout the WTC at t bytes/s. = 420 s, No andtraffic no data was received during the idle the turbine was in shutdown mode. mode, as shown inthe Figure 9b.

Appl. Sci. 2017, 7, 1034

12 of 19

Figure 8b shows the end-to-end delay inside the wind turbine between sensor nodes and the WTC). The link capacities inside the wind turbine were configured with 10 Mbps and 100 Mbps. During the power production mode, the end-to-end delay was about 0.098 ms and 1.1 ms using link capacities of 10 Mbps and 100 Mbps, respectively. The received traffic from the pitch angel sensor, wind speed sensor, and tilt sensor were 18 bytes/s, 36 bytes/s, and 40 bytes/s, respectively, as shown in Figure 9a. The measurements were received at the WTC just after the control center operator activated the wind turbine the received traffic (a)for operation at t = 300 s. For the electric measurement, (b) for the current and voltage were 73,728 bytes/s and 147,456 bytes/s, respectively. Note that the electric Figure 8. (a) were Total received traffic atthe WTC for different operation modes; (b) End-to-end forthe the measurements transmitted toat WTC at t = 420operation s, and no data (b) was received delay during Figure 8. (a) Total received traffic WTC for different modes; End-to-end delay for theidle internal communication inside the wind turbine. internal communication inside the wind turbine. mode, as shown in Figure 9b.

Idle Mode

(a)

(b)

Figure 9. Traffic received at WTC for different sensors. (a) Wind speed, pitch angle, tilt; (b) Voltage Figure 9. Traffic received at WTC for different sensors. (a) Wind speed, pitch angle, tilt; (b) Voltage and current. and current.

In the future, wind turbines will integrate more and more sensor nodes and monitoring devices. In therequire future, the wind turbines will integrate and more and monitoring devices. This will communication network more to support evensensor highernodes bandwidth. We studied the This will require the communication network to support even higher bandwidth. We studied the end-to-end delay for the case where the number of sensor nodes inside the wind turbine is duplicated end-to-end delay forand the three-times case where the number sensorin nodes inside the wind duplicated two-times (case2×) (case3×). Theofresults Figure 10 shows the turbine receivedistraffic and two-times (case2 × ) and three-times (case3 × ). The results in Figure 10 shows the received traffic and the end-to-end delay at the WTC during different operation modes. The end-to-end delay increased the end-to-end delay at the WTC during different operation modes. The end-to-end delay increased from 1.1 ms (base case with 108 sensor nodes) to 3.08 ms for case3×. The results of the ETE delay were from 1.1 ms (base case with 108 sensor to of 3.08 ms for case3inside ×. The results the ETE were proportional to the network traffic andnodes) number sensor nodes the wind of turbine, asdelay shown in proportional to the network traffic and number of sensor nodes inside the wind turbine, as shown in Table 10. Table 10.

end-to-end delay for the case where the number of sensor nodes inside the wind turbine is duplicated two-times (case2×) and three-times (case3×). The results in Figure 10 shows the received traffic and the end-to-end delay at the WTC during different operation modes. The end-to-end delay increased from 1.1 ms (base case with 108 sensor nodes) to 3.08 ms for case3×. The results of the ETE delay were proportional to the network traffic and number of sensor nodes inside the wind turbine, as shown in Appl. Sci. 2017, 7, 1034 13 of 20 Table 10.

(a)

(b)

Figure Figure 10. 10. (a) (a)Total Total received received traffic traffic at at WTC WTC for for different different operation operation mode; mode; (b) (b) ETE ETE delay delay for for the the internal internal communication inside wind turbine. communication inside wind turbine. Table 10. ETE delay inside WTs with different numbers of sensor nodes. Case

# Sensor Nodes

Idle Mode

Operation Mode

Case1× Case2× Case3×

108 216 324

0.53 ms 0.65 ms 0.69 ms

1.1 ms 1.77 ms 3.08 ms

4.2. Wind Farm Results We configured the WF communication network based on the data from real large-scale WF projects (Greater Gabbard WF in UK [16], Horn Rev WF in Denmark [15], and Yeung Heung in South Korea [17]). The network topology was configured as cascaded architecture. A main ethernet switch was located on the offshore platform. Dedicated optical fibers were connected between the main ethernet switch and the nearest wind turbines. The connectivity between turbines was provided through the connection between ethernet switches installed at the base of each turbine. Figure 11 shows the total traffic received at the control center (SCADA server). The traffic received for analogue measurements, status information, foundation measurements, and protection information was 4,510,880 bytes/s (225,544 × 20), 1160 bytes/s (58 × 20), 28,680 bytes/s (1434 × 20), and 1,536,320 bytes/s (76,816 × 20), respectively. Figure 12 shows the average ETE delay of real-time monitoring data for all wind turbine applications. The average ETE delay was about 7.83 ms and 0.78 ms for link capacities of 100 Mbps and 1 Gbps, respectively. For the 20 wind turbines in the WF, Table 11 shows the maximum and minimum ETE delays for SCADA, MU-IED, and CB-ID between the wind turbines and the control center. Table 11. Simulation results of the ETE delay from the wind turbine zone to the control center.

Wind Turbine Zone

Channel Speed

SCADA

ETE Delay Min

Max

100 Mbps 1 Gbps

11.87 1.16

13.06 1.28

CB-IED

100 Mbps 1 Gbps

0.56 0.20

5.40 0.45

MU-IED

100 Mbps 1 Gbps

9.36 0.53

11.62 1.13

information was 4,510,880 bytes/s (225,544 × 20), 1160 bytes/s (58 × 20), 28,680 bytes/s (1434 × 20), and information was 4,510,880 bytes/s (225,544 × 20), 1160 bytes/s (58 × 20), 28,680 bytes/s (1434 × 20), and 1,536,320 bytes/s (76,816 × 20), respectively. Figure 12 shows the average ETE delay of real-time 1,536,320 bytes/s (76,816 × 20), respectively. Figure 12 shows the average ETE delay of real-time monitoring data for all wind turbine applications. The average ETE delay was about 7.83 ms and 0.78 monitoring data for all wind turbine applications. The average ETE delay was about 7.83 ms and 0.78 ms for link capacities of 100 Mbps and 1 Gbps, respectively. For the 20 wind turbines in the WF, Table ms for link capacities of 100 Mbps and 1 Gbps, respectively. For the 20 wind turbines in the WF, Table 11 shows the maximum and minimum ETE delays for SCADA, MU-IED, and CB-ID between the Appl. Sci. 2017,the 7, 1034 14 ofthe 20 11 shows maximum and minimum ETE delays for SCADA, MU-IED, and CB-ID between wind turbines and the control center. wind turbines and the control center.

Figure 11. Total received traffic traffic at the SCADA SCADA server in in the control control center(20 (20 windturbines). turbines). Figure Figure11. 11.Total Totalreceived received trafficatatthe the SCADAserver server inthe the controlcenter center (20wind wind turbines).

Figure 12. Average ETE delay for the wind farm data (20 wind turbines). Figure Figure12. 12.Average AverageETE ETEdelay delayfor forthe thewind windfarm farmdata data(20 (20wind windturbines). turbines).

4.3. Met-Mast Results We assumed that the dimension of the Met-mast platform were 10 × 8 m. Four data collection units (DCU) were installed at different levels (90 m, 70 m, 50 m, and 20 m) for collecting the measurement data. All sensing data were collected at the measurement PC located at the met-mast platform. Each DCU had a dedicated communication link to the measurement PC to maintain reception of data in case of any faults in other DCUs. The electric power needed for the measurement equipment and other devices at the met-mast could be supported from the WF or from photovoltaic panels. Figure 13 shows the received traffic at the measurement PC for pressure, ADCP, humidity, wind speed, and wind direction, which were 150 bytes/s, 400 bytes, 4 bytes/s, 48 bytes, and 24 bytes/s, respectively. Figure 14a shows the total received data at the measurement PC of about 638 bytes/s, while the ETE delay is about 0.54 ms and 0.058 ms for link capacities of 10 Mbps and 100 Mbps, respectively. The amount of data stored at the measurement PC was about 55 Mb per day, and data was transmitted to the met-mast server at the local control center for storage.

In the real system, the data acquisition system at the met-mast uses wireless data transmission using WCDMA communication to transmit the collected data from the measurement PC to the remote control center [32]. We assumed that the met-mast was connected to the nearest wind turbine in the WF (WT16). The real-time measured data were stored at the measurement PC and transmitted to the local control center (met-mast server) through the WF communication network. The IP ETE Appl. Sci. 2017, 7, 1034 15 of 20 delay of the met-mast data through the WF network is shown in Figure 15.

(a)

(b)

Figure Figure13. 13.Traffic Trafficreceived receivedatatmeasurement measurementPC PCfor fordifferent differentsensors. sensors.(a) (a)Voltage Voltageand andcurrent; current;(b) (b)Wind Wind speed, pitch angle, and tilt. speed, angle, and tilt. Appl. Sci. 2017,pitch 7, 1034 15 of 19

Appl. Sci. 2017, 7, 1034

15 of 19

(a)

(b)

Figure at at measurement PCPC forfor different sensors (b) ETE delaydelay for the Figure 14. 14. (a) (a)Total Totalreceived receivedtraffic traffic measurement different sensors (b) ETE formetthe mast sensors. met-mast sensors.

In the real system, the data acquisition system at the met-mast uses wireless data transmission using WCDMA communication to transmit the collected data from the measurement PC to the remote control center [32]. We assumed that the met-mast was connected to the nearest wind turbine in the (a) (b) WF (WT16). The real-time measured data were stored at the measurement PC and transmitted to the localFigure control14.center (met-mast server) the WF communication network. The IP delay of (a) Total received traffic at through measurement PC for different sensors (b) ETE delay forETE the metmast sensors. the met-mast data through the WF network is shown in Figure 15.

Figure 15. ETE delay for met-mast data (link capacity 100 Mbps and 1 Gbps).

4.4. Substation Automation Results The communication network for the offshore WF substation is shown in Figure 3. The architecture is based on IEC 61850 with three levels: a process level, bay level, and station level [33]. We assumed that all MU IEDs and CB IEDs were sending metering values (76,800 bytes/s) and the status of the breaker (16 bytes/s) to the central P&C unit at the station level on the offshore platform. Three scenariosFigure were configured with differentdata link(link capacities of100 10,Mbps 100 and Mbps. To validate 15. for capacity and 11 Gbps). Figure 15. ETE ETE delay delay for met-mast met-mast data (link capacity 100 Mbps and 1000 Gbps). the results, we compared the generated amount of traffic from different IEDs with the received traffic 4.4. Automation Results fromSubstation the server. The server FTP traffic received was about 416 bytes/s (26 CB-IEDs × 16 bytes/s) and 1,920,000 (25 MU-IEDs × 76,800 bytes/s) for the CB-IEDs and MU-IEDs, respectively. Figure 16 shows The communication network for the offshore WF substation is shown in Figure 3. The the traffic received, which agrees with our calculations. architecture is based on IEC 61850 with three levels: a process level, bay level, and station level [33]. We assumed that all MU IEDs and CB IEDs were sending metering values (76,800 bytes/s) and the status of the breaker (16 bytes/s) to the central P&C unit at the station level on the offshore platform. Three scenarios were configured with different link capacities of 10, 100 and 1000 Mbps. To validate

Appl. Sci. 2017, 7, 1034

Figure 15. ETE delay for met-mast data (link capacity 100 Mbps and 1 Gbps).

16 of 20

4.4. 4.4. Substation Substation Automation Automation Results Results The communicationnetwork network offshore WF substation is inshown Figure 3. The The communication forfor the the offshore WF substation is shown Figure in 3. The architecture architecture is based IECthree 61850levels: with three levels: a process level, bay level,level and [33]. station [33]. is based on IEC 61850on with a process level, bay level, and station Welevel assumed We assumed that all MU IEDs and CB IEDs were sending metering values (76,800 bytes/s) and that all MU IEDs and CB IEDs were sending metering values (76,800 bytes/s) and the status of the the status of (16 the breaker to the central unit at thelevel station on the offshore platform. breaker bytes/s)(16 to bytes/s) the central P&C unitP&C at the station onlevel the offshore platform. Three Three scenarios were configured with different link capacities 10, and 100 and Mbps. To validate scenarios were configured with different link capacities of 10,of100 10001000 Mbps. To validate the the results, we compared the generated amount of traffic from different IEDs with the received traffic results, we compared the generated amount of traffic from different IEDs with traffic from FTP traffic trafficreceived receivedwas wasabout about416 416bytes/s bytes/s (26 (26CB-IEDs CB-IEDs×× 16 bytes/s) bytes/s) and from the server. The server FTP and 1,920,000 × 76,800 1,920,000 (25 MU-IEDs × 76,800bytes/s) bytes/s)for forthe theCB-IEDs CB-IEDsand andMU-IEDs, MU-IEDs,respectively. respectively. Figure Figure 16 16 shows shows the the traffic traffic received, received, which agrees with our calculations.

(a)

(b)

Figure 16. Traffic received at the measurement central P&C unit (channel capacity 100 Mbps, 1 Gbps). Figure 16. Traffic received at the measurement central P&C unit (channel capacity 100 Mbps, 1 Gbps). (a) CB-IEDs (b) MU-IEDs. (a) CB-IEDs (b) MU-IEDs.

Appl. Sci. 2017, 7, 1034

16 of 19

Figure delay forfor all all protection andand control information considering link Figure 17 17shows showsthe theaverage averageETE ETE delay protection control information considering capacities of 100ofMbps and 1 Gbps. with thewith timing of the IEEE standard, link capacities 100 Mbps and 1 Compared Gbps. Compared therequirement timing requirement of 1646 the IEEE 1646 the link capacity of 100 Mbps cannot satisfy the delay requirement owing to a high delay of about standard, the link capacity of 100 Mbps cannot satisfy the delay requirement owing to a high delay 8.5 ms. The 1 Gbpsof can satisfycan thesatisfy protection requirement with a delay lessa than ms, of about 8.5link ms. capacity The linkof capacity 1 Gbps the protection requirement with delay1 less as shown than 1 ms,inasFigure shown17. in Figure 17.

Figure 17. 17. Average ETE delay delay for for P&C P&C information information in in the the WF WF (channel (channel capacity capacity 100 100 Mbps, Mbps, 11 Gbps). Gbps). Figure Average ETE

It was observed that using 10 Mbps for channel capacity was not sufficient, as the amount of generated traffic was much higher than the link capacity. Figures 18 and 19 show the worst case scenario in case of 10 Mbps where data loss and higher delays occur. Table 12 shows the ETE delay of the MU-IEDs and CB-IEDs for the feeders (F1, F2, and F3), bus, and transformer with link bandwidth of 100 Mbps and 1 Gbps. Figure 20 shows the ETE delay of the MU-IED and CB-IED at

Figure 17. Average ETE delay for P&C information in the WF (channel capacity 100 Mbps, 1 Gbps).

Figure Average ETE delay for P&C information in the WF (channel capacity 100 Mbps, 1 Gbps).17 of 20 Appl. Sci. 2017, 17. 7, 1034

It was observed that using 10 Mbps for channel capacity was not sufficient, as the amount of It was observed that using 10 Mbps for channel capacity was not sufficient, as the amount of generated was that much higher than the capacity. Figures 18 and 19 show case It wastraffic observed using 10 Mbps for link channel capacity was not sufficient, as the the worst amount of generated traffic was much higher than the link capacity. Figures 18 and 19 show the worst case scenario in case of 10 Mbps where data loss and higher delays occur. Table 12 shows the ETE delay generated traffic was much higher than the link capacity. Figures 18 and 19 show the worst case scenario in case of 10 Mbps where data loss and higher delays occur. Table 12 shows the ETE delay of the MU-IEDs CB-IEDs fordata the feeders (F1, F2, and occur. F3), bus, transformer link scenario in case ofand 10 Mbps where and higher delays Tableand 12 shows the ETEwith delay of of the MU-IEDs and CB-IEDs for theloss feeders (F1, F2, and F3), bus, and transformer with link bandwidth of 100 Mbps and 1 Gbps. Figure 20 shows the ETE delay of the MU-IED and CB-IED at the MU-IEDs and CB-IEDs for the feeders (F1, F2, and F3), bus, and transformer with link bandwidth bandwidth of 100 Mbps and 1 Gbps. Figure 20 shows the ETE delay of the MU-IED and CB-IED of at the collector bus. 100 Mbps andbus. 1 Gbps. Figure 20 shows the ETE delay of the MU-IED and CB-IED at the collector bus. the collector

(a) (a)

(b) (b)

Figure 18. Traffic received at at measurement central central P&C unit unit (channel capacity capacity 10 Mbps). Mbps). (a) CB-IEDs Figure Traffic received Figure 18. 18. Traffic received at measurement measurement central P&C P&C unit (channel (channel capacity 10 10 Mbps). (a) (a) CB-IEDs CB-IEDs (b) MU-IEDs. (b) MU-IEDs. (b) MU-IEDs.

Figure 19. ETE delay for P&C information in the WF (channel capacity 10 Mbps). Figure 19. ETE in the the WF WF (channel (channel capacity capacity 10 10 Mbps). Mbps). Figure 19. ETE delay delay for for P&C P&C information information in Table 12. End-to-end delay of the IEDs at different zones. Zone

IED Type

ETE Delay 100 Mbps

1 Gbps

Collector Feeder 1

CB-IED MU-IED

1.31 ms 14.05 ms

0.20 ms 1.35 ms

Collector Feeder 2

CB-IED MU-IED

1.14 ms 13.87 ms

0.18 ms 1.33 ms

Collector Feeder 3

CB-IED MU-IED

1.32 ms 13.96 ms

0.19 ms 1.35 ms

Collector Bus

CB-IED MU-IED

1.00 ms 13.77 ms

0.16 ms 1.32 ms

Substation Transformer

CB1-IED CB2-IED MU-IED

1.62 ms 0.82 ms 13.62 ms

0.32 ms 0.15 ms 1.31 ms

Collector Feeder 3 Collector Bus Substation Transformer Appl. Sci. 2017, 7, 1034

CB-IED MU-IED CB-IED MU-IED CB1-IED CB2-IED MU-IED

1.32 ms 13.96 ms 1.00 ms 13.77 ms 1.62 ms 0.82 ms 13.62 ms

0.19 ms 1.35 ms 0.16 ms 1.32 ms 0.32 ms 0.15 ms 1.31 ms

18 of 20

Figure 20. 20. ETE ETE delay delay for for the the MU-IED MU-IED and and CB-IED CB-IED at the collector bus (Link capacity 100 Mbps, 1 Gbps).

5. 5. Conclusions Conclusions In In this this work, work, we we proposed proposed aa framework framework for for the the grid grid integration integration of of aa cyber cyber physical physical wind wind energy energy system (CPWES) that consists of four layers: a wind farm power system layer, data acquisition system (CPWES) that consists of four layers: a wind farm power system layer, data acquisition and and monitoring communication network (Korean monitoring layer, layer, communication network layer, layer, and and application application layer. layer. A A real real WF WF project project (Korean Southwest considered as aascase study. We developed a communication network model Southwest –phase –phase1)1)was was considered a case study. We developed a communication network for the WF based on the OPNET Modeler. The network architecture was modeled based on IEC model for the WF based on the OPNET Modeler. The network architecture was modeled based 61400-25 and IEC standards. Different scenarios were configured the on IEC 61400-25 and61850 IEC 61850 standards. Different scenarios were configuredfor forevaluating evaluating the communication network performance at different levels, including: data transmission of condition communication network performance at different levels, including: data transmission of condition monitoring monitoring system system for for aa standalone standalone wind wind turbine turbine during during different different operation operation modes, modes, communication communication network for a real WF consists of 20 wind turbines, data transmission of sensors and measurement network for a real WF consists of 20 wind turbines, data transmission of sensors and measurement devices installed at the met-mast, and communication network for protection and control devices installed at the met-mast, and communication network for protection and control devices devices at at wind turbines, turbines,feeders, feeders,and and offshore substation. proposed architecture was evaluated with wind offshore substation. The The proposed architecture was evaluated with respect respect to topology, network link topology, link and end-to-end delay for different applications. to network capacity, andcapacity, end-to-end delay for different applications. Simulation results Simulation results indicated the channel capacity of 1 Gbps satisfied the requirement of the IEEE indicated the channel capacity of 1 Gbps satisfied the requirement of the IEEE 1646 standard for1646 the standard for the WF system. This work contributes by providing a reference architecture for WF system. This work contributes by providing a reference architecture for modeling, simulating, and modeling, simulating, and evaluating the communication network of a WF and can be used for the evaluating the communication network of a WF and can be used for the design and implementation of design and implementation of future CPWESs. future CPWESs. Thiswork workwas wassupported supported National Research Foundation of Korea underfunded Grant Acknowledgments: This by by the the National Research Foundation of Korea under Grant by the Korean (MSIP) (No. 2015R1A2A1A10052459). funded by the government Korean government (MSIP) (No. 2015R1A2A1A10052459). Author All the the authors authors contributed Author Contributions: Contributions: All contributed to to publish publish this this paper. paper. Conflicts of Interest: The authors declare no conflicts of interest. Conflicts of Interest: The authors declare no conflicts of interest.

References 1. 2.

3. 4. 5.

Global Wind Energy Council. Global Wind Statistics 2015. 2016. Available online: http://www.gwec.net/wpcontent/uploads/2015/02/GWEC_GlobalWindStats2014_FINAL_10.2.2015.pdf (accessed on 1 August 2017). Report on Wind Power Industry 2015, South Korea, Jaehee Park, Project Officer Wind Energy & High Tech. 2016. Available online: https://www.rvo.nl/sites/default/files/2016/03/Rapport%20Windenergie% 20Zuid-Korea.pdf (accessed on 1 August 2017). Korea Wind Power Industry Association (KWEIA). Available online: http://www.kweia.or.kr/ forwindpowerstatisticsinformation (accessed on 1 August 2017). Yu, F.; Zhang, P.; Xiao, W.; Choudhury, P. Communication systems for grid integration of renewable energy resources. IEEE Netw. 2011, 25, 22–29. [CrossRef] Gao, Z.; Geng, J.; Zhang, K.; Dai, Z.; Bai, X.; Peng, M.; Wang, Y. Wind Power Dispatch Supporting Technologies and Its Implementation. IEEE Trans. Smart Grid 2013, 4, 1684–1691.

Appl. Sci. 2017, 7, 1034

6. 7.

8. 9. 10. 11.

12.

13.

14.

15. 16.

17. 18.

19.

20.

21. 22.

23. 24.

25. 26.

19 of 20

Gallardo-Calles, J.-M.; Colmenar-Santos, A.; Ontañon-Ruiz, J.; Castro-Gil, M. Wind control centres: State of the art. Renew. Energy 2013, 51, 93–100. [CrossRef] Singh, B.K.; Coulter, J.; Sayani, M.A.G.; Sami, S.M.; Khalid, M.; Tepe, K.E. Survey on communication architectures for wind energy integration with the smart grid. Int. J. Environ. Stud. 2013, 70, 765–776. [CrossRef] Karthikeya, B.R.; Schutt, R.J. Overview of Wind Park Control Strategies. IEEE Trans. Sustain. Energy 2014, 5, 416–422. [CrossRef] Alizadeh, S.M.; Ozansoy, C. The role of communications and standardization in wind power applications—A review. Renew. Sustain. Energy Rev. 2016, 54, 944–958. [CrossRef] Moness, M.; Moustafa, A.M. A Survey of Cyber-Physical Advances and Challenges of Wind Energy Conversion Systems: Prospects for Internet of Energy. IEEE Internet Things J. 2016, 3, 134–145. [CrossRef] Pettener, A.L. SCADA and communication networks for large scale offshore wind power systems. In Proceedings of the IET Conference on Renewable Power Generation (RPG 2011), Edinburgh, UK, 6–8 September 2011; pp. 1–6. Kunzemann, P.; Jacobs, G.; Schelenz, R. Application of CPS within Wind Energy—Current Implementation and Future Potential. In Industrial Internet of Things: Cybermanufacturing Systems; Springer: Cham, Switzerland, 2017; pp. 647–670. Ahmed, M.A.; Kang, Y.-C.; Kim, Y.-C. Modeling and simulation of ICT network architecture for cyber-physical wind energy system. In Proceedings of the 2015 IEEE International Conference on Smart Energy Grid Engineering (SEGE), Oshawa, ON, Canada, 17–19 August 2015; pp. 1–6. Tchakoua, P.; Wamkeue, R.; Ouhrouche, M.; Slaoui-Hasnaoui, F.; Tameghe, T.A.; Ekemb, G. Wind Turbine Condition Monitoring: State-of-the-Art Review, New Trends, and Future Challenges. Energies 2014, 7, 2595–2630. Kristoffersen, J.R.; Christiansen, P. Horns Rev Offshore Windfarm: Its Main Controller and Remote Control System. Wind Eng. 2003, 27, 351–359. [CrossRef] Goraj, M.; Epassa, Y.; Midence, R.; Meadows, D. Designing and deploying ethernet networks for offshore wind power applications—A case study. In Proceedings of the 10th IET International Conference on Developments in Power System Protection (DPSP 2010), Manchester, UK, 29 March–1 April 2010; p. 84. Park, J.Y.; Kim, B.J.; Lee, J.K. Development of condition monitoring system with control functions for wind turbines. World Acad. Sci. Eng. Technol. 2011, 5, 269–274. Demonstration Project: Southwest Offshore Wind Park Development; Ministry of Trade, Industry and Energy. 2015. Available online: http://www.motie.go.kr/common/download.do?fid=bbs&bbs_cd_n=6& bbs_seq_n=63153&file_seq_n=2 (accessed on 1 August 2017). IEC 61400-25-2. International Standard, Wind Turbines Part 25-2: Communications for Monitoring and Control of Wind Power Plants—Information Models; International Electrotechnical Commission (IEC): Geneva, Switzerland, 2006. Cruden, A.; Booth, C.; Catterson, Y.M.; Ferguson, D. Designing wind turbine condition monitoring systems suitable for harsh environments. In Proceedings of the 2nd IET Renewable Power Generation Conference (RPG 2013), Institution of Engineering and Technology, Beijing, China, 9–11 September 2013; pp. 1–4. Ahmed, M.; Kim, Y.-C. Hierarchical Communication Network Architectures for Offshore Wind Power Farms. Energies 2014, 7, 3420–3437. [CrossRef] Cardenas, J.; Muthukrishnan, V.; McGinn, D.; Hunt, R. Wind farm protection using an IEC 61850 process bus architecture. In Proceedings of the 10th IET International Conference on Developments in Power System Protection (DPSP 2010), Managing the Change, Manchester, UK, 29 March–1 April 2010; pp. 1–5. Wei, M.; Chen, Z. Intelligent control on wind farm. In Proceedings of the 2010 IEEE PES Innovative Smart Grid Technologies Conference Europe (ISGT Europe), Gothenburg, Sweden, 11–13 October 2010; pp. 1–6. Golshani, M.; Taylor, G.A.; Pisica, I. Simulation of power system substation communications architecture based on IEC 61850 standard. In Proceedings of the 2014 49th International Universities Power Engineering Conference (UPEC), Cluj-Napoca, Romania, 2–5 September 2014; pp. 1–6. Thomas, M.S.; Ali, I. Reliable, Fast, and Deterministic Substation Communication Network Architecture and its Performance Simulation. IEEE Trans. Power Deliv. 2010, 25, 2364–2370. [CrossRef] Sidhu, T.S.; Yin, Y. Modelling and Simulation for Performance Evaluation of IEC61850-Based Substation Communication Systems. IEEE Trans. Power Deliv. 2007, 22, 1482–1489. [CrossRef]

Appl. Sci. 2017, 7, 1034

27.

28. 29.

30. 31. 32. 33.

20 of 20

Wei, M.; Chen, Z. Communication Systems and Study Method for Active Distribution Power Systems. In Proceedings of the 9th Nordic Electricity Distribution and Asset Management Conference, Aalborg, Denmark, 6–7 September 2010. Oh, K.-Y.; Kim, J.-Y.; Lee, J.-K.; Ryu, M.-S.; Lee, J.-S. An assessment of wind energy potential at the demonstration offshore wind farm in Korea. Energy 2012, 46, 555–563. [CrossRef] 1646-2004—IEEE Standard Communication Delivery Time Performance Requirements for Electric Power Substation Automation. 2005. Available online: http://ieeexplore.ieee.org/servlet/opac?punumber=9645 (accessed on 1 August 2017). OPNET Technologies. OPNET Modeler. Available online: https://www.riverbed.com (accessed on 1 August 2017). Ahmed, M.; Kim, Y.-C. Communication Network Architectures for Southwest Offshore Wind Farm. J. Korean Inst. Commun. Inf. Sci. 2017, 42, 1–10. [CrossRef] Ko, S.-W.; Ju, Y.-C.; Jang, M.-S. The Measurement using a Wireless Transmission System with a WCDMA Communication for Metrological Data. J. Int. Counc. Electr. Eng. 2013, 3, 330–333. [CrossRef] Zhabelova, G.; Vyatkin, V. Multiagent Smart Grid Automation Architecture Based on IEC 61850/61499 Intelligent Logical Nodes. IEEE Trans. Ind. Electron. 2012, 59, 2351–2362. [CrossRef] © 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Suggest Documents