Conceptualizing impact assessment as a learning ...

3 downloads 0 Views 676KB Size Report
not only new knowledge and skills, but also new behaviors and values. Although .... people with both diverse and common interests can reach agreement.
Environmental Impact Assessment Review 62 (2017) 195–204

Contents lists available at ScienceDirect

Environmental Impact Assessment Review journal homepage: www.elsevier.com/locate/eiar

Conceptualizing impact assessment as a learning process Luis E. Sánchez a,⁎, Ross Mitchell b a b

Escola Politécnica, University of São Paulo, Av. Prof. Mello Moraes, 2373, 05508-900 São Paulo, Brazil Shell International Exploration & Production BV, Netherlands

a r t i c l e

i n f o

Article history: Received 11 December 2015 Received in revised form 3 June 2016 Accepted 9 June 2016 Available online 18 June 2016 Keywords: Effectiveness Environmental assessment Knowledge management Organizational learning Social learning

a b s t r a c t This paper explores how project developers and their consultants, government regulators and stakeholders can learn from the impact assessment (IA) process, thus potentially improving its effectiveness and enhancing project sustainability. Despite the benefits that learning can bring to an organization, failure to learn appears commonplace both within the IA process and, once approved, subsequent industrial development. To nurture organizational learning through IA, enabling structures that foster information sharing and interpretation and enhance organizational memory are needed. In this paper learning outcomes are grouped into three categories: acquisition of knowledge and skills, developing new behaviors and developing sustainability-oriented norms and values. Means to achieve such outcomes include education and training, experiential learning, learning through public participation (social learning) and a ‘learning organization approach’. Societal expectations increasingly demand not only projects that ‘pass’ the review criteria of regulators, financiers and the community, but IA processes capable of delivering sustainable outcomes that include learning and sharing of knowledge. It is proposed that learning be treated as a purposeful – not as an accidental – outcome of IA, and facilitated by adopting a ‘learning organization approach’ coupled with best practice such as early stakeholder engagement. © 2016 Elsevier Inc. All rights reserved.

1. Introduction The acquisition, interpretation and use of knowledge have always been key ingredients of impact assessment (IA). In assessing the impacts of a proposed development, an interdisciplinary team combines the knowledge, skills and competencies of each team member in order to: (1) identify direct, indirect and cumulative impacts and risks; (2) make predictions on the future state of the environment both with and without the proposed project; (3) assess the significance of impacts, considering the perspectives of affected communities, civil society organizations (CSOs), government agencies and other stakeholders; and (4) make recommendations on effective means to mitigate (avoid, reduce, restore or compensate) harmful impacts and to enhance beneficial ones. While the importance of knowledge for IA practice is well established, the role that learning plays, or ‘could’ play, in the IA process remains ‘fresh’ although with increasing interest from researchers and practitioners. Indeed, recent reflections on the effectiveness of IA - a recurrent theme in the literature - consider knowledge and learning alongside other effectiveness criteria (Bond et al., 2013). Still, learning appears to mainly show up in the ‘softer’ aspects associated with IA such as the post-approval phases of development where adaptive ⁎ Corresponding author. E-mail addresses: [email protected] (L.E. Sánchez), [email protected] (R. Mitchell).

http://dx.doi.org/10.1016/j.eiar.2016.06.001 0195-9255/© 2016 Elsevier Inc. All rights reserved.

management planning is increasingly practiced (Walkerden, 2005). Appraising the evolution of IA, Jacobs et al. (1993) describe the IA process as one of “continuous evaluation, learning, adaptation and feedback” (p. 14), noting that “over the past two decades, developers have ‘learned’ about environmental impacts, environmentalists have ‘learned’ about development, governments have ‘learned’ about consultation, and the art of mitigation has been advancing” (p. 24; emphasis added). If so, many questions remain. How has this learning been integrated into IA practice? Is learning a beneficial side-effect of IA or can it be explicitly treated as a desired outcome of knowledge sharing or co-creation? Can robust, mutual learning be achieved within a strict IA timeframe among diverse affected communities and other stakeholders, IA specialists and the developer? Our paper enquires how project developers and their consultants, government regulators and stakeholders can learn throughout the IA process, thus potentially improving its effectiveness, with arguably more sustainable outcomes. The findings are applicable to any form of IA such as Environmental Impact Assessment (EIA) or Environmental, Health and Social Impact Assessment (ESHIA). While recognizing that learning is relevant to all levels of IA, from policy to project, we focus mostly on IA from a project level perspective. Following this introduction, this paper is structured around the following questions: (1) who can learn in IA and for which purpose?; (2) what are the possible learning outcomes?; and (3) how can learning be achieved? Each question is developed in three separate sections (2–4) following the structure shown in Table 1. After reviewing the

196

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

Table 1 A taxonomy of learning in impact assessment. Questions

Categories

Examples

Who can learn? [learners]

All participants in the impact assessment process as individuals, groups or organizations – focus on: -Organizational leaning -Social learning

Project developer Consultant team Government regulator Stakeholders (directly and indirectly affected) Other individuals and groups (e.g. scientists, media)

What can be learned? [learning outcomes]

Skills and knowledge (equivalent meanings: single-loop learning/instrumental learning/improving performance within existing processes)

Increased scientific knowledge Increased capacity to mitigate impacts and enhance benefits Preparing better IA documents (terms of reference, environmental impact statements or reports, environmental and social management plans, etc.) Adopting more effective communication strategies New political strategies to influence government planning and decision-making Recognizing the need for negotiating multiple objectives and trade-offs when using sustainability as a policy-goal Developing project alternatives consistent with sustainability objectives Addressing gaps in legislation/regulation that hinders the effectiveness of IA Reaching mutual understanding (≠agreement) in disputes Sharing of knowledge Sustainability-oriented learning

New behaviors (equivalent meanings: double-loop learning/improving the process/communicative learning)

Norms and values (equivalent meanings: triple-loop learning/transformative learning) How can learning be achieved? [processes to facilitate learning/to deliver learning outcomes]

Formal education Experience Public participation Learning-organization approach

literature on these questions, in Section 5, we examine key conditions that enable learning in IA and, in Section 6, propose learning as a purposeful outcome. Conclusions are presented in the final section. 2. Agents and purposes of learning in impact assessment Learning is an ample concept with several meanings depending on the ontological context (e.g., educational, political, cultural). It is commonly described as some kind of activity or process of gaining knowledge or skill (Merriam-Webster, 2015). In the management field, learning is defined as “increased capacity to take effective action” (Kim, 1993, p. 38). Business dictionaries often treat learning more as an outcome, such as “measurable and relatively permanent change in behavior through experience, instruction, or study” (Business Dictionary, 2015). Thus, learning is goal-oriented: a process of acquiring not only new knowledge and skills, but also new behaviors and values. Although individual knowledge learning underpins IA by way of specialized knowledge sharing, collective levels of learning such as group, organizational and social learning are fundamental to its practice and dissemination. Impact assessment is typically undertaken by consultant firms under contract with a project proponent and done to conform to applicable regulations and company policies or standards. External expert and public review are also part of the process, while early public engagement is a generally recommended best practice. Thus both group and organizational learning offer possibilities for social learning, which is discussed in this section. The concept of group and organizational learning (OL) was developed in the managerial sciences as a metaphor to explain the extent to which learning by individuals within organizations are transferred and become “embedded in an organization memory and structure” (Kim, 1993, p. 37). Enabling OL in natural resource management is understood to benefit from information, structure and culture (Genskow and Wood, 2011). Huber (1991) lists four ‘constructs’ of OL: knowledge acquisition, information distribution, information interpretation and

Training and capacity building Critical reflection (e.g. after-action review technique) Early engagement with stakeholders Collaborative learning activities (e.g. joint fact-finding) Employing community liaison officers Participatory or community-based monitoring Establishing and maintaining internal structures that facilitate organizational learning: information sharing, information interpretation, organizational memory

organizational memory. For Fitzpatrick (2006), the ‘structures’ internal to an organization that facilitate learning are similar to the ‘constructs’ of Huber (1991), and include information sharing, information interpretation and organizational memory. For Gazzola et al. (2011), three internal 'conditions' influence an organization's capacity to learn through IA: cultural (e.g., values and shared beliefs), structural (e.g., the degree of inter-department coordination and collaboration and their approach to information exchange) and behavioral conditions (e.g., routines). Various hierarchical and other conceptualizations illustrate how learning can occur at various levels or degrees among people within a group or organization. Following Argyris and Schön (1996), the literature distinguishes between single- and double-loop learning. Singleloop level learning focuses on matching actions and results, or ‘adaptive learning’ necessary for the organization to survive; namely, acting to change behavior, or what is commonly referred to as ‘change management’ (also adaptive management in the context of resource development). When a mismatch between action and outcome is detected, future actions are altered accordingly in order to prevent similar mistakes. In contrast, double-loop learning occurs when serious problems are detected and the organization's norms and values consequently change. Double-loop learning focuses on the actions and the assumptions behind the actions, or ‘generative learning’ necessary for the organization to thrive. Argyris and Schön (1996) call this “a change in the values of theory-in-use, as well as [change in] strategies and assumptions” (p. 21). Subsequent research introduced the notion of triple-loop learning; namely, a change in assumptions and actions from a normative, moral or ethical sense, and also evolutionary or experiential learning (e.g., Kransdorff, 2006). Triple-loop learning is both normative and transformative by “helping individuals create a shift in personal perceptions through questioning inconsistencies and incongruencies in organizations” (Kransdorff, 2006, p. 177). This ‘loop level’ learning in groups or organizations can also be considered within a ‘collaborative learning’ environment. Among the first to describe collaborative learning, Daniels and Walker (1996)

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

conceptualized it as a model for effective public participation in natural resource planning and policy-making. The collaborative learning approach “emphasizes activities that encourage systems thinking, joint learning, open communication, and focuses on appropriate change” (p. 81). Since “no single party, agency, organization or discipline holds the key to understanding a particular resource management situation” (p. 75), various participants must learn from one another. Likewise, shifting from single-loop to higher learning levels requires support throughout the organization from top to bottom and between units. Social learning is yet another construct which has been diversely conceptualized, resulting in some confusion and criticism. Building on Habermas's theory of ‘communicative action’, Webler et al. (1995) consider social learning as a ‘cooperative discourse’ when communities of people with both diverse and common interests can reach agreement on collective action to solve a shared problem. According to her review of social learning in research on natural resource management, Rodela (2011) notes three divergent research approaches: (1) an individualcentric perspective (participatory); (2) a network-centric approach (group and organizational); and (3) a systems-centric (social-ecological system moving towards a more sustainable trajectory). Social learning has also been applied to impact assessment. With their conceptual framework akin to the systems-centric approach described by Rodela (2011); Sinclair et al. (2008) treat the IA process as “an essential element of socio-ecological governance”, with potential to enable individual and social learning outcomes, which, in turn, is needed in the “transition to sustainability” (p. 419). Group, organizational and social learning can be distinguished by asking ‘who learns?’ or ‘what is the purpose of learning?’ Extending these questions through his transformational learning theory, Mezirow (1997) posits two domains of adult learning: instrumental and communicative. The former is task-oriented and conducive to problem solving. The latter is related to understanding others' values and purpose. Influenced by Paulo Freire and Jürgen Habermas, Mezirow divided ‘transformative learning’ into three distinct knowledge types: instrumental, communicative and emancipatory. With the latter type, individuals may change their frames of reference by critically reflecting on their assumptions and beliefs and consciously making and implementing plans that bring about worldviews (Mezirow, 1997). Building on the pioneering work of Mezirow, several authors discuss the notion of transformative learning through IA, particularly as to public involvement and social learning (e.g., Diduck et al., 2012, 2013). Going further, Sinclair et al. (2008) conceptualize individual and social learning outcomes in IA as transformative when they are consistent with sustainability criteria. These authors position IA as “a powerful force that can shape a person's values, understanding, attitudes and behavior” (p. 418), based on the assumption that “dialogue and critical reflection can promote cognitive development, socio-political empowerment and social change” (p. 420).

197

3.1. Acquiring knowledge and skills Diverse knowledge and skills can be acquired by participants in the IA process. Knowledge created for IA purposes can also spill over to other people or groups not involved in IA. Such is the case of new scientific knowledge. Examples include finding and describing new species, extending the geographical distribution of known species, new archeological findings, advancing groundwater knowledge and mapping of sensitive habitats. One of the authors involved in an IA for a proposed mine expansion in Suriname even located a previously unknown community of indigenous migrants, which led to a major change in the project design. Such discoveries are perhaps not surprising, considering IA is supposed to inform decision-making on the basis of sound scientific evidence. Testing this premise is beyond the purpose of this paper, but IA is generally recognized as a driver for advancing science. Still, the opportunity to contribute to advance scientific knowledge should be treated as a positive side-effect given that, for proponents, the central purpose of IA is a functional one, focused on obtaining development approval. Hence, a second and arguably more important learning outcome for IA is how to design and manage projects that mitigate (the mitigation hierarchy of avoid, reduce, remedy or compensate) harmful effects and enhance benefits, as a possible measure of substantive effectiveness of the IA process (Cashmore et al., 2004; Chanchitpricha and Bond, 2013). Learning how to better design and manage projects is certainly a key purpose of IA legislation worldwide. The quality of IA-related documents appears to have improved over time in many countries (Glasson et al., 1997; Wende, 2002; Landim and Sánchez, 2012), with more robust mitigations and management measures; while this could be taken as a signal of learning, much room for improvement remains. As an example, few studies have documented improvements in decommissioning or reclamation plans as a result of OL, which is odd considering that projects requiring these plans typically have long lifecycles, thus potentially offering multiple progressive learning opportunities. Case in point, in a recent study of mining closure planning practice in Canada and Australia, notable shortcomings included inadequate social impact assessment, community consultation, analysis of alternative options and the monitoring of post-closure activities (Kabir et al., 2015). Other outcomes have been documented in the literature, such as mutual learning among developers and stakeholders (Sinclair and Diduck, 2001), and CSOs that alter “the media and message of their intervention, to strengthen their message” (Fitzpatrick, 2006, p. 173). Such ‘messages’ may improve stakeholder performance in the IA process, and contribute to the development of new political strategies that influence government planning and decision-making (Diduck et al., 2013, p. 178).

3.2. Developing new behaviors 3. Learning outcomes in impact assessment In this section, we outline three kinds of learning outcomes drawn from the literature: knowledge and skills, new behaviors and norms and values (see Table 1 for examples of outcomes of each). Other taxonomies of learning outcomes in IA exist, notably, the framework of Diduck et al. (2012), who classify learning outcomes under instrumental, communicative, transformative and sustainability-oriented domains. These categories intersect with our proposed classification although there are some differences. As an example, skills are generally associated with instrumental learning, whereas knowledge is not restricted to practical application but instead relates to cognition and comprehension of phenomena and situations. New behaviors, on the other hand, can arise as a result of a transformational experience, such as a change in worldviews, or, pragmatically, as a means to achieve a desired end, like obtaining approval for a project.

Many scholars have explored learning in IA by studying behavioral change of organizations, with both positive and negative results. Since this literature is quite broad in scope, selected examples are provided that illustrate concepts discussed in this paper. Among the first to study OL related to EIA, Hill and Ortolano (1976) and Brendecke and Ortolano (1981) enquired about the effects of the United States National Environmental Policy Act on two government agencies: the Army's Corps of Engineers and the Soil Conservation Service. They reported that the interagency and public review required by the Act “resulted in measurable benefits in terms of improved planning and decision-making” (Hill and Ortolano, 1976, p. 1099). New behaviors emerged in both agencies and resulted in decisions such as project modifications and recommendations of different alternatives. Brendecke and Ortolano (1981) found evidence of behavioral change in the Corps of Engineers' adoption of “interdisciplinary planning” and

198

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

acceptance of “two-way communication with the public”, in contrast with previous planning practices. In a contrasting case, Shepherd and Ortolano (1997) studied organizational change associated with EIA in the Electricity Generating Authority of Thailand, a developer whose internal support for EIA was granted as a means of clearing ‘environmental regulatory hurdles’ to obtain permitting approval, but did not evolve to accommodate learning. Although the organization became a “nationally recognized leader in EIA work” (p. 353), with its environmental unit evolving from a ‘cosmetic’ role to a ‘project-oriented’ division, support from top management faded and the mobilization of internal resources to effect change did not result in a lasting structure that retained lessons learned. The study of learning in environmental agencies was pioneered by Sánchez-Triana and Ortolano (2001) researching the Valle del Cauca Corporation, in Colombia, where ‘learning by imitation’ and ‘learning by doing’ processes took place as a result of new regulatory requirements. While learning in the EIA process helped the firm to meet its goals, this was not necessarily conducive to better environmental protection outcomes. These findings echo concerns that learning in IA can be ‘manipulative’ (Morrison-Saunders, 2009), i.e., learning how to manipulate the process to attain particular ends, such as proponents embellishing environmental impact statements to make conclusions more palatable or opponents learning how to find procedural flaws in order to prompt litigation. There is ample literature on EIAs prepared to justify decisions made without genuine consideration of impacts, which are sometimes dubbed ‘cosmetic’ EIAs (Fearnside, 2015). More recently, Diduck et al. (2013) surveyed several learning outcomes associated with IA in India on two hydropower projects, and as mentioned above, organized these into instrumental, communicative, transformative and sustainability-oriented domains. Asking local residents what was learned from their involvement, most reported learning outcomes in the communicative domain, compared to CSO, government and proponent participants who did not. Among all stakeholder groups, the predominant communicative outcome was insight into others' interests. However, for learning, the highest prevalence of outcomes for all participant groups was in the solutions-oriented instrumental domain. A key theme for local residents was a need for long-term, sustainable benefits; for government respondents, it was protecting the rights of local residents. Only one clear instance of a transformative outcome was found, highlighting the difficulty that learning in IA presents; Diduck et al. (2013), recognizing the study limitations, assert that time is needed for long-term critical reflection to determine if transformative outcomes can result. 3.3. Linking learning to sustainability-oriented norms and values The literature provides documented evidence of learning from IA, particularly instrumental learning. That IA could foster transformative, sustainability-oriented learning, is claimed by a number of authors, but unsurprisingly little evidence of actual achievements is available. Learning in the context of strategic environmental assessment was considered by Jha-Thakur et al. (2009), who see learning as playing a role in “transforming individual, professional and organizational norms and practices in support of sustainable development” (p. 134). However, after interviewing practitioners in three European countries, only evidence of single-loop learning outcomes was found: knowledge acquisition, improved comprehension and increased analytical capabilities. While IA has traditionally been treated as a means of obtaining governmental approval for proposed developments, in many instances, a social license may be harder to gain (Boutilier, 2014), which has been contributing to the fact that many companies – especially in the resources sector – are positioning IA as a business risk management tool. Therefore, the need to engage fairly and meaningfully with stakeholders (Mathur et al., 2008) - many of them aiming at preventing the project of going ahead - may lead to mutual or collaborative learning. This third kind of normative learning outcome transcends the previous

and can be associated with triple-loop learning, i.e., the process of asking ‘how do we decide what is right?’ (Mitchell and Leach, forthcoming) A mutual learning perspective would create constructive conditions for engagement through the less tangible but valuable benefits associated with meaningful dialogue for a freely shared, hence democratic, future. As Young (2000) puts it, “The normative legitimacy of a democratic decision depends on the degree to which those affected by it have been included in the decision-making process and have had the opportunity to influence the outcomes” (pp. 5–6). Sharing of knowledge can result from collaborative learning in IA. Collaborative learning has possibilities to change not only project design, but also future decisions about, for example, what stakeholders want the reclaimed land (post-development) to look like and how it is expected to function decades later. In Canada, as in most countries, few examples can be shown of incorporation of stakeholder perspectives in environmental decision analyses; rather, an adversarial process and a lack of incentive to understand multiple perspectives has been a more typical pattern (Kiker et al., 2005). One of the authors worked on a project in 2010 with significant promise for participatory learning to sustainably shape the future: a conceptual reclamation plan in the oil sands region of northern Alberta (Canada). A formal decision process may not only discourage stakeholder participation, it may bypass opportunities for an integration of perspectives, learning about new alternatives and consensus building. In contrast, this conceptual reclamation plan was a break from conventional planning approaches in that regulatory land use direction was not a prerequisite; rather, a sustainability-oriented learning decision analysis process was designed to translate diverse stakeholder desires into a ranking of preferred reclamation options. Transparency and collaboration was also a key element in the documentation analysis and reporting. The outcome was a stakeholder-friendly dashboard which would allow users to adjust weightings of variables to derive alternative reclamation scenarios. Such learning opens up possibilities for a ‘learning landscape’ whereby all stakeholders can participate in a shared desired outcome built on understanding each other's perspectives and values. In this way, the project employed imagination, creativity and innovation to develop a learning-driven process culminating in a decision-making tool flexible enough to accommodate both stakeholder and professional input. 4. Achieving learning outcomes in impact assessment In this section, we review selected means to achieve the potential learning outcomes depicted in Section 3. Again, the literature approaches learning processes from a variety of perspectives. In the conceptual scheme of Diduck et al. (2012), for example, learning is conceived as being triggered by an event and then proceeds through individual and social action, critical self-reflection or reflective discourse in participatory processes. The vast array of learning theories and models of learning (Blackmore, 2007) suggest that choices are inevitable in assessing whether learning occurs in professional or social domains, and if so, under what conditions or among which groups. The four categories of how learning can be achieved from Table 1 are retained here. 4.1. Education and training for impact assessment Knowledge and skills relevant to IA can be acquired in various ways; for example, by education, training, experience (experiential or tacit knowledge), study or research (i.e., scientific knowledge), cultural rules and norms (traditional or generational knowledge) or through interaction of people with their biophysical environment (local or cultural knowledge). Building from this premise, at least three main paths exist for acquiring knowledge and skills for IA practice: (1) tertiary education and professional training, including continuous or adult education; (2) experiential learning, whether passed on by other IA specialists or

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

gained through ‘hands-on’ experience; and (3) research. Any of these paths offer opportunities for learning. In the first path, IA professionals come from a range of disciplines, including the natural and social sciences, engineering and health professions. Formal IA teaching is now widespread, featuring a common set of principles and practices. Universities and colleges in many countries teach some form of IA, especially EIA, in departments such as regional planning, geography, environmental sciences and environmental engineering. Impact assessment education has been reviewed at a country level (e.g., Stelmack et al. (2005) for Canada; Ramos et al. (2008) for Portugal; Sánchez (2010) for Brazil, and Weiland (2012) for Germany), at the European level (Gazzola, 2008; Fischer and Jha-Thakur, 2013), and at an international level (Sánchez and Morrison-Saunders, 2010; Pollack et al., 2015). Still, many IA professionals enter the field without prior formal exposure to the discipline. On-the-job training and capacity building are preferential forms of disseminating IA knowledge, skills and tools. Effective ways of delivering training have been extensively discussed at the annual conferences of the International Association for Impact Assessment (IAIA) and reviewed in the early EIA literature (Bisset and Tomlinson, 1985; Lee, 1988). The second path – experiential learning – will be discussed in the next subsection, while learning by researching has a very broad scope and will not be reviewed here. 4.2. Experiential learning Learning from experience and from seasoned professionals is arguably the backbone of IA. Consolidated knowledge is condensed in several sources, including: (1) textbooks (available in several languages) (e.g., Hanna, 2016); (2) reflections by experienced professionals (Greenberg, 2012; Lawrence, 2013); (3) guidance documents, such as the Environmental, Health and Safety Guidelines of the World Bank (www. ifc.org/ehsguidelines) and the International Finance Corporation (IFC) Performance Standards (http://www.ifc.org/performancestandards), or resources made available by professional bodies such as IAIA (http:// www.iaia.org/publications-resources);and (4) technical guidance provided by government agencies in some countries (Sánchez and MorrisonSaunders, 2011). Some research critically reviews and, to some extent, validates experiential knowledge, whether about particular types of projects or impacts – for example, Bergström et al. (2014) on offshore wind farms and Leeney et al. (2014) on the impacts of wave energy – or under a wider lens of a retrospective, broad review of a whole sector - for example, the review of environmental and social impacts of large dams (World Commission on Dams, 2000). Knowledge and skills ‘for’ IA are outcomes of instrumental and communicative learning by individuals and groups professionally involved in IA. Still, IA is not the exclusive realm of professionals or researchers. Various forms of knowledge are relevant for IA, ranging from those of a more personal nature (e.g., lay, tacit or implicit knowledge) (Polanyi, 1958, 1997) to those embedded in and that interact with cultural rules and norms, such as traditional (Berkes, 1993; Stevenson, 1996) and local knowledge (Baines et al., 2003; Raymond et al., 2010). The latter is not necessarily ‘traditional’ from an indigenous or ethnic view, but has evolved “from the interaction of the people's cultural values and social organization with the physical environment in which they dwell” (Baines et al., 2003, p. 26). As the primary purpose of IA is to influence decision-making towards sustainability, and possibly deliver more sustainable outcomes (Sinclair et al., 2008; Morrison-Saunders and Retief, 2012), it has been argued that IA has potential to be a powerful educational tool. As such, we can reflect on the extent to which diverse decision-makers (developers, regulators, others), and likewise affected and interested parties (i.e., stakeholders) and the public at large, can learn ‘through’ the IA process. From a learning perspective within the context of IA, combined with a drive for sustainability in project design, construction and

199

operations, two essential learning factors discussed below are changing behaviors and transforming organizations. 4.3. Learning through public participation Public participation in IA is credited with facilitating social learning (e.g., O'Faircherllaigh, 2010; Glucker et al., 2013) and much evidence support this understanding; for example, Webler et al. (1995) for a waste disposal facility in Switzerland, Chávez and Bernal (2008) concerning hydroelectric development in Mexico, Saarikoski (2000) concerning waste management plans in Finland, and Sinclair et al. (2008) in their review of previous studies of different types of projects in Canada. Webler et al. (1995) consider that the criteria of “good public decision-making processes” (p. 444) include fairness, competence and social learning. They conceptualize social learning as both cognitive enhancement and moral development, where people can judge right from wrong in a transformative experience associated with participating in the IA process. Having studied the IA process for a waste management facility – a classical ‘not in my backyard’ project –, they concluded that “to effectively cope with the tendency for people to want to pursue egoistic aims before collective ones” (p. 460), public participation should foster social learning. Obviously, fulfilling this potential presupposes genuine participatory processes. Fitzpatrick (2006) noted that identifying process strengths, such as the aspects of the IA process that encourage dialogue (as opposed to testimony in public hearings) can facilitate learning. Learning opportunities through public participation in IA are not restricted to formal channels established by regulations in each jurisdiction. Diduck et al. (2013) found civil society-led opportunities for participation in India as conducive to learning. Learning arising from public participation can be lasting, as found by Bull et al. (2008) studying a consultation process in England, concluding that it “has shifted people's understanding of resource management issues” (p. 713). Worth noting is that opportunities for social learning have blossomed in the past decade or so. New collaborative technologies are blending formal, informal and social approaches. This type of support mix may include webinars, virtual conferencing, video- and photo-sharing, blogging, wikis, chat rooms, virtual worlds and instant messaging. These new approaches may also expand possibilities for the public and developers alike to actively participate in learning forums involving development projects. Such new techniques with great learning potential have not been well tested in the IA space but offer much promise. 4.4. Learning organizations The organizations involved in IA - consultants, regulators, CSOs, developers - may have an interest in engaging in learning for an array of reasons, including improving their efficiency. It is understood that learning can be actively pursued in organizations by adopting ‘planned interventions’ (Easterby-Smith and Araujo, 1999). Learning in environmental consultancy firms has been considered by several authors. Based on a case study of a landfill in Brazil, for example, Bond et al. (2010) argue that ‘informal knowledge’ sharing between team members in the consultancy is essential for sustainability-oriented IA. Observing that a ‘typical EIA’ leads to reductionism at the expense of knowledge integration, they recommend a learning-organization approach, highlighting the importance of information exchange and knowledge validation. In another example from Brazil, Costanzo and Sánchez (2014) studied eight environmental consultancy firms – from small local firms to branches of multinational consultancies – and mapped their knowledge management practices. While knowledge socialization practices were employed by most of the consultancies, ‘after-action reviews’ were not conducted to identify lessons learned. This represents a missed opportunity for the firms to learn from both positive and negative experience with the projects supported.

200

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

Knowledge management in government environmental agencies with regulatory powers for EIA approvals was studied by Sánchez and Morrison-Saunders (2011) in Western Australia and Sánchez and André (2013) in Quebec. In both cases, staff were asked how knowledge repositories were accessed and maintained and whether they helped foster information sharing and interpretation and maintained organizational memory. In Western Australia, preparing and updating technical guidance on the basis of experience acquired was a valued means of storing and transferring knowledge obtained from previous assessments. To share learning from experience and develop shared understanding, an after-action review (Ramalingan, 2006) was used in the Quebec case. A ‘post-hoc review’ of IA was also observed by Fitzpatrick (2006) in other Canadian jurisdictions. Organizational learning within CSOs was also observed by Fitzpatrick (2006), but studies on OL related to IA in proponent organizations seem rare or harder to conduct. In Fitzpatrick's study, for example, corporate actors were ‘underrepresented’, despite efforts to include them. A purposeful learning approach for proponent organizations was devised by Sánchez (2012) by which data and information obtained through follow-up of projects submitted to the IA process generates a learning opportunity; this, in turn, enables the transformation of postIA results into useful knowledge for the management of the project itself (single-loop learning) and for assessing future projects (double-loop learning). 5. Failures and conditions enabling learning in impact assessment Besides the potential benefits of learning that have been explored in this paper, situations exist where virtually no learning can be observed or acknowledged by stakeholders. This situation may be more typical than one would hope. Moreover, there are situations where unlearning may occur, i.e., organizations seem to forget lessons learned. In this section we discuss some failures to learning and, conversely, some conditions that could enable learning in an IA context. 5.1. Failures to learn from impact assessment The simple transfer of knowledge that often characterizes the proponent-consultant relationship (Kågström and Richardson, 2015; Gallardo et al., 2016) is not intrinsically conducive to learning. Failures to learn include capacity differences among various actors in development projects and an institutional inability to ensure knowledge is adequately retained and shared. For example, in Fitzpatrick's (2006) study, not all actors achieved the same learning outcomes as different structures were in place to enable OL; Jha-Thakur et al. (2009) found very little evidence of double-loop learning in European case studies; in Colombia, Sánchez-Triana and Ortolano (2001) found that although learning took place within the organization they studied to meet its own goals, this did not spill over to enhance environmental protection for public benefit; Shepherd and Ortolano (1997) found that learning was not retained in a proponent organization in Thailand. A common failing is that many organizations simply do not learn through IA, whether by accident (i.e., missed opportunity) or design (e.g., deciding not to conduct an after-action review due to cost or the fear of what might be uncovered). One illustrative example of a proponent organization failing to learn from its involvement with several IAs was the federal Brazilian Department of Transportation Infrastructure. An audit conducted in 2011 by the Federal Accounting Tribunal found the Department's OL capacity to be “severely limited” (Sánchez, 2013, p. 198). Moreover, by outsourcing most environmental assessment and management activities, this organization repeatedly replicated deficiencies in other construction project EIAs. The audit recommended that after major construction projects are completed, an evaluation report should highlight lessons learned and make recommendations for future assessments and environmental management. In this case, we can hypothesize that the organization was weak in the three requisite

structures for any learning organization: information sharing, information interpretation and organizational memory. Learning from serious accidents with severe environmental and social damage- such as the 2014 British Colombia (Canada) Mount Polley and the 2015 Brazil Mariana tailings dam failures or the 2010 Deepwater Horizon oil rig blowout in the Gulf of Mexico (United States) - should be an expected outcome for regulators. New or fortified conditions should be imposed on developers both during the IA process and as part of the daily management and monitoring of these developments. However, the severity of consequences of ‘not getting it right’ may be worse than ever. As Bowker and Chambers (2015) point out, while fewer mine tailings storage facilities failures occur today, the ones that do fail are much more serious. In their study of failures in the period 1910–2010, 55% of all catastrophic (‘very serious’) failures in100 years have occurred since 1990, and 74% of all failure events post-2000 are ‘serious’ or ‘very serious’. These empirical findings demonstrate that learning from disasters has far to go. Are we learning from such failures by collaborating more to implement better prevention and mitigation measures, i.e., change management? (single loop). Is our awareness of potential failures increasing so that we are collaboratively building upon past experience? Namely, are we reframing problems and then taking that learning forward? (double loop). Or are we continually ‘failing to learn from failure’? The ‘organizational amnesia’ pathology has different causes (Othman and Hasim, 2004), including changes in top management, such as IA champions, as well as failure to carefully document, generate new polices based on learning from past experience and engage with relevant staff more often to discuss lessons learned. Development failures, including those which have presumably gone through rigorous IA processes, suggest that IA can better be described as a potential learning process as the conditions that enable learning need to be better understood. Indeed, Bond et al. (2013) list learning as one of the dimensions of IA effectiveness; namely, besides complying with applicable requirements (procedural effectiveness) and achieving its intended goals (substantive effectiveness), an effective IA is one that enables learning for those participating in the process. In this sense, the desired outcome could be normative effectiveness. As a consequence, it should be possible to identify enabling conditions for learning through IA, which we now discuss. 5.2. Conditions that enable learning through impact assessment Key conditions under which the learning potential of IA may be realized are not well understood. They are generally described as including (1) a supportive regulatory environment and (2) facilitative structures, such as robust public participation. Regarding the first condition, the IA regulatory context is understood to play a key role in forcing or convincing proponents to develop projects that demonstrate the best possible adherence to the mitigation hierarchy. More recently and as a business risk management approach, financial institutions insist that their clients prepare an ESHIA, namely, a more robust type of IA, in accordance with the IFC Performance Standards. Under this framework, developers are required to demonstrate management capacity and adopt an environmental and social management system to effectively manage the project to mitigate impacts and risks. An example of the second condition is Fitzpatrick's (2006) survey of learning outcomes in government organizations and CSOs involved in two large projects in northern Canada (a dam and associated transmission lines, and a diamond mine). The organizations' internal structures identified as important to OL are: information sharing, information interpretation (building shared meaning from experience) and memory (recording and referencing acquired knowledge). This finding supports Argyris and Schön's (1996) single- and double-loop learning theory; namely, the delivery of learning outcomes in IA depends largely on the adequacy of such structures within an organization that enable learning.

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

Sustainable outcomes are also possible under the right learning conditions. According to Gazzola et al. (2011) in their study of a government agency in Italy involved in strategic environmental assessment, internal cultural, structural and behavioral conditions, as presented in Section 2, also influence the organization's capacity to learn through IA. If properly understood and managed, those conditions could foster sustainability-oriented learning. An organization may have an incentive to learn if its members acquire knowledge recognized as potentially useful to the organization (Huber, 1991). This is most likely if external pressures are conducive to learning outcomes. According to the institutional theory, an organization is submitted to ‘coercive, mimetic and normative’ pressures from its institutional environment (DiMaggio and Powell, 1983). This theory has been used to explain the voluntary adoption of ISO 14001 environmental management systems (Zhu et al., 2013; Phan and Baird, 2015) and can help in understanding the extent to which IA can be influential within proponent organizations. Coercion results from “both formal and informal pressures exerted on organizations by other organizations upon which they are dependent” (DiMaggio and Powell, 1983, p. 150), including regulators and financiers. Normative pressures are exerted

201

by professionals within a field who disseminate information and best practices, while mimetic processes reflect the fact that many organizations model themselves on other organizations in a process influenced by staff turnover, consulting firms or trade associations. Concerning social learning through public participation, the provision of early information and corresponding engagement with stakeholders is essential to enable mutual learning, namely community learning about the project and developer learning about stakeholder values and concerns. In addition, Diduck et al. (2012) propose the following key conditions to achieve mutual learning among stakeholders: availability of accurate and complete information, the existence of deliberative forums and procedural certainty. 6. A learning organization approach as a means to promote learning from impact assessment In the previous section we reviewed a number of conditions leading to failures or to success in learning. We now examine learning as a purposeful result of IA by conceptually building two contrasting approaches adopted by developers, dubbed ‘passive approach’ and ‘learning

Fig. 1. A 'learning organization approach' in contrast to a passive approach to learning in impact assessment.

202

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

organization approach’. Experience of both authors suggests that while the former is far more widespread than the latter, many organizations do learn from their past experience and are moving towards the latter. It should be warned that this is an analytical typology and it is not appropriate or useful to label a particular organization as either ‘passive’ or ‘learning’ as every organization may feature several characteristics of each type. Moreover, they change over time for several reasons, including new management policies and organizational amnesia. Passive behavior is understood as a formalist or regulatory approach by which a proponent organization approaches the IA process as a ‘must do’ activity; namely, as required by financing or regulatory agencies. In this minimalist case, the organization sees no value in IA besides obtaining the necessary approvals. Hence, only the minimum is delivered. Examples of practices include selecting consultants by the lowest bid, showing up at public hearings and other mandatory consultation activities without engaging with stakeholders, or preparing postapproval monitoring reports and immediately shelfing them. In contrast, a learning organization employs best practice such as early engagement with stakeholders and preparing its own internal impact assessments prior to making decisions and going ahead with the public approval process. Such an organization not only creates and manages knowledge generated from each IA and from the follow-up of previous projects; it feeds back its learning into the assessment and management of future projects. Fig. 1 summarizes both approaches in a timeline paralleling key project development and impact assessment activities, from early conception to implementation and follow-up. Examples of practices conducive to learning are illustrated. However, in order to achieve learning outcomes, the organization needs internal structures capable of effectively sharing and interpreting information while building and maintaining organizational memory. Many practices are conducive to learning that can also be transformative, such as participatory monitoring, since working together is transformative to both sides. 7. Conclusion For many developers, CSOs and other stakeholders, impact assessment is akin to a test one must pass or fail. For a growing number of scholars and practitioners, however, a better descriptor of an effective IA process is one where sustainability-oriented, participatory learning can be demonstrated. Developers play a central role in IA because they initiate the process. In the past, meeting regulatory requirements for project approval was generally seen as ‘the’ measure of success. Now the bar is being progressively raised. Societal expectations increasingly go beyond standard project design and mitigations, which ‘pass’ the review criteria of regulators, financiers and the community, to a process capable of delivering sustainable outcomes. To encourage and achieve the conditions needed for sustainable projects, arguments presented in this paper defend that learning should be treated as purposeful action and designed as an integral component of the IA process, with learning outcomes and targets clearly articulated with stakeholders. Several authors cited support similar views (in particular Diduck et al., 2012, 2013; Fitzpatrick, 2006; Gazzola et al., 2011; Sinclair and Diduck, 2001; Sinclair et al., 2008). For organizations to learn from IA, at minimum, robust, enabling learning structures and conditions are essential; namely, not only during the IA itself, but maintained and adapted for all phases of the development, from pre-construction through operation and decommissioning and restoration. Learning is a multi-faceted process ideally involving both the holders and recipients of knowledge throughout an IA process. A project's need for wise decisions and adaptive management that follow an IA process emphasize that robust opportunities for learning be built into IA from day one. If a successful outcome is desired, then practitioners of IA should not only value learning from mistakes or failures (single-loop), but above all appreciate and embed shared understanding in the process. A learning-based approach would differ from a knowledge-based

approach; the latter describes a more traditional, one-sided view of a knowledge holder (e.g., proponent) transferring it to a recipient (e.g., host community), whereas the former would be driven by a spirit of enquiry. Stakeholder engagement should be seen as an opportunity for learning among all stakeholders, not only for the ‘public’ to learn. Learning is a social process where diverse stakeholders can share a common forum, appreciate each other's values, reflect upon their own and create a shared vision and objectives. Meaningful, transparent dialogue can also help raise awareness, change attitudes and affect behaviors. Yet prevailing practices tend to view stakeholder engagement from a managerial perspective, sometimes from an ethical perspective, less often as a combination of the two and rarely as a social learning perspective. Still, finding mutually acceptable solutions in a learning-driven approach will depend on the institutional framework of the IA process and the willingness of the developer. To what extent can different learning perspectives be considered in a formal IA process? Did genuine learning occur, and, if so, did it lead to sustainable outcomes? More research is needed on these and related questions. Some caveats bear mentioning. A learning approach may demand unacceptable timeframes from the developer's perspective or be restricted by regulatory requirements. It could also add costs for robust multiple-loop learning and social learning. It is likely that multiple stakeholders would need to be engaged with greater intensity and openness than a typical IA process. However, companies striving for a social license to operate or who have experienced serious delays in their projects may well find that the additional investment in learning-based approaches may result in a smoother IA as well as postapproval phases. Acknowledgements We are thankful to an anonymous reviewer who provided insightful and challenging comments. References Argyris, C., Schön, D., 1996. Organizational Learning II – Theory, Method and Practice. Addison-Wesley, Reading, Mass. Baines, J., McClintock, W., Taylor, N., Buckenham, B., 2003. Using local knowledge. In: Becker, H., Vanclay, F. (Eds.), The International Handbook of Social Impact Assessment: Conceptual and Methodological Advances. Edward Elgar, Cheltenham, pp. 26–41. Bergström, L., Kautsky, L., Malm, T., Rosenberg, R., Wahlberg, M., Åstrand-Capetillo, N., Wilhelmsson, D., 2014. Effects of offshore wind farms on marine wildlife - a generalized impact assessment. Environ. Res. Lett. 9 (3), 034012. Berkes, F., 1993. Traditional ecological knowledge in perspective. In: Inglis, J.T. (Ed.), Traditional Ecological Knowledge, Concepts and Cases. Ottawa, Canadian Museum of Nature/International Development Research Centre. Bisset, R., Tomlinson, P., 1985. EIA training courses organized by the Centre for Environmental Management and Planning, University of Aberdeen: an analysis of experience. Environ. Impact Assess. Rev. 5, 279–281. Blackmore, C., 2007. What kinds of knowledge, knowing and learning are required for addressing resource dilemmas?: a theoretical overview. Environ. Sci. Pol. 10, 512–525. Bond, A., Viegas, C.V., Coelho, C.C.S.R., Selig, P.M., 2010. Informal knowledge processes: the underpinning for sustainability outcomes in EIA? J. Clean. Prod. 18, 6–13. Bond, A., Morrison-Saunders, A., Howitt, R., 2013. Framework for comparing and evaluating sustainability assessment practices. In: Bond, A., Morrison-Saunders, A., Howitt, R. (Eds.), Sustainability Assessment: Pluralism, Practice and Progress. Routledge, London, pp. 117–131. Boutilier, R.G., 2014. Frequently asked questions about the social licence to operate. Impact Assess. Proj. Apprais. 32 (4), 263–272. Bowker, L.N., Chambers, D.M., 2015. The Risk, Public Liability, & Economics of Tailings Storage Facility Failures. Technical Report, Earthworks, July 21. Brendecke, C., Ortolano, L., 1981. Environmental considerations in Corps planning. Water Resour. Bull. 17, 248–254. Bull, R., Petts, J., Evans, J., 2008. Social learning from public engagement: dreaming the impossible? J. Environ. Plan. Manag. 51 (5), 701–716. Business Dictionary, 2015. Learning. (Available at) http://www.businessdictionary.com/ definition/learning.html#ixzz3oLodnIzF (Last accessed Dec. 10, 2015). Cashmore, M., Gwilliam, R., Morgan, R., Cobb, D., Bond, A., 2004. The interminable issue of effectiveness: substantive purposes, outcomes and research challenges in the advancement of environmental impact assessment theory. Impact Assess. Proj. Apprais. 22 (4), 295–310.

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204 Chanchitpricha, C., Bond, A., 2013. Conceptualising the effectiveness of impact assessment processes. Environ. Impact Assess. Rev. 43, 65–72. Chávez, B.V., Bernal, A.S., 2008. Planning hydroelectric power plants with the public: a case of organizational and social learning in Mexico. Impact Assess. Proj. Apprais. 26, 163–176. Costanzo, B.P., Sánchez, L.E., 2014. Gestão do conhecimento em empresas de consultoria ambiental. Production 24 (4), 742–759. Daniels, S.E., Walker, G.B., 1996. Collaborative learning: improving public deliberation in ecosystem-based management. Environ. Impact Assess. Rev. 16, 71–102. Diduck, A., Sinclair, A.J., Hostetler, G., Fitzpatrick, P., 2012. Transformative learning theory, public involvement, and natural resource and environmental management. J. Environ. Plan. Manag. 55 (10), 1311–1330. Diduck, A.P., Pratap, D., Sinclair, A.J., Deane, S., 2013. Perceptions of impacts, public participation, and learning in the planning, assessment and mitigation of two hydroelectric projects in Uttarkhand, India. Land Use Policy 33, 170–182. DiMaggio, P.J., Powell, W.W., 1983. The iron cage revisited: institutional isomorphism and collective rationality in organizational fields. Am. Sociol. Rev. 48 (2), 147–160. Easterby-Smith, M., Araujo, L., 1999. Organizational learning: current debates and opportunities. In: Easterby-Smith, M., Burgoyne, J., Araujo, L. (Eds.), Organizational Learning and the Learning Organization. Sage, London, pp. 1–21. Fearnside, P., 2015. Brazil's São Luiz doTapajós dam: the art of cosmetic environmental impact assessments. Water Altern. 8 (3), 373–396. Fischer, T.B., Jha-Thakur, U., 2013. Environmental assessment and management related level degree programmes in the EU: baseline, trends, challenges and opportunities. J. Environ. Assess. Policy Manag. 15, 1350020 (26 pages). Fitzpatrick, P., 2006. In it together: organizational learning through participation in environmental assessment. J. Environ. Assess. Policy Manag. 8, 157–182. Gallardo, A.L.C.F., Aguiar, A.O., Sánchez, L.E., 2016. Linking environmental assessment and management of highway construction in Southeastern Brazil. J. Environ. Assess. Policy Manag. 18 (1), 1650002 (27 pages). Gazzola, P., 2008. Trends in education in environmental assessment: a comparative analysis of European EA-related Master programmes. Impact Assess. Proj. Apprais. 26, 148–158. Gazzola, P., Jha-Thakur, U., Kidd, S., Peel, D., Fischer, T., 2011. Enhancing environmental appraisal effectiveness: towards an understanding of internal context conditions in organisational learning. Plann. Theor. Pract. 12 (2), 183–204. Genskow, K.D., Wood, D.M., 2011. Improving voluntary environmental management programs: facilitating learning and adaptation. Environ. Manag. 47, 907–916. Glasson, J., Therivel, R., Weston, J., Wilson, E., Frost, R., 1997. EIA - learning from experience: changes in the quality of environmental impact statements for UK planning projects. J. Environ. Plan. Manag. 40 (4), 451–464. Glucker, A., Driessen, P.P.J., Kolhoff, A., Runhaar, H.A.C., 2013. Public participation in environmental impact assessment: why, who and how? Environ. Impact Assess. Rev. 43, 104–111. Greenberg, M.R., 2012. The Environmental Impact Statement After Two Generations: Managing Environmental Power. Routledge, Abingdon. Hanna, K.S. (Ed.), 2016. Environmental Impact Assessment: Practice and Participation, third ed. Oxford University Press, Toronto. Hill, W.W., Ortolano, L., 1976. Effects of NEPA's review and comment process on water resources planning: results of a survey of planners in the Corps of Engineers and Soil Conservation Service. Water Resour. Res. 12 (6), 1093–1100. Huber, G.P., 1991. Organizational learning: the contributing processes and the literatures. Organ. Sci. 2 (1), 88–115. Jacobs, P., Mulvihill, P.R., Sadler, P., 1993. Environmental assessment: current challenges and future prospects. In: Kennet, S.A. (Ed.), Law and Process in Environmental Management. Calgary, Canadian Institute of Resources Law, pp. 13–17. Jha-Thakur, U., Gazzola, P., Peel, D., Fischer, T.B., Kidd, S., 2009. Effectiveness of strategic environmental assessment – the significance of learning. Impact Assess. Proj. Apprais. 27 (2), 133–144. Kabir, S.M.Z., Rabbi, F., Chowdhury, M.B., Akbar, D., 2015. A review of mine closure planning and practice in Canada and Australia. World Rev. Bus. Res. 5 (3), 140–159. Kågström, M., Richardson, T., 2015. Space for action: how practitioners influence environmental assessment. Environ. Impact Assess. Rev. 54, 110–118. Kiker, G.A., Bridges, T.S., Varghese, A., Seager, T.P., Linkov, I., 2005. Application of multicriteria decision analysis in environmental decision making. Integr. Environ. Assess. Manag. Soc. Environ. Toxicol. Chem. (SETAC) 1 (2), 95–108. Kim, D.H., 1993. The link between individual and organizational learning.Sloan. Manag. Rev. 35 (1), 37–50. Kransdorff, A., 2006. Corporate DNA: Using Organizational Memory to Improve Poor Decision-Making. Gower Publishing, Burlington, VT. Landim, S.N.T., Sánchez, L.E., 2012. The contents and scope of environmental impact statements: how do they evolve over time? Impact Assess. Proj. Apprais. 30 (4), 217–228. Lawrence, D.P., 2013. Environmental Impact Assessment: Practical Solutions to Recurrent Problems. John Wiley& Sons, Hoboken. Lee, N., 1988. Training requirements for environmental impact assessment. In: Wathern, P. (Ed.), Environmental Impact Assessment: Theory and Practice. Unwin Hyman, London, pp. 143–158. Leeney, R.H., Greaves, D., Conley, D., O'Hagan, A.M., 2014. Environmental Impact Assessments for wave energy developments - learning from existing activities and informing future research priorities.Ocean and. Coast. Manag. 99 (C), 14–22. Mathur, V.N., Price, A.D.F., Austin, S., 2008. Conceptualizing stakeholder engagement in the context of sustainability and its assessment. Constr. Manag. Econ. 26 (6), 601–609. Merriam-Webster, 2015. Learning. Merriam-Webster, Inc. (Available at: http://www. merriam-webster.com/dictionary/learning).

203

Mezirow, J., 1997. Transformative learning: theory to practice. New Direct. Adult Contin. Educ. 74, 5–12. Mitchell, R.E., Leach, B., 2016. Knowledge integration and collaborative learning: lessons from the mining sector. Environ. Policy Govern. (forthcoming). Morrison-Saunders, A., 2009. Keynote address: beyond project approval – the learning legacy of EIA. International Association for Impact Assessment South Africa 2009 Annual Conference: Environmental Impact Assessment and beyond: What Have we Learned? Where Are we now? Where Are we Going?August, Wilderness, Southern Cape, South Africa, pp. 23–26 (Available at: http://researchrepository.murdoch.edu. au/28878/1/beyond_project_approval.pdf). Morrison-Saunders, A., Retief, F., 2012. Walking the sustainability assessment talk progressing the practice of environmental impact assessment (EIA). Environ. Impact Assess. Rev. 36, 34–41. O'Faircherllaigh, C., 2010. Public participation and environmental impact assessment: purposes, implications, and lessons for public policy making. Environ. Impact Assess. Rev. 30, 19–27. Othman, R., Hasim, N.A., 2004. Typologizing organizational amnesia. Learn. Organ. 11 (3), 273–284. Phan, T.N., Baird, K., 2015. The comprehensiveness of environmental management systems: the influence of institutional pressures and the impact on environmental performance. J. Environ. Manag. 160, 45–66. Polanyi, M., 1958. Personal Knowledge: Towardsa Post-Critical Philosophy. Routledge and Kegan Paul Ltd., London. Polanyi, M., 1997. Tacit knowledge. In: Prusak, L. (Ed.), Knowledgein Organisations. Butterworth-Boston: Heinemann. Pollack, K.M., Dannenberg, A.L., Botchwey, N.D., Stone, C.L., Seto, E., 2015. Developing a model curriculum for a university course health impact assessment in the USA. Impact Assess. Proj. Apprais. 33 (1), 80–85. Ramalingan, B., 2006. Tools for Knowledge and Learning: A Guide for Development and Humanitarian Organisations. Overseas Development Institute, London. Ramos, T., Cecílio, T., Melo, J.J., 2008. Environmental impact assessment in higher education and training in Portugal. J. Clean. Prod. 16 (5), 639–645. Raymond, C.M., Fazey, I., Reed, M.S., Stringer, L., Robinson, G.M., Evely, A.C., 2010. Integrating local and scientific knowledge for environmental management. J. Environ. Manag. 91, 1766–1777. Rodela, R., 2011. Social learning and natural resource management: the emergence of three research perspectives. Ecol. Soc. 16 (4), 30. Saarikoski, H., 2000. Environmental impact assessment (EIA) as collaborative learning process. Environ. Impact Assess. Rev. 20, 681–700. Sánchez, L.E., 2010. Environmental impact assessment teaching at the University of São Paulo: evolving approaches to different needs. J. Environ. Assess. Policy Manag. 12 (3), 245–262. Sánchez, L.E., 2012. Information and knowledge management. In: furthering environmental impact assessment. In: Perdicoúlis, A., Durning, B., Palframan, L. (Eds.), Furthering Environmental Impact Assessment: Towards a Seamless Connection between EIA and EMS. Edward Elgar, Cheltenham, pp. 19–38. Sánchez, L.E., 2013. Development of environmental impact assessment in Brazil. UVP-Report, 27(4+5), pp. 193–200. Sánchez, L.E., André, P., 2013. Knowledge management in environmental impact assessment agencies: a study in Québec, Canada. J. Environ. Assess. Policy Manag. 15 (3), 1350015 (32 pages). Sánchez, L.E., Morrison-Saunders, A., 2010. International survey of impact assessment education. Impact Assess. Proj. Apprais. 28 (3), 245–250. Sánchez, L.E., Morrison-Saunders, A., 2011. Learning about knowledge management for improving environmental impact assessment in a government agency: the Western Australian experience. J. Environ. Manag. 92, 2260–2271. Sánchez-Triana, E., Ortolano, L., 2001. Organizational learning and environmental impact assessment at Colombia's Cauca Valley Corporation. Environ. Impact Assess. Rev. 21, 223–229. Shepherd, A., Ortolano, L., 1997. Organizational change and environmental impact assessment at the electricity generating authority of Thailand: 1972–1988. Environ. Impact Assess. Rev. 17, 329–356. Sinclair, A.J., Diduck, A.P., 2001. Public involvement in environmental assessment in Canada: a transformative learning perspective. Environ. Impact Assess. Rev. 21 (2), 113–136. Sinclair, A.J., Diduck, A., Fitzpatrick, P., 2008. Conceptualizing learning for sustainability through environmental assessment: critical reflections on 15 years of research. Environ. Impact Assess. Rev. 28, 415–428. Stelmack, C.M., Sinclair, A.J., Fitzpatrick, P., 2005. An overview of the state of environmental assessment education at Canadian universities. Int. J. Sustain. High. Educ. 6, 36–53. Stevenson, M.G., 1996. Indigenous knowledge in environmental assessment. Arctic 49 (3), 278–291. Walkerden, G., 2005. Adaptive management planning projects as conflict resolution processes. Ecol. Soc. 11 (1) (48 11 pages). Webler, T., Kastenholz, H., Renn, O., 1995. Public participation in impact assessment: a social learning perspective. Environ. Impact Assess. Rev. 15 (5), 443–463. Weiland, U., 2012. Teachings on environmental impact assessments in geography reflecting demands of the working practice: a German case study. J. Environ. Assess. Policy Manag. 14, 1250019 (14 pages). Wende, W., 2002. Evaluation of the effectiveness and quality of environmental impact assessment in the Federal Republic of Germany. Impact Assess. Proj. Apprais. 20 (2), 93–99. World Commission on Dams, 2000. Dams and Development: A New Framework for Decision-Making. Earthscan, London. Young, I.M., 2000. Inclusion and Democracy. Oxford University Press, Oxford, UK.

204

L.E. Sánchez, R. Mitchell / Environmental Impact Assessment Review 62 (2017) 195–204

Zhu, Q., Cordeiro, J., Sarkis, J., 2013. Institutional pressures, dynamic capabilities and environmental management systems: investigating the ISO 9000 - environmental management system implementation linkage. J. Environ. Manag. 114, 232–242. Luis E. Sánchez teaches environmental planning and management at Escola Politécnica, an engineering school at the University of São Paulo, Brazil, where he is Full Professor of Mining Engineering. He holds a Ph.D. on economics of natural resources from the Paris School of Mines, France. He was visiting lecturer at the University of Montreal, Canada and research fellow at Murdoch University, Australia. His research interests include all forms of impact assessment as well as sustainability in the extractive industries.

Ross Mitchell is a senior social and environmental specialist who has led environmental and industrial projects and advised on resource-based development in many countries. He holds a PhD in Environmental Sociology from the University of Alberta. He currently works for Shell based in The Hague, and is responsible for coordination of early identification, pro-active management and integration of social and environmental risks and opportunities in upstream deals, ventures and projects. He has extensive experience in the analysis and integration of social and environmental issues and indicators into sustainable business operations, stakeholder engagement, and policy analysis in the public and private contexts. His specialties include risk assessments, social and environmental impact assessments, public participation, international and community development, and sustainable approaches to resource management.