Acting Responsibly with Geospatial Data - IEEE Xplore

5 downloads 12436 Views 76KB Size Report
Roland L. Trope, roland.trope@verizon.net disputes? Taiwan recently .... business or personal productivity;. • public health .... Law School. Contact him at roland.
Digital Protection Editors: Michael Lesk, [email protected] Martin R. Stytz, [email protected] Roland L. Trope, [email protected]

Acting Responsibly with Geospatial Data

G

iven their potential for usefulness, who would have thought that mapping applications such as Google Earth (http://earth.google.com) and Google Maps (http://maps.google.com) could

cause trouble, let alone become new f lashpoints for international disputes? Taiwan recently complained about being referred to as a province of China in Google Earth,1 and when Google responded by revising the reference, China’s mainland media suggested a possible boycott of Google’s China service.2 Malaysia and Thailand have raised concerns about the potential ramifications for national security (the maps reveal sensitive infrastructure information), and even India’s president, Abdul Kalam, has warned that terrorists could use the application to find targets.3 These nifty mapping applications have prompted apprehensions and policy debates about access to and use of sensitive geospatial data, and the issues are extremely complex yet subtle. Moreover, Google has accentuated the decentralized control of geospatial data by releasing its application programming interface (API), which gives anyone with unrestricted access to the Internet the ability to combine geomapping software and images with other data for new and varied uses. The fear of providing maps or other geospatial data to states, organizations, or people who would deliberately or inadvertently misuse the data drives law enforcement and security agencies to avoid, limit, or track disclosure of such information.

Representatives of the same interests, however, are also pursuing a geospatial data policy of “build once and share or use many times.” How best, then, to balance the ability to obtain and use such information with the need to restrict access? How much geospatial detail must be made accessible? How does an originator of geospatial data know that the data he or she creates and releases contains time-, security-, or privacy-sensitive information? What criteria should we use to categorize geospatial data?

Geospatial data’s origins Geospatial data is simply a “language of the landscape;” it can, for the occurrence of every event, “provide position-based knowledge” (see www.fgdc.gov/publications/ documents/geninfo/Beyond _Boundaries_Lo.pdf ). It consists of “information that identifies the geographic location and characteristics of natural or constructed features and boundaries on the Earth. This information may be derived from, among other things, remote sensing, mapping, and surveying technologies. Statistical data may be included in this definition at the discretion of the collecting agency.”4 To the extent that such data is time-sensitive and

PUBLISHED BY THE IEEE COMPUTER SOCIETY



focused on operational deployments, movements, and schedules, it can originate from widely available portable technologies. Statistics and other data are increasingly important because they permit the creation of profiles with myriad uses. Knowing that a street has restaurants, for example, is less useful than knowing the location of the Thai, Indian, or Southwestern restaurants on that street, the passage of a patrol car this evening, or the delay of a targeted individual’s arrival at one of those restaurants. The availability of richly informative geospatial data has spawned a geographic information industry with significant exports. Countries are increasingly aware that access to geospatial data is critical in bolstering economic growth and job creation because it spurs innovation. The military’s effort to optimize geospatial data for decision-making prior to and during critical operations has led to the recent establishment of the US National GeospatialIntelligence Agency (NGA), which has a mandate to provide timely, relevant, and accurate geospatial intelligence in support of national security. When combined with simulation technologies, the use of such data is readily seen in military and homeland security contexts both for incidentresponse planning and actual response activities. We should not assume, however, that the military applies a “more data is better” rule.

E. M ICHAEL POWER Gowling LaFleur Henderson LLP ROLAND L. TROPE Trope and Schramm, LLP

Getting the balance right How big are the security problems surrounding geospatial data? Are the fears about it unfounded? Will the increasingly available technologies

1540-7993/05/$20.00 © 2005 IEEE



IEEE SECURITY & PRIVACY

77

Digital Protection

for capturing, transmitting, displaying, and interpreting geospatial data ultimately equip terrorists and criminals with data they currently cannot get? In 2004, RAND published a study, Mapping the Risks: Assessing the Homeland Security Implications of Publicly Available Geospatial Information, which indicated that although targets are more easily identified with geospatial data, planning an attack requires more reliable, detailed, and timely information than is usually publicly available. Attackers seek such information from non-geospatial data sources, including (as in the case of an off-shore mining platform cited in the report) an episode of the X-Files and Scuba Diving magazine. In preparing its report, RAND examined 5,000 US government Web sites to assess the nature of publicly available federal geospatial information and its usefulness (for bad guys) and uniqueness. It found that less than 6 percent of federal data sets appeared to have the potential to be useful for a potential attacker, no data sets were critical (that is, necessary for an attack), and less than 1 percent were both potentially useful and unique. From a security perspective, this seems to be less of a problem than people might think, but “release” decisions about sensitive geospatial data will continue to be made, so decision-makers need a plan for handling and releasing such data responsibly. In May 2004, the Homeland Security Working Group of the US Federal Geographic Data Committee (FGDC) issued, for public comment, its proposed Guidelines for Providing Appropriate Access to Geospatial Data in Response to Security Concerns(www.fgdc.gov/fgdc/homeland/ access_guidelines.pdf ). As the FGDC explained, “The Guidelines provide procedures to identify sensitive information content of geospatial data sets. Should such content be identified, the Guidelines help organizations decide what access to provide to such data and still protect sensitive infor78

IEEE SECURITY & PRIVACY



mation content.” The FGDC sought, in the Guidelines, to balance two competing principles: “encouraging access to information and effectively safeguarding information that is truly sensitive” (see www.fgdc.gov/fgdc/ homeland/response_to_comments. pdf ). In June 2005, after consideration of the submitted comments, the FGDC completed the final Guidelines, and, in August 2005, adopted and issued them (www.fgdc.gov/fg dc/steer/2005/steer062305.html). The FGDC published the Guidelines chiefly for the benefit of data originators, with the aim of providing such organizations with a set of procedures for identifying sensitive geospatial data and a methodology for making decisions whether to publish such data, the kind and quantity to release thereby, and the extent to which some of the data should be changed. The Guidelines recommend that originators only release sensitive geospatial data that are not a unique source of such information and for which the security risks do not outweigh the benefits of disclosure. Data not meeting these criteria must be safeguarded, which can be accomplished by either changing the data or restricting its release. The Guidelines are based on a series of principles derived from federal and state laws and policies, but they essentially boil down to “right to know” versus “need to secure.” The Guidelines contain an extensive 14step decision-tree (see Figure 1). The steps fall into three phases: Who decides to apply safeguards? Does the data need to be safeguarded? What safeguards are appropriate? Section 2 of the Guidelines goes to the heart of the matter: Is the geospatial data “sensitive” and thus in need of partial or complete safeguarding? It is here that the Guidelines assist in making the necessary determination. Perhaps surprisingly, sensitivity has little to do with accuracy. Where something is located is less important than what can be discerned from the information pro-

NOVEMBER/DECEMBER 2005

vided—for example, does the geospatial data somehow permit an observer to discern information about security procedures? An all-too-common answer in any assessment as to whether sensitive geospatial data should be released is to say, “the potential for trouble is present and therefore the data should not be released.” The Guidelines attempt to avert such counterproductive results by introducing a “net benefit” test to encourage the decision-maker to pause and ask, if the data has the potential to be sensitive, should it be released anyway because of its net benefit to society? The decision involves a two-step process, with the first step having security as its focus: would the release of the sensitive geospatial data cause a significant • increase in the likelihood of attack? • decrease in the difficulty of executing an attack? • increase in the damage caused by an attack? For the second step, the security risk is juxtaposed with potential benefits— namely, whether disclosure of the geospatial data would contribute to • business or personal productivity; • public health, public safety, or the government’s regulatory functions; • the support of legal rights and public involvement; or • the maintenance of a data source where no alternative of equal quality at the same cost exists. Perhaps the most significant aspect is that incorporating a net benefit test avoids—or at least diminishes the probability of—ill-considered, short-sighted decisions to restrict releases of sensitive geospatial data as a matter of principle. Given the economic importance of geospatial information, the Guidelines provide a valuable tool for data governance in an area that many organizations have

Section I: Is it your decision to apply safeguards to these data?

Digital Protection

No

1. Did your organization originate these data? Yes 3. Document your use of the decision procedure.

(Have the sensitivity concerns been addressed by the changes to data?) 4. Are these data useful for selecting specific targets, and/or for planning and executing an attack on a potential target?

Section II: Do these data need to be safeguarded?

2. Follow instructions of originating organization.

No

Yes No

5. Is the information unique to these data?

7. Safeguarding isn't justified.

Yes No 6. Do the security costs outweigh the social benefits of active dissemination of these data? Yes 8. Would the public still be served, and the security risk be mitigated, by changing these data?

Yes

9. Do you have the authority to change these data?

10. Change these data.

Yes

No Section III: What safeguards are authorized and justified?

No 11. Do you have the authority to restrict these data?

Yes

No 12. Will the appropriate decision-maker give permission to restrict these data?

13. Decide the extent of the restrictions.

Yes

No

Decision or process

14. Safeguarding isn't authorized.

Valid endpoint for use of the Guidelines

Figure 1. Flowchart. The Guidelines provide a logical decision-making process that should help organizations identify sensitive geospatial data, ensure such data is protected, and make such decisions in a uniform and effective manner. (Figure courtesy of the US Federal Geographic Data Committee.)

not considered yet. However, scrutiny of the Guidelines prompts further questions and reveals significant omissions.

The most serious omissions appear in the Guidelines’ express assumptions. For example, the Guidelines assume that organizations,

particularly those that originate or receive sensitive geospatial data, already have procedures for handling such data internally. We doubt this

www.computer.org/security/



IEEE SECURITY & PRIVACY

79

Digital Protection

assumption, if empirically tested, would be true for even half of the commercial organizations in any country. The data breaches reported

rity organizations, and facility operators.” Implicit in such a recommendation is the belief that the organization should initiate such

‘Release’ decisions about sensitive geospatial

data before their release must be well-established and periodically revised to ensure that organizations handle such data responsibly.

Acknowledgments

makers need a plan for handling and releasing

The views expressed in this article are solely those of the authors, and do not reflect the views of, and should not be attributed to, the US Military Academy, US Department of Defense, or the US government.

such data responsibly.

References

data will continue to be made, so decision-

during 2005 suggest that most companies have yet to develop reliable procedures for enterprise-wide data governance and are slow to consider adopting a lifecycle approach to data governance. With such shortcomings, it is highly improbable that most companies would have developed reliable internal procedures for responsible control of sensitive geospatial data. The Guidelines also imply that the intended audience—organizations that originate geospatial data and the people who manage or oversee such organizations—understand the potential security risks posed by dissemination of geospatial data. Unfortunately, this assumption begs several crucial questions—for example, because terrorists’ and criminals’ harmful intentions define the set of targets and thus the subject matter and time-sensitivity of the data, how can commercial firms reliably label sensitive data? How can such firms know when previously sensitive data is now appropriate to release? Although the Guidelines make some unwarranted assumptions, they offer sound recommendations that, if followed, could help an organization improve its chances of responsibly handling sensitive geospatial data. The Guidelines caution that if a geospatial-data-originating organization has doubts or is uncertain about the “potential security consequences of disseminating geospatial data,” it “should seek advice from others including legal counsel, secu80

IEEE SECURITY & PRIVACY



consultations proactively. Moreover, organizations that want the full benefits of such consultations must do so early enough for senior decisionmakers to issue orders that implement the improved internal procedures, educate personnel, and train and test personnel to verify that enterprise-wide procedures are having the intended beneficial effects. Second, the Guidelines recommend that originating organizations should “make every effort to learn about the laws and regulations that affect dissemination of their data and should carefully consider the magnitude of the security risk incurred versus the benefits that accrue from the dissemination of any particular data.” Few organizations will welcome this recommendation because they are already overwhelmed with efforts to monitor, in multiple jurisdictions throughout the world, the potentially applicable laws concerning the handling of sensitive information such as personal data.

rganizations must recognize that geospatial data can be created to contain highly sensitive data and that responsible handling of such data will not detract from a firm’s commercial opportunities—in fact, it could help it avert severe reputational damage. Originating organizations will find that as the data they handle become increasingly sensitive, the procedures for deciding whether to withhold or change such

O

NOVEMBER/DECEMBER 2005

1. L. Haines, “Taiwan Huffs and Puffs at Google Earth,” The Register, 4 Oct. 2005; www.theregister.co.uk/ 2005/10/04/taiwan_google_earth/. 2. L. Haines, “Irate Chinese Threaten Google Boycott,” The Register, 20 Oct. 2005; www.theregister.co.uk/ 2005/10/20/china_google_strop/. 3. P. Goodyear, “India Smarts at the Google Earth Imagery,” Earth times.org, 19 Oct. 2005; www. earthtimes.org/articles/show/4270. html. 4. Section 1, Executive Order 13286, Federal Register, vol. 68, no. 43, 5 Mar. 2003, pp. 10619–10633. E. Michael Power is a partner in the Ottawa, Canada, office of Gowling Lafleur Henderson LLP, where he provides strategic and legal advice on privacy, information security, electronic commerce, and electronic government. He has a BA, an LLB, and an MBA from Dalhousie University, Canada. Power and Roland L. Trope recently co-authored Sailing in Dangerous Waters: A Director’s Guide to Data Governance (American Bar Association, 2005). Contact him at [email protected]. Roland L. Trope is a partner in the New York City office of Trope and Schramm LLP and an adjunct professor in the Department of Law, US Military Academy. He provides strategic and legal advice on mergers and acquisitions, export and defense trade controls, trade sanctions, anti-money laundering, personal data protection, information security, intellectual property, cyberspace law, and defense procurements. Trope has a BA in political science from the University of Southern California, a BA and an MA in English language and literature from Oxford University, and a JD from Yale Law School. Contact him at roland. [email protected].

Suggest Documents