In the affiliate marketing industry, Algorithmic Management Software (AMS) is leveraged by firms ... accommodations, additional equipment â furniture, etc. .... many associates for their part in driving and securing a sale: 'In affiliate ..... the transaction data points â that can be assigned to t and result in an increase for PD while.
Algorithmic Management Software: Exploring the Adverse Effects of Digitally Enabled Freelance Labor
Candidate Number: 82841 Word Count: 8,564 Supervisor: Dr. Carsten Sørensen
ABSTRACT In the affiliate marketing industry, Algorithmic Management Software (AMS) is leveraged by firms for all tasks associated with staffing – onboarding, oversight, regulation and quality assurance that pertain to freelance and telecommuter labor; labor that is almost entirely conducted remotely via hyperspecialized information systems. Simply meaning that individuals need only an internet connection, and the necessary qualifications listed by a firm, to begin working and earning a commission-based salary instantaneously. The benefit of which a firm realizes when able to drastically expand their pool of available talent without the need to incur additional overhead costs typically associated with an increased labor force such as human resource related expenditures, larger office accommodations, additional equipment – furniture, etc. Hyperspecialized information systems can yield an incredibly powerful and sustained competitive advantage for a firm. Yet, one-off instances of budget misallocations and ineffectual quality assurance – verification methods, that originate as a direct result of the AMS design logic, can render these systems a highly volatile investment. Just as properly utilized data analytics can help a ‘good’ firm achieve greatness, so to can a hyperspecialized information system enable a firm to boost revenues along with other economic benefits, if the AMS is engineered, implemented and maintained satisfactorily. However, it is my contention that AMS is not well maintained because of the increasing amount of fraud in the digital domain. Yet fraud is not appropriately looked for, resulting in a situation where there is over reliance on, as well as unquestioning trust in, the value of AMS. Hence my thesis: that poor detection rates (PD) of fraud is high and increasing over time.
Keywords: Information System Platforms | Telecommuting I Hyperspecialization
I Actor-Network Theory
Algorithmic Management Software | High-Low Quality Leads I Intrusion Detection Systems
MG4D7 | Mark Benya | MISDI 2017-18
2 | 31
LIST OF ACRONYMS AMS
Algorithmic Management Software
RPA
Robotic Process Automation
IS
Information Systems
IT
Information Technologies
AM
Affiliate Marketing
ANT
Actor-Network Theory
IDS
Intrusion Detection System
QA
Quality Assurance
LIST OF FIGURES & TABLES (Figures & Table in order of appearance)
Figure 1: Algorithmic Management Software (AMS) Overview Figure 2: Quantitative Dataset Profile Figure 3: Computation of Detection Rate (PD) and Error Rate (PF) Figure 4: Countries Represented in Transaction Datasets Figure 5: Correlational Analysis Decision Tree Table 1: Proposed Dataset Metrics for Continuing Research Figure 6: Proposed Actor Network Theory (ANT) Alignment
MG4D7 | Mark Benya | MISDI 2017-18
3 | 31
TABLE OF CONTENTS ABSTRACT ___________________________________________________________________ 2 LIST OF ACRONYMS ____________________________________________________________ 3 LIST OF FIGURES & TABLES ______________________________________________________ 3 1.) Introduction _______________________________________________________________ 5 2.) Theoretical Overview & Corresponding Literature _________________________________ 8 3.) Research Objectives & Methodology __________________________________________ 11 3.1) Research Design _______________________________________________________________ 12 3.2) Data Collection ________________________________________________________________ 13
4.) Data Analysis Strategies & Methods __________________________________________ 15 4.1) Correlational Analysis Outcomes _________________________________________________ 18 4.2) Limitations ___________________________________________________________________ 18 4.3) Further Research ______________________________________________________________ 20
5.) Discussion & Conclusion _____________________________________________________ 20 BIBLIOGRAPHY_______________________________________________________________ 24 APPENDIX___________________________________________________________________ 27
A.) Sample of Transaction Datasets _________________________________________________________ 27 B.) Correlational Analysis of Transaction Datasets ______________________________________________ 28 C.) Quantitative Dataset Overview __________________________________________________________ 29 D.) Acknowledgements ___________________________________________________________________ 30
Supplement (Raw Data) _______________________________________________________ 31
MG4D7 | Mark Benya | MISDI 2017-18
4 | 31
1.) Introduction Independent contractors, colloquially known as hired guns, have been an integral part of the organizational labor force since the late 1980s (Beniger, 1989; Felstead et al., 1999). Boundaries associated with geographic distance have been all but eliminated thanks largely in part to Information System Platforms: any tool, artifact, or other digital – online enabled technology that challenges traditional “gainfully employed” norms, allowing contractors to work remotely for one or more firms simultaneously from any location on the planet provided there is an internet connection (Marotta-Wurgler, 2011). As a result, contractors or telecommuters, where Telecommuting is defined as: periodic work out of the principle office one or more days per week (Nilles, 1998), can end up moving from firm to firm in rapid succession to pursue new assignments and challenges. At the heart of this ‘work-from-home’ industry is a smattering of Hyperspecialization Information System Platforms. Hyperspecialization refers to having been exclusively designed & developed for a single – extremely specialized – set of functionalities (Schwab, 2016). The five primary classifications of hyperspecialized information systems, observed by Peterson (2005) include: Communication, Lead & Transaction Tracking, Office Automation, Enterprise Resource Planning, and Human Resource Management. In this thesis a single type of hyperspecialized information system platform was studied in depth, rather than attempting to adequately address all of Peterson’s (2005) uniquely identified classifications. A Lead & Transaction Tracking information system – more specifically an affiliate marketing-based platform, that is entirely automated and regulated via Algorithmic Management Software – was explored in order to shed light on the potential adverse side effects of fraud appearing to stem from ineffectual Algorithmic Management Software alignment and implementation. “Affiliate marketing is built on a performance-based system where a business rewards one or many associates for their part in driving and securing a sale: ‘In affiliate advertising, an online retailer places a link for its business at a host business’s site. The host earns a commission whenever a visitor clicks the link and consummates a transaction with the sponsor (Papatla et al., 2002)’ (MY401 essay, 2018)”. Algorithmic Management Software (AMS) refers to a digital system that manages a preset clustering of tasks via a set of ‘black and white’ – ‘IF statement’ rules, i.e. if a unique IP address is routed to a landing page as a result of the user clicking a banner advertisement, dispense $X as a commission payment to the freelancer for their efforts in facilitating the transaction. In the case of the AMS being analyzed, this system is responsible for performing the various background screening and reference checks necessary to vet new freelance entity applications, verifying and distributing commission payments to freelancers, as
MG4D7 | Mark Benya | MISDI 2017-18
5 | 31
well as quality assurance and verification of leads in order to ascertain if they are of a low or high quality (Nilles, 1998; Papatla et al., 2002; Matin, 2007). Additionally, AMS will contain an Intrusion Detection Systems (IDS) security feature. This security feature contains a set of ‘white-listed’ and ‘black-listed’ sources, and monitors the activities of all abnormal – ‘black-listed’ sources. “The IDS observes the transaction but does not know whether it came from a normal or abnormal source. The goal of the IDS is to classify each transaction as legal or fraudulent, and to give a warning signal to security management in case of a fraudulent activity” (Cavusoglu et al., p.71, 2004). As part of the quality assurance auditing, transactions will be classified as either low or high quality to a varying degree. This enables firms and affiliates alike to analyze and continually refine the various methods by which they drive sales. “Unlike traditional media, online advertising provides an advertiser with real-time information of an advertisement’s efficacy” (Matin, p. 536, 2007). The benefit of this quality assurance feature is undeniable, allowing for a firm to adjust their budget allocations instantaneously to maximize high quality transactions while subsequently eliminating sources that generate low quality transactions. This continual refinement strategy is among one of the key benefits of AMS, and allows all involved parties to drastically improve on the occasionally abysmal ‘clickthrough rates’ – otherwise referred to as the quality of a lead. “Some studies suggest that click-through-rates, which measure how often a banner advertisement attracts a response, have fallen as low as 0.36%” (Papatla et al., p.69, 2002). In contrast, affiliate marketing (AM) platforms that successfully implement and achieve sustained refinement strategies can experience high quality conversions 10 to 20 percent of the time1. Quality leads or transactions, as classified by the AMS, are basically datasets that consist of information that is deemed to be of value by a firm. Included are names, addresses, online order preferences, phone numbers, dates of birth, and other pertinent details utilized by a firm in order to market products and eventually generate revenue. Complete datasets with verified, accurate, and up to date information yield a greater likelihood of producing tangible sales and are therefore labeled High-Quality Leads; affiliates strive to produce the maximum amount of high quality leads given that they fetch a significantly higher price than low-quality leads. In contrast, Low-Quality Leads tend to be incomplete, lacking in accuracy, and have a far lesser chance of generating a sale or producing revenue for the firm (Papatla et al., 2002; Matin, 2007; Dinev et al., 2008). Due to the ever-increasing amount and complexity of the variables to be managed within the digitally enabled world, an Actor-Network Theory conceptual framework should be taken into consideration for all hyperspecialization related business decisions. The Actor-Network Theory (ANT) was developed in an effort to address the increasingly complex socio-technical modern world. “Hybrids of human and nonhuman elements continue to proliferate, and the boundaries…between human and machine capabilities, are frequently contested and always negotiable” (Walsham, p. 477, 1997). In essence, the ANT offers a viable method for identifying 1
This is industry standard based on my personal experience.
MG4D7 | Mark Benya | MISDI 2017-18
6 | 31
and understanding the key strengths and weaknesses pertaining to a specific ‘actor’ within their overarching network [i.e. the pros and cons that originate from the information system platform that are a direct result of how it is situated and aligned within the digital AM ecosystem-network – Figure 1]. All of which can contribute to an increasingly effective alignment between business processes and information system platforms (Walsham, 1997). Increased instances of quality assurance failure – as exemplified by higher overall amounts of low-quality transactions – can abound from Information System-empowered freelance / contractor employees; all of which appears to stem from ineffectual AMS functionality. While in theory instances of work that are solely managed by output and non-human metrics can offer a sustained competitive advantage for a firm, the focus of this thesis concerns the adverse side effects of AMS. Prior to implementing effectual proactive measures, we must first understand all of the implications – beneficial or otherwise – that originate from a business decision to replace aspects of a firm with AMS. This is because the business processes that end up managed entirely by algorithmic software are unable to produce results that are of the same caliber as that of their human counterparts (Zuboff, 1988; Nilles, 1998; Felstead et al, 1999; Barley et al., 2004; Dinev et al., 2008; Marotta-Wurgler, 2011). Hence, an IDS that yields poor detection rates (PD) – from an AMS quality assurance standpoint – will undoubtedly damage a firm’s credibility and reputation. In this thesis I focus on the PD for fraud because this metric is arguably the most significant.
Figure 1: Algorithmic Management Software (AMS) Overview. As shown in green the Tracking Software or AMS is the central actor in the company – sales force – customer network.
MG4D7 | Mark Benya | MISDI 2017-18
7 | 31
2.) Theoretical Overview & Corresponding Literature Upon review of the various articles, journals and miscellaneous literature centered on the use of algorithmically managed information system platforms for hyperspecialized labor, a handful of recurring themes became apparent. These themes identify the numerous benefits that can be derived from an algorithmically managed hyperspecialized platform, as well as the ever present and exponentially growing problem of click fraud, in addition to other negative ramifications for a firm that extend far past simple monetary damages and losses (Lynne Markus, 1983; Beniger, 1989; Nilles, 1998; Felstead et al., 1999; Barley et al., 2004; Dinev et al., 2008; Malone et al., 2011; Gratton, 2011; Fidock et al., 2012; Schwab, 2016; Willcocks et al., 2016). Hyperspecialization in the Information Systems field, while still relatively new, is touched on in a handful of publications including the Harvard Business Review (Malone et al., 2011), and in texts by Nilles, Beniger, and Felstead (Nilles, 1998; Beniger, 1989; Felstead et al., 1999). Additionally, the topic of Robotic Process Automation (RPA) is discussed by Willcocks et al. (2016), which is essentially the process of replacing human assembly line workers with hardware and software counterparts. “RPA is a software-based solution, a so-called ‘software robot’” (Willcocks et al., p.66, 2016), and if implemented effectively can yield efficient business processes, for a fraction of the cost required to mimic outputs via human actors alone. AMS and RPA solutions are fundamentally virtual assistance employees, that when successfully implemented within an organization, can significantly catalyze the productivity and overall outputs produced by their human actor counterpart. Furthermore, Peterson (2005) accurately groups and classifies the various types of work that can now be outsourced via a hyperspecialized info system and helps establish its relevance in the current digital ecosystem (Peterson, 2005; Schwab, 2016; Willcocks et al., 2016). Malone et al., (2011) outline the benefits derived from the utilization of an Information Systems (IS) platform for hyperspecialized labor, although the theorized drawbacks are only briefly discussed. These include the rise of ‘digital sweatshops’ where workers are huddled in small windowless rooms, laboring away in front of a computer screen, collecting next to nothing in wages, entirely at the mercy of a large faceless corporation (MY401 essay, 2018). The digital sweatshop appears to mirror the equally appalling conditions that arose during the era of ‘homework’ in the early 1900s; “for the most part the weekly wages are monstrously small, and the hours of work cruelly long…. In short, the economic conditions are as bad as they could be….” (Felstead et al., p. 88, 1999). Another theorized drawback comes from a freelancer incidentally working on a project with which their moral compass does not align. The perceived benefit derived from ‘black-boxing’ aspects of a project – thereby facilitating encapsulation – might inadvertently prevent the freelancer from thoroughly understanding the project’s end goal; i.e., a pacifist contributes their time and effort to a military endeavor (Malone et al., 2011). Both are technically valid issues that a freelancer may experience when engaged with digitally enabled hyperspecialized labor. Furthermore, the theoretical secondary and tertiary negative implications for a firm that decide to hyperspecialize aspects of their business processes are not
MG4D7 | Mark Benya | MISDI 2017-18
8 | 31
discussed at length – i.e., financial damage resulting from defamation after admitting publicly to having been defrauded or being the victim of a cyberattack (McManus, 2001; Edelman, 2008). Key advantages of AMS – a modern form of service automation – are thoroughly explored and discussed by Willcocks et al., (2016): “once software is designed, the marginal cost of producing additional copies is nearly zero, potentially offering huge economics of scale” (Willcocks et al., p. 38, 2016), and Papatla: “No upfront costs to either the host or sponsor…the number of hosts that a sponsor can use is virtually limitless” (Papatla et al., p. 70, 2002). This feature in essence allows a firm to produce digital copies of a quality assurance management employee for minimal initial cost and zero recurring costs. Enabling a firm to rapidly bolster its workforce with freelance and contractor – 1099 labor2; while simultaneously avoiding the need to incur traditional expenses associated with expanding the workforce. “If configured correctly, the RPA software should do the work better, faster, and much cheaper than the HR specialist” (Willcocks et al., p. 67, 2016). Granted, all the perceived benefits hinge on the ‘correct configuration’ or successful alignment of the AMS actor within the network; all of which is adequately covered by the relevant literature. The five primary classifications of hyperspecialized information system, highlighted by Peterson (2005), are as follows: Communication – Skype, WhatsApp, Microsoft Lync and any other IS that allow entities to interface. Lead & Transaction Tracking – Affiliate marketing and freelancer platforms such as Upwork or Elance that promote the exchange of services between a contractor and client. Office Automation – such as the Microsoft suite of products. Enterprise Resource Planning – any and all inventory management or merchandise planning platforms. Human Resources – any and all IS platforms specifically for payroll tasks, personnel management and so forth. There exist slightly more detailed classifications according to Nilles (1998) and Felstead (1999) although their focus is centered on that of Telecommuting, which bares striking resemblance to hyperspecialized affiliate freelance labor, save for one important aspect: the outputs produced by the affiliate marketing platform are regulated and audited exclusively by AMS. As observed by Felstead: “home-located production often calls forth a degree of self-conscious invention of technologies of the self because many of those who make their living at home do not have established traditions to draw on or culturally prescribed role models to emulate” (Felstead et al, p. 118, 1999). The AMS are designed to help guide those inventive uses of technology in such a way that benefits all involved parties, while simultaneously maintaining all business processes. As shown in Figure 1, the AMS is the key moderating and regulatory actor within the AM network (Walsham, 1997). Teleworkers, argues Nilles, are perpetually “inventing telecommuting for themselves and somehow convincing their supervisors to let them work at home part of the time” (Nilles, preface p. x, 1998). The original work-from-home revolution thrived given the ability for human managers 2
This refers to the US Internal Revenue Service tax document submitted by employers to contract employees.
MG4D7 | Mark Benya | MISDI 2017-18
9 | 31
to occasionally meet with and evaluate outputs face to face. Yet, in the modern landscape consisting almost entirely of AMS driven quality assurance methods, it is necessary to ascertain the key benefits as well as negative implications associated with this alternative management approach. AMS attempts to maximize the benefits of hyperspecialized labor – by enabling the mass employment of freelancers, while simultaneously providing quality assurance verification turnover at a faster and far cheaper rate than a human manager – yet is devoid of the one key characteristic that is vital for long-term success, namely the human manager’s ability to continually refine and adapt the quality assurance techniques into their verification process (Nilles, 1998; Beniger, 1989; Felstead et al., 1999; Willcocks et al., 2016). AMS quality assurance shortcomings can originate from a multitude of sources, both as a naturally occurring byproduct of the AM industry, or as a result of intentional manipulation and corruption of the IS – otherwise classified as cybercrime. Cybercrime is either not detected or can be intentionally ignored due to the negative repercussions that immediately follow news of an IS breach. Furthermore, long-term secondary and tertiary adverse side effects can compound such as “loss of customers that switch to competitors, inability to attract new customers because of perceived poor security…potential future legal liabilities arising out of the breach…” (Cavusoglu et al., p. 67, 2004). Consider the Facebook – Cambridge Analytica scandal whereby the initial fallout resulting from IS manipulation paled in comparison to the overall long-term damage suffered to Facebook’s credibility, which directly resulted in shareholder losses and impending government sanctions. Cavusoglu argues that “security lapses can lead to breach of consumer confidence and trust in addition to lost business and third-party liability” (Cavusoglu et al., p. 67, 2004). Hence, the emphasis of this quantitative study is on that of IS manipulation and cybercrime that goes undetected by QA measures and features in an AMS hyperspecialized information system, whereby low-quality leads can be disguised and registered as high-quality leads. This in turn, throws off the pareto equilibrium of the digital marketing ecosystem, and can catalyze negative secondary and tertiary side effects3. Matin and Dinev estimate between 10% to 15% of all ‘ad clicks’ are fraudulent, resulting in ~$500m – $1b in losses per year, and this amount only continues to grow as additional instances of AMS are leveraged by firms without an adequate implementation and alignment focus (Cavusoglu et al., 2004; Matin, 2007; Dinev et al., 2008). A number of studies focus on a wide range of benefits that can be derived from the business decision to leverage telecommuting – hyperspecialized labor. A variety of observations pertaining to the benefits of telecommuting are discussed at length: i.e. the elimination of ~3,000 cars from commuting from a hypothetical downtown per day (Nilles, 1998), drastically improving the traffic flow situation. Additionally, there is the retention of energy, that would otherwise have been spent on the commute into work, thereby allowing teleworkers to be more active in local functions, “participating in local political & service groups, more frequently” (Nilles, p. 164, 1998). Lastly, Nilles briefly discusses the benefits of effectually aligning a firm with Hypothetical secondary and tertiary side effects would include increased insurance fees and loss of consumer confidence. 3
MG4D7 | Mark Benya | MISDI 2017-18
10 | 31
telecommuters in a quantifiable metric. He suggests that cost to benefit ratios routinely exceed 1 per 100; once all the telecommuters are trained and miscellaneous expenses covered (Nilles, 1998). Yet, does that conversely mean the theoretical disadvantages of telecommuting would result in a benefit of 1 per cost of 100 – for any firm that fails to adequately align aspects of their AMS with other core business processes? Consider the vivid example presented by Beniger (1989), that outlines the theoretical negative outcomes for both a stagecoach versus a steam powered train accident; “stages often overturned or smashed, but for the most part such accidents usually resulted in injuries rather than death and involved only a few people. By contrast, a single train carried hundreds, at speeds up to 30 miles an hour. The railroad disaster, with its potential to kill or maim scores, if not hundreds, held a special terror” (Beniger, p. 223, 1989). Thus, it is vital to keep in mind that an inverse proportionality exists between the positive and negative outcomes associated with innovative technology. The potential adverse side effects of replacing human managers with AMS counterparts has yet to be discussed or extrapolated on sufficiently, and is thus the basis and underpinning for the current quantitative research. As the world continues to embrace and intertwine digital technologies, AMS among them, all the benefits as well as possible pitfalls, need to be recognized so that proactive measures can be taken. Effectual proactive measures can offset or entirely prevent adverse effects such as damage to a firm’s credibility and overall reputation in the industry.
3.) Research Objectives & Methodology The objective of this study is to quantify any adverse effects associated with AMS solutions that are implemented without a regard for the ANT configuration of a particular firm. To do this, we will identify a measurable and repeatable ‘digital fingerprint’ that exists within the transaction datasets, that when integrated with an IDS, will yield an increased detection rate (PD) while simultaneously decreasing error rate (PF). Based on a scoring system implemented through correlational analysis, the individual transaction datasets will be classified as either legal or fraudulent. In theory, this digital fingerprint could be leveraged by firms that engage with forms of hyperspecialized freelance – contractor labor on a regular basis. This will be accomplished by improving the IDS detection rates (PD) (Cavusoglu et al., 2004) for AMS platforms so as to reduce the number of false negative (i.e., the classification of illegal transactions as legal transactions) and false positive (i.e., the classification of legal transactions as illegal transactions) occurrences. The goal is to delve deeper into how instances of an IDS allow false positive and false negative transactions to remain undetected (Brooks, 1987; Strong et al., 2010), and offer a practical and viable solution for improving overall IDS detection rates (PD).
MG4D7 | Mark Benya | MISDI 2017-18
11 | 31
3.1) Research Design This study is focused on the manipulation and stress testing of 7,402 datasets. These datasets were obtained from a variety of former business associates, colleagues, and from the AM industry, that made use of the same Lead & Transaction Tracking information system being researched in this study. This Lead & Transaction Tracking platform has a functionality that allows for accounts to share and exchange transaction datasets so long as both accounts are approved members within the system. Given that I worked in this affiliate marketing industry for roughly seven years, my account was granted access to any and all transaction datasets that could be obtained from these prior associate and colleague accounts. All data originated from firms within the United States; however, although “the Internet also provides a means of crossing international boundaries….” (Papatla et al., p. 73, 2002), the US is still overwhelmingly represented in this study, accounting for ~86% of the datasets. The AM platform only stores the transaction datasets for the past two calendar years. Due to this design limitation, any datasets that exceed a two-year window from the index date were purged to optimize server allocation & minimize the overall resources associated with long-term data storage. Therefore, from the range of transaction datasets theoretically available, 7,402 out of a potential 11,937 transaction datasets (~62%) remained. These datasets span a multitude of time frames and industries (Figure 2). A variety of quantitative research analysis methods, including descriptive, correlational, causalcomparative/quasi-experimental, and experimental research, were considered before settling on a correlational analytical approach (Marshall, 1996; Black, 1999; Papatla et al., 2002). Ideally a relationship or link, between any of the 16 independent variables that each transaction dataset is comprised of (Appendix A), can be established as a way to confirm authenticity of the transaction. Therefore, the objective of this quantitative study is to devise an analysis method that can yield the largest detection rate (PD) while simultaneously maintaining a minimal error rate (PF) (Figure 3). The 7,402 total transaction datasets were clustered into random groups of 7 so as to better draw a representative sample from the total population (Marshall, 1996; Myers et al., 2011). In order to visualize a transaction from the perspective of the IDS, the 16 individual data points that make up a transaction dataset (Appendix A) were analyzed in an effort to uncover any patterns or inconsistencies that might be factored into an IDS’s detection rate (PD). The aim of which is to identify a way to generate an inverse proportionality between detection rate and error rate (i.e.
b = k / a), where b = (PD), a = (PF), and k = digital fingerprint t.
MG4D7 | Mark Benya | MISDI 2017-18
12 | 31
Figure 2: Quantitative Dataset Overview. Shown here are the total number of datasets available, datasets analyzed, pertinent dataset characteristics, and the industries with the highest representation.
“Assume that probability distributions for legal and fraudulent traffic are fL (x) and fF (x) respectively and normally distributed (Figure 3). Given xL and xF, PD and PF can be computed numerically for various values of t…as t decreases, both PD and PF increase” (Cavusoglu et al., p.72, 2004). Therefore, any identifiable and repeatable value or correlation value derived from the transaction data points – that can be assigned to t and result in an increase for PD while simultaneously decreasing PF – can be leveraged as a viable digital fingerprint for further research to extrapolate on. The following is a high-level overview of the analysis steps to be taken, the specific analysis methodology is discussed in-depth next section.
3.2) Data Collection All data (see Supplement) has been anonymized from the firms, affiliates, and clients to protect any personal information from being misappropriated. Nonetheless all datasets have been filtered and sorted by the firm of origination and date of occurrence. Rather than being able to draw a representative sample from the complete ‘population’, as instructed by Marshall (1996), I was forced to make do with the data that was readily obtainable given the time constraints and AM platform’s data storage policy. As a result, the transaction datasets that were collected are classified as a convenience sample.
MG4D7 | Mark Benya | MISDI 2017-18
13 | 31
Within one firm there are 12 unique affiliates – freelance employees (Firm A.1). Assuming a similar ratio holds across all of the datasets for this time period (11,937 in total), that would indicate approximately 3,850 unique affiliates partnered with firms that are of the size as Firm A.1. The implication is that AMS is entirely responsible for the initial quality assurance screening of all outputs generated by several thousand affiliate – freelance employees.
Figure 3: Computation of Detection Rate (PD) and Error Rate (PF) from Cavusoglu et al. (2004).
Figure 4: Countries Represented in Transaction Datasets. As shown, the transactions originated from a number of countries but the overwhelming majority hail from the United States.
MG4D7 | Mark Benya | MISDI 2017-18
14 | 31
Finally, the data has been manipulated and filtered by a variety of metrics, including: unique cities represented, unique states / countries represented, transaction averages, and so forth. These are reviewed in Appendix C.
4.) Data Analysis Strategies & Methods All design decisions were made with an emphasis on identifying unique and identifiable trends within the data. Additionally, a focus on combining theory and methodology was factored into the correlational analysis method, in an effort to adhere to the framework presented by the ANT (Walsham, 1997). Furthermore, Black (1999) touches on the importance of designing a quantitative study that can be replicable by another independent researcher. This correlational analysis approach can be utilized or applied to any transaction dataset collected from the same Lead & Transaction Tracking information system, and is therefore repeatable should follow up research be desired. The research analysis method is derived from correlational relationships arising between the 16 independent data points that comprise each transaction: [1] Transaction ID [2] Order ID [3] Firm [4] Affiliate ID [5] Affiliate Name [6] Media Type [7] Call Result [8] $ Payout [9] $ Fees [10] City [11] State/Region [12] Caller ID [13] Phone Type [14] Duration [15] Keypresses [16] Call Start Time (local) Appendix B contains a visual overview of the 5-step correlational analysis process and subsequent
flagging of a suspicious / potentially fraudulent transaction using the prescribed digital fingerprint scoring method. Step 1 - Data points [4] & [5]: establish the baseline digital fingerprint. All linked Affiliate ID [4] and Affiliate Name [5] data point values should remain unchanged across all transaction datasets. Mismatched or inconsistent values for data points [4] & [5] can be a precursor that the transaction has been corrupted, manipulated, or is otherwise fraudulent in some capacity. Given that the
MG4D7 | Mark Benya | MISDI 2017-18
15 | 31
analysis of data points [4] & [5] is necessary to establish a baseline digital fingerprint – but is not a standalone indicator of foul play – a transaction is therefore assigned a value of +1 for >= 1 mismatched data points, and a value of -1 is assigned for < 1 mismatched data points. These weighting numbers (+1, -1) as well as the ones that follow are admittedly arbitrary. However, analysis using these weighted variables generated an output (Figure 5) that was balanced with regards to outcome, and maintained an unbiased skewing of transactions during the correlational analysis process. Initial attempts to achieve a balanced scoring system failed due to certain steps being weighted far heavier than necessary. For example, user error can occur when a human actor manually interacts with a transaction, which results in a mismatched data point that appears fraudulent but is actually legal. Hence, the lower positive values associated with each step of the correlational analysis process, indicate the potential for fraudulent activity, whereas higher positive values represent a more concrete instance of fraud or system manipulation. Step 2 – Data points [6], [10], [11] & [12]: verification of City/State corresponding with Caller ID. Auto dialer programs often contain inaccurate or out of date phone numbers, typically the preceding three digits of a Caller ID [12] code will not correspond with the correct City/State [11]. I.e. Appendix A – Row 2: the caller ID code beginning 678 corresponds with GA (Georgia) rather than with the state LA (Louisiana) that is listed. Incorrectly corresponding state and caller ID values indicate a high likelihood of fraudulent activity, and a transaction is therefore assigned a value of +3 when data points [11] & [12] are mismatched; a value of -1 is assigned when data points [11] & [12] contain accurate information. Step 3 – Data point [16]: check for repeated sequences (i.e. 3-minute intervals) in the Call Start Time indicating a robo-dialer. If a discernable pattern or apparent trend in the Call Start Time data point [16] emerges. I.e. Appendix A – Row 1 to 4: A call originates at 13:00 followed by 13:02, then 13:08 followed by 13:10. 2-minute interval spacing – highly indicative of robotic dialing software – therefore a value of +2 is assigned; a value of -1 is assigned when no discernable patterns can be identified. Step 4 – Data points [6] & [13]: verification of Media Type corresponding with Phone Type. Auto dialer software will also occasionally provide illogical correlating values for data points [6] & [13]. I.e. Appendix A – Row 9 (highlighted in red): A Landline [13] phone number somehow conducted a Mobile: Search [6] in order to complete this specific transaction. Given that it would be nearly impossible for a landline phone to navigate to a search engine such as Google and facilitate the transaction, this represents a mismatched value between data points [6] & [13] – and is an indicator of fraudulent activity and is therefore assigned a value of +5, immediately flagging the transaction for further review. Step 5 – Transactions that have a final digital finger print value of 4 or higher are reclassified as suspicious / potentially fraudulent. The reclassified transactions are then forwarded to the relevant security and compliance departments for further review as well. Figure 5 breaks down
MG4D7 | Mark Benya | MISDI 2017-18
16 | 31
the various iterations of the t value based on what stage the correlational analysis is at, and how prior data points have impacted its overall score.
Figure 5: Correlational Analysis Decision Tree. In this analysis the weighting of the variables was as described in the text, and yields a balanced outcome.
The above five step correlational analysis method attempts to derived a value for the digital fingerprint (t) – that can be factored into Cavusoglu’s detection rate (PD) calculation [Figure 3] – and thereby increase overall IDS functionality. Based on the respective transaction data set values at the conclusion of the correlational analysis, it can be expected that values greater than or equal to 4 are likely fraudulent, and values less than or equal to 3 are likely valid and legal transactions (Black, 1999; Cavusoglu et al., 2004).
MG4D7 | Mark Benya | MISDI 2017-18
17 | 31
4.1) Correlational Analysis Outcomes There are two outcomes that seem most significant given the various factors and resources at disposal for this quantitative study. Firstly, that despite all of the granular verification and quality assurance aspects, AMS and IDSs are still susceptible to cybercrimes and system manipulation. Secondly, that a digital fingerprint or reproducible and tangible output can be identified, in order to improve the overall effectiveness of IDS’s found within AMS. Outcome one: A human operator will be able to identify transactions deemed suspicious or fraudulent that the AMS and IDS fail to identify. Regardless of how well conceived and engineered an algorithmic management software solution is, there will still be one-off scenarios that the intrusion detection system fails to identify as low-quality or fraudulent, whereas a human counterpart is able to flag suspicious transactions with far greater accuracy. Outcome two: Through the correlational analysis of the transaction datasets, a relationship or link between a number of the 16 independent data points emerged. The benefit of which is that a unique and repeatable method of identifying fraudulent transactions within the datasets can be incorporated into the IDS and improve overall detection rates (PD) in addition to reducing detection error rates (PF) – as exemplified by Cavusoglu (2004) in Figure 3.
Of the 7,402 transaction datasets that were analyzed using the prescribed correlational analysis method, ~2,665 scored greater than or equal to four, and were thus reclassified as theoretically fraudulent. This amounted to ~36% of the total transaction datasets, more than double the amount (10-15%) described by Matin (2007) and Dinev (2008). Although the data are difficult to analyze, a handful of industries are represented far higher than others. Namely: human resource solutions, banking – finance, healthcare insurance, auto insurance, and miscellaneous home services. The significance of which is that these industries, assuming the correlational analysis results are valid, appear to be experiencing large amounts of fraudulent activity. In total 24 unique industries were represented as part of this study, details of which can be found in Appendix C.1.
4.2) Limitations Due to the decentralized nature of how the datasets are stored (each respective firm is responsible for securing and maintaining their datasets), there exist some gaps in transaction history and other miscellaneous information as a result. Hence, the reason only 7,402 transactions, or ~62%, could be analyzed rather than the entire set of 11,937 transactions. Each firm upholds a slightly different operating procedure, with some firms possessing a full 2 years’ worth of data, whereas others only maintained records for 6 months prior to purging the data, etc. This, coupled with the time constraints associated with this project, might play into the
MG4D7 | Mark Benya | MISDI 2017-18
18 | 31
results and overall conclusions. As seen through the instance of multi-directional inheritance (Winter et al., 2014), minor nuances in the data can skew the overall end output. Therefore, all conclusions and assumptions drawn from this quantitative study should be taken with caution until further research can be conducted and the final results verified. The three key limiting factors associated with the qualitative research datasets are as follows: A.) Substantial amounts of relevant literature exist on contractor, freelance & work-from-home labor. Yet, very little has been published on the information system platforms and tools that facilitate much of the modern work-from-home dynamics; specifically, with regards to the adverse effects that are produced when AMS fails to work as advertised. B.) The overall sample pool of data is incomplete. Additionally, the lack of complete access to firm-related information, resulted in a Convenience Sample dataset. In any subsequent research a dataset with 100% coverage would be ideal [Table 1: Proposed Dataset Metrics for Continuing Research].
Table 1: Proposed Dataset Metrics for Continuing Research. Dataset 2 and 3 represent the theoretical transactions that can be obtained and analyzed for any further research purposes.
C.) Based on my personal “identified shortcomings”, originating from ~seven years of experience in the digital marketing hyperspecialized field, desired results may have been subconsciously predetermined or the analyzed data may be subjected to minor forms of bias (such as the weight attached to the correlational analysis variable). Felstead identifies a key limitation regarding the research conducted to date on hyperspecialized info systems. Namely that “there are currently only a few multivariate studies from which to draw” (Felstead et al., p.
MG4D7 | Mark Benya | MISDI 2017-18
19 | 31
85, 1999). Additionally, the research only briefly touches on the aspect of AMS and therefore all assumptions and conclusions should be taken with caution.
4.3) Further Research Additional research should be conducted on a larger dataset with fewer gaps in transaction data. In theory, the gaps in data will display the same characteristics as the data from the original datasets. If not, it should show varying data which opens up the need for further research to confirm that either the gaps in the datasets resulted in the change between sets 1 and 2. Follow up research should focus on obtaining data from a far larger sample pool, as well as place an emphasis on data from one or two specific sectors maximum (as compared to the range of 5+ industries the current study encompasses). This would allow for a quantitative study that compares results across individual sectors, and could be particularly beneficial for determining if certain industries are prone to higher instances of adverse effects than others. Most importantly though, future research should consist of a quantitative case study that centers on exploring any statistical causal inferences found within the pertinent firm data; this is important for highlighting any indirectly observable trends & patterns that cannot be identified solely via qualitative research. Analysis of these data suggests the possibility of a higher than previously reported amount of fraudulent traffic by AMS. The significance of which is the clear need for further research, of a larger dataset, with differing analytical methods and so forth. I.e. there was no clearly definable ‘digital fingerprint’ that could be used to distinguish between naturally occurring low-quality transactions and artificially manipulated ones.
5.) Discussion & Conclusion “Your ultimate focus should always be on the results” (Nilles, Managing Telework)
Rapid proliferation of digital technologies has contributed to the steady increase in demand for hyperspecialized freelance labor (Zuboff, 1988; Nilles, 1998; Felstead et al., 1999; Barley et al., 2004; Willcocks et al., 2016). Thus, AMS became a robust and inexpensive – rapidly deployable solution for the Lead & Transaction Tracking industry; specifically, when attempting to leverage economies of scale by onboarding additional teleworker – freelance labor without the requirement of additional human resources headcount (Willcocks et al., 2016). Furthermore, it enabled numerous AM networks to expand their reach from local to regional, and eventually to national markets simply through the effective implementation of ‘automatic control’: the process of supplanting human tasks with that of machine functionality (Beniger, 1989). Yet, “despite all
MG4D7 | Mark Benya | MISDI 2017-18
20 | 31
of the advancements automation can provide business and services and industries with, the system must be programmed, taught and monitored by people” (Willcocks et al., p. 221, 2016), resulting in a severe limitation to AMS’s adaptability and continued effectiveness over time. This reliance on the human operator component from AMS creates a gap in the quality assurance and anti-fraud capabilities given the static ‘black and white’ IDS security functionality that cannot update independently. Essentially meaning that if a software update containing the latest security and low-quality transaction deterrence methods is installed on Monday – and an innovative way to manipulate the AMS is concocted on Tuesday – the entire AMS system becomes essentially useless and requires further resources to be invested into the prevention of this latest way to manipulate the AM system. Given that the repeated costs associated with attempting to remain one step ahead of all cybercrime related activities can quickly negate the cost saving benefits of AMS, Nilles argues that the demand for honest brokers will increase; “These individuals / agencies will provide some level of certification and support to their clients” (Nilles, p. 300, 1998). The establishment of strong working relationships between unique firms that utilize the same AM platform governed by AMS should limit the total amount of low-quality transactions that go undetected. As presented by Beniger: “comparison of data from two different sources…served as a reliable check on the honesty and efficiency of these [freelance] employees” (Beniger, p. 231, 1989). Figure 6 presents an alternative ANT organizational structure that places an emphasis on the communication, and mutually beneficial unbiased auditing of AMS outputs, by that of a ‘colleague firm’ that is utilizing the same AM platform.
MG4D7 | Mark Benya | MISDI 2017-18
21 | 31
Figure 6: Proposed Actor Network Theory (ANT) Alignment. AMS remains the key moderating actor in the overall network, with the addition of a unbiased auditing of the correlational analysis outputs by a partner agency.
Incorporating a peer review – audit layer to the ANT further contributes to the continual refinement and overall benefits that can be derived from AM, and by extension hyperspecialized freelance labor. As a teleworker – freelancer, the new-found freedoms must be woven into a well-organized and structured work routine. Human counterparts are more effective at gauging if a freelancer is adhering to a ‘work-hard, play-hard’ mentality, or if they are effectively being paid to engage in personal activities from their living room (Nilles, 1998; Gratton, 2011). “Clerks who had performed one small operation in a long paper chain were, for the first time, using computer technology to accomplish all the functions associated with a single product…people seated at their partitioned workstations, staring into the screens of desktop terminals” (Zuboff, p. preface, 1988). Whereas in today’s day and age the emphasis appears to be on reconstructing the original chain only in digital form; consisting of virtual laborers, tethered together via hyperspecialized work systems, commuting no further than their coffee table for employment. Effectual implementation and maintenance of a hyperspecialized AMS – RPA process can yield an ROI of 6:1, argues Willcocks et al. (2016). The deterrence theory set in place to help enforce the algorithmic management software is not “high enough” to deter 100% of cybercrime related activities. Instances of fraud that are trivial in amount, i.e. $500 is misappropriated as a result of the artificial manipulation of the digital platform, cannot be practically enforced given the plaintiff would need to incur substantial fees in order to: Track down the freelance employee and serve them with court summons; cover the legal expenses until a guilty verdict or judgement is levied; and even at the end of this lengthy
MG4D7 | Mark Benya | MISDI 2017-18
22 | 31
process the freelancer may not have an adequate asset portfolio to make pursuing the legal route a viable action for the firm (Bendiek et al., 2015). Cavusoglu argues that dual security features can complement one another, contingent on their overall effectiveness and alignment within the ANT (Cavusoglu et al., 2004). Given that the AMS researched during this study already possess IDS capabilities, I propose incorporating a peer review – 3rd party audit process. This additional security feature – QA practice would require no initial financial investment, and in theory, if structured accordingly, could serve as a form of user training exercise. Whereby colleagues from firms in adjacent – but comparable industries – can gain practical insight and expertise, while simultaneously improving their individual QA / fraud detection capabilities through the audit and analysis of 3rd party transaction datasets. Nilles (1998) emphasizes the need for additional authentication and verification of freelance – telecommuter outputs, as a way to trust-but-verify all work produced from an individual that exists strictly as a digital entity. “In addition to the security issues already discussed, the internet intensifies the problem of authentication; that is, how can you be really, really, really sure that the person logging on to your network is actually an authorized teleworker” (Nilles, p. 91, 1998). This peer review – 3rd party audit process is a purely theoretical solution yet one clear conclusion can be drawn from the literature as well as the research analysis findings: a plethora of security features – complementing one another – consisting of the lowest combined fiscal amount – can only benefit a firm.
MG4D7 | Mark Benya | MISDI 2017-18
23 | 31
BIBLIOGRAPHY §
AlKalbani, A., Deng, H., Kam, B., and Zhang, X. (2017) “Information Security Compliance in Organizations: An Institutional Perspective”, Data and Information Management, Vol.1, No.2, pp. 104-114
§
Attride-Sterling, J. (2001) “Thematic networks: an analytic tool for qualitative research”, Qualitative Research, Vol.1, No.3, pp. 385-405
§
Barley, S. and Kunda, G. (2004) “Gurus, Hired Guns, and Warm Bodies: Itinerant Experts in a Knowledge Economy”, Princeton University Press, ISBN 0-691-11943, pp. 1-91
§
Beniger, J. (1989) “The Control Revolution: Technological and Economic Origins of the Information Society”, Harvard University Press, ISBN 0-674-16986-7, pp. vi-436
§
Bendiek, A. and Metzger, T. (2015) “Deterrence theory in the cyber-century: Lessons from a state-of-the-art literature review”, Working Paper RD EU/Europe, Stiftung Wissenschaft und Politik German Institute for International and Security Affairs, pp. 1-17
§
Black, T. (1999) “Doing Quantitative Research in the Social Sciences”, SAGE Publications, ISBN 0-7619-5352-3, pp. 5-847
§
Brooks, F. (1987) “No Silver Bullet – Essence and Accidents of Software Engineering”, IEEE & University of North Carolina at Chapel Hill, Vol.20, No.4, pp. 10-19
§
Cavusoglu, H., Cavusoglu, H., and Raghunathan, S. (2004) “Economics of IT Security Management: Four Improvements to Current Security Practices”, Communications of the Association for Information Systems, Vol.14, pp. 65-75
§
Dinev, T., Hu, Q., and Yayla, A. (2008) “Is There an On-line Advertisers’ Dilemma? A Study of Click Fraud in the Pay-Per-Click Model”, International Journal of Electronic Commerce, Vol.13, No.2, pp. 29-60
§
Edelman, B. (2008) “Deterring Online Advertising Fraud Through Optimal Payment in Arrears”, Harvard Business School Working Paper, pp. 1-15
§
Felstead, A. and Jewson, N. (1999) “Global Trends in Flexible Labour”, MacMillian Press, ISBN 0-333-72999-4, pp. 1-194
§
Felstead, A. and Jewson, N. (1999) “In Work. At Home: Towards an understanding of homework”, ANU Press, ISBN 0-415-16300-5, pp. 79-125
MG4D7 | Mark Benya | MISDI 2017-18
24 | 31
§
Fidock, J. and Carroll, J. (2012) “Theorising about the Life Cycle of IT Use: An appropriation perspective”, Information Systems Foundations: Theory Building in Information Systems, ANU Press, pp. 79-112
§
Gratton, L. (2011) “The Shift: The Future of Work is Already Here”, HarperCollins Publishers, ISBN 978-0-00-742793-2, pp. 28-271
§
Heyink, J.W. and Tymstra, TJ. (1993) “The Function of Qualitative Research”, Social Indicators Research, Vol.29, No.3, pp. 291-305
§
Lynne Markus, M. (1983) “Power, politics, and MIS implementation”, Communications of the ACM, Vol.26, No.6, pp. 430-444
§
Malone, T., Laubacher, R. and Johns, T. (2011) “The Big Idea: The Age of Hyperspecialization”, Harvard Business Review, pp. 1-11
§
Marotta-Wurgler, F., (2011) “Some Realities of Online Contracting”, Supreme Court Economic Review, Vol.19, No.1, pp. 11-23
§
Marshall, M. (1996) “Sampling for qualitative research”, Oxford University Press, Vol.13, No.6, pp. 522-525
§
Matin, S. (2007) “Clicks Ahoy – Navigating Online Advertising in a Sea of Fraudulent Clicks”, Berkeley Technology Law Journal, Vol.22, No.1, pp. 534-554
§
McManus, B. (2001) “Rethinking Defamation Liability for Internet Service Providers”, Suffolk University Law Review, Vol.35, Rev. 647, pp. 647-669
§
Myers, M. and Klein, H. (2011) “A Set of Principles for Conducting Critical Research in Information Systems”, MIS Quarterly, Vol.35, No.1, pp. 17-36
§
MY401 course summative essay, MISDI
§
Nilles, J. (1998) “Managing Telework: Strategies for Managing the Virtual Workforce”, John Wiley & Sons, Inc., ISBN 0-471-29316-4, pp. viii-232
§
Papatla, P. and Bhatnagar, A. (2002) “Choosing the Right Mix of On-Line Affiliates: How Do You Select the Best?”, Advertising and the New Media, Vol.31, No.3, pp. 69-81
§
Peterson, D. J. (2005) “IT in Business and Industry.” Russia and the Information Revolution, 1st ed., RAND Corporation, pp. 33–48
§
Schwab, K. (2016) “The Fourth Industrial Revolution”, Crown Publishing Group, World Economic Forum, ISBN 978-1-5247-5886-8, pp.1-115
§
Strong, D. and Volkoff, O. (2010) “Understanding Organization – Enterprise System Fit: A Path to Theorizing the Information Technology Artifact”, MIS Quarterly, Vol.34, No.4, pp. 731-756
MG4D7 | Mark Benya | MISDI 2017-18
25 | 31
§
Thacker, C. and Dayton, D. (2008) “Using Web 2.0 to Conduct Qualitative Research: A Conceptual Model”, Technical Communication, Vol.55, No.4, pp. 383-391
§
Walsham, G. (1997) “Actor-network theory and IS research”, Chapman & Hall, pp. 466-480
§
Willcocks, L. and Lacity, M. (2016) “Service Automation: Robots and the Future of Work”, Steve Brooks Publishing, ISBN 978-0-956414-56-4, pp.27-299
§
Winter, S., Berente, N., Howison, J., and Butler, B. (2014) “Beyond the organizational ‘container’: Conceptualizing 21st century sociotechnical work”, Information and Organization, Vol.24, pp. 250-269
§
Zuboff, S. (1988) “In the Age of the Smart Machine: The Future of Work and Power”, Basic Books Inc. Publishers, ISBN 0-465-03211-7, pp. 33-361
MG4D7 | Mark Benya | MISDI 2017-18
26 | 31
APPENDIX A.) Sample of Transaction Datasets
Appendix A: Sample of Transaction Datasets
MG4D7 | Mark Benya | MISDI 2017-18
27 | 31
B.) Correlational Analysis of Transaction Datasets Step by step analysis of the 16 individual transaction data points: [1] Transaction ID, [2] Order ID, [3] Firm, [4] Affiliate ID, [5] Affiliate Name, [6] Media Type, [7] Call Result, [8] $ Payout, [9] $ Fees, [10] City, [11] State/Region, [12] Caller ID, [13] Phone Type, [14] Duration, [15] Keypresses, [16] Call Start Time (local).
Appendix B.1: Data points [4] & [5]: establish baseline digital fingerprint
B.2: Data points [6], [10], [11] & [12]: verification of City/State corresponding with Caller ID B.3: Data point [16]: check for repeated sequences (i.e. 3min intervals) in the Call Start Time
MG4D7 | Mark Benya | MISDI 2017-18
28 | 31
B.4: Data points [6] & [13]: verification of Media Type corresponding with Phone Type B.5: Fraudulent transaction flagged & submitted for further review
C.) Quantitative Dataset Overview
Appendix C.1: Industries & media platforms present in the transaction datasets
MG4D7 | Mark Benya | MISDI 2017-18
29 | 31
Appendix C.2: States present in the transaction datasets
D.) Acknowledgements
v Dad, you’ve been a source of inspiration my entire life. Thank you for everything. v Dr. C. Sørensen, there is no way I could have made it through the programme without your guidance. Thank you for helping me grasp the significance of understanding a problem before attempting to solve it. v Prof. J. Vaughn, thank you for taking the time to answer my incessant questions. You sparked my interest in technology. v J. Heinz, had you not taken me under your wing and patiently explained how ‘things worked’ I would have failed tremendously. Thank you for being an exceptional manager and for sharing your wisdom. v A. Kittelberger, you’re the best roommate a guy could have asked for.
MG4D7 | Mark Benya | MISDI 2017-18
30 | 31
Supplement (Raw Data) Please see attached supplement for the first 1000 transaction datasets. The remaining 6,402 transaction datasets are available upon request for any pertinent follow up research.
MG4D7 | Mark Benya | MISDI 2017-18
31 | 31