Policy Significance Statement
The commentary identifies that corporate social responsibility and corporate digital responsibility mechanisms are not adequate for managing data responsibility. These findings are key for understanding suitable path forwards as data-driven technology is further integrated into society and shaping the discussion as the field moves forward. The commentary suggests the need for a comprehensive approach to ensure data are managed responsibly and protect consumers.
1. Introduction
Data, and data-driven technologies, are seen to hold a lot of promise for overcoming the complex challenges facing society (e.g., climate change and food insecurity) as well as the potential to improve the lives of all and importantly those in poverty. However, the data being generated and the technology to make sense of it are mainly controlled by the private sector, with governments having to form public–private partnerships to even improve the use of the data that they themselves generate and use. Technological progress is largely driven by the private sector, with the public sphere typically playing catch-up, especially when legislating, or contracting the private sector for new projects.
Although the consequent impact and role that these private companies have on societal issues are not new, especially when looking at issues around environmental protection, climate change, and the role of corporations in developing countries, technological development brings a new side to the possible impact. As digital transformation expands, and data technologies play a larger role in the day-to-day lives of people, there is a growth in demand for data responsible approaches, be it through private governance or through regulatory means. For many in the private sector, data responsibility is increasingly seen through the lens of corporate social responsibility (CSR) and the emerging related idea of corporate digital responsibility (CDR).
This paper aims to examine the status of CSR and its relation to data responsibility. It also aims to examine the adequacy of CSR, and the emerging CDR, as an approach to governing and implementing data responsibility and protecting the interests of wider society. The paper engages with the debates about technology, data responsibility, and governance. By reflecting on CSR and CDR, the paper showcases the current trajectory of the discussion regarding responsible use of data and digital tools. In doing so, the paper aims to ensure that the lessons from CSR are considered as digital transformation expands and governments grapple with the societal implications. It must be noted that this commentary is limited to the Global North notion of CSR/CDR and does not reflect on the way the concept is being implemented by governments and corporations in the Global South.
2. Corporate Social Responsibility: The Evolution
In the broadest sense, CSR can be described as the idea that corporations should go beyond their minimum legal obligations and consider their impact on society when making strategic and operational decisions and taking actions (Russell et al., Reference Russell, Russell and Honea2016). It is a concept developed in the Global North with a heavy focus on the impact of multinational corporations as they expanded globally (Bernard, Reference Bernard2021; Marques and Utting, Reference Marques and Utting2010). In searching for a precise definition of CSR, it becomes clear that despite it being a heavily researched field, there is still no common definition. Neither is there the consensus on what the core principles are, how a company achieves social responsibility, or, in fact, whether a company is even obliged to be socially responsible (Crane et al., Reference Crane, McWilliams, Matten, Moon and Siegel2008). This may be due to the multitude of theories that form the foundation of our understanding of CSR, theories coming from varying fields of knowledge, such as economics, political sciences, and sociology.
Despite the lack of consensus, the existing theories and examinations of CSR can be categorized into groupings (Melé, Reference Melé2008). Each of these theories is descriptive of the ways in which businesses are approaching their social responsibilities, but can also be understood from a normative perspective.
-
(1) Corporate Social Performance: This theory holds that companies have four responsibilities, namely, wealth creation, economic responsibilities, legal responsibilities, and addressing societal problems.
-
(2) Shareholder Value Theory: It holds that the only responsibilities are to generate profit and increase economic value for shareholders while upholding its legal obligations. This may be seen as the “classical” approach to CSR captured in Friedmans definition “the social responsibility of the business is to increase profits” (Friedman, 1962; Grigore et al., Reference Grigore, Molesworth, Watkins, Theofilou, Grigore and Stancu2017 as cited by Melé, Reference Melé2008).
-
(3) Stakeholder Theory: Under the stakeholder theory, a business has a responsibility to anybody who has a “stake” in the business activities, beyond just those described legally.
-
(4) Corporate Citizenship: Rather than a full single theory, Corporate Citizenship can be seen as an umbrella term for several theories that are emerging around the role of businesses within society. Rather than the business’s responsibility being an external affair, this theoretical group argues that a business is an integral part of society. Views on this have been derived from instances where a corporation enters the realm of citizenship, fulfilling roles that are like that of governments.
Corporate responsibility has evolved over the past few decades as the different practical approaches and theoretical thinking emerged and as awareness of the impact has grown. The most dominant of these is the classical approach, of shareholder value theory, in which a corporation’s ultimate responsibility is toward maximizing profits. Corporate citizenship is the most recent approach to emerge, and therefore a full theory is yet to emerge (Grigore et al., Reference Grigore, Molesworth, Watkins, Theofilou, Grigore and Stancu2017; Melé, Reference Melé2008).
One of the tensions with CSR is that it can be applied as a set of activities of a corporation or an expansive guiding ethos that is holistic in nature and reflective of the corporation’s “personality.” This dialectic can create challenges when corporations attempt to obscure their intensions by using CSR activities performatively. In essence, they can render CSR as simply a public relations tool rather than a normative and principle-driven governing ethos. (Bernard, Reference Bernard2021).
2.1. CSR and practical implementation
In practice, CSR is not one mechanism, but rather several implementation styles and accountability mechanisms that are not mutually exclusive. The result is a complex notion of CSR, how it is, and should be, implemented and how it should be enforced. There is no “one-size-fits-all” (Maon et al., Reference Maon, Swaen and Lindgreen2017). This creates a challenge for governments who aim to enforce CSR through either soft or hard law or self-regulation. As the concept of CSR and its foundations are expanded to address the emerging data-driven impacts, it is imperative that a solution to these challenges be found.
2.1.1. Self-regulation
One of the mechanisms for upholding CSR is self-regulation through mechanisms and instruments in place at the firm or sector level, including codes of ethics and conduct, responsible investment, and so forth. These aim to establish and uphold industry standards and norms which can be enforced through hard and/or soft laws. Self-regulation is argued to be an important mechanism for sectors that are experiencing a growth of emerging practices and technologies that fall outside or in the gray zones of existing government regulations. It is seen as the way to allow continued innovation and development while countering the risk that harder governance mechanisms would hinder innovation (Berkowitz and Souchaud, Reference Berkowitz and Souchaud2019).
When setting self-regulated industry norms and standards, it is often done through meta-organizations (MOs), organizations that are composed of other organizations. The role of these MOs is to not only generate CSR solutions, but also facilitate and ensure reporting and accountability, and capacity building. The MOs can be divided into three main categories (Berkowitz et al., Reference Berkowitz, Bucheli and Dumez2017):
-
(1) Traditional MOs such as trade associations.
-
(a) American Petroleum Institute—“Advocacy, research and statistics, standards, certification, and education.”
-
-
(2) Specialized MOs dealing with specific challenges for which businesses can work together to research a solution. Members are usually made up of only businesses.
-
(a) Conservation of Clean Air and Water in Europe—Research on environmental issues related to the oil industry.
-
(b) Asistencia Reciproca Petrolera Empresarial Latinoamercana—Developing and sharing best practices; research and building competency.
-
(c) World Ocean Council—Collaborative stewardship.
-
-
(3) Multistakeholder MOs which gather business, government, as well as civil society actors.
-
(a) UN Global Compact—“Framework for development, implementation, and disclosure” of policies and practices.
-
(b) Voluntary Principles for Security and Human Rights—Principles.
-
These organizations are also not necessarily restricted to a specific issue or a specific sector. They can be infrasectoral, focusing on segments of a sector, suprasectoral, bringing together companies from related sectors (e.g., oil and gas mining), or cross-sectoral, bringing together companies from unrelated sectors (Berkowitz et al., Reference Berkowitz, Bucheli and Dumez2017).
MOs are vital for companies aiming not only to uphold their CSR objectives, but also to determine how existing legal and societal obligations apply to their emerging technologies, but they cannot replace government regulation. They may act as a precursor to future regulatory framework, facilitating cooperation with the government as it explores the implications of the emerging tech and practices (Berkowitz and Souchaud, Reference Berkowitz and Souchaud2019). We acknowledge that there are other types of self-regulations that sectors and industries impose, such as the BAR Association on lawyers who want to practice law in the United States and requirements to conduct pro-bono work in a calendar year. These are a form of CSR activities.
Self-regulation approaches are by nature self-designed and sometimes voluntary. Corporations engage in them as a means to thwart hard regulations that are imposed by governments (Lock and Seele, Reference Lock and Seele2016). CSR mechanisms allow for firms or industries to signal to policymakers, stakeholders, and public their intentions, reducing potential information asymmetries (Su et al., Reference Su2016). Firms engage in those activities as a tool to maximize their economic profit potential, and to signal about their capacities to implement these policies. How a firm engages with the political arena and uses is reflective of their perspectives of governments and institutions (Anastasiadis, Reference Anastasiadis2014). Corporations regularly engage in political lobbying when they see opportunities to either enshrine in hard laws regulations that would benefit their sectors, or to thwart the establishment of new laws that they oppose (McGreal, Reference McGreal2021). A real tension, however, emerges when there are discrepancies between the stipulated CSR platforms of a firm or industry, and the political lobbying that they engage in (Lock and Seele, Reference Lock and Seele2016).
However, many companies have pushed to reduce potential regulatory requirements, especially hard law obligations, on their business operations allowing for flexibility in how they conduct their business, argued to be key for innovation and tech development (Berkowitz and Souchaud, Reference Berkowitz and Souchaud2019). As a result, companies might pursue strategies that mimic government regulations by engaging in self-regulatory practices indicating to regulators that they are adhering to best practices or behaviors that would be the foundation of regulation. Often, these types of approaches are couched under a CSR approach in which companies showcase to regulators, policymakers, and society that they are good stewards and that there is no need for hard laws or if there are, it should be based on the practices they already following. We are mindful that companies and even industries may pursue or encourage government regulation to secure their interests and advantage while hurting competitors or limiting entry into their market (see Case Study below).
2.1.2. Social pressure
Another mechanism through which businesses can be forced to adhere to social responsibility obligations is social pressure through stakeholder actions either individually or collectively. Stakeholders can be employees, consumers, shareholders, or competitors. Research has found that the strongest group that has influence on the decision-making of organizations are the employees themselves. Employees can force their management to adhere to the CSR policies established or recognize new aspects of social responsibility (Helmig et al., Reference Helmig, Spraul and Ingenhoff2016). Within the tech sector, this push by employees has been seen at Google, Amazon, and Palantir with varying levels of success (MacMillan and Dwoskin, Reference MacMillan and Dwoskin2019; Paul, Reference Paul2020; The Guardian, 2021). Through research conducted regarding consumer behaviors in the United States, it has been shown when corporations violate governments or voluntary standards, there are punitive actions taken by both highly conscious and less conscious consumers (Russell et al., Reference Russell, Russell and Honea2016). As such, CEOs are increasingly mindful about the impact of aligning CSR strategies with their business models in order to keep stakeholders and consumers satisfied (Henderson, Reference Henderson2018).
3. The Emergence of Corporate Digital Responsibility
Over the past decade, as society has experienced a major shift toward digitalization, there have been increasing calls for companies to engage in “CDR.” The concept has broadly been understood as the desired responsibilities of corporations to their users, societies, and governments, when it comes to the use of digital tools and data collection (Lobschat et al., Reference Lara, Mueller, Eggers, Brandimarte, Diefenbach, Kroschke and Wirtz2021). The basic implementation of CDR is the same as CSR, and may be viewed as one part of an overall CSR model. It is a voluntarily established set of policies and self-governing principles, developed, implemented, and overseen by corporations themselves, going above regulations that are mandate by regulations. As a result, a cynical interpretation of CDR is that it offers corporations an opportunity to a build a cover for unethical behaviors and practices. More optimistically, it can be interpreted as organization aiming to ensure responsible development of technology in a realm that is severely underregulated.
3.1. Understanding CDR
As an emerging field of practice and study, defining CDR is challenging. Conceptually, Richard Mason, writing in the 1980s, articulates the essence of CDR best “Our moral imperative is clear. We must ensure that information technology, and the information it handles, are used to enhance the dignity of mankind” (Mason, Reference Mason1986). Competing definitions of CDR have emerged over the past few years, focused on who are constituents of CDR, arguing special attention needs to be given to artificial actors (Lobschat et al., Reference Lara, Mueller, Eggers, Brandimarte, Diefenbach, Kroschke and Wirtz2021), identifying different branches of responsibilities CDR aims to target (Herden et al., Reference Herden, Alliu, Cakici, Cormier, Deguelle, Gambhir and Griffiths2021), or making the case for merging sustainability and digitalization together (Wade, Reference Wade2020).
We propose to define CDR as the set practices, policies, and governance structures of corporations as they relate to the digital transformation. CDR must be centered around accountable digital practices, enforcement mechanisms, sustainable growth and development, and the promotion of trust across the digital ecosystem. CDR practices must engage how digitalization shapes society and the environment and the impacts that it has on individuals, communities, and states.
States for their part are beginning to recognize CDR practices as critical. For example, the French and German governments have both articulated how CDR and building trust in a digital ecosystem will be imperative for corporations moving forward. Both countries articulate the need for CDR to go beyond minimum-expected standards and regulations (see German CDR Initiative and France Corporate Digital Responsibility for more details).Footnote 1
As part of the discussion on digitalization and ethical and responsible approaches to it, many have begun applying for a critical perspective on the impact of data and digitization on society. As a result, we have seen witnessed a rise in critical studies that uses data colonialism and data capitalism as a lens from which to highlight how existing power dynamics now manifest themselves in the digital realm. CDR will have to reckon with this and the continued drive toward commodifying and quantifying human behaviors and digital interactions (Sadowski, Reference Sadowski2019; Thatcher et al., Reference Thatcher, O’Sullivan and Mahmoudi2016).
3.2. Separate but overlapping: Why CDR must go beyond CSR
Commentators often tend to think of CDR as part of a CSR mechanism; we argue that it is best to think of CDR as a separate mechanism that overlaps with CSR in multiple areas. CDR and CSR are both voluntary and self-governed approaches to responsible business practices that aim to go beyond the mandatory legal and regulatory minimums established by the states where they operate. CDS and CSR both share, in theory, a corporate citizenship ethos in that they reflect a deeper interest in the impact of business practices on consumers, and broader society. CDR and CSR both maintain that implementing these policies provide an economic and business advantage for firms.
We argue that CDR is separate due to the scale and the impact of the digital transformation, and the unique influence and power dynamics that arise. Moreover, our concern is that if CDR develops along similar lines of CSR as a more voluntary “soft” measures rather than government-enforced obligatory laws, it risks failing in the same way that CSR has failed to be an effective mechanism in the realm of environmental and climate protection (Eavis and Krauss, Reference Eavis and Krauss2021). As CDR is an emerging mechanism, an opportunity exists to incorporate the effective parts of CSR but include stronger enforcement and accountability mechanisms. We have an opportunity to learn from the last time technologies with such profound impacts emerged, and we did not effectively protect long-term societal interests. CDR will have to manage the impacts associated with environmental degradation resulting from unsustainable practices and increased digital waste. Moreover, CDR will have to target the growing concerns regarding data collection and the problems related to corporate surveillance and the deep implications that this has for societies across the world. Examples of the challenges CDR will have to address are emerging with clear implications for the future. In the education sector, there is need to balance the needs for digitalization with the concerns about perpetuating a dehumanizing culture that fosters mistrust between education institutions and students. The criminal justice system is increasingly using algorithms that enshrine unjust and biased practices. Such examples can be seen throughout different sectors, and they are not necessarily new phenomena; rather they are exacerbated by the turn toward digital tools and algorithms for decision-making support.
The concern is that CDR will be relegated to secondary or tertiary consideration by organizations while continuing to prioritize a business-as-usual approach. More worryingly, without strong accountability mechanism and independent government regulatory development, it can be used for whitewashing or regulatory capture. Additionally, the challenges that plagued CSR in other sectors will be the same for CDR if they are not actively addressed. CDR must reckon that it is dealing with a phenomenon that is reshaping human relations and that the responsibility threshold is higher.
4. Data Responsibility and the Limits of CDR
4.1. New tools, old problems
Governments, communities, and corporations are all trying to grasp the full scale and impact of digitization (Dufva and Dufva, 2019), and assessing whether they are well suited to respond to this digital transformation (Cheng et al., Reference Cheng, Frangos and Groysberg2021). This is happening as a gap emerges between societal expectations, and the impact of the proliferation of new technologies and digital tools. This, however, is a phenomenon that occurs throughout history as new material forms and tools emerge. The sociologist William Ogburn argued (a century ago) of the existence of a cultural gap that arises when new tools and technology make it to the world and societies assess their societal impact. Ogburn called this a period of maladjustment where tools and products exist and proliferate through the economy and society, but that society itself has yet to figure out what the norms and regulations ought to be (Ogburn, Reference Ogburn1922). We are living through this period of maladjustment with digital and data responsibility.
One of the challenges that we have when regulating or thinking about technology is the debate about whether technology shapes societal norms and values, what proponents refer to as “Technological Determinism,” or whether societal and cultural norms, developed by individuals and groups go on to shape technologies, control them, and place them at the service of society (Dafoe, Reference Dafoe2015; Winner, Reference Winner1980). These debates have been ongoing for decades and we suspect we will continue to do so for many more years to come.
We argue that it is important for policymakers, academics, and corporate executives to be aware of these tensions and to understand how these debates shape competing perspectives on the role of regulations. These debates help frame the conversations around agency and the degree to which technology exists within the realm of control and governance. Critically, the debates about the nature of technology or the separation of business and ethics have real-life implications. They help shape how we go about designing governance structures, whether its norms, regulations, or frameworks, to operate these technologies, manage them, or conclude that the harms far outweigh the potential benefits.
4.2. Tech and data responsibility
For the purposes of this commentary, we adopt the initial United Nation Office for the Coordination of Humanitarian Affairs (UNOCHA) definition when we think about Data Responsibility: “a set of principles, processes, and tools that support the safe, ethical, and effective management of data.” (OCHA Centre for Humanitarian Data, 2021).
Data responsibility and data protection regulations have existed for a while, and the first national data privacy law goes back to the Sweden Data Protection Act of 1973, or the state of Hesse in Germany in 1970 (Hofverberg, Reference Hofverberg2012). These laws first established broad protections and rules about the management of data by entities. Today, more than 66% of countries around the world have passed legislation focused on Data Protection and Privacy, and they vary in breadth and scope (UNCTAD, n.d.).
The struggle around data and tech responsibility is a struggle between three groups: Governments seeking to exert their power and control, corporations seeking to maintain their ability to generate profits and expand market share, and people who are looking to protect their rights all the while making use of the modern-day benefits that technological advancement brings.
There is no escaping being a part of the data ecosystem. If you live in a country with even a minimum level of data and technological penetration, you are impacted by the scope and scale of data collection, the data generating apparatus, and the ways in which they have an impact on you and your community (Thompson and Warzel, Reference Thompson and Warzel2019). Every individual that is part of the interconnected economic system generates new data every day by simply existing and participating in the economic system, there is no escaping it, otherwise as Paolo Ricaurte puts it “refusing to generate data means exclusion”(Ricaurte, Reference Ricaurte2019).
Concurrently, the commodification of everyday data is here for all to see (Thatcher et al., Reference Thatcher, O’Sullivan and Mahmoudi2016). Financial reports of tech companies highlight the dollar values companies expect to generate from user data and engagement. Corporations often resist the push toward hard laws as they view the costs associated with implementations as too high to bear or as a threat to their business models (Crain, Reference Crain2018; Sherman, Reference Sherman2020). Interestingly, this clash is also between corporations, as advertising firms (and Facebook), for example, have reported how Apple’s new privacy model has cut deep into their revenues and voiced their dissatisfaction with Apple’s move toward a more privacy conscious model.
The lack of a global regulatory system or globally accepted standards and principles means that there is going to remain gaps in implementations and inadequate responsible data practices. Corporations adopting CDR mechanisms may be the first step, but our concern is that it will remain a piece-meal or ad hoc approach limited to one company or a portion of a sector. The challenge is in how to design new approaches that can be scaled across the sectors and across continents that would leverage the potential of digital technologies, all the while empowering users, ushering in a new era of transparency, agency, and innovation that is sustainable and responsible (Mastercard, 2019).
4.2.1. The “Big-Tech” perspective on data responsibility: Self-governance
Commentators argued that Big-Tech companies push for private governance in order to avoid mandates or rules imposed by government regulators (Schaake, Reference Schaake2021). Analysts have made the case that it is in the best interest of these companies and for the sector to preempt government regulations by establishing clear rules and guidelines that would govern the sector. Private governance is seen as a tool to maintain consumer trust, pursue sector wide strategies, and avoid a “digital tragedy of the commons” (Cusumano et al., Reference Cusumano, Gawer and Yoffie2021). An extreme view of this is being proposed in Nevada where local tech companies would essentially become their own government, proposing rules and regulations in their “innovation zones,” levy taxes on constituents, and manage schools and other government functions (Nevett, Reference Nevett2021).
Over the past few years, “Big-Tech” companies have pursued self-regulation strategies and developed their own oversight board because of increasing public scrutiny to their actions. Oversight boards are in theory an adjudicatory body that will often review and provide guidance on issues as they relate to the digital policies and practices of a corporation. They are designed to be an external body whose recommendations the boards and upper management ought to implement. The most notable example of this has been the Facebook oversight board first introduced in November 2018 and later formed in October 2019 (Oversight Board, n.d.). The board is often referred to as Facebook’s Supreme Court, which listens to appeals and is tasked to provide commentary on Facebook’s policies and to evaluate appeal processes vis-a-vis content moderation. The concern is that an oversight board or similar mechanism, while in theory a welcomed process, in practice, can often end up a regulatory dodge, utilizing a CDR mechanism as a facade of accountability while shielding a corporation from true regulation.
The Facebook Oversight Board has been criticized as not having any teeth. The ruling of the board only applies to the specific case in question and does not lead to the establishment of companywide policies or precedents. The oversight board is open for expansion should they prove to be successful (Hatmaker, Reference Hatmaker2021). The board is only able to issue recommendations, and their ruling can be ignored by the company (Klonick, Reference Klonick2021). Moreover, this approach remains ad hoc and limited to one company at a time, rather than provide industrywide standards and practices. As a result, there is a lack of transparency into these processes. Critically, the scope of the board and a lot of the discussion on the regulation of big-tech are focused on content moderation, with gaps on issues related to data collection and tracking, the environmental and labor costs of collecting, managing and using the data, and the extent to which data are sold and bought by data brokers in the United States for example (Melendez and Pasternack, Reference Melendez and Pasternack2019). Furthermore, the lack of independent access and investigation makes it hard to truly evaluate the true impact of the Facebook Oversight Board, but the signs are not promising (Bass, Reference Bass2021; Hegelich, Reference Hegelich2020; Wall Street Journal, 2021).
4.2.2. The nonprofit perspective on data responsibility: Guidelines, frameworks, and operationalized approaches
One sector that deals with data responsibility in a volatile environment daily is the humanitarian sector. Humanitarian actors collect (and generate) a wide range of data related to the conflict or disaster areas they operate in, including data on some of the most vulnerable groups. Humanitarian actors increasingly use sophisticated tools such as remote sensing, biometric data collection, or information communication technologies to conduct their work, and see connectivity and digitalization as a central component of aid. Due to the nature of their work, humanitarians are increasingly wary of the dangers and potential risks of digitization and increased data collection. Individuals’ lives and livelihood could be in danger if data are leaked or misused for example.
As a result, the humanitarian sector (broadly defined) has pushed toward the development of sectorwide guidelines, frameworks, and ethical codes to guide the work of humanitarians. Certain codes, like the Signal Code, apply existing international human rights law and humanitarian law to the rights of individuals in crises for information and establishes clear responsibilities for actors responding to crises and collecting data of those impacted.
The International Committee of the Red Cross Data Protection Guidelines, UNOCHA Data Responsibility Guidelines, or the Inter-Agency Standing Committee Operational Guidance on Data Responsibility in Humanitarian Action look at how data protection guidelines can be operationalized and how organizations across the sector can implement them. The Sphere Handbook that establishes The Humanitarian Charter and Minimum Standards and is relied upon by hundreds of organizations around the world has continuously expanded its guidelines and focus on data protection and responsible digital use.
It must be noted that many humanitarian agencies operate under agreements that provide them with immunities and protections from prosecution, some of which private corporations might very much desire. The purpose here is to showcase the sector’s approach and thinking about these issues. The authors believe that the humanitarian sector has been leading the way in many ways toward developing data responsibility approaches that reduce risks and harms, all the while maximizing the benefit of digital tools.
4.2.3. Clash of the Titans
The limits of a CDR approach were exposed as governments clashed with tech companies as they sought to combat monopolies and impose antitrust regulations, institute fines (Graham, Reference Graham2017), threaten to breakup companies (Staff, 2007), or impose new rules on companies (Perrigo, Reference Perrigo2020). In early 2021, Facebook and Google both clashed with the government of Australia over its proposed media law that would have required companies such as Google and Facebook to share profits, emanating from search results, with media companies (Kaye, Reference Kaye2021). Both companies threatened to limit their service offerings in those countries before stepping back and reaching settlements with the government (Kaye and Packham, Reference Kaye and Packham2021).
This episode underlines the power struggle that will be ongoing over the next decades, and the limits of CDR models. As governments reckon with the continued digital transformation and look for new ways of retaining their control and power, tech companies will continuously push against regulations while lobbying for concessions or reach settlements that do not impact their profit margins. Concurrently, tech companies may pursue strategies to lobby governments to impose certain regulations to undermine a competitor. Microsoft, for example, backed the Australian government in its row with Google, seeing an opportunity to gain a larger market share of the search market (Shead, Reference Shead2021). This dynamic at play underscores that striking the balance between government regulations, tech companies’ private interests individually and as a sector, and maintaining the rights and the trust of the general population will require tremendous effort, one that is not resolved simply by pursuing a CSR or CDR strategy.
5. Implications
5.1. Data responsibility must be made the core of business
Data responsibility must be understood as one of the most important principles for the protection of rights individuals and communities moving forward. Otherwise, as new technologies and new forms of extraction are developed, the guardrails will become harder and harder to establish. Data responsibility and ethical use of technology have to be embedded through the entire business cycle (Martin and Freeman, Reference Martin and Freeman2004). Therefore, treating data responsibility exclusively through a CSR or a CDR lens does not meet the seriousness and critical nature of the issue. That approach both undermines the higher ideals of developing widespread responsible practice all the while also underscoring the true value of data and the actual impact that data plays on the individual level, societal level, and corporate level. Viewing data responsibility as a CSR or CDR policy shifts the focus from centering data responsibility in the processes, design, and implementation of corporate practices, to a secondary, or worse, an afterthought that is completed to complement the business operations of a company.
5.2. Connecting digital and environmental
There is a growing divergence between digital strategies and environmental/climate change strategies. A lot of the solutions to solving digital technology challenges, such as biased algorithms, are to collect more and more data, whereas, currently, those conflict with strategies to become more environmentally friendly. Furthermore, as Kate Crawford highlights in her groundbreaking book Atlas of AI, “calls for labor, climate, and data justice are at their most powerful when they are united” (Crawford, Reference Crawford2021). Therefore, the social responsibilities of these technology companies must be assessed and held accountable in a holistic manner.
5.3. Acknowledging and addressing the societal impact in a holistic manner rather than piecemeal
An approach that considers the protection of individual rights, technological growth, and innovation is a challenge. Ignoring or undermining one component for the sake of the other will lead to further inequity and continued accumulation of power and control by a seemingly smaller share of entities that control the data ecosystem. Furthermore, due to the foundational role that data-driven technologies will have in society, this disproportionate power will inadvertently translate into many, if not all, other areas of society.
Data responsibility and CSR/CDR must grapple and consider issues with issues linked to data exploitation, data justice, and data colonialism. Data responsibility also must reckon with the growing digital divide and how the rush toward a digital transformation that is not equitable will leave hundreds of millions of people behind and outside the economic system.
5.4. Understanding new stakeholders and the level of impact
We are particularly concerned with companies that contribute to the growth of data brokers by selling data to these third-party groups. Data brokers facilitate the ability of companies and states to subvert CDR commitments or, in the case of governments, allow them to curtail certain rights (Roderick, Reference Roderick2014). Data brokers take advantage of a lack of regulation and gray areas to collect massive amounts of information on people, package the data, and then sell them.
Funding Statement
This work received no specific grant from any funding agency, commercial, or not-for-profit sectors.
Competing Interests
The authors declare no competing interests exist.
Author Contributions
The authors have contributed equally to the design, research, writing, and editing of the manuscript.
Data Availability Statement
There are no primary data used in this manuscript, and all references are linked below, some of which may require institutional access.
Abbreviations
CDR, corporate digital responsibility; CSR, corporate social responsibility; IASC, Inter-Agency Standing Committee; ICRC, International Committee of the Red Cross; MOs, meta-organizations; UNOCHA, United Nation Office for the Coordination of Humanitarian Affairs
Comments
No Comments have been published for this article.