A creative methodology for software developer un-bias
Kelsie Nabben & Michael Zargham
14 November 2020
“Wisdom is knowledge about certain principles and causes.” — Aristotle, 350 B.C.
The origins of infrastructure matters in relation to the social outcomes it supports, enables or undermines. Yet, as emerging technologies become more enmeshed in everyday life — and more powerful: it is rarely highlighted how crucial it is for technology developers to be aware of the values in which they imbue into their creations.
The Beginning of Infrastructures
Digital infrastructures are the material objects constructed by people, and the complex ecology of their relationships with other social and technical objects (Jewett & Kling, 1991; Star & Ruhleder, 1996; Star, 1999).
Anthropologist Brian Larkin emphasises how intent and politics is structured and released through the ‘technopolitics’ of built infrastructures (2013). In contrast, scholars agree that the cognitive bias of technology developers is reflected as “algorithmic injustice” in their creations (Gupta, et al., 2020). Where an infrastructure originates, including the ontology of its makers, will affect how and where it grows, in response to legal, political, environmental and social factors (Larkin, 2013). The same is true for technological systems, as digital infrastructure.
Larkin focuses on how “infrastructures reveal forms of political rationality that underlie technological projects” once a technology has been released in the world (2013).
The ontology of a technology is comprised of elements internal to the object itself, to form infrastructure which consists of technical, administrative and financial components (Larkin, 2013). These artefacts interact to contribute to certain goals (Hughes, 1987). Here, we draw out how the elements internal to an object are embedded by its designer.
In contrast to ‘technopolitics’, which aims to surface the politics of infrastructure, this piece offers a methodological tool to aid developers and system engineers to surface biases and assumptions in the design and development phases of new technologies and reflect on their own ontologies. We call this tool ‘techno-reflexivity’.
Just as “social research is almost inevitably digital” (Pink, 2019), so is the digital inherently social. Ethnography in a digital world requires methodological creativity (Pink & Postill, 2019). Defining ‘techno-reflexivity’ as a tool is a necessary approach towards methodological creativity to bridge common language and practice across the fields of social science and system engineering. In taking this liberty, we borrow the concept of ‘reflexivity’ from ethnography, apply it to the process of systems engineering, and create a shared language for the co-creation of more self-aware digital infrastructures.
In order to apply the concept of ‘techno-reflexivity’, we frame technology developers as researchers and apply ethnographic and anthropological qualitative social research strategies to the development process for greater awareness of subjectivity in technology design.
The Ontology of Infrastructure
“No tool is neutral” (Star, 1996). In order to write code, you have to assume things.
Ethnography has various approaches for identifying and positioning assumptions and worldviews known as ‘positionality’ and ‘reflexivity’.
‘Positionality’ is the social and political context that creates your identity, and how your identity influences and potentially biases your understanding of, and outlook on the world.
“The position adopted by a researcher affects every phase of the research process, from the way the question or problem is initially constructed, designed and conducted to how others are invited to participate, the ways in which knowledge is constructed and acted on and, finally, the ways in which outcomes are disseminated and published.” (Coghlan & Brydon-Miller, 2014).
When re-applied from qualitative research to technology development, positionality refers to the positioning of the developer as a researcher — in relation to the social and political context of the tool or infrastructure, as well as the community, the organisation or the real or perceived participant ‘user’ group.
Implicit in positionality are the notions of power and intent, in relation to other actors in the network, or in this case, the infrastructure.
Additionally, ‘reflexivity’ is a process of reflection involving both emotion and cognition, that results in new understandings of a phenomenon (David Boud, 2010).
“Researchers need to evaluate how intersubjective elements influence data collection and analysis. Reflexivity — where researchers engage in explicit, self-aware analysis of their own role — offers one tool for such evaluation.” (Finlay, 2002).
When applied to technology development, reflexivity is self-awareness of one’s subjective responses; the intersubjective dynamics between creator and other influencing factors such as creators, funders, social norms, and real or perceived ‘users’; and the development process itself being undertaken. Techno-reflexivity is a conscious awareness of the subjective nature of one’s own ontology, and the ways in which this influences the biases, beliefs, processes and development of what one creates. In technology development, this includes the knowledge, positions and beliefs of the developer, in relation to the perceived or actual users of the technology.
Techno-reflexivity is especially important in designing governance models for complex, dynamic, multi-agent digital systems (Tan & Zargham, 2020). The interplay between positionality, engineering processes, and system settings can be represented as follows:
Subjectivity, vis à vis objectivity
Developers are taught objectivity — to think in terms of ‘right’ and ‘wrong’ in order to communicate with computers through binary coding languages. Subjectivity is rarely emphasised, practiced or celebrated as a valuable contribution to system design.
Objectivity creates the risk of binary judgements, such as the notion that technology can solve the world’s problems. The risk of technological determinism is that it disregards the implicit social factors which shape both the design and outworking of technology in society.
In contrast, a social constructivist perspective highlights the social and shared nature which constructs what is ‘true’. One risk associated with this perspective is that it ignores the influence and impact of technological tools.
The reality is that neither is correct. Rather, an interplay of both the technical and the social perspectives is needed to clarify the socio-technical reality of technology infrastructures in practice in society.
“Our technologies mirror our societies. They reproduce and embody the complex interplay of professional, technical, economic and political factors.” (Bijker & Law, 1992). These real-world constraints require compromise, and an awareness of ideals to surface these trade-offs.
The concept of ‘trust modelling’ is an example of an attempt to articulate a tool to surface assumptions from a sociological perspective in technology design (Nabben, 2020). This approach is based on an interdisciplinary research report on decentralised technology development (Wagner, McKelvey & Nabben, 2020), and inspired by the text ‘Engineering a Safer World’ (Leveson, 2012).
The aim of ‘trust modelling’ is to surface ‘trust assumptions’. Translating trust models into technology design is not new (Zimmermann, 1994). However, this tool emphasises both technical, and social trust in a socio-technical infrastructure, to surface systems objectives, assumptions, and dependencies.
Systems thinking and ‘techno-reflexivity’
Technological infrastructures re-configure society (Larkin, 2013). Scholars suggest that ‘some technologies are inherently power laden and are therefore sensitive for both researchers and society at large’, such as mapping systems as a representation of the world and cyberspace as a gateway to knowledge (Atkins, 2004).
“Large systems with high momentum tend to exert a soft determinism on other systems, groups and individuals in society.” (Hughes, 1987).
Techno-reflexivity, as awareness of how one’s own ontological framework is projected in their work, is especially pertinent to automated infrastructures. Black-box platforms whose operative code is not publicly disclosed structure and govern society in opaque and non-transparent ways. This is even more pronounced with the evolution of technological automation, such as machine learning, automated decision making (ADM) and artificial intelligence (AI), which is given a set of ‘rules’, processes and procedures, by which to make and execute on decisions.
Beliefs, biases and intentions contribute to the processes that shape our technology, and in turn, the way our societies are organised. Our understanding of these processes might help us to create different or better technologies that either sustain or drive societal change (Wiebe & Law, 1992). In designing and building technology, variations in one’s theoretical starting point will create very different results. The impetus for techno-reflexivity becomes even more pertinent when technological tools and systems are composed to become digital infrastructures.
Note: this piece is the first in a series on digital infrastructure, and key themes and ideas related to Kelsie’s PhD project. Feedback and engagement are most welcome.
Get these stories to your inbox for free — subscribe here: https://kelsienabben.substack.com/people/1619235-kelsie-nabben
About the Authors:
Kelsie is an ethnographic researcher on the opportunities and implications of decentralised technology infrastructures at the RMIT University Blockchain Innovation Hub, and a PhD candidate in the School of Media & Communications. She is particularly interested in privacy, civil society and socio-technical resilience.
Zargham is a systems engineer specialising in large scale social and economic systems with automated components. He holds a PhD in Systems Engineering from the University of Pennsylvania. His work is supported by private research and design firm BlockScience and is academically affiliated with WU Vienna.
Aristotle, Metaphysics, 350 B.C.E.
Bijker, W & Law, J, Shaping Technology / Building Society, The MIT Press, 1992. https://mitpress.mit.edu/books/shaping-technology-building-society. Accessed 14 Nov. 2020.
Boud, David. Relocating Reflection in the Context of Practice. 2010.
Coghlan, David & Brydon-Miller, Mary. SAGE Encyclopedia of Action Research — Ebook, Bokus, 2014. https://www.bokus.com/bok/9781473925304/sage-encyclopedia-of-action-research/. Accessed 14 Nov. 2020.
Finlay, Linda. “Outing” the Researcher: The Provenance, Process, and Practice of Reflexivity, 2002. https://journals.sagepub.com/doi/10.1177/104973202129120052. Accessed 16 Nov. 2020.
Gupta, Abhishek, et al. “The State of AI Ethics Report (October 2020).” ArXiv:2011.02787 [Cs], Nov. 2020. arXiv.org, http://arxiv.org/abs/2011.02787.
Hughes, Thomas P. P 86–9 THE EVOLUTION OF LARGE TECHNOLOGICAL SYSTEMS., 1987.
Jewett, Tom & Kling, Rob. The Dynamics of Computerization in a Social Science Research Team: A Case Study of Infrastructure, Strategies, and Skills, 1991. https://journals.sagepub.com/doi/10.1177/089443939100900205. Accessed 11 Nov. 2020.
Larkin, Brian. “The Politics and Poetics of Infrastructure.” Annual Review of Anthropology, vol. 42, no. 1, 2013, pp. 327–43. Annual Reviews, doi:10.1146/annurev-anthro-092412–155522.
Leveson, Nancy. Engineering a Safer World. The MIT Press, 2012. https://mitpress.mit.edu/books/engineering-safer-world. Accessed 16 Nov. 2020.
Nabben, Kelsie. From Threat Models to Trust Models for Technology We Can Trust, 2020. https://firstname.lastname@example.org/technology-we-can-trust-from-threat-models-to-trust-models-790e05bc70b. Accessed 16 Nov. 2020.
Pink, Sarah. “Digital Social Futures Research.” Journal of Digital Social Research, vol. 1, no. 1, 1, Aug. 2019, pp. 41–48. jdsr.se, doi:10.33621/jdsr.v1i1.13.
Star, Susan L. Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces, 1996. https://www.researchgate.net/publication/220079826_Steps_Toward_an_Ecology_of_Infrastructure_Design_and_Access_for_Large_Information_Spaces. Accessed 16 Nov. 2020.
Star, Susan L. Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces | Information Systems Research, 1996. https://pubsonline.informs.org/doi/10.1287/isre.7.1.111. Accessed 11 Nov. 2020.
Star, Susan L. The Ethnography of Infrastructure, 1999. https://journals.sagepub.com/doi/10.1177/00027649921955326. Accessed 11 Nov. 2020.
Tan, Joshua & Zargham, Michael. Introducing Govbase. 8 Nov. 2020, Medium, https://thelastjosh.medium.com/introducing-govbase-97884b0ddaef.
Wagner, E, McKelvey, K & Nabben, K. Decentralization Off The Shelf: 7 Maxims., 2020, https://decentpatterns.xyz/report/.
Zimmermann, P, PGP User’s Guide, Volume I: Essential Topics. 1994. https://web.pa.msu.edu/reference/pgpdoc1.html. Accessed 16 Nov. 2020.