Introduced: Amy Sample Ward
Amy Sample Ward is the CEO of NTEN, a nonprofit creating a world where missions and movements are more successful through the skillful and equitable use of technology. Their work focuses on the questions around tech for social good.
What do you work on as a Fellow of the Robert Bosch Academy?
I want to be part of an equitable world. We need to reshape our relationship with technology, as well as the technologies themselves, if we are to create an equitable world together.
What does an Internet built on the values of sovereignty, safety, and freedom look like? This is the question I am asking myself, asking in conversations with many others, and using as a frame for research here. I’m not expecting “answers” or “solutions” but reflections and experiences, feelings, and hopes. Ultimately, I think we know sovereignty when we have experienced our rights ignored; we know safety when we have been hurt; we know freedom when we have lived without it. This conflict is inherent in sharing space, so it is not about creating a world and an Internet free of harm, but one that has ample and accessible accountability for us to work out conflict and coexist.
This question, of course, seems to generate many more questions: What about the billions of people who aren’t online? How can community and individual protections be built in a sea of corporate software? What might accountability and restitution look like beyond current state government and corporate institution structures? Where do we start building something different?
The question I return to more than any of the whats and hows and wheres is this: who begins? Who builds something different? We only need to look around to see what has come from the last 40 years of building up technology by those most privileged – with access, training, funding, and resources. I think all of the rest of us begin together, in all of our own ways from wherever we are, to build in a new way.
What are the most relevant issues in your field?
There are so many important and challenging issues to consider across the field of technology: from internet access to the right to repair, algorithmic bias to media racism, and digital surveillance to data privacy. Within all of the various conversations, the same threads of safety, sovereignty, and freedom can be woven.
There are a few particular issues that have shaped my work history and informed my interests, including:
- Community power: How can technology be used to support mobilization, organizing, resource-sharing, and change-making efforts within and by communities? Nonprofit organizations are often part of these efforts. But there’s also challenges and constraints that come with organizations “owning” or otherwise “leading” these efforts. when the ultimate goals we may be reaching toward include a world where those organizations should not need to exist because the inequities and needs their missions serve to address have been met.
- Human rights: Access to the internet is necessary in many ways for adequately and effectively accessing services, civic participation, education and employment, and even communication with family, friends, and community. As such, it needs to be considered a human right, and both protected and with the accountable frameworks that come with that designation.
- Accountability: The economic and political contexts in which we live are deeply entrenched in the ways most technologies are developed and maintained. The internet facilitates different kinds of communication, collaboration, and connection than an offline world does. Yet governments defined by geographic boundaries and political landscapes maintained through violence expect to divide and control our information and communication technologies. Instead, we need proactive policies that can be created by communities themselves and that provide for mechanisms of accountability back to those communities.
How do you perceive current debates about the use of the technologies for the better?
It’s important to name the underlying question here: can technology be neutral? Can a technology be “for good”? And, of course, to look at who benefits (or, rather, who profits) when posing this question.
In the most recent book that I co-authored with Afua Bruce, The Tech That Comes Next, we argue that technologies created by people cannot ever be neutral because people are not neutral. When we believe the myth that technology can be neutral – a myth that is regularly and intentionally presented to us by technology companies – then we separate them from their political and social impact. Of course, technology is political! But thinking that it is not benefits technology companies, especially the largest ones, by easing or erasing expectations for our individual user and community rights and protections.
As an extension of this, it’s clear that no technology can exclusively or entirely be “for good” because who it is good for, how it is accessed, how it is maintained, and so on are all subjective impacts for different groups. This mindset also sets us up to think that we could divide technologies into those for good and those for bad. This is, again, a distraction from focusing instead on the needs, autonomy, and accountability that should be clear for all technologies and the communities that need, use, and ultimately even own them.
What insights for your work are you expecting to gain during your fellowship?
I’m open to insights from many different sectors, geographies, and perspectives. For more than two decades, I have been a student of abolition and find my deepest inspirations and strength for both questioning the systems I experience today and for thinking of a new future have come from the lessons of abolitionist work and thinking. Seeing the ways that carcerality, power, and supremacy are engrained in both the government structures as well as the corporate ones around us can be overwhelming and immobilizing. But there is strength in seeing plainly what is happening as it allows us to better understand how different something else could and would need to be.
Thankfully, I’m not the only person interested in the future of the internet and our world! I’m excited and grateful for the opportunity to have time to explore these conversations, ideas, and questions – to sit in active dreaming – with so many others. I am hopeful for insights into a few areas, including:
- Where else have we built from these values? How can we document, learn from, and share examples where there have been communities – however that may be defined – building with sovereignty, safety, and freedom regardless of the scope or the outcome so that we can both prove these values to be more than ideals or ideas and practicable ways of engaging.
- What other questions can we ask? How do we best provoke and open our minds to what an equitable world could be so that we can in turn more regularly talk about these visions with each other. When we root conversation and design in the problems of today, we make our focus match that with an outsized emphasis on issues, challenges, and existing systems; instead, by rooting our conversations in the world we could build, we better position ourselves to build bridges back to today.
What makes Berlin and Germany relevant for your work?
There is value to me in getting out of my usual space to reflect and see things from a different perspective. Berlin is an excellent spot as the time zone difference is enough that I’m separated from the day-to-day of my work. But, what I have found almost instantly here, is that there is an interest among the people I have met and a capacity for big questions without clear answers.
And, even more so than what I have found to be true of Berlin in general, being part of the Robert Bosch Academy has presented already incredible conversations amongst the group of Fellows as only and always is the case with interdisciplinary and diverse groups. It is a privilege to see the ways my questions are informed by and informing those of folks working in health, history, democracy, and climate change. We are always stronger together and I am feeling deeply grateful for this first-hand experience of that truth.
Quarterly Perspectives
You might also be interested in
"I fear we have not moved quickly enough"
9 September 2015. Denis Hayes, Richard von Weizsäcker Fellow and President of the Bullitt Foundation, considers the climate change as being one of the biggest dangers for humanity. His appeal: “We have to try to cause people to care as much about human...