Hype as Governance
An interview with Andreu Belsunces on sociotechnical fictions, AI hype, purposeful frictions, and legitimacy
In this conversation, I speak with Andreu Belsunces, a sociologist of technology and design who studies how politics, imagination, and the economy shape one another. Andreu describes how his career was shaped by a political drive based on democratic values, climate urgency, and a commitment to human justice. That path led him from cultural analysis of digital platforms to the politics embedded in technology. He introduces sociotechnical fictions as claims that lack firm evidence but gain authority because they are voiced within scientific and technical contexts. From there, he connects hype, venture capital, and legitimacy as mechanisms that govern the future by steering attention, investment, and public imagination. He also shares concrete teaching practices, from analyzing venture capital narratives to student video essays and worldbuilding exercises that design counter-institutions. Finally, he argues for useful frictions that reduce our vulnerability to hype, including new forms of literacy and clearer recognition of belief, desire, and power in techno-futures.
Enjoy the interview below and let us know what sociotechnical fictions, questions, or ideas it sparks in you!
Politics, freedom, and why technology is never neutral
Question: What motivates your current work? What are the values, visions, and metaphors that are driving what you’re doing today?
Andreu: Since the beginning of my professional career, my work has been politically driven. I started working for the Spanish International Cooperation Agency in Uruguay, and then for the United Nations and UNESCO. As a sociologist, I have always been interested in culture and knowledge. When I became more interested in technology, I approached it from a cultural perspective. I wanted to understand how emerging digital platforms and digital content, especially in the early 2010s, changed the way we depict the world, relate to each other, and tell stories. I was used to thinking about the politics of the entertainment industry, which has often framed how we relate to each other and how we think of ourselves. Little by little, I developed the intuition that there were forms of politics behind technology.
Freedom is very important to me, but not freedom in a US libertarian sense. I mean that each individual should be able to live life the way they want and be the person they want to be, without oppression or undue influence, with self-respect and respect for others. As the climate crisis became more visible, climate and environmental concerns also became more important to me, along with protecting democratic values and systems. Over the last several years, I have seen democratic institutions and procedures become endangered, not only through narratives but also through technologies that undermine our understanding of, and trust in, democratic institutions. These technologies increasingly support alternative financial and governance systems aimed at anti-democratic forms of relating, such as cryptocurrency.1
Sociotechnical fictions as “true enough” stories with real power
Question: How do you define sociotechnical fictions in your work, and what do you think is their role in AI?
Andreu: It has been a long journey to define sociotechnical fictions in an operative way. I define them as a specific kind of fiction that exists within the boundaries of science and technology. They are meaningfully different from cinematographic and literary fiction because they are presented within, and protected by, science and technology.
We can talk about Harry Potter or Star Wars, and we do not take them at face value. But if a tech guru starts talking about Artificial General Intelligence (AGI), which is a technology that lacks empirical evidence, we often take it at face value. Why? Because, as modern beings, as sons and daughters of modernity and the Enlightenment, we tend to believe that what comes from scientific and technological discourse is true.
Sociotechnical fictions describe fictional statements that are treated, presented, and understood as objective and truthful information. Because I come from the field of Science and Technology Studies (STS), I am interested in performativity2 and in what things do, including non-human actions. I started to approach fiction as something that makes things happen.
When a fictional statement is protected by a scientific or technical setting, it becomes more performative. For example, if it appears in a white paper or a think tank report, journalists, regulators, policy makers, and citizens start thinking about this particular future more seriously. That gives these statements more power to become self-fulfilling prophecies.
From AGI to Worldcoin: examples of fictions that build products and agendas
Andreau: In the last three years, I have looked closely at two examples, Artificial General Intelligence and Worldcoin. If you read the Worldcoin white paper, you will find that they justify the existence of their device and crypto product based on a future that does not exist yet. That future is the idea that AGI will become a reality sooner or later, and that AI agents will start stealing identities in online environments.
Cybersecurity is a real problem, but we already have cybersecurity solutions. We already have robust digital identity systems in some democratic countries, like Spain. We do not need a new device to protect online identity, but stronger public solutions – and even less a financial network that operates outside democratic oversight or control, which is what Worldcoin is trying to do as a cryptocurrency.
Worldcoin is an example of a sociotechnical fiction because it reacts to a problem that is not yet real, or it wasn’t when it was launched. It also responds to challenges posed by existential risk narratives. Existential risk frameworks are mostly speculative. They are not like refugees, genocide, the war in Ukraine, the housing crisis, or the casualization of labor. “AI taking over humanity” is itself a sociotechnical fiction when engaged by academia or expert actors. Worldcoin responds to this and materializes hypothetical speculative frameworks like longtermism.
Speculative research: studying futures as they emerge, not just imagining alternatives
Question: How do you use speculative methods in your research or teaching? What are some examples of speculative artifacts you or your student created?
Andreu: I have been using speculative design for many years in lectures with my students, especially in Design, Cinema, and Art. For the last three or four years, I have been more interested in speculative research than speculative design –in fact, Alex Wilkie, one of the authors, is my PhD supervisor.
Speculative design, as a form of critical design, often tries to picture alternative futures to challenge how we think about technology and society. Speculative research, by contrast, understands the future as open and constantly unfolding. It studies continuously emerging futures as cultural, economic, and political phenomena. It is research that tries to intervene in how futures are thought and made, paying attention to the futures that are emerging now and identifying where those futures appear.
It also requires awareness that novelty is a social construction.3 I am close to post-growth4 communities, and part of my work is about making post-growth futures more understandable and desirable. But the way post-growth and degrowth create novelty often does not resonate as much as futures created by the tech industry, because we are culturally accustomed to framing the future through science and technology. Paying attention to how technoscience produces novelty, as opposed to how post-growth tries to produce novelty, helps us understand how future narratives mobilize desire, investment, and political support.
Venture capital mobilizes visions of the future
Andreu: Let me give a specific recent example. Most recently, together with my students, I’ve been researching how the venture capital industry mobilizes visions of the future. Venture capital is a gatekeeper between possible futures and the present because it mobilizes billions of dollars, resources, political influence, and media attention.
By paying attention to venture capital portfolios, narratives, manifestos, and aesthetics, we get a living and contemporary object of study for observing how futures emerge as spaces of political struggle. We approach them through expectations, promises, fictions, hype, and financial speculation. Based on this analysis, I ask students to create images, campaigns, or video essays that explore these futures.
In one business design degree where I teach, I asked students to extrapolate the futures venture capital is trying to make real. Then we did a worldbuilding exercise based on the features of those futures. I invited them to create an organization, action, technology, or institution that might counter those technolibertarian and authoritarian futures, for example, through post-growth alternatives. We imagine a future based on actual statements, and then within that future, we try to find ways of altering it.
Hype as governance: how attention politics reshapes what society funds and fears
Question: In your view, what is the role of speculative methods in AI literacy and, more broadly, AI design and governance?
Andreu: Right now, I am co-leading a group called Hype Studies, a wide network of scholars and professionals who are concerned by the phenomenon of hype across science and technology. We are curating a series of articles at Tech Policy Press on hype, power, and governance. In STS, sociotechnical imaginaries describe visions of the future sustained by institutions, governments, and corporations. They are normative in guiding research and development, investment, and public discourse. When an imaginary becomes hegemonic, it becomes performative in influencing how we move toward those futures.
Today, there are many AI imaginaries. In the last three or four years, across public funding for science, technology, and academia, everyone wants to talk about AI. This draws attention to AI systems, while other topics become sidelined. It can feel as if climate change is no longer important, because now everyone is talking about AI. That is dangerous.
There is also growing discussion about technological sovereignty. It is not only about owning and controlling AI and data infrastructure. We should also care about the imaginaries that shape how we develop technology and think about AI.
AI hype has been driven by tech leaders, and it frames how we think about technology and its dangers. For example, the discourse around congressional hearings and the letter calling to pause AI, which was partially funded by Elon Musk, used alarmism to frame concerns. By amplifying alarmism, it influenced how we think about consequences. In that sense, hype is a form of future governance.
What the Hype Studies conference surfaced: fandom, attention, and power
Question: What were some of the key learnings from the Hype Studies Conference last year?
Andreu: The main learning is that many people are concerned about this. Addressing hype is a transdisciplinary endeavor. I am interested in political, economic, and financial dimensions, but others focus on psychological and communicative dynamics, hype as scientific communication, and the ambivalence of hype. Sometimes hype can be toxic and harmful for scientific research, but sometimes it can help raise attention and funding for a new technology – or even criticism!.
Something very interesting to me is the relationship between fandom5 and hype. Hype is not only technological. There is hype in music and video games. In tech, we talk about early adopters or tech bros, but not in terms of fandom. Thinking in these terms helps us understand how people emotionally engage with technology and tech leaders.
At the conference we had a conversation with Jack Stilgoe, the director of the STS department at UCL. He told me about a case that caught my attention. One of his projects centered on researching electric cars and responsible innovation for self-driving vehicles. At some point, he was critical in terms of seeing both their benefits and downsides. At the conference, he shared that he had received hate attacks on social media because he was seen as attacking something sacred. This showed me how fandom dynamics are pervasive in technology and how hype can produce fandom. There is also a gender dimension in how popular culture depicts fandom, and we should be able to name these dynamics, maybe even through comedy.
Useful frictions: reducing vulnerability to hype and exposing belief in tech futures
Question: As part of this Speculative Friction project, I explore how carefully designed frictions can create space for reflection, oversight, shared understanding, and improved human agency in how AI is built and governed. In your view, what are examples of such meaningful frictions?
Andreu: In terms of AI, one useful friction would allow people to be less vulnerable to hype narratives. That is why we are developing a Hype Literacy toolkit. It is in progress and will be finished in January.
Another useful friction could come from paying attention to how belief, almost like spiritual or religious belief, shapes how tech futures are presented. I have been working on a speculative project titled Metabolism of Techno-Financial Worlding, where I explore venture capital as a kind of portal that uses our need for the sacred and ineffable to bring imaginary entities into reality. In the early stages, many startups are imaginary, and through money and communication, they become real and have an impact.
Some emerging technologies never reach stability and inhabit our world like phantoms, like the metaverse. People tried to make it real, but it did not stick. It becomes a technological zombie. Being aware that some discourses about the future are structured by belief and are fictitious can help.
Legitimacy: how fictions scale through universities, media, and institutions
Question: In your view, how is legitimacy produced for sociotechnical futures, and how might we build legitimacy for alternative imaginaries that support the goals of human agency, democratic oversight, and ecological responsibility?
Andreu: I have been learning how legitimacy for fictional statements has been built over the last five years. For example, when researchers at important universities talk about existential risk, that creates a framework justifying that these risks are important and need attention, even though most of the risks are still hypothetical and therefore fictional. Because these fictional statements come from legitimate universities, we take them at face value.
When the New York Times, the Financial Times, or The Economist describe an emerging technology as something that will change everything, like cryptocurrencies, the internet of things, big data, cloud computing, or AI, they tell the rest of society to follow this new miracle. We believe it because it comes from legitimate institutions.
I would love someone to explore how movement-building, such as the post-growth movement, and other technological alternatives mobilize fictions and legitimize possibilities. Social movements can become attractive because they are prefigurative - that means that they envision and enact more beautiful, just, and sustainable ways of being together. They picture futures and engage people through desirable alternatives, which social theory calls prefiguration.
Law has its own way to participate in sociotechnical fictions, and I’m eager to engage in discussion with legal scholars. Building on Roberto Mangabeira Unger’s essay called “Legal analysis as institutional imagination,” I’m curious about how legal analysis as speculation could help protect democracy from technocratic visions of the future. I am tired of tech moguls trying to scare us about fictional futures because it shapes how we think, and we are not paying enough attention to creating alternative social futures.
The idea that descriptions, models, categories, and technologies don’t just represent the world—they can help bring a particular world into being through practice. STS asks: When people use a concept, a metric, a model, or a tool, what realities does it enact? What behaviors and institutions does it make more likely? What becomes “true” because it’s being measured, predicted, and organized around?
In science and technology studies (STS), social construction describes how what we treat as objective “facts” or inevitable technologies are shaped by human decisions, institutions, power, and culture.
An approach and paradigm shift away from prioritizing GDP and infinite economic expansion, focusing instead on human well-being, ecological sustainability, and equity within planetary boundaries.
Fandom refers to a community of people who are passionate about something, whether it’s a film, a band, a television show, a book, or a sports team.


