Ippolita gruppo

7 Luglio Madrid Tavola rotonda per il ciclo Politics of Technoscientific Futures

Incontro organizzato da Mauro Turrini e Núria Vallès Peris

Partecipano: Gruppo Ippolita, Klaus Hoeyer, Noortje Marres


Predictive models associated with big data and the use of digital algorithms or machine learning are now part of our everyday lives. The use of navigators or cell phones that drivers make to choose the quickest routes to a destination, or the suggestions of what song to listen to or what book to read have integrated our routines and sound familiar. At the same time, other programs related to identifying the most likely neighborhoods in which a crime will occur, quantifying the creditworthiness of a potential borrower, or creating individualized health risk profiles, fuel dystopian scenarios that interpret digital technologies in light of their capacity for collective and individual monitoring and control, raising important issues related to discrimination and privacy protection.

The will to anticipate the future is certainly not a new phenomenon, but rather yet another attempt to cope with uncertainty. However, as the examples mentioned above show, today’s forms of prediction are characterized by an unprecedented technological mediation. Predicting means foreseeing by automatic calculations, on the basis of data whose quantity is not manageable by the human mind. Obviously, these practices tend to convey a specific frame of the future, as a space of optimization to be achieved through complex technological systems which integrate connected devices, algorithms and platforms.

The automatic construction of knowledge and of the distinction between true and false, poses urgent epistemological and juridical challenges about the mode of functioning of algorithms. Despite the transparency and unlimited openness conveyed by digital technologies, we urgently need to open the black box of algorithms and question their validity, utility, transparency and responsibility. Extending our scrutiny to the digital ecosystem of platforms, predictive models need huge physical infrastructures concentrated in the hands of a few private players of digital sector. The underlying oligopolistic tendency of techno-capitalism also raise questions about the normativity of this emerging socio-technical assemblages. In an increasingly interconnected society where the virtual and the real become blurred, predictive technologies offer a source of knowledge, meaning and truth that reconfigures the very relationships and realities they describe and analyze. The resulting relations are endlessly or chronically brought into being in a continuing process of production, reproduction and amplification of existing social inequalities, of the stigmatization of certain behaviors or socio- demographic features, of the obliteration or the hyper-control of certain segments of the population due to the exclusion of access or binding access to digital technologies – digital divide, free services, etc. The new predictive techniques are therefore techniques capable of forms of knowledge, meaning, truth and falsification, through which new forms of governmentality as well as resistance are emerging. This subplenary intends to explore them by keeping together academics and activists.