We are currently being drawn into a technological revolution that will result in fundamental changes to the labour market and in our way of interacting. Now more than ever, the humanities have a significant role to play.  

Project: 'As if it were a person - social robotics and human self-understanding'

Grant: EUR 0.8 million (DKK 5.9 million) from VELUX FONDEN

By Johanna Seibt

‘The Robot Revolution’, as it is dubbed by the media, is powered by social robotics that will make robots with social qualities part of society. Today, robot designers are no longer programming the machine’s physical movements, but rather its social behaviour. This means that we are no longer creating culture as a product of Man’s freedom of thought and volition, but starting to design and construct culture by producing artificial agents that will engage us in their programmed routines and repertoires. 

Can robot designers also be cultural engineers? 

Robot developers, computer scientists and investigators within a range of different disciplines are cautioning that the robot revolution poses a threat to humanity’s core values. But who has the requisite expertise for advising on this new form of cultural design? None other but humanities researchers. According to numerous new global initiatives for responsible robotics and ethical engineering, state-of-the-art engineers need humanities research right now. And it is not solely moral and ethical concepts such as dignity and justice that are at issue, but also sociocultural values associated with autonomy, flexibility, freedom of scope, the authentic encounter, individual empowerment and equality. We face the advent of new domains, demanding a new type of social responsibility. Governments, grant-makers and researchers in the humanities, for example, each have a responsibility for shaping the robot revolution to match society’s sociocultural values. 

Robophilosophy - a new field of research

Thanks to the visionary approach of VELUX FONDEN, the School of Culture and Society at Aarhus University has been able to define and present a new interdisciplinary field of philosophical research: ‘robophilosophy’. 
The aim of robophilosophy is to solve the challenge posed by the robot revolution. Put in more academic terms, this branch of investigation is defined as ‘philosophy of, for and by social robotics’. It includes another branch of philosophy, ‘roboethics’, but also mobilises the theoretical disciplines of philosophy, ontology, epistemology and philosophy of science in its ambition to understand the interaction between humans and robots. 

Robophilosophy will help us to understand interaction between humans and robots

The grant from VELUX FONDEN (January 2012-March 2015) made it possible to employ four junior investigators, and, in collaboration with ATR/Hiroshi Ishiguro Robotics Lab in Japan, establish the first international interdisciplinary research team within robophilosophy.  
The research team has conducted empirical studies of changes in people’s attitudes to robots in a long-term project at an experimental physical rehabilitation centre, Vikærgården Rehabiliteringscenter, (2013-2014); in a pilot project among students at the SOSU Nord health college (October 2014-February 2015); and during a museum exhibition entitled ‘Robot or Not – What’s Your View?’ at the Krydsfelt Skive museum (September 2015-February 2016). One of the study topics was what it takes to convince people that a robot is capable of conscious thought. 
But more theoretical and abstract topics have also been addressed. Key questions include: Is a human interaction with a social robot actually a social interaction? And how do we need to expand our notions of social behaviour to allow for simulated and programmed interaction between humans and robots, and do we even have any sound reasons for doing so, regardless of the advances in social robot technology?

Who holds responsibility? 

The research team’s efforts will map and shed light on the complex areas of responsibility that arise from social robot technology, since these are focal for legislation and development policy. Because who is responsible if things go wrong? Who is going to be responsible for ensuring that only certain designs are realised, and others not? Who is responsible for addressing systemic adverse consequences of using social robot technology, and who is responsible for devising applications for social robot technology that underpin human values? 
These questions are complex because they always involve at least three groups of human agents – the robotics researcher who designed the robot, the public who interact with robots, and the surrounding cultural community. A key premise of the research project is consequently that the various types of responsibility can only be separated and defined if one understands advances in robotics applications as a social group action entailing a collective responsibility. 

The robotic moment

To raise awareness of the extensive social and cultural implications of social robotics, in 2014, the research team organised a major international research conference entitled ‘Robophilosophy 2014 – Sociable Robots and the Future of Social Relations’. The conference was such a success that additional grants made it possible to set up the recurring series of Robophilosophy 1) conferences and the research network TRANSOR, Transdisciplinary Studies in Social Robotics 2). The 2014 conference also generated five publications. 
In addition, 2016 saw the realisation of the world’s largest conference within humanities research in and on social robotics, a twinned event that included the second robophilosophy conference and a conference of the TRANSOR network. The title was: ‘What Social Robots Can and Should Do’. 
The decade between 2015 and 2025 has been advance-dubbed ‘the robotic moment’ in human history - ‘[the] way we contemplate [robots] on the horizon says much about who we are and who we are willing to become’ 3). Contributing competently and, not least, proactively, to this contemplation is not only an intriguing new challenge for the humanities, it is also an obligation with historic implications. / 

Notes: 1) www.robo-philosophy.org 2) www.transor.org 3) Sherry Turkle, Alone Together  

JOHANNA SEIBT

Johanna Seibt, PhD, D.Phil., Professor (with Special Responsibilities), School of Culture and Society, Aarhus University