Autonomous systems are affecting virtually all aspects of society, so future designs must be guided by a broad range of societal stakeholders. That’s according to a new report led by scientists in the Oden Institute for Computational Engineering and Sciences at The University of Texas at Austin.
Ufuk Topcu of the Department of Aerospace Engineering and Engineering Mechanics led a yearlong effort involving more than 100 autonomy experts nationwide in the completion of a report titled “Assured Autonomy: Path Toward Living With Autonomous Systems We Can Trust.”
From spacecraft design to health care, vehicles to smart-cities planning, autonomous systems help us determine how society should run as a matter of routine. Yet the safety, security and regulation of these systems are still not prioritized.
“The management of autonomous systems needs to be at the crossing of science, technology, society, policy and governance,” Topcu said.
Harmoniously
Commissioned by The Computing Community Consortium (CCC) — enabling high-impact research through state, industry and academic engagement within the computing community — the report begins by listing technologies that have disrupted society in the past. From the printing press “democratizing knowledge” to the industrial revolution “replacing man with machine” to more recently, the internet making instant global communications a given.
We are now witnessing the age of autonomous systems, where both humans and human intelligence can live harmoniously with machines and machine intelligence, assuming the right approach is taken now.
“Science and technology have always disrupted the status quo,” the report said. “But these systems are different from earlier technologies. Replacing humans and human intelligence with machines and machine intelligence is replacing within existing frameworks of laws, ethics, morality, norms, as well as an existing technology.”
Technical flaw
With autonomy, a technical flaw in the software of a system, for example, can’t be seen as merely a bug. “That bug could be a potential violation of law and/or morality,” Topcu said.
The challenges faced require interdisciplinary approaches. “Autonomous systems are not just engineering marvels. They have an influence on individuals, groups, and even the culture as a whole,” said Art Markman of UT’s College of Liberal Arts and director of the IC2 Institute. “Engaging experts from a range of disciplines, including the humanities and the social and behavioral sciences, will be a crucial step to avoid unintended consequences of the deployment of new technologies.”
The direction taken by autonomous technologies is currently guided almost exclusively by scientists and engineers. According to Karen Willcox, director of the Oden Institute, it is time to “stretch our interdisciplinary thinking beyond the STEM fields.”
Mathematical modeling
“Autonomy is a prime example of a domain where the interdisciplinary approaches of computational science – weaving together rigorous mathematical modeling with advanced computing and domain expertise – will play a critical role in developing better, safer systems,” Willcox said.
“But there is an urgent need to build deep collaborations in research and education with humanists and social scientists. This is a challenging but exciting future prospect.”
The core recommendation of the report is the creation of a network of institutes that can easily share ideas, concerns and, ultimately, develop a regulatory and quality assurance framework that underpins future advancements. The CCC calls for the implementation of a national research strategy for assurance with stakeholders in government, academia, industry and society all playing their part.
Socioeconomic opportunity
“Autonomy is a socioeconomic opportunity as well as a challenge, and the public will both perceive and be affected by it unevenly,” Topcu said.
The broader impact of autonomous technologies is a growing priority for researchers at UT Austin.
“Both basic and applied research in autonomous and AI technologies, particularly at UT, is increasingly focused on values-based designs,” said interim Vice President for Research Alison R. Preston. “Outlining the potential value of new technologies at the earliest design stages can limit the potential for unintended, negative consequences once a new technology is deployed. Understanding how to design autonomous systems that provide value to multiple stakeholders requires true collaboration, not just within any one university but among them. I’m proud that the Oden Institute is helping lead that charge.”
“Assured Autonomy: Path Toward Living With Autonomous Systems We Can Trust” is the product of several workshops facilitated by Ufuk Topcu and participating members of the CCC organizing committee: Nadya Bliss and Nancy Cooke (Arizona State University), Missy Cummings (Duke University), Ashley Llorens (Johns Hopkins University), Howard Shrobe (MIT), and Lenore Zuck (University of Illinois at Chicago).
Illustration: The Computing Community Consortium (CCC)