This paper considers what an approach to human rights and the ethical governance of critical technologies could entail for Quad members. Its focus is data-driven technologies, like artificial intelligence.
The key insight of the paper is that policymaking and diplomacy on critical technologies should proceed from a recognition that the uses and impacts of technology are heavily affected by social factors, including local culture, context and legal traditions. Quad membership is often defined by distinguishing from autocratic/non-democratic powers. However, there are also considerable divergences within and between Quad members, and other partners, on what the responsible development, use and governance of technology (and related data) comprises. There are also differences between and within like-minded countries about how technologies are perceived to either pose a risk to, or enhance, security, economic and social interests and values.