The final link in the chain is the role of corporate governance and the importance it plays in fast paced industries such as the AV sector. The final link in the chain is the role of corporate governance and the importance it plays in fast paced industries such as the AV sector. Whether driven by regulation or rooted in corporate purpose, both the success of the company in question, and the sector as a whole, rely on responsible governance and business ethics. Sound technology development, adherence to regulatory requirements, and ethical and responsible deployment all hinge on the leadership and governance of an organisation. However, the Institute of Business Ethics37 reports that the growing use of integrated AI and data use has thrown large challenges in path of Boards, some of whom are either struggling or are ill prepared for the decisions and nature of oversight required.
Organisational literacy
Not unique to the Boards of AV developers, but rather a ubiquitous challenge for all those in the era of the 4th Industrial Revolution, is the ability to interrogate information and make informed decisions – not just about the technology itself, but issues which it gives rise to, such as the rule of law, data privacy, and ethics. Boards have to ultimately take accountability for the actions of their companies, but not every board member is, or ought to be, an expert on technology deployed. However, where in the main boards they are balancing risk and opportunity, it is vital that they have some mechanism by which to assess the real scale and nature of the choices they face. This responsibility is not simply academic corporate governance theory. As an illustration, the UK Government’s principles of vehicle cyber security for connected and automated vehicles, issued in 2017 makes clear where accountability sits for cyber security.
Principle 1 - organisational security is owned, governed and promoted at board level Principle 2 – Security risks are assessed and managed appropriately and proportionally, including those specific to the supply chain Principle 3 - organisations need product aftercare and incident response to ensure systems are secure over their lifetime Principle 4 - all organisations, including sub-contractors, suppliers and potential 3rd parties, work together to enhance the security of the system Principle 5 - systems are designed using a defence-in-depth approach Principle 6 - the security of all software is managed throughout its lifetime Principle 7 - the storage and transmission of data is secure and can be controlled Principle 8 - the system is designed to be resilient to attacks and respond appropriately when its defences or sensors fail
The key principles of vehicle cyber security for connected and automated vehicles, UK Government
A simple way of explaining some of the dilemmas faced is described by the IBE as ‘a need to draw a line between the desire to deliver good outcomes for most people and willingness to accept the need to explain how and why poor decisions are made for a few’38. The House of Lords Select Committee on AI stated that ‘it is not acceptable to deploy any artificial intelligence system which could have a substantial impact on an individual’s life, unless it can generate a full and satisfactory explanation for the decisions it will take’. Boards and development teams face a choice between using technologies which can or can not be easily explained.
These are complex issues, but ultimately boards need to have an acute appreciation of the risks being taken, their appetite and risk management strategies. Arguably they also ought to have a very clear AI governance framework, with checks and controls, which is deployed throughout the business. The companies that are leading research into AI in the US and China, including Google, Amazon, Microsoft, Baidu, SenseTime and Tencent, have taken very different approaches on issues such as whether to develop technology that can ultimately be used for purposes such as for military and surveillance purposes. For instance, Google has said it will not sell facial recognition services to governments, while Amazon and Microsoft both do so.
Importantly, although the buck stops with directors, corporate governance is not delivered by boards alone. The awareness and understanding of these complex considerations is something which needs to exist throughout organisations. Some companies have already started to role out education and awareness programmes to bridge the gap. In some cases external oversight boards have been established to provide a second pair of eyes – although the jury is still out on the effectiveness of these after a number of high profile false starts. Whichever methods are employed, fundamentally the taxonomy of transparency, accountability, privacy and the rule of law ought to be common place in all stages of development and decision making.
Corporate culture and responsibility
Playing heavily into literacy is the role of corporate culture. It has been argued that society’s primary challenge today is to cope with the effects of accelerated innovation and the disruptive technologies it generates.39 At the forefront of this challenge are the businesses driving the technological development. The social impact of the decisions made internally can be far reaching, in ways which were not anticipated – one only needs to consider the dark web, or the mental health impacts of social media.
Understanding and accepting this form of moral responsibility, as opposed to focusing on legal responsibility, is quite distinct from traditional models of compliance training. A reflection on the difference is the degree to which some of these internal dialogues might also need to be external – engaging with other disciplines, between firms in the sector, with governments and importantly civil society. The appreciation of the value of external input, collaboration for the overall benefit of all, and reflecting on the input of civil society is for some in the sector likely to be a significant change in governance approach.
Aside from pressure from regulators and consumers, internally boards are facing a more engaged workforce. For instance, Googlers have taken to streets to protest against the company’s involvement in work such as ‘Project Dragonfly’, which planned to develop a censored search engine for China, or its involvement with the Pentagon.40
Responsibility extends far in a sector with a complex and global supply chain. A poorly designed algorithm in a component of a sensor unit which was built using poorly collected data, and which in turn is unable to recognise certain types of people could have serious consequences. This is made even more complex when sourcing from countries which do not share the same views on issues such as data privacy. In China for instance, CloudWalk, Yitu and SenseTime have all partnered with the Chinese government to roll out facial recognition and predictive policing, particularly among minority groups such as the Uighur Muslim.41
A further area of potential weakness relates to the maturity of the companies in question. The field of AI is synonymous with start up’s, where regulatory gap runs the risk of being profound. Many fly under the radar until such a point that the technology developed takes off - but the speed at which this can take place outstrips the norm business development curve which most regulators can deal with. Therefore it leaves only the corporate culture and governance to act as a check and balance on quality and ethics – a challenge for companies which are unlikely to have a very mature governance framework.
These differences in approach need to be both known, understood and then manged within the supply chain. Albeit interrogating the supply chain is difficult and time consuming, it can be achieved, and ought to remain high on the risk register.
Ultimately governance failures are bad for business. As Ginni Rometty of IBM has said, companies are judged not just by how they use data, but whether they are trusted stewards of other people’s data42, this concept of stewardship can be applied much more broadly. The Institute of Business Ethics argues that there is a clear link between taking an ethical approach and competitiveness. The case seems straight forward, however past governance failures, such as the VW emissions scandal prove that there are fault lines hidden amongst corporate culture. With a prize of such large numbers at stake in the AV sector, the pressure to hold fast to ethical principles will remain a potential pitfall – made all the more vulnerable by the current lack of clear regulatory frameworks and standards.