UK Government-Linked AI Firm Developing Technology for Military Drones

Concerns Raised Over Faculty AI's Work with Defense Industry After Partnerships with NHS and AI Safety Institute

A British artificial intelligence startup with close ties to the UK government, Faculty AI, is developing AI technology for military drones, raising ethical concerns. Faculty AI, which has previously worked with the NHS and the government's AI Safety Institute (AISI), is reportedly developing and deploying AI models onto unmanned aerial vehicles (UAVs), according to a defense industry partner, Hadean.



Faculty AI distinguishes itself from companies like OpenAI by focusing on reselling and consulting on the use of existing AI models, particularly those from OpenAI, rather than developing its own foundational models. The company gained prominence through its work on data analysis for the Vote Leave campaign and subsequent collaborations with Boris Johnson's government, including pandemic-related projects and the inclusion of its CEO, Marc Warner, in government scientific advisory meetings. More recently, Faculty has conducted AI model testing for the AISI, established under former Prime Minister Rishi Sunak.

While the exact nature of Faculty's defense work remains partially undisclosed due to confidentiality agreements, a press release from Hadean, a London-based startup partnering with Faculty, mentioned collaboration on "subject identification, tracking object movement, and exploring autonomous swarming development, deployment and operations." It is understood that this specific work did not involve weapons targeting. However, Faculty declined to comment on whether its work includes drones capable of lethal force. A Faculty spokesperson stated that the company helps develop AI models for "safer, more robust solutions" for its defense partners, adheres to "rigorous ethical policies and internal processes," and follows Ministry of Defence ethical guidelines on AI. They also emphasized Faculty's decade of experience in AI safety, including work on combating child sexual abuse and terrorism.

The development of AI for military drones raises significant ethical and legal questions, particularly regarding the potential for autonomous weapons systems. Experts and politicians have urged caution in introducing such technologies, with a House of Lords committee advocating for international agreements to clarify the application of humanitarian law to lethal drones. The Green party has called for a complete ban on lethal autonomous weapons systems. The Scott Trust, owner of the Guardian, holds a minority share in Faculty through its investment in Mercuri VC (formerly GMG Ventures).
Previous Post Next Post

نموذج الاتصال