Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

AI regulators in UK are ‘under-resourced’, warns science committee chairman

Outgoing science committee chair Greg Clark has said AI regulators are under-resourced (James Manning / PA).
Outgoing science committee chair Greg Clark has said AI regulators are under-resourced (James Manning / PA).

Artificial intelligence regulators in the UK are “under-resourced” in comparison to developers of the technology, the Commons science committee chairman has warned.

The Science, Innovation and Technology Committee said in a report into the governance of AI that £10 million announced by the Government in February to help Ofcom and other regulators respond to the growth of the technology was “clearly insufficient”.

It added that the next government should announce further financial support “commensurate to the scale of the task”, as well as “consider the benefits of a one-off or recurring industry levy” to help regulators.

Outgoing committee chairman Greg Clark said he was “worried” that UK regulators were “under-resourced compared to the finance that major developers can command”.

The report, published on Tuesday, also expressed concern at suggestions the new AI Safety Institute has been unable to access some developers’ models to perform pre-deployment safety testing that was intended to be a major focus of its work.

The committee has called on the next government to name any developers that refused access — in contravention of the agreement at the November 2023 summit at Bletchley Park — and report their justification for refusing.

It adds that the Government and regulators should safeguard the integrity of the election campaign by taking “stringent enforcement action” against online platforms hosting deepfake content which “seeks to exert a malign influence on the democratic process”.

Former business secretary Mr Clark said it was important to test the outputs of AI models for biases “to see if they have unacceptable consequences”, as biases “may not be detectable in the construction of models”.

Commenting on the report, Mr Clark said: “The Bletchley Park summit resulted in an agreement that developers would submit new models to the AI Safety Institute.

“We are calling for the next government to publicly name any AI developers who do not submit their models for pre-deployment safety testing.

“It is right to work through existing regulators, but the next government should stand ready to legislate quickly if it turns out that any of the many regulators lack the statutory powers to be effective.

“We are worried that UK regulators are under-resourced compared to the finance that major developers can command.”

In its report, the committee states that the “most far-reaching challenge” of AI may be the way it can operate as a “black box” – in that the basis of, and reasoning for, its output may be unknowable.

The MPs add that if a chain of reasoning cannot be viewed, there must be stronger testing of the outputs of AI models as a means to assess their power and acuity.

The committee states that the conclusions and recommendations of the report apply to whoever is in government after the General Election on July 4.

In its last report of the current Parliament on the topic, the committee writes: “It is important that the timing of the General Election does not stall necessary efforts by the Government, developers and deployers of AI to increase the level of public trust in a technology that has become a central part of our everyday lives.”

It adds that any new government should be ready to produce AI-specific legislation should the current approach “prove insufficient to address current and potential future harms associated with the technology”.

The Department for Science, Innovation and Technology said the UK was taking steps to regulate AI and upskilling regulators as part of a wider £100 million funding package.