Tony Blair and Nick Clegg Host Dinner to Connect Tech Leaders with UK Ministers

Earlier this year, Tony Blair and Nick Clegg organized a private dinner where a group of technology entrepreneurs had the opportunity to meet influential ministers, as revealed by official documents.

As a long-time supporter of the tech industry, the former prime minister hosted this dinner at a high-end hotel in London, representing the Tony Blair Institute (TBI) political consultancy.

Together with former deputy prime minister Mr. Clegg, who was a senior executive at Meta at the time, they invited leaders from six tech firms, including Poppy Gustafsson, the government’s investment minister tasked with encouraging businesses to invest in the UK.

Mr. Blair is a passionate advocate for the transformative potential of technology in public services and has actively sought partnerships with industry leaders. His consultancy has produced several policy papers that advocate for placing artificial intelligence at the core of government initiatives.

However, some critics express concerns that Prime Minister Blair, known for his close ties to Keir Starmer’s administration, has been able to influence the agenda without adequate public oversight. There are also questions surrounding the reliance of Blair’s consultancy on significant contributions from Silicon Valley’s billionaire Larry Ellison, an acquaintance of Donald Trump and Elon Musk.

Mr. Ellison, who briefly claimed the title of the world’s richest person this year, has donated or committed over $300 million to Mr. Blair’s consultancy.

Documents obtained by the Guardian through freedom of information laws reveal that 12 attendees discussed the government’s evolving stand on artificial intelligence at a gathering deemed a “salon dinner.”

Do you have any information about this story? Email henry.dyer@theguardian.com or message (using your non-work phone) Signal or WhatsApp to +44 7721 857348. For the most secure communications, visit theguardian.com/tips.

The dinner took place at the luxurious Corinthia Hotel in late January and featured Ron Jaffe, managing director of Insight Partners, a US venture capital firm investing in tech companies.

Also present was Alex Kendall, CEO of Wave, focused on self-driving cars, along with Nigel Thune, head of Graphcore, a computer chip manufacturer, and Mark Warner, CEO of Faculty AI, who last year collaborated with TBI to produce a document on leveraging AI to enhance public services.

A representative for Mr. Clegg noted, “During his tenure at Meta, Nick Clegg frequently interacted with government ministers and other tech CEOs, which is standard for his role in policy and global affairs.”

According to a TBI spokesperson, “The event featured discussions with ministers about various issues among tech leaders. No companies were charged to attend.”

Companies that participated asserted they do not contribute to or employ TBI.

This dinner illustrates how Prime Minister Blair’s consultancy is advancing pro-technology policies. The rapidly growing TBI is active in 45 countries and employs over 900 staff members. Its most recent financial statement reported revenues of $145 million in 2022 from advisory services and donations, although many donors and clients remain undisclosed.

The consultancy faces criticism for potentially allowing donor interests to influence its policy positions, a claim it disputes. It has also been criticized for continuing financial ties with Saudi Arabia following the murder of journalist Jamal Khashoggi in 2018. Blair is anticipated to play a significant role in the reconstruction of Gaza following the war.

Weeks prior to the dinner, the government confidentially provided an outline of an AI action plan to TBI shortly before it was set to be publicly released. On January 9, Ferial Clark, then Minister of Science, Innovation and Technology (DSIT), made this request at the prompting of TBI’s Director of Science Policy, Jacob Mokander.

The following day, an aide to Mr. Clark reached out to Mr. Mokander stating, “It was a pleasure speaking with you. As a follow-up, here’s the top-secret action plan summary. Thank you for expanding the plan through your networks and supportive quotes on Monday.” Mr. Mokander replied, “Thanks for sharing the action plan (confidentiality).” Blair endorsed the action plan on January 13, which aims to bolster the UK’s role in AI development and deployment.

When asked why this document was shared with TBI so early, a DSIT spokesperson stated, “We cannot apologize for our regular engagement with stakeholders. It’s standard to share embargoed information with them ahead of publication.”

A spokesperson for TBI remarked, “It’s typical for governments to consult experts and engage various stakeholders when crafting policy. As indicated in the footnotes, the AI Opportunity Action Plan accurately references our published work.”

quick guide

How to contact the research team






show


The best public interest journalism relies on first-hand reporting from those in the know.

If you have something to share regarding this matter, please contact our research team confidentially by:

Secure messaging in the Guardian app

The Guardian app has a tool to submit story tips. Messages are end-to-end encrypted and hidden within the daily activities performed by all Guardian mobile apps. This prevents observers from knowing that you are communicating with us, much less what you are saying.

If you don’t already have the Guardian app, please download it (iOS/Android) to go to the app menu. Select ‘Secure Messaging’, follow the instructions to compose your message, and select ‘UK Research Team’ as the recipient.

SecureDrop, instant messenger, email, phone, mail

If you can safely use the Tor network without being monitored or observed, you can send messages and documents to Guardians through the SecureDrop platform.

Finally, the guide at theguardian.com/tips lists several ways to contact us safely and discusses the pros and cons of each.

Illustration: Guardian Design/Rich Cousins

Thank you for your feedback.


Source: www.theguardian.com

UK Minister’s Advisor Contends that AI Companies Should Not Compensate Creatives

Aides to the Senior Minister stated that AI firms would not be required to compensate creators for using their content to train their systems.

Kirsty Innes, recently appointed as special advisor to Liz Kendall, Secretary of State for Science, Innovation and Technology, remarked, “Philosophically, whether you believe large companies should compensate content creators or not, there is no legal obligation to do so.”

The government is currently engaged in discussions regarding how creatives should be compensated by AI companies. This has prompted well-known British artists, including Mick Jagger, Kate Bush, and Paul McCartney, to urge Kiel’s predecessors to advocate for the rights of creators and safeguard their work.

Innes, who previously worked with the Tony Blair Institute (TBI) Think Tank, has since deleted her statement. In a post on X from February—before her ministerial appointment—she commented:

Additionally, she stated:

The TBI received donations amounting to $270 million (£220 million) last year from Oracle Tech billionaire Larry Ellison. Oracle is backing the $600 million Stargate project aimed at developing AI infrastructure in the U.S. with OpenAI and the Japanese investment firm SoftBank.

Since Labour initiated consultations on copyright law reform, tensions have arisen with the UK creative community. The suggested approach to allow AI companies to utilize copyrighted works without permission from their owners indicates a desire to circumvent the process.

Some content creators, such as The Guardian and The Financial Times, have entered agreements with OpenAI, the creators of ChatGPT, to license and monetize their content for use in the San Francisco startup.

The government now asserts that it no longer prefers requiring creatives to opt out and has assembled working groups from both the creative and AI sectors to develop solutions to the issues at hand.

Ed Newton-Rex, founder of Foally Trained—a nonprofit dedicated to respecting creative rights and certifying generative AI companies—emphasizes his advocacy for creative rights.

“I hope she takes this into account with her advisor. This perspective aligns more closely with public sentiment, which is rightly concerned about the implications of AI and the influence of large technology firms.”

Skip past newsletter promotions

He noted that Kendall’s appointment presents an opportunity to redefine a relationship that has become increasingly complicated between the creative industry and the dominance of big technology companies. This move appears to reflect the demands from Big AI firms for copyright reform without the obligation to compensate creators.

Both Innes and Kendall chose not to comment.

Beevan Kidron, a crossbench peer who has rallied against the loosening of copyright laws, stated, “Last week, Creative sent a letter to the Prime Minister asking him to clearly define the rights of individuals in their scientific and creative endeavors, especially in terms of the unintended consequences of government policies that overlook British citizenship.”

Source: www.theguardian.com

OpenAI Leaders and Ministers Discuss UK-Wide ChatGPT Plus Initiatives | Peter Kyle

The leader of the organization behind ChatGpt and the UK’s tech secretary recently engaged in discussions about a multi-billion-pound initiative to offer premium AI tool access across the nation, as reported by The Guardian.

Sam Altman, OpenAI’s co-founder, had conversations with Peter Kyle regarding a potential arrangement that would enable UK residents to utilize its sophisticated products.

Informed sources indicate that this concept emerged during a broader dialogue about the collaborative opportunities between OpenAI and the UK while in San Francisco.

Individuals familiar with the talks noted that Kyle was somewhat skeptical about the proposal, largely due to the estimated £2 billion cost. Nonetheless, the exchange reflects the Technology Secretary’s willingness to engage with the AI sector, despite prevailing concerns regarding the accuracy of various chatbots and issues surrounding privacy and copyright.

OpenAI provides both free and subscription versions of ChatGPT, with the paid ChatGPT Plus version costing $20 per month. This subscription offers quicker response times and priority access to new features for its users.

According to transparency data from the UK government, Kyle dined with Altman in March and April. In July, he formalized an agreement with OpenAI to incorporate AI into public services throughout the UK. These non-binding agreements could grant OpenAI access to government data and potential applications in education, defense, security, and justice sectors.

Secretary of State Peter Kyle for Science, Innovation and Technology. Photo: Thomas Krych/Zuma Press Wire/Shutterstock

Kyle is a prominent advocate for AI within the government and incorporates its use into his role. In March, it was revealed he consulted ChatGPT for insights on job-related inquiries, including barriers to AI adoption among British companies and his podcast appearances.

The minister expressed in January to Politicshome:

The UK stands among OpenAI’s top five markets for paid ChatGPT subscriptions. An OpenAI spokesperson mentioned: [a memorandum of understanding] aims to assess how the government can facilitate AI growth in the UK.

“In line with the government’s vision of leveraging this technology to create economic opportunities for everyday individuals, our shared objective is to democratize AI access. The wider the reach, the greater the benefits for everyone.”

Recently, the company has been in talks with several governments, securing a contract with the UAE for using technology in public sectors like transportation, healthcare, and education to enable nationwide ChatGPT adoption.

The UK government is eager to draw AI investment from the USA, having established a deal with OpenAI’s competitor Google earlier this year.

Kyle stated that in the next ten years, the establishment of a new UN Security Council will be significantly influenced by technology, especially AI, which he believes will play a fundamental role in determining global power dynamics.

Skip past newsletter promotions

Similar to other generative AI tools, ChatGPT is capable of generating text, images, videos, and music upon receiving user prompts. This functionality raises concerns about potential copyright violations, and the technology has faced criticism for disseminating false information and offering poor advice.

The minister has expressed support for planned amendments to copyright law that would permit AI companies to utilize copyrighted materials for model training, unless the copyright holder explicitly opts out.

The consultations and reviews by the government have sparked claims from creative sectors that the current administration is too aligned with major tech companies.

Ukai, the UK’s foremost trade organization for the AI industry, has repeatedly contended that the government’s strategy is overly concentrated on large tech players, neglecting smaller entities.

A government representative stated, “We are not aware of these allegations. We are collaborating with OpenAI and other leading AI firms to explore investment in UK infrastructure, enhancing public services, and rigorously testing the security of emerging technologies before their introduction.”

The Science and Technology Division clarified that discussions regarding the accessibility of ChatGPT Plus to UK residents have not advanced, nor have they conferred with other departments on the matter.

Source: www.theguardian.com

Proposed phone bill for young teens faces opposition from government ministers, sparking safety concerns

After facing opposition from education secretaries Peter Kyle and Bridget Phillipson, the bill seeking to ban addictive smartphone algorithms targeting young teenagers was weakened.

The Safer Phone Bill, introduced by Labour MP Josh McAllister, is set to be discussed in the Commons on Friday. Despite receiving support from various MPs and child protection charities, the government has opted to further investigate the issue rather than implement immediate changes.

Government sources indicate that the new proposal will be accepted, as the original bill put forward by McAllister did not receive ministerial support.

The government believes more time is needed to assess the impact of mobile phones on teenagers and to evaluate emerging technologies that can control the content produced by phone companies.

Peter Kyle opposes the major bill, which would have been the second online safety law some advocates were hoping for.

Although not fundamentally against government intervention on this issue, a source close to Kyle mentioned that the work is still in its early stages.

The original proposal included requirements for social media companies to exclude young teens from their algorithms and limit addictive content for those under 16. However, these measures were removed from the final bill.

Another measure to ban mobile phones in schools was also dropped after objections from Bridget Phillipson, who believes schools should self-regulate. There are uncertainties regarding potential penalties for violations.

Health Secretary Wes Streeting has been vocal about addressing the issue of addictive smartphones, publicly supporting McAllister’s bill.

The revised Private Membership Bill instructs Chief Medical Officer Chris Whitty to investigate the health impacts of smartphone use.


McAllister hopes that the bill will prompt the government to address addictive smartphone use among children more seriously, rather than just focusing on harmful or illegal content.

If the Minister commits to adopting the new measures as anticipated, McAllister will not push for a vote on the bill.

The government has pledged to “publish a research plan on the impact of social media use on children” and seek advice from the UK’s chief medical officer on parents’ management of their children’s smartphone and social media usage.

Polls indicate strong public support for measures restricting young people’s use of social media, with a majority favoring a ban on social media for those under 16.

Source: www.theguardian.com

Britain postpones AI regulation as ministers aim to align with Trump administration

Ministers have postponed the regulation of artificial intelligence in line with the Trump administration, as reported by The Guardian.

Three labor sources revealed that the AI bill, originally planned for release before Christmas, is now expected to be delayed until summer.

The Minister had intended to issue concise invoices shortly after taking office.

The bill aims to address concerns about the potential risks of advanced AI models to humanity and to clarify the use of copyrighted materials by AI companies, differing from individual suggestions.

However, Trump’s election prompted a reconsideration of the bill. Senior labor sources said the bill was being carefully reconsidered, and there are no firm proposals yet on its content. The source added that they had aimed to pass it before Christmas, but it is now delayed until summer.

Another labor source, familiar with the legislation, mentioned that earlier drafts of the bill had been prepared months ago, but they are now being held back due to Trump’s actions, which could negatively impact British businesses. They expressed reluctance to proceed without addressing these concerns.

Trump’s actions have undermined Biden’s plans for AI regulation, including revoking an executive order aimed at ensuring technology safety and reliability. The future of the US AI Safety Institute is uncertain following the resignation of its director. Additionally, US Vice President JD Vance opposed planned European technical regulations at the AI Summit in Paris.

The UK government opted to align with the US by not signing the Paris Declaration endorsed by 66 other countries at the summit. UK Ambassador to Washington Peter Mandelson reportedly proposed making the UK a major US AI investment hub.

During a December committee meeting, Science and Technology Secretary Peter Kyle hinted that the AI bill was in advanced stages. However, Science Minister Patrick Balance stated earlier this month that there is no bill currently in place.

A government spokesperson stated, “This government remains committed to enacting legislation that will ensure the safe realization of the significant benefits of AI for years to come.

“We are actively engaged in refining our proposals for publication soon to ensure an effective approach against this rapidly evolving technology. Consultations will soon commence.”

The Minister faces pressure regarding individualized plans to allow AI companies access to online materials, including creative works for training models without requiring copyright permission.

Skip past newsletter promotions

Artists like Paul McCartney and Elton John have criticized this move, warning that it could undermine traditional copyright laws protecting artists’ livelihoods.

Source: www.theguardian.com

AI tools can offer ministers a ‘tone check’ on the reception of their policies by MPs

A new artificial intelligence tool known as Parlex can notify ministers about the potential unpopularity of a policy within their own party through a search called “parliamentary mood check.”

Parlex is just one of the AI tools being developed for ministers and civil servants to anticipate issues with backbenchers and pinpoint supportive legislators.

By inputting a policy outline like a 20 mph speed limit, the tool can predict how legislators will respond based on their past contributions in Congress. A demonstration video on the government website demonstrates historical opposition from Conservative MPs and support from Labour MPs for traffic calming measures.

Described as a “vibe check,” the tool helps policy teams understand the political landscape and develop response strategies before formally proposing a policy in Congress.

According to a report by The Times, key MPs like Iain Duncan Smith and former MP Tobias Ellwood oppose the 20mph limit, while Labour MP Kerry McCarthy supports traffic calming measures.

The tool is expected to be more beneficial for civil servants than ministers who should already possess a good understanding of congressional views.

Chancellor Keir Starmer recently announced an AI plan involving significant investments in Britain’s computing capacity to integrate the technology into the nation’s infrastructure.

The government’s initiatives include releasing public data to foster AI businesses, including anonymized NHS data for research and innovation purposes with strong privacy safeguards in place.

Ministers believe AI could stimulate Britain’s economic growth and generate an estimated economic boost of up to £470bn over the next decade.

Parlex is just one of many AI tools being developed within the government, with other tools like Redbox aimed at automating document analysis for civil servants.

Skip past newsletter promotions

The tool will soon be available to all civil servants in the Cabinet Office and DSIT, aiming to streamline manual processes and increase efficiency.

Another program called Consult generates revenue by automating consultation processes, allowing civil servants to better analyze and act on public opinion.

The Department for Work and Pensions has also utilized AI, including ‘whitemail’ to analyze letters received daily and communicate information more effectively.

However, challenges have arisen, such as inaccuracies in identifying housing benefit fraud suspects due to government algorithms underperforming.

Source: www.theguardian.com

Ministers around the world becoming targets of Russian hackers on WhatsApp | Breached

Government-linked hackers from Russia targeted WhatsApp accounts of government officials worldwide by sending emails inviting them to join user groups on the messaging app.

This tactic by a hacking group called Star Blizzard is a new approach. The UK’s National Cyber Security Center (NCSC) has connected Star Blizzard to Russia’s FSB domestic spy agency, accusing them of trying to undermine trust in politics in the UK and similar countries.

According to Microsoft, victims would receive an email from an attacker posing as a US government official, instructing them to click on a QR code. This action would allow the attacker to access their WhatsApp account, connecting it to a linked device or WhatsApp web portal instead of a group.

Microsoft stated, “Threat actors gain access to messages within WhatsApp accounts and the ability to exfiltrate this data.”

The fake email invited recipients to join a WhatsApp group about supporting NGOs in Ukraine. Ministers and officials from various countries, especially those involved in Russia-related affairs, defense policy, and Ukraine support, were targeted.

In 2023, NCSC revealed that Star Blizzard had targeted British MPs, universities, and journalists to interfere with British politics. The group is likely affiliated with Russia’s FSB Center 18 unit.

Microsoft warned that despite the WhatsApp campaign ending in November, Star Blizzard continues to use spear phishing tactics to steal sensitive information.


Microsoft advised targeted sectors to be cautious with emails, especially those with external links. They recommend verifying email authenticity by contacting the sender through a known email address.

WhatsApp, owned by Meta, offers end-to-end encryption, ensuring message privacy between sender and recipient unless account access is compromised.

A WhatsApp spokesperson emphasized using official WhatsApp-supported services for account linking and caution when clicking links from trusted sources only.

Source: www.theguardian.com

UK Government Ministers Officially Announce Ban on Mobile Phone Use in Schools

Ministers have confirmed plans to ban the use of mobile phones in English schools and have published guidance for headteachers, which some unions believe includes practices that are already widely adopted.

One headteacher welcomed the Department for Education’s (DfE) plan, saying it would help give schools the confidence to make changes that would benefit pupils, even if it may be met with opposition from parents.

This non-statutory guidance offers schools a range of potential ways to enforce the ban, from leaving cell phones at home to storing them in inaccessible lockers, and aims to address the distraction and concerns about potential bullying and social pressures caused by the prevalence of smartphones in schools.

Education Secretary Gillian Keegan stated that the guidance aims to “empower” schools that do not currently ban phones and to “provide clarity and consistency.” The guidance emphasizes the importance of schools being places for learning, interaction, and friendship rather than the constant use of cell phones.

There are also concerns about children’s access to harmful content on phones, leading to calls for technology companies and mobile phone manufacturers to take action.

The 13-page DfE guidance states that telephone policies should be clearly communicated to students and explain the reasons behind them, while also involving parents in the ban.

Geoff Barton, general secretary of the Association of School and College Leaders, expressed concerns about the amount of time some children spend on their phones and stated that the new guidance is not impactful, as most schools already have policies in place to address mobile phone use.

Chairman of two schools in Essex, Vic Goddard, mentioned that Passmores Academy had introduced a total phone ban, which was well-received by both parents and students, and that this guidance will be helpful for schools to address potential conflicts with parents.

Source: www.theguardian.com

Ministers urge update of computer evidence laws to prevent another Horizon case

Legal experts are calling for immediate changes to the law to recognize that the computer was at fault, otherwise risking a repeat of the Horizon incident.

Under English and Welsh law, computers are presumed to be ‘trusted’ unless proven otherwise, leading to criticism that it reverses the burden of proof in criminal cases.

Stephen Mason, a barrister and electronic evidence expert, stated, “If someone says, ‘There’s something wrong with this computer,’ they’re supposed to have to prove it, even if it’s the person accusing them who has the information.”


Mason, along with eight other legal and computer experts, proposed changes to the law in 2020 after the High Court’s ruling against the Post Office. However, their recommendations were never implemented.

The legal presumption of computer reliability comes from the old common law principle that “mechanical instruments” should be presumed to be in good working order unless proven otherwise.

An Act in 1984 ruled that computer evidence was admissible only if it could be shown that the computer was working properly, but this law was repealed in 1999.

The international influence of English common law means that the presumption of reliability is widespread, with examples from New Zealand, Singapore, and the United States supporting this standard.

Noah Weisberg, CEO of legal AI platform Zuva, emphasized the urgency of re-evaluating the law in the context of AI systems and the need to avoid assuming error-free computer programs.

Weisberg also stated, “It would be difficult to say that it would be reliable enough to support a conviction.”

James Christie, a software consultant, suggested two stages of changes to the law, requiring those providing evidence to demonstrate responsible development and maintenance of the system, as well as disclosing records of known bugs.

The Ministry of Justice declined to comment on the matter.

Source: www.theguardian.com