• About
  • Advertise
  • Privacy & Policy
  • Contact Us
Tuesday, April 21, 2026
Dubai News TV
  • UAE
    • Abu Dhabi
    • Dubai
    • Ajman
  • REGION
    • Middle East
    • GCC
    • MENA
      • Syria
    • Asia
      • Afghanistan
      • Bangladesh
      • India
      • Iran
      • Israel
      • Pakistan
      • Sri Lanka
    • Africa
    • Europe
  • REAL ESTATE
  • Opinion
    • EDITOR’S CHOICE
    • The Big Read
    • Viewpoint
    • EXCLUSIVE
  • World
  • Business
    • Local Business
    • Markets
  • TECH
  • HEALTH
  • Horoscope
  • PR
  • Sports
No Result
View All Result
  • UAE
    • Abu Dhabi
    • Dubai
    • Ajman
  • REGION
    • Middle East
    • GCC
    • MENA
      • Syria
    • Asia
      • Afghanistan
      • Bangladesh
      • India
      • Iran
      • Israel
      • Pakistan
      • Sri Lanka
    • Africa
    • Europe
  • REAL ESTATE
  • Opinion
    • EDITOR’S CHOICE
    • The Big Read
    • Viewpoint
    • EXCLUSIVE
  • World
  • Business
    • Local Business
    • Markets
  • TECH
  • HEALTH
  • Horoscope
  • PR
  • Sports
No Result
View All Result
Dubai News
No Result
View All Result
  • Top News
  • UAE
  • Dubai
  • World
  • Business
  • GOLD/FOREX
  • REGION
  • REAL ESTATE
  • FEATURED
  • EDITOR’S CHOICE
  • ENVIRONMENT
  • Road To Financial Freedom
  • Health
  • Sports

How do you know when AI is powerful enough to be dangerous? Regulators try to do the math

by Web Desk
2 years ago
in Tech, Technology, Top News
How do you know when AI is powerful enough to be dangerous? Regulators try to do the math
Share on FacebookShare on TwitterShare on Whatsapp

How do you know if an artificial intelligence system is so powerful that it poses a security danger and shouldn’t be unleashed without careful oversight?

For regulators trying to put guardrails on AI, it’s mostly about the arithmetic. Specifically, an AI model trained on 10 to the 26th floating-point operations per second must now be reported to the U.S. government and could soon trigger even stricter requirements in California.

Say what? Well, if you’re counting the zeroes, that’s 100,000,000,000,000,000,000,000,000, or 100 septillion, calculations each second, using a measure known as flops.

What it signals to some lawmakers and AI safety advocates is a level of computing power that might enable rapidly advancing AI technology to create or proliferate weapons of mass destruction, or conduct catastrophic cyberattacks.

Those who’ve crafted such regulations acknowledge they are an imperfect starting point to distinguish today’s highest-performing generative AI systems — largely made by California-based companies like Anthropic, Google, Meta Platforms and ChatGPT-maker OpenAI — from the next generation that could be even more powerful.

Critics have pounced on the thresholds as arbitrary — an attempt by governments to regulate math.

“Ten to the 26th flops,” said venture capitalist Ben Horowitz on a podcast this summer. “Well, what if that’s the size of the model you need to, like, cure cancer?”

An executive order signed by President Joe Biden last year relies on that threshold. So does California’s newly passed AI safety legislation — which Gov. Gavin Newsom has until Sept. 30 to sign into law or veto. California adds a second metric to the equation: regulated AI models must also cost at least $100 million to build.

Following Biden’s footsteps, the European Union’s sweeping AI Act also measures floating-point operations per second, or flops, but sets the bar 10 times lower at 10 to the 25th power. That covers some AI systems already in operation. China’s government has also looked at measuring computing power to determine which AI systems need safeguards.

No publicly available models meet the higher California threshold, though it’s likely that some companies have already started to build them. If so, they’re supposed to be sharing certain details and safety precautions with the U.S. government. Biden employed a Korean War-era law to compel tech companies to alert the U.S. Commerce Department if they’re building such AI models.

AI researchers are still debating how best to evaluate the capabilities of the latest generative AI technology and how it compares to human intelligence. There are tests that judge AI on solving puzzles, logical reasoning or how swiftly and accurately it predicts what text will answer a person’s chatbot query. Those measurements help assess an AI tool’s usefulness for a given task, but there’s no easy way of knowing which one is so widely capable that it poses a danger to humanity.

“This computation, this flop number, by general consensus is sort of the best thing we have along those lines,” said physicist Anthony Aguirre, executive director of the Future of Life Institute, which has advocated for the passage of California’s Senate Bill 1047 and other AI safety rules around the world.

Floating point arithmetic might sound fancy “but it’s really just numbers that are being added or multiplied together,” making it one of the simplest ways to assess an AI model’s capability and risk, Aguirre said.

“Most of what these things are doing is just multiplying big tables of numbers together,” he said. “You can just think of typing in a couple of numbers into your calculator and adding or multiplying them. And that’s what it’s doing — ten trillion times or a hundred trillion times.”

For some tech leaders, however, it’s too simple and hard-coded a metric. There’s “no clear scientific support” for using such metrics as a proxy for risk, argued computer scientist Sara Hooker, who leads AI company Cohere’s nonprofit research division, in a July paper.

“Compute thresholds as currently implemented are shortsighted and likely to fail to mitigate risk,” she wrote.

Venture capitalist Horowitz and his business partner Marc Andreessen, founders of the influential Silicon Valley investment firm Andreessen Horowitz, have attacked the Biden administration as well as California lawmakers for AI regulations they argue could snuff out an emerging AI startup industry.

For Horowitz, putting limits on “how much math you’re allowed to do” reflects a mistaken belief there will only be a handful of big companies making the most capable models and you can put “flaming hoops in front of them and they’ll jump through them and it’s fine.”

In response to the criticism, the sponsor of California’s legislation sent a letter to Andreessen Horowitz this summer defending the bill, including its regulatory thresholds.

Regulating at over 10 to the 26th flops is “a clear way to exclude from safety testing requirements many models that we know, based on current evidence, lack the ability to cause critical harm,” wrote state Sen. Scott Wiener of San Francisco. Existing publicly released models “have been tested for highly hazardous capabilities and would not be covered by the bill,” Wiener said.

Share22Tweet14Send

Related Posts

These are difficult times for the world, so what will Pakistan do?
International

These are difficult times for the world, so what will Pakistan do?

April 21, 2026
Iranian attacks
Business and Economy

UAE President makes over 100 calls, drives diplomatic efforts amid Iranian attacks

April 20, 2026
Oil prices head for lowest close over Trump tariffs
Business

Money, lobbyists, inertia: why fossil fuels are so hard to quit

April 20, 2026
Powerful states are trying to sabotage decarbonisation of shipping
International

Powerful states are trying to sabotage decarbonisation of shipping

April 20, 2026
The EU must not wait till Israel starts executing Palestinians
International

The EU must not wait till Israel starts executing Palestinians

April 20, 2026
What is really happening in northern Nigeria
International

What is really happening in northern Nigeria

April 20, 2026
Load More
  • The process of justice must be observed in ICC Prosecutor Karim Khan’s case

    The process of justice must be observed in ICC Prosecutor Karim Khan’s case

    53 shares
    Share 21 Tweet 13
  • Sheikh Mohammed issues new law to enhance quality, safety of Dubai buildings

    61 shares
    Share 24 Tweet 15
  • Managing personal liquidity in 7 easy steps

    331 shares
    Share 132 Tweet 83
  • Netanyahu sees Lebanon as his last chance for a ‘win’

    53 shares
    Share 21 Tweet 13
  • Your daily horoscope: April 14, 2026

    53 shares
    Share 21 Tweet 13
  • The pope has shown the world how to stand up to Trump

    53 shares
    Share 21 Tweet 13
  • Malayalam Filmmaker Ranjith Arrested in Kerala Over Sexual Assault Complaint

    54 shares
    Share 22 Tweet 14
  • UAE to cut 93% of industrial carbon emissions by 2024

    178 shares
    Share 71 Tweet 45
  • Pakistan PM, military chief head home after Iran war diplomacy blitz

    53 shares
    Share 21 Tweet 13
  • UAE announces readiness for in-person learning; some nurseries reopen this week

    53 shares
    Share 21 Tweet 13
United Arab Emirates Dirham Exchange Rate

About Dubai News TV

Dubai News is an English language news and current affairs digital TV channel established to provide round-the-clock news, information, and knowledge about local, regional, and international events. It covers a wide range of topics, including politics, business, technology, culture, and sports, ensuring viewers stay informed and engaged with the latest developments. The channel aims to deliver accurate, unbiased reporting and insightful analysis, catering to a diverse audience with a global perspective.

Categories

  • Abu Dhabi (43)
  • Afghanistan (32)
  • Africa (29)
  • Ajman (5)
  • Artificial Intelligence (5)
  • Asia (82)
  • Bangladesh (87)
  • Business and Economy (773)
  • Cricket (11)
  • Donald Trump (6)
  • Dubai (161)
  • EDITOR'S CHOICE (10)
  • Education (29)
  • Entertainment (1,943)
  • ENVIRONMENT (13)
  • Europe (91)
  • EXCLUSIVE (4)
  • FEATURED (41)
  • Featured Stories (40)
  • Global Business (2,258)
  • Gold & Forex (1)
  • Healthcare (9)
  • heath (10)
  • Horoscope (670)
  • Hospitality (1)
  • India (177)
  • International (8,631)
  • Iran (26)
  • Israel (18)
  • Israel-Palestine conflict (76)
  • Life Style (1)
  • Lifestyle (1,372)
    • Health (8)
  • Local Business (1,635)
  • Markets (13)
  • MENA (818)
  • Military & Defense (8)
  • News (11,488)
    • Business (2,168)
    • Politics (13)
    • World (8,701)
      • Foods (1)
      • Games (2)
      • Travel (6)
  • Opinion (26)
  • Outreach Initiatives (1)
  • Pakistan (287)
  • Personal Finance (7)
  • Philippine (11)
  • Philippines (7)
  • PR (157)
  • REAL ESTATE (170)
  • REGION (4,256)
    • GCC (210)
    • Middle East (3,324)
  • Road To Financial Freedom (7)
  • Russia (28)
  • Russia-Ukraine war (73)
  • Saudi Arabia (16)
  • Sharjah (12)
  • South Asia (91)
  • Sports (1,232)
  • Sri Lanka (45)
  • Startup (7)
  • Syria (7)
  • Tech (500)
  • Technology (491)
  • The Big Read (6)
  • Top News (24,823)
  • turkey (9)
  • TV Shows (7)
  • UAE (6,866)
  • Uncategorized (10)
  • Video Posts (11)
  • Viewpoint (8)

Latest News

These are difficult times for the world, so what will Pakistan do?
International

These are difficult times for the world, so what will Pakistan do?

by News Desk
April 21, 2026
0

The recent statements by US President Donald Trump and the shuttle diplomacy carried out by senior Pakistani military and government...

Read moreDetails
Your daily horoscope: April 20, 2026

Your daily horoscope: April 20, 2026

April 20, 2026
Iranian attacks

UAE President makes over 100 calls, drives diplomatic efforts amid Iranian attacks

April 20, 2026
  • About
  • Advertise
  • Privacy & Policy
  • Contact Us

© 2024 Dubai News TV - Powerd by Global Biz International.

No Result
View All Result
  • Top News
  • UAE
  • Dubai
  • World
  • Business
  • GOLD/FOREX
  • REGION
    • South Asia
      • Pakistan
      • India
    • GCC
    • Middle East
  • REAL ESTATE
  • FEATURED
    • Featured Stories
  • EDITOR’S CHOICE
    • The Big Read
    • Viewpoint
  • ENVIRONMENT
  • Road To Financial Freedom
  • Health
  • Sports

© 2024 Dubai News TV - Powerd by Global Biz International.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.