• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

Tiongkok makes first global achievement that could impact certain US policy: 'Keen to show the world that their program is unstoppable'

k1976

Alfrescian
Loyal
Joined
May 16, 2023
Messages
43,821
Points
113

China Makes Huge Nuclear Leap in World First for Clean Energy


PUBLISHED
NOV 25, 2025 AT 05:21 AM EST
thumbnail.webp

0:00 / 0:57

By Sam Stevenson

Associate News Editor
3
Share
Newsweek is a Trust Project member

China has launched the world's first commercial supercritical carbon dioxide (sCO2) power generator, an innovative clean-energy breakthrough developed by the China National Nuclear Corporation (CNNC).
A pioneering power generator that uses carbon dioxide instead of steam to transfer heat has been hooked up to the grid at a steel plant in Guizhou, southwestern China. The system converts waste heat into electricity, according to a November 10 post by CNNC's Nuclear Power Institute of China, the Hong Kong-based South China Morning Postreported.


Why It Matters​

China's rapid advances in nuclear technology signal a potentially transformative shift in the global energy landscape.
With game-changing developments such as the world's first commercial sCO2 power generator and successful thorium-to-uranium breeding in a molten salt reactor, China is positioning itself to address both domestic environmental concerns and global challenges in clean energy supply, energy security and technology leadership.
These innovations could reshape nuclear power's safety profile and cost structure, with implications for U.S. energy policy, international competition and future global decarbonization effort
 
China is quickly becoming the global leader in nuclear power, with nearly as many reactors under construction as the rest of the world combined. While its dominance of solar panels and electric vehicles is well known, China is also building nuclear plants at an extraordinary pace. By 2030, China’s nuclear capacity is set to surpass that of the United States, the first country to split atoms to make electricity.

Many of China’s reactors are derived from American and French designs, yet China has overcome the construction delays and cost overruns that have bogged down Western efforts to expand nuclear power.

At the same time, China is pushing the envelope, making breakthroughs in next-generation nuclear technologies that have eluded the West. The country is also investing heavily in fusion, a potentially limitless source of clean power if anyone can figure out how to tame it.

Beijing’s ultimate objective is to become a supplier of nuclear power to the world, joining the rare few nations — including the United States, Russia, France and South Korea — that can design and export some of the most sophisticated machines ever invented.

00cli-china-nuclear-02-mobileMasterAt3x.jpg

A dome being placed on the Unit 1 reactor building of the Zhejiang San’ao nuclear power plant on Zhejiang Province, China, in 2022.


Visual China Group, via Getty Images
“The Chinese are moving very, very fast,” said Mark Hibbs, a senior fellow at the Carnegie Endowment for International Peace who has written a book on China’s nuclear program. “They are very keen to show the world that their program is unstoppable.”

As the United States and China compete for global supremacy, energy has become a geopolitical battleground. The United States, particularly under President Trump, has positioned itself as the leading supplier of fossil fuels like oil, gas and coal. China, by contrast, dominates the manufacturing of solar panels, wind turbines and batteries, seeing renewable power as the multi-trillion-dollar market of the future.

Nuclear power is enjoying a resurgence of global interest, especially as concerns about climate change mount. That’s because nuclear reactors don’t spew planet-warming emissions, unlike coal and gas plants, and can produce electricity around the clock, unlike wind and solar power.
 

Google’s TPU Chip Has Taken AI By Storm and Knocked Nvidia Stock Down. Here’s Everything to Know.​

By Adam Levine
Follow


Updated Nov 28, 2025 11:00 am EST / Original Nov 28, 2025 2:30 am EST



im-91238060
Alphabet CEO Sundar Pichai has helped bring Google back into the AI conversation. (NATHAN LAINE/BLOOMBERG)

Google is suddenly at the center of the artificial-intelligence trade. It started with the celebrated release of the company’s new Gemini 3 AI model, which was trained on Google’s own AI chips. The rally picked up steam on areport from The Information that Meta Platforms was in talks with Google to buy those chips, known as Tensor Processing Units, to fill an artificial-intelligence data center—the domain of Nvidia red-hot graphics processing units, or GPUs.
 

Explained: Difference Between Nvidia GPU And Google TPU​


TOI Tech Desk | TIMESOFINDIA.COM | Dec 27, 2025, 17:26 IST

Explained: Difference between Nvidia GPU and Google TPU

Share


Nvidia lost billions of dollars from its market value after a report claimed that Meta, the parent company of Facebook and Instagram, is partnering with Google to train its AI models on the search giant’s Tensor Processing Units (TPUs).

As one of Nvidia’s largest customers, Meta’s shift toward alternative hardware sent shockwaves through the semiconductor giant, triggering a cascading effect. In this article, we explore the key differences between Nvidia’s GPUs and Google TPUs, while also examining the LPUs from Groq – the startup Nvidia recently acquired in a strategic $20 billion defensive move.
 

Nvidia GPU vs Google TPU: Architecture and purpose​


Nvidia originally designed Graphics Processing Units, or GPUs for rendering 3D graphics. They have thousands of small cores that can handle many tasks simultaneously. Since they are General Purpose, they can be used for AI, gaming, crypto mining and scientific simulations.

Google’s TPU, on the other hand, is an Application-Specific Integrated Circuit which has been specifically designed from the ground up by Google for one specific purpose: speeding up the “tensor” math that powers machine learning.
 
FeatureNvidia GPUGoogle TPU
FlexibilityHigh. Can run almost any AI model or software.Low. Optimized specifically for Deep Learning.
SpeedExcellent, but has overhead for general tasks.Ultra-fast for specific AI training/inference.
EfficiencyConsumes more power per task.Highly energy-efficient for AI workloads.
SoftwareUses CUDA (the industry standard).Optimized for TensorFlow and JAX.


Nvidia GPU vs Google TPU: Availability and access

Nvidia GPUs can be bought. Whether it is a $1,000 gaming card or a $30,000 H100 enterprise chip, companies can purchase them and run them in their own data centres. Larger the number of chips, larger the processing power and better the models’ training.
When it comes to Google TPUs, you cannot cannot buy them. Google does not sell the physical chips, instead offers to ‘rent’ them through Google Cloud Platform, allowing partners to train their models. This means, if you want TPUs, you must stay in Google's ecosystem and cannot run the hardware in your data centre.

Nvidia GPU vs Google TPU: Training and inference​

The critical difference is speed. Nvidia GPUs are ‘king’ of Training , which means teaching an AI model from scratch. Most major AI labs like OpenAI, Meta, xAI and Google itself uses massive clusters of Nvidia H100s/B200s to build their models.

Google TPUs are dominant in Inference – running the model and getting an answer. Since TPUs are designed for speed, they can provide AI “answers” to millions of users simultaneously with less lag, faster than GPUs.
 
One of the reasons is the difference in software. AI researchers know how to use Nvidia’s CUDA as it is the ‘language’ of AI. Moving to Google’s TPU often requires rewriting code or switching frameworks. Google Cloud CEO Thomas Kurian recently said that the company makes its models compatible with Nvidia GPUs and they are a partner.

What does Groq LPUs offer​

Groq Language Processing Unit is a new category of processor with an AI Inference Technology that is claimed to deliver “AI compute speed, quality, and affordability at scale” – just like Google TPUs.

“Groq created and built the LPU from the ground up to meet the unique needs of AI. LPUs run Large Language Models (LLMs) and other leading models at substantially faster speeds and, on an architectural level, up to 10x more efficiently from an energy perspective compared to GPUs,” the company said. By offering LPUs, along with GPUs, Nvidia is essentially aiming to offer power, speed and efficiency at one-stop shop.
 

Google TPUs Vs Nvidia GPUs​

Great Speculations

ByTrefis Team,

Contributor.
Building a platform to do the job of 1 million analysts
for Great Speculations

Follow Author
Sep 11, 2025 at 05:54am EDTSep 12, 2025 at 12:51am EDT

0
US-POLITICS-CONGRESS-TECH-AI

... More
AFP VIA GETTY IMAGES
Google has begun placing its Tensor Processing Units (TPUs) in data centres run by smaller cloud providers that have long relied on Nvidia's GPUs. This is more than a product rollout—it's a calculated move in the AI infrastructure chess game.

The effects extend across the AI ecosystem, creating both opportunity and uncertainty for investors. While individual AI names like Google and Nvidia could see heightened volatility as the landscape shifts, those seeking exposure with less single-stock risk might prefer diversified approaches. The High Quality Portfolio, for example, has comfortably outperformed its blended benchmark—a mix of the S&P 500, Russell, and S&P MidCap indexes—and has delivered returns exceeding 91% since inception, letting investors participate in these structural shifts across multiple quality companies instead of betting on a single winner. Separately, see – How Oracle Stock Surges 3x To $900?


What Are TPUs?​

TPUs are Google's purpose-built AI chips designed specifically for machine-learning tasks. While Nvidia's GPUs are versatile general-purpose processors, TPUs act like specialized scalpels optimized for AI workloads. The latest Ironwood chip delivers 42.5 exaflops of compute per pod with more than 9,200 chips per unit—“more than 10x improvement” over Google's previous generation.
 
If tiongkok nuclear can work, they would have place the reactor inside their newest aircraft carrier
 
Back
Top