Blog Post

Prmagazine > News > News > Karen Hao on the Empire of AI, AGI evangelists, and the cost of belief | TechCrunch
Karen Hao on the Empire of AI, AGI evangelists, and the cost of belief | TechCrunch

Karen Hao on the Empire of AI, AGI evangelists, and the cost of belief | TechCrunch

The center of each empire is ideology, a system of belief that can move the system forward and justify expansion—even if the cost of such expansion directly violates the mission stated by the ideology.

For European colonial forces, it is Christianity and hope for saving souls while extracting resources. For today’s AI empire, “benefiting from all humans” is artificial universal intelligence. Openai is its chief communicator, spreading enthusiasm throughout the industry by rebuilding the way AI is built.

“I’m interviewing people, their voices trembled from their passion for Agi,” journalist and bestselling author Karen Hao”The Empire of Artificial IntelligenceThe latest episode of fair.

In her book, Huo compared the AI ​​industry, especially Openai, to an empire.

“The only way to really understand the scope and scale of Openai behavior … is to realize that they are already stronger than almost any country in the world and that they have consolidated not only economic power, but also political power,” Huo said. “They are forming the earth toward the earth. They have been rewiring our geopolitics all their lives. So you can only describe it as an empire.”

Openai describes AGI as “a highly autonomous system that outweighs human performance in most economically valuable work,” and the system will somehow “improve humans by increasing the richness, turbocharged economy and new scientific knowledge that helps discover, thus changing the limits of possibility.”

These vague promises fueled exponential growth in the industry – its huge resource demand, scratching the ocean of data, tight grids and the willingness to release untested systems into the world. Many experts say future services that will never come.

TechCrunch Events

San Francisco
|
October 27-29, 2025

The path is not inevitable, and scaling is not the only way to get more progress in AI, Howe said.

“You can also develop new technologies in your algorithms,” she said. “You can improve existing algorithms to reduce the amount of data and calculate the required usage.”

But this strategy will mean sacrificing speed.

“When you define the task of building a beneficial AGI as everything the winner occupies (this is what Openai does), the most important thing is speeding beyond anything else,” Huo said. “Beyond efficiency, speed up security, speed of exploratory research.”

Sam Altman, CEO of the opening of AI, spoke during Kakao Media Day in Seoul.
Image source:Kim Jae-Hwan/sopa Images/lightrocket/Getty Images

For Openai, the best way to ensure speed is to adopt existing technology, “just do something intellectually cheap, that is, pumping more data (more supercomputers) into these existing technologies.”

Instead of falling behind other tech companies, Openai decided to line up.

“And, because the AI ​​industry has successfully captured most of the world’s top AI researchers who no longer exist in academia, you now have the whole discipline that is now shaped by the agenda of these companies rather than real scientific exploration,” Hao said.

Expenditures were and would be astronomy. Last week, Openai said it expected to burn $115 billion in cash By 2029. Meta said in July it will cost $72 billion Establish AI infrastructure this year. Google hopes to achieve $85 billion Most of this will be used to expand AI and cloud infrastructure in 2025.

At the same time, the goal post is constantly moving, and even in terms of harm, the greatest “benefits to humanity” has not yet been realized. Such as unemployment, wealth concentration and AI chatbots can fuel delusions and mental illness. In her book, Ho also documented workers in developing countries such as Kenya and Venezuela who were exposed to disturbing content including child sexual abuse material and were paid very low – about $1 to $2 per hour – roles such as content moderation and data labels.

Huo said it is a wrong trade-off to turn AI advances against existing harms, especially when other forms of AI offer real benefits.

She pointed to Google Deepmind’s Nobel Prize winner AlphafoldIt is trained on amino acid sequence data and complex protein folding structures and can now accurately predict the 3D structure of proteins whose amino acids are – very useful for drug discovery and understanding diseases.

“These are the types of AI systems we need,” Hao said. “Alphafold does not cause mental health crisis among people. Alphafold does not cause huge environmental hazards…because it has much less training in infrastructure. [the datasets don’t have] All the toxic nonsense you linger when you scratch the internet. ”

Besides the quasi-religious commitment to AGI and the narrative that is crucial to the competition is Beat China in AI competitionso that Silicon Valley can have a free impact on the world.

“Literally, the opposite is true,” Huo said. “The gap between the United States and China continues to get closer and closer, and Silicon Valley has had a freedom-free impact on the world… The only actor who is unscathed is Silicon Valley itself.”

Of course, many would argue that OpenAI and other AI companies benefit humans by releasing Chatgpt and other large language models by automating tasks such as coding, writing, research, customer support and other knowledge work.

But the structured way Openai – part non-profit, part for-profit – complicates its definition and measurement of its impact on humanity. The news this week is more complicated Openai reached an agreement with Microsoft This brings it closer to the final disclosure.

Two former OpenAI security researchers told TechCrunch they fear AI labs have begun to confuse their for-profit and nonprofit tasks — because people like other products built with Chatgpt and LLMS, so this would tick the framework that benefits humans.

Hao responded to these concerns, describing the dangers consumed by a mission that was ignored by facts.

“Even if there is evidence that what they are building is actually hurting a large number of people, the mission continues the paper,” Hao said. “It’s really a bit dangerous and dark.” [being] So you build a belief system that you lose your connection to reality. ”

Source link

Leave a comment

Your email address will not be published. Required fields are marked *

star360feedback Recruitgo