{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026 April 2026
1 2 3 4 5 6 7 8 9 10 11 12 13 14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
News Every Day |

Bombs and Porn Are Bad Reasons to Build More Data Centers

Data center construction isn’t going as planned. Bloomberg reported earlier this month that nearly half of the 12 gigawatts in computing power worth of data centers planned for this year have been delayed or canceled. Just a third of those projects are currently under construction, the market intelligence firm Sightline Climate estimates in a forthcoming report. Less than a third of the 21.5 GW worth of data center projects announced for 2027 are currently under construction.

That’s thanks in part to shortages of electrical equipment like transformers and batteries. But many also face challenges from a growing, bipartisan backlash to data center construction. Maine’s legislature recently passed the country’s first-ever statewide moratorium on data center construction for projects over 20 megawatts, to last until November 2027. Similar bills have been introduced in at least a dozen states. The Milwaukee suburb of Port Washington voted by a margin of roughly 2-to-1 for a referendum requiring voter approval before the city can extend any preferential tax treatment to projects valued at or costing $10 million or more. The referendum was a reaction to the city approving tax incentives for a $15 billion data center project to be operated by Oracle and OpenAI. (That project will not be impacted by the vote.) In Festus, Missouri, last week, voters kicked out all four incumbents who’d voted to approve a $6 billion data center plan from the developer CRG.

Not all data centers are being built for AI hyperscalers. The International Energy Agency projects that roughly half of the electricity demand from new projects planned through 2030 will be for facilities equipped to meet needs for generative AI like ChatGPT, as opposed to the less energy-intensive data centers handling cloud storage and more traditional computing tasks. The upsides of those AI-specific projects aren’t self-evident, and there’s a growing divide between the glorious futures promised by big AI developers and what people see it actually doing—generating eerie school papers and TikTok content, for instance, or flooding X with AI-generated child pornography. In addition to concerns about rising electricity bills, air pollution, and noise, fights over data centers seem to be channeling deeper frustrations. What and whom, in other words, is all this stuff actually for?

OpenAI CEO Sam Altman last year wrote that “the gains to quality of life from AI driving faster scientific progress and increased productivity will be enormous; the future can be vastly better than the present.” On Thursday, meanwhile, Florida officials opened an investigation into whether OpenAI’s ChatGPT had assisted in the planning of a mass shooting last year at Florida State University, and the extent to which chatbots might “facilitate criminal activity, empower America’s enemies, or threaten our national security,” per Florida Attorney General James Uthmeier. Court documents examined by a local news outlet show that the suspected shooter messaged extensively with ChatGPT about video games, dating, his feelings of isolation, and—eventually, less than a year before the shooting last April—guns. On the day of the shooting, where two FSU students were killed, he asked, “If there was a shooting at FSU, how would the country react?” and “What time is it the busiest in the FSU student union?” ChatGPT responded that the busiest time at the student union is “typically between 11:30 a.m. and 1:30 p.m.”

News also broke this week that OpenAI is backing an Illinois bill that could exempt companies from liability in the event that frontier models—those trained with more than $100 million of computational costs—cause “critical harms,” like creating a weapon of mass destruction, killing more than 100 people, or causing at least $1 billion in property damage. U.S. bombs in February killed between 175 and 180 people at a primary school in southern Iran—mostly girls under the age of 12—with the help of an AI targeting system developed by Palantir for the Department of Defense. Since 2024, the Pentagon has awarded the defense contractor multiyear contracts for that system worth up to $1.4 billion.

On the more quotidian end of things, AI seems to be helping students cheat on their schoolwork, filling social media feeds with news of fake TV shows and bizarre AI fruit cucking videos, and leading otherwise rational people to fall in love with chatbots. Sloppily added large language model, or LLM, features in apps, email services, and search engines churn out useless summaries of two-line emails and false information spelled out in authoritative tones. While AI’s full impact on the U.S. job market remains “guesswork,” former Biden administration official Jennifer M. Harris argued last week, it’s deepening already historic levels of inequality. Investors are rewarding companies that announce AI-fueled layoffs with surging share prices. “What’s worse,” she adds, is that “much of the trillion-plus-dollar investment in the AI boom isn’t happening in the stock market at all—it’s happening in private funds out of reach to all but the wealthiest, most connected among us.”

Despite claims from AI developers that their technology will eventually solve climate change and run on renewable energy, for now—and into the foreseeable future—they are using a lot of gas. Meta is planning to fund the construction of seven gas plants to provide 5.2 GW worth of power to its Manhattan-size Hyperion data center complex in rural Louisiana. The state’s regulators previously greenlit Entergy to build three gas plants, generating 2.3 GW for the project. As part of Meta’s agreement with Entergy, it has also agreed to finance the construction of 240 miles of transmission lines, battery storage, and nuclear power upgrades. More speculatively, Meta made a “commitment” to “help” fund “up to 2,500 megawatts of new renewable resources.” As The Atlantic’s Matteo Wong notes, greenhouse gas emissions from data centers could more than double by the end of the decade—long before AI developers’ well-advertised investments in fusion power are likely to pay off. There is still scant data available on how much electricity data centers actually use.

Unsurprisingly, all this hasn’t made AI especially popular. A Quinnipiac poll published late last month found that just 35 percent of U.S. residents are either “very excited” (6 percent) or “somewhat excited” (29 percent) about AI. Sixty-two percent are “not so excited” (29 percent) or “not excited at all” (33 percent). Eighty percent of poll respondents were “very” or “somewhat concerned” about it, and 55 percent think AI will do more harm than good in their day-to-day lives. Nearly two-thirds think AI will do more harm than good in education. Seventy percent think AI will decrease job opportunities. Sixty-five percent of respondents—including 78 percent of Democrats and 56 percent of Republicans—would oppose building an AI data center in their community.

So, again, why is the U.S. embarking on a state-sponsored spending-and-building binge for a technology that most people here think will make the world—and their lives—worse? Data center developers and supportive politicians promise construction jobs and additional tax revenues that can translate into bigger municipal budgets and tax decreases for residents of the places where data projects are built. Data centers don’t employ huge numbers of people over the long-term, though, and tax upsides for their neighbors are often undercut by generous tax incentives offered to developers. The Texas Tribune this week reported that the Lonestar State is expected to lose out on $3.2 billion in sales tax revenue over the next two years as a result of tax exemptions offered to data center developers.

To make their case, AI boosters typically pitch their products in graver terms than just jobs and tax revenue. The Trump administration, prominent Democrats, and AI hyperscalers have all framed “winning the AI race” as a national security imperative, raising fears that China will beat the U.S. to achieve a mysterious state known as “artificial general intelligence,” or something even more powerful called “superintelligence.” These terms are not well defined, and neither is the material threat posed by China “winning” and the U.S. “losing.” The United States is not at war with China. China’s government does not seem especially eager to start a war with the U.S. Our government has in the last few months kidnapped a head of state, threatened to annex Greenland, and started a stupid, reckless war of aggression against Iran—a war in which it’s used AI to kill more than a hundred children. At home, ICE is using Palantir’s AI to hunt down and disappear migrants as the Trump administration demands universities hand over lists of Jews. Criticisms of China’s domestic and foreign policy shouldn’t obscure the fact that the U.S. government is already doing extraordinarily dangerous things with AI. The companies building it are under zero obligation to further the interests of the U.S. government, much less those of most of the people who live here. If something called superintelligence is indeed real, which seems doubtful, do we really want Sam Altman or Donald Trump—who threatened to wipe out an entire civilization last week—to control it?

It isn’t a coincidence that AI hyperscalers in the U.S. have sold their models to the public, policymakers, and investors in terms of what’s likely to happen down the road. The prospect of a foreign power gaining access to a godlike, world-destroying entity certainly inspires more urgency than, say, B2B software, vibescoding, and AI therapists. But rather than taking executives’ predictions about an inevitable utopian/apocalyptic future at face value, conversations about the future of AI infrastructure should be grounded in what most people are presently getting out of it. For now, the answer is not much.

Ria.city






Read also

Manitoba judge discounts sentence for Métis man linked to street gang beating

BlackRock's Larry Fink expects more 'dispersion' in private credit — and he likes that

The problem with thinking you’re part Neanderthal

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости