Salta al contenuto principale



Researchers found Meta’s popular Llama 3.1 70B has a capacity to recite passages from 'The Sorcerer's Stone' at a rate much higher than could happen by chance.

Researchers found Meta’s popular Llama 3.1 70B has a capacity to recite passages from x27;The Sorcererx27;s Stonex27; at a rate much higher than could happen by chance.#AI #Meta #llms

#ai #meta #x27 #LLMs


Keep Track of the Compost with LoRaWAN


Composting doesn’t seem difficult: pile up organic matter, let it rot. In practice, however, it’s a bit more complicated– if you want that sweet, sweet soil amendment in a reasonable amount of time, and to make sure any food-born pathogens and weed seeds don’t come through, you need a “hot” compost pile. How to tell if the pile is hot? Well, you could go out there and stick your arm in like a schmuck, or you could use [Dirk-WIllem van Gulik]’s “LORAWAN Compostheap solarpowered temperaturesensor” (sic).

The project is exactly what it sounds like, once you add some spaces: a solar-powered temperature sensor that uses LoRaWAN to track temperatures inside (and outside, for comparison) the compost heap year round. Electronically it is pretty simple: a Helltech CubeCell AB01 LoraWAN module is wired up with three DS18B20 temperature sensors, a LiPo battery and a solar panel. (The AB01 has the required circuitry to charge the battery via solar power.)

The three temperature sensors are spread out: within a handmade of a metal spike to measure the core of the heap, one partway up the metal tube holding said spike, to measure the edge of the pile, and one in the handsome 3D printed case to measure the ambient temperature. These three measurements, and the difference between them, should give a very good picture of the metabolism of the pile, and cue an observant gardener when it is time to turn it, water it, or declare it done.

Given it only wakes every hour or so for measurements (compost piles aren’t a fast moving system like an RMBK) and has a decent-sized panel, the LiPo battery isn’t going to see much stress and will likely last many years, especially in the benevolent Dutch climate. [Dirk] is also counting on that climate to keep the printed PLA enclosure intact. If one was to recreate this project for Southern California or North Australia, a different filament would certainly be needed, but the sun doesn’t beat down nearly as hard in Northern Europe and PLA will probably last at least as long as the battery.

Of course with this device it’s still up to the gardener to decide what to do with the temperature data and get out to do the hard work. For those who prefer more automation and less exercise, this composter might be of interest.

Our thanks to [Peter de Bruin] for the tip about this finely-turned temperature sensing tip. If you, too, want to bask in the immortal fame brought by a sentence of thanks at the end of a Hackaday article (or perhaps a whole article dedicated to your works?) submit a tip and your dreams may come true.


hackaday.com/2025/06/23/keep-t…



Video Game Preservation Through Decompilation


Unlike computer games, which smoothly and continuously evolved along with the hardware that powered them, console games have up until very recently been constrained by a generational style of development. Sure there were games that appeared on multiple platforms, and eventually newer consoles would feature backwards compatibility that allowed them to play select titles from previous generations of hardware. But in many cases, some of the best games ever made were stuck on the console they were designed for.

Now, for those following along as this happened, it wasn’t such a big deal. For gamers, it was simply a given that their favorite games from the Super Nintendo Entertainment System (SNES) wouldn’t play on the Nintendo 64, any more than their Genesis games could run on their Sony PlayStation. As such, it wasn’t uncommon to see several game consoles clustered under the family TV. If you wanted to go back and play those older titles, all you had to do was switch video inputs.

But gaming, and indeed the entertainment world in general, has changed vastly over the last couple of decades. Telling somebody today that the only way they can experience The Legend of Zelda: A Link to the Past is by dragging out some yellowed thirty-odd year old console from the attic is like telling them the only way they can see a movie is by going to the theater.

These days, the expectation is that entertainment comes to you, not the other way around — and it’s an assumption that’s unlikely to change as technology marches on. Just like our TV shows and movies now appear on whatever device is convenient to us at the time, modern gamers don’t want to be limited to their consoles, they also want to play games on their phones and VR headsets.

But that leaves us with a bit of a problem. There are some games which are too significant, either technically or culturally, to just leave in the digital dust. Like any other form of art, there are pieces that deserve to be preserved for future generations to see and experience.

For the select few games that are deemed worth the effort, decompilation promises to offer a sort of digital immortality. As several recent projects have shown, breaking a game down to its original source code can allow it to adapt to new systems and technologies for as long as the community wishes to keep them updated.

Emulation For Most, But Not All


Before we get into the subject of decompilation, we must first address a concept that many readers are likely familiar with already: emulation.

Using a console emulator to play an old game is not entirely unlike running an operating system through a virtual machine, except in the case of the console emulator, there’s the added complication of having to replicate the unique hardware environment that a given game was designed to run on. Given a modern computer, this usually isn’t a problem when it comes to the early consoles. But as you work your way through the console generations, the computational power required to emulate their unique hardware architectures rapidly increases.
Nintendo put emulation to work with their “Mini” consoles.
The situation is often complicated by the fact that some games were painstakingly optimized for their respective console, often making use of little-documented quirks of the hardware. Emulators often employ title-specific routines to try and make these games playable, but they aren’t always 100% successful. Even on games that aren’t particularly taxing, the general rule of emulation is to put performance ahead of accuracy.

Therein lies the key problem with emulation when it comes to preserving games as an artistic medium. While the need for ever-more powerful hardware is a concern, Moore’s Law will keep that largely in check. The bigger issue is accuracy. Simply running a game is one thing, but to run it exactly how it was meant to run when the developers released it is another story entirely.

It’s fairly common for games to look, sound, and even play slightly differently when under emulation than they did when running on real hardware. In many cases, these issues are barely noticeable for the average player. The occasional sound effect playing out of sync, or a slightly shifted color palette isn’t enough to ruin the experience. Other issues, like missing textures or malfunctioning game logic can be bad enough that the game can’t be completed. There are even games, few as they may be, that simply don’t run at all under emulation.

Make no mistake, emulation is usually good enough for most games. Indeed, both Nintendo and Sony have used emulation in various capacities to help bring their extensive back catalog of games to newer generations. But the fact remains that there are some games which deserve, and sometimes even require, a more nuanced approach.

Chasing Perfection


In comparison, when a game is decompiled to the point that the community has the original C code that it was built from, it’s possible to avoid many of the issues that come with emulation. The game can be compiled as a native executable for modern platforms, and it can take advantage of all the hardware and software improvements that come with it. It’s even possible to fix long-standing bugs, and generally present the game in its best form.

For those who’ve dabbled in reverse engineering, you’ll know that decompiling a program back into usable C code isn’t exactly a walk in the park. While there are automated tools that can help get through a lot of the work, there’s still plenty of human intervention required. Even then, the original code for the game would have been written to take advantage of the original console’s unique hardware, so you’ll need to either patch your way around that or develop some kind of compatibility layer to map various calls over to something more modern and platform-agnostic. It’s a process that can easily take years to complete.

Because of this, decompilation efforts tend to be limited to the most critically acclaimed titles. For example, in 2021 we saw the first efforts to fully reverse The Legend of Zelda: Ocarina of Time. Released in 1998 on the N64, it’s often hailed as one of the greatest video games ever made. Although the effort started with Ocarina, by 2024, the lessons learned during that project led to the development of tools which can help decompile and reconstruct other N64 games.

Games as Living Documents


For the most part, an emulated game works the same way it did when it was first released. Of course, the emulator has full control over the virtual environment that the game is running in, so there are a few tricks it can pull. As such, additional features such as cheats and save states are common in most emulators. It’s even possible to swap out the original graphical assets for higher resolution versions, which can greatly improve the look of some early 3D games.

But what if you wanted to take things further? That’s where having the source code makes all the difference. Once you’ve gotten the game running perfectly, you can create a fork that starts adding in new features and quality of life improvements. As an example, the decompilation for Animal Crossing on the GameCube will allow developers to expand the in-game calendar beyond the year 2030 — but it’s a change that will be implemented in a “deluxe” fork of the code so as to preserve how the original game functioned.

At this point you’re beyond preservation, and you’ve turned the game into something that doesn’t just live on, but can actually grow with new generations of players.


hackaday.com/2025/06/23/video-…



#USA-#Iran, la guerra degli specchi


altrenotizie.org/primo-piano/1…


L’Iran attacca le basi americane in Medioriente


@Notizie dall'Italia e dal mondo
Missili contro la base militare americana di Al-Udeid, in Qatar. L'allarme suona anche nelle basi dell'Iraq.
L'articolo L’Iran attacca le basi americane in Medioriente proviene da Pagine Esteri.

pagineesteri.it/2025/06/23/med…



Head to Print Head: CNC vs FDM


It’s a question new makers often ask: “Should I start with a CNC machine or a 3D Printer?”– or, once you have both, every project gets the question “Should I use my CNC or 3D printer?” — and the answer is to both is, of course, “it depends”. In the video embedded below by [NeedItMakeIt] you can see a head-to-head comparison for one specific product he makes, CRATER, a magnetic, click-together stacking tray for tabletop gaming. (He says tabletop gaming, but we think these would be very handy in the shop, too.)

[NeedItMakeIt] takes us through the process for both FDM 3D Printing in PLA, and CNC Machining the same part in walnut. Which part is nicer is absolutely a matter of taste; we can’t imagine many wouldn’t chose the wood, but de gustibus non disputandum est–there is no accounting for taste. What there is accounting for is the materials and energy costs, which are both surprising– that walnut is cheaper than PLA for this part is actually shocking, but the amount of power needed for dust collection is something that caught us off guard, too.

Of course the process is the real key, and given that most of the video follows [NeedItMakeIt] crafting the CNC’d version of his invention, the video gives a good rundown to any newbie just how much more work is involved in getting a machined part ready for sale compared to “take it off the printer and glue in the magnets.” (It’s about 40 extra minutes, if you want to skip to the answer.) As you might expect, labour is by far the greatest cost in producing these items if you value your time, which [NeedItMakeIt] does in the spreadsheet he presents at the end.

What he does not do is provide an answer, because in the case of this part, neither CNC or 3D Printing is “better”. It’s a matter of taste– which is the great thing about DIY. We can decide for ourselves which process and which end product we prefer. “There is no accounting for taste”, de gustibus non disputandum est, is true enough that it’s been repeated since Latin was a thing. Which would you rather, in this case? CNC or 3D print? Perhaps you would rather 3D Print a CNC? Or have one machine to do it all? Let us know in the comments for that sweet, sweet engagement.

While you’re engaging, maybe drop us a tip, while we offer our thanks to [Al] for this one.

youtube.com/embed/h6PO_Yxd8io?…


hackaday.com/2025/06/23/head-t…



Truffa del finto servizio assistenza: così hanno bucato i siti di Netflix, Microsoft e altri


@Informatica (Italy e non Italy 😁)
I criminali informatici hanno messo appunto un’insidiosa tecnica di attacco che sfrutta le inserzioni sponsorizzate tra i risultati di ricerca per condurre le ignare vittime su vere pagine web di assistenza online i

reshared this




La dipendenza europea dalle piattaforme digitali USA è vulnerabilità geopolitica


@Informatica (Italy e non Italy 😁)
A metà febbraio Microsoft ha disattivato l’account di posta istituzionale del procuratore capo della Corte penale internazionale, Karim Ahmad Khan, privando la Cpi di un canale di comunicazione critico. Ecco cosa implica la




Here is the video archive for our FOIA Forum where we explained how we got records about Massive Blue, a company selling AI personas to cops.#FOIAForum


Kill Switch! L’arma digitale di Donald Trump che minaccia l’Europa


Il ritorno di Donald Trump alla Casa Bianca è diventato un doloroso promemoria per l’Europa della sua principale vulnerabilità digitale: il “kill switch” di fatto controllato dagli Stati Uniti. Rischi politici che solo pochi anni fa sembravano una fantasia sono ora percepiti come una minaccia molto reale , in grado di paralizzare l’economia e le comunicazioni europee.

Nel corso degli anni di integrazione economica e globalizzazione tecnologica, i paesi europei sono diventati estremamente dipendenti dai servizi cloud americani. La sicurezza di e-mail, streaming video, elaborazione industriale e persino comunicazioni governative è direttamente collegata all’infrastruttura controllata dalle tre maggiori aziende americane: Amazon, Microsoft e Google. Queste aziende attualmente servono oltre due terzi del mercato cloud europeo.

Da tempo si esprimono preoccupazioni circa un’influenza indebita degli Stati Uniti sui dati europei. Le leggi americane consentono alle autorità statunitensi di accedere alle informazioni archiviate sui server di queste aziende in tutto il mondo. Ma da quando Trump è tornato al potere, tali scenari sono diventati molto più vicini alla realtà.

La situazione si è aggravata dopo che la Corte penale internazionale ha emesso mandati di arresto per importanti politici israeliani e al procuratore capo della Corte, Karim Khan, è stato impedito l’accesso ai suoi account di posta elettronica ospitati sui server Microsoft. Sebbene l’azienda stessa si sia rifiutata di divulgare i dettagli della chiusura, l’incidente ha suscitato grande scalpore. Aura Sallah, ex importante lobbista di Meta a Bruxelles e ora membro del Parlamento europeo, ha sottolineato che una situazione del genere dimostra chiaramente che l’affidabilità e la sicurezza delle piattaforme digitali americane per l’Europa sono seriamente in discussione.

Come ha osservato Zach Myers, direttore del think tank CERRE, l’Europa è un concorrente e un avversario per Trump, non un alleato. Pertanto, l’idea che le autorità americane possano deliberatamente disattivare i servizi cloud per aumentare la pressione politica non sembra più fantascienza.

In risposta al peggioramento della situazione, politici e aziende europee stanno intensificando gli sforzi per ridurre la dipendenza tecnologica dagli Stati Uniti. Il capo dell’azienda francese OVHcloud, Benjamin Revkolewski, ha paragonato i servizi cloud a un sistema di approvvigionamento idrico: familiare e impercettibile finché qualcuno non chiude la valvola. E se la possibilità di un simile blocco era precedentemente discussa in teoria, oggi è percepita come un rischio reale.

Per ridurre almeno in parte il grado di dipendenza, le più grandi aziende americane si sono affrettate a dimostrare la loro disponibilità al dialogo. Microsoft ha incluso garanzie legali nei contratti con le agenzie governative europee per mantenere l’accesso ai servizi, anche in caso di decisioni politiche da parte di Washington. Amazon ha annunciato un nuovo meccanismo per la gestione dei servizi europei, promettendo di garantirne il “funzionamento indipendente e continuo”, anche qualora gli Stati Uniti introducessero nuove restrizioni.

Eppure molti dubitano che tali promesse resisteranno alle pressioni della Casa Bianca. Come sottolinea l’economista Cristina Caffarra dell’University College di Londra, anche con le migliori intenzioni, le aziende non saranno in grado di tenere testa al proprio governo se il confronto politico raggiungerà un nuovo livello.

In questo contesto, nell’UE si stanno diffondendo richieste di creare infrastrutture digitali proprie e indipendenti. Una di queste iniziative è il progetto EuroStack, con un investimento previsto di 300 miliardi di euro. Il suo obiettivo è garantire la piena indipendenza dell’Europa nel campo delle tecnologie e del software cloud. Il piano prevede commesse governative prioritarie per le aziende IT locali, sussidi e un fondo di sostegno.

Ma l’ambizioso progetto sarà estremamente difficile da attuare. Come ammettono anche i suoi sostenitori, l’entità dell’investimento è paragonabile ai budget delle più grandi riforme infrastrutturali degli ultimi decenni. Gli scettici, compresi i rappresentanti delle lobby americane, sostengono che i costi reali potrebbero superare i 5 trilioni di euro.

I responsabili politici dell’UE si trovano a dover bilanciare il desiderio di sovranità tecnologica con il timore di essere accusati di protezionismo, che potrebbe innescare una dura risposta da parte degli Stati Uniti. Gli Stati membri sono divisi: la Francia è irremovibile sulla necessità di proteggere i dati dall’influenza americana, mentre i Paesi Bassi, tradizionalmente fedeli agli Stati Uniti, hanno in passato adottato una posizione più cauta. Tuttavia, le turbolenze politiche degli ultimi mesi hanno costretto anche loro a riconsiderare il loro approccio.

Il problema è aggravato dal fatto che le iniziative legislative volte a rafforzare la sovranità digitale sono bloccate. Uno dei progetti chiave, che prevede la certificazione obbligatoria delle soluzioni “cloud” per le agenzie governative, è bloccato in fase di approvazione. Secondo l’idea, il livello di certificazione più elevato avrebbe dovuto garantire la protezione dei dati da interferenze da parte di paesi terzi, inclusi gli Stati Uniti. Ma sotto la pressione di Washington, i negoziati si sono protratti a lungo e la Commissione europea si rifiuta di divulgare la corrispondenza con la parte americana, citando la “necessità di mantenere la fiducia”.

Nel frattempo, Bruxelles sta sempre più insistendo sulla necessità di una politica rigorosa e pragmatica. Come ammette Henna Virkkunen, responsabile del dipartimento UE per la sovranità tecnologica, l’Europa si trova per la prima volta di fronte a una situazione in cui la sua dipendenza economica e tecnologica può essere usata come arma nei conflitti internazionali.

La posta in gioco finanziaria, tecnologica e politica è altissima. L’Europa deve decidere se è disposta a pagare per l’indipendenza o se preferisce continuare a sperare che il passaggio all’estero non venga mai effettuato.

L'articolo Kill Switch! L’arma digitale di Donald Trump che minaccia l’Europa proviene da il blog della sicurezza informatica.

Giupardeb reshared this.




Details about how Meta's nearly Manhattan-sized data center will impact consumers' power bills are still secret.

Details about how Metax27;s nearly Manhattan-sized data center will impact consumersx27; power bills are still secret.#AI


'A Black Hole of Energy Use': Meta's Massive AI Data Center Is Stressing Out a Louisiana Community


A massive data center for Meta’s AI will likely lead to rate hikes for Louisiana customers, but Meta wants to keep the details under wraps.

Holly Ridge is a rural community bisected by US Highway 80, gridded with farmland, with a big creek—it is literally named Big Creek—running through it. It is home to rice and grain mills and an elementary school and a few houses. Soon, it will also be home to Meta’s massive, 4 million square foot AI data center hosting thousands of perpetually humming servers that require billions of watts of energy to power. And that energy-guzzling infrastructure will be partially paid for by Louisiana residents.

The plan is part of what Meta CEO Mark Zuckerberg said would be “a defining year for AI.” On Threads, Zuckerberg boasted that his company was “building a 2GW+ datacenter that is so large it would cover a significant part of Manhattan,” posting a map of Manhattan along with the data center overlaid. Zuckerberg went on to say that over the coming years, AI “will drive our core products and business, unlock historic innovation, and extend American technology leadership. Let's go build! 💪”

Mark Zuckerberg (@zuck) on Threads
This will be a defining year for AI. In 2025, I expect Meta AI will be the leading assistant serving more than 1 billion people, Llama 4 will become the leading state of the art model, and we’ll build an AI engineer that will start contributing increasing amounts of code to our R&D efforts. To power this, Meta is building a 2GW+ datacenter that is so large it would cover a significant part of Manhattan.
Threads


What Zuckerberg did not mention is that "Let's go build" refers not only to the massive data center but also three new Meta-subsidized, gas power plants and a transmission line to fuel it serviced by Entergy Louisiana, the region’s energy monopoly.

Key details about Meta’s investments with the data center remain vague, and Meta’s contracts with Entergy are largely cloaked from public scrutiny. But what is known is the $10 billion data center has been positioned as an enormous economic boon for the area—one that politicians bent over backward to facilitate—and Meta said it will invest $200 million into “local roads and water infrastructure.”

A January report from NOLA.com said that the the state had rewritten zoning laws, promised to change a law so that it no longer had to put state property up for public bidding, and rewrote what was supposed to be a tax incentive for broadband internet meant to bridge the digital divide so that it was only an incentive for data centers, all with the goal of luring in Meta.

But Entergy Louisiana’s residential customers, who live in one of the poorest regions of the state, will see their utility bills increase to pay for Meta’s energy infrastructure, according to Entergy’s application. Entergy estimates that amount will be small and will only cover a transmission line, but advocates for energy affordability say the costs could balloon depending on whether Meta agrees to finish paying for its three gas plants 15 years from now. The short-term rate increases will be debated in a public hearing before state regulators that has not yet been scheduled.

The Alliance for Affordable Energy called it a “black hole of energy use,” and said “to give perspective on how much electricity the Meta project will use: Meta’s energy needs are roughly 2.3x the power needs of Orleans Parish … it’s like building the power impact of a large city overnight in the middle of nowhere.”

404 Media reached out to Entergy for comment but did not receive a response.

By 2030, Entergy’s electricity prices are projected to increase 90 percent from where they were in 2018, although the company attributes much of that to damage to infrastructure from hurricanes. The state already has a high energy cost burden in part because of a storm damage to infrastructure, and balmy heat made worse by climate change that drives air conditioner use. The state's homes largely are not energy efficient, with many porous older buildings that don’t retain heat in the winter or remain cool in the summer.

“You don't just have high utility bills, you also have high repair costs, you have high insurance premiums, and it all contributes to housing insecurity,” said Andreanecia Morris, a member of Housing Louisiana, which is opposed to Entergy’s gas plant application. She believes Meta’s data center will make it worse. And Louisiana residents have reasons to distrust Entergy when it comes to passing off costs of new infrastructure: in 2018, the company’s New Orleans subsidiary was caught paying actors to testify on behalf of a new gas plant. “The fees for the gas plant have all been borne by the people of New Orleans,” Morris said.

In its application to build new gas plants and in public testimony, Entergy says the cost of Meta’s data center to customers will be minimal and has even suggested Meta’s presence will make their bills go down. But Meta’s commitments are temporary, many of Meta’s assurances are not binding, and crucial details about its deal with Entergy are shielded from public view, a structural issue with state energy regulators across the country.

AI data centers are being approved at a breakneck pace across the country, particularly in poorer regions where they are pitched as economic development projects to boost property tax receipts, bring in jobs and where they’re offered sizable tax breaks. Data centers typically don’t hire many people, though, with most jobs in security and janitorial work, along with temporary construction work. And the costs to the utility’s other customers can remain hidden because of a lack of scrutiny and the limited power of state energy regulators. Many data centers—like the one Meta is building in Holly Ridge—are being powered by fossil fuels. This has led to respiratory illness and other health risks and emitting greenhouse gasses that fuel climate change. In Memphis, a massive data center built to launch a chatbot for Elon Musks’ AI company is powered by smog-spewing methane turbines, in a region that leads the state for asthma rates.

“In terms of how big these new loads are, it's pretty astounding and kind of a new ball game,” said Paul Arbaje, an energy analyst with the Union of Concerned Scientists, which is opposing Entergy’s proposal to build three new gas-powered plants in Louisiana to power Meta’s data center.

Entergy Louisiana submitted a request to the state’s regulatory body to approve the construction of the new gas-powered plants that would create 2.3 gigawatts of power and cost $3.2 billion in the 1440 acre Franklin Farms megasite in Holly Ridge, an unincorporated community of Richland Parish. It is the first big data center announced since Louisiana passed large tax breaks for data centers last summer.

In its application to the public utility commission for gas plants, Entergy says that Meta has a planned investment of $5 billion in the region to build the gas plants in Richland Parish, Louisiana, where it claims in its application that the data center will employ 300-500 people with an average salary of $82,000 in what it points out is “a region of the state that has long struggled with a lack of economic development and high levels of poverty.” Meta’s official projection is that it will employ more than 500 people once the data center is operational. Entergy plans for the gas plants to be online by December 2028.

In testimony, Entergy officials refused to answer specific questions about job numbers, saying that the numbers are projections based on public statements from Meta.

A spokesperson for Louisiana’s Economic Development told 404 Media in an email that Meta “is contractually obligated to employ at least 500 full-time employees in order to receive incentive benefits.”

When asked about jobs, Meta pointed to a public facing list of its data centers, many of which the company says employ more than 300 people. A spokesperson said that the projections for the Richland Parish site are based on the scale of the 4 million square foot data center. The spokesperson said the jobs will include “engineering and other technical positions to operational roles and our onsite culinary staff.”

When asked if its job commitments are binding, the spokesperson declined to answer, saying, “We worked closely with Richland Parish and Louisiana Economic Development on mutually beneficial agreements that will support long-term growth in the area.”

Others are not as convinced. “Show me a data center that has that level of employment,” says Logan Burke, executive director of the Alliance for Affordable Energy in Louisiana.

Entergy has argued the new power plants are necessary to satiate the energy need from Meta’s massive hyperscale data center, which will be Meta’s largest data center and potentially the largest data center in the United States. It amounts to a 25 percent increase in Entergy Louisiana’s current load, according to the Alliance for Affordable Energy.

Entergy requested an exemption from a state law meant to ensure that it develops energy at the lowest cost by issuing a public request for proposals, claiming in its application and testimony that this would slow them down and cause them to lose their contracts with Meta.

Meta has agreed to subsidize the first 15 years of payments for construction of the gas plants, but the plant’s construction is being financed over 30 years. At the 15 year mark, its contract with Entergy ends. At that point, Meta may decide it doesn’t need three gas plants worth of energy because computing power has become more efficient or because its AI products are not profitable enough. Louisiana residents would be stuck with the remaining bill.

“It's not that they're paying the cost, they're just paying the mortgage for the time that they're under contract,” explained Devi Glick, an electric utility analyst with Synapse Energy.

When asked about the costs for the gas plants, a Meta spokesperson said, “Meta works with our utility partners to ensure we pay for the full costs of the energy service to our data centers.” The spokesperson said that any rate increases will be reviewed by the Louisiana Public Service Commission. These applications, called rate cases, are typically submitted by energy companies based on a broad projection of new infrastructure projects and energy needs.

Meta has technically not finalized its agreement with Entergy but Glick believes the company has already invested enough in the endeavor that it is unlikely to pull out now. Other companies have been reconsidering their gamble on AI data centers: Microsoft reversed course on centers requiring a combined 2 gigawatts of energy in the U.S. and Europe. Meta swept in to take on some of the leases, according to Bloomberg.

And in the short-term, Entergy is asking residential customers to help pay for a new transmission line for the gas plants at a cost of more than $500 million, according to Entergy’s application to Louisiana’s public utility board. In its application, the energy giant said customers’ bills will only rise by $1.66 a month to offset the costs of the transmission lines. Meta, for its part, said it will pay up to $1 million a year into a fund for low-income customers. When asked about the costs of the new transmission line, a Meta spokesperson said, “Like all other new customers joining the transmission system, one of the required transmission upgrades will provide significant benefits to the broader transmission system. This transmission upgrade is further in distance from the data center, so it was not wholly assigned to Meta.”

When Entergy was questioned in public testimony on whether the new transmission line would need to be built even without Meta’s massive data center, the company declined to answer, saying the question was hypothetical.

Some details of Meta’s contract with Entergy have been made available to groups legally intervening in Entergy’s application, meaning that they can submit testimony or request data from the company. These parties include the Alliance for Affordable Energy, the Sierra Club and the Union of Concerned Scientists.

But Meta—which will become Entergy’s largest customer by far and whose presence will impact the entire energy grid—is not required to answer questions or divulge any information to the energy board or any other parties. The Alliance for Affordable Energy and Union of Concerned Scientists attempted to make Meta a party to Entergy’s application—which would have required it to share information and submit to questioning—but a judge denied that motion on April 4.

The public utility commissions that approve energy infrastructure in most states are the main democratic lever to assure that data centers don’t negatively impact consumers. But they have no oversight over the tech companies running the data centers or the private companies that build the centers, leaving residential customers, consumer advocates and environmentalists in the dark. This is because they approve the power plants that fuel the data centers but do not have jurisdiction over the data centers themselves.

“This is kind of a relic of the past where there might be some energy service agreement between some large customer and the utility company, but it wouldn't require a whole new energy facility,” Arbaje said.

A research paper by Ari Peskoe and Eliza Martin published in March looked at 50 regulatory cases involving data centers, and found that tech companies were pushing some of the costs onto utility customers through secret contracts with the utilities. The paper found that utilities were often parroting rhetoric from AI boosting politicians—including President Biden—to suggest that pushing through permitting for AI data center infrastructure is a matter of national importance.

“The implication is that there’s no time to act differently,” the authors wrote.

In written testimony sent to the public service commission, Entergy CEO Phillip May argued that the company had to bypass a legally required request for proposals and requirement to find the cheapest energy sources for the sake of winning over Meta.

“If a prospective customer is choosing between two locations, and if that customer believes that location A can more quickly bring the facility online than location B, that customer is more likely to choose to build at location A,” he wrote.

Entergy also argues that building new gas plants will in fact lower electricity bills because Meta, as the largest customer for the gas plants, will pay a disproportionate share of energy costs. Naturally, some are skeptical that Entergy would overcharge what will be by far their largest customer to subsidize their residential customers. “They haven't shown any numbers to show how that's possible,” Burke says of this claim. Meta didn’t have a response to this specific claim when asked by 404 Media.

Some details, like how much energy Meta will really need, the details of its hiring in the area and its commitment to renewables are still cloaked in mystery.

“We can't ask discovery. We can't depose. There's no way for us to understand the agreement between them without [Meta] being at the table,” Burke said.

It’s not just Entergy. Big energy companies in other states are also pushing out costly fossil fuel infrastructure to court data centers and pushing costs onto captive residents. In Kentucky, the energy company that serves the Louisville area is proposing 2 new gas plants for hypothetical data centers that have yet to be contracted by any tech company. The company, PPL Electric Utilities, is also planning to offload the cost of new energy supply onto its residential customers just to become more competitive for data centers.

“It's one thing if rates go up so that customers can get increased reliability or better service, but customers shouldn't be on the hook to pay for new power plants to power data centers,” Cara Cooper, a coordinator with Kentuckians for Energy Democracy, which has intervened on an application for new gas plants there.

These rate increases don’t take into account the downstream effects on energy; as the supply of materials and fuel are inevitably usurped by large data center load, the cost of energy goes up to compensate, with everyday customers footing the bill, according to Glick with Synapse.

Glick says Entergy’s gas plants may not even be enough to satisfy the energy needs of Meta’s massive data center. In written testimony, Glick said that Entergy will have to either contract with a third party for more energy or build even more plants down the line to fuel Meta’s massive data center.

To fill the gap, Entergy has not ruled out lengthening the life of some of its coal plants, which it had planned to close in the next few years. The company already pushed back the deactivation date of one of its coal plants from 2028 to 2030.

The increased demand for gas power for data centers has already created a widely-reported bottleneck for gas turbines, the majority of which are built by 3 companies. One of those companies, Siemens Energy, told Politico that turbines are “selling faster than they can increase manufacturing capacity,” which the company attributed to data centers.

Most of the organizations concerned about the situation in Louisiana view Meta’s massive data center as inevitable and are trying to soften its impact by getting Entergy to utilize more renewables and make more concrete economic development promises.

Andreanecia Morris, with Housing Louisiana, believes the lack of transparency from public utility commissions is a bigger problem than just Meta. “Simply making Meta go away, isn't the point,” Morris says. “The point has to be that the Public Service Commission is held accountable.”

Burke says Entergy owns less than 200 megawatts of renewable energy in Louisiana, a fraction of the fossil fuels it is proposing to fuel Meta’s center. Entergy was approved by Louisiana’s public utility commission to build out three gigawatts of solar energy last year , but has yet to build any of it.

“They're saying one thing, but they're really putting all of their energy into the other,” Burke says.

New gas plants are hugely troubling for the climate. But ironically, advocates for affordable energy are equally concerned that the plants will lie around disused - with Louisiana residents stuck with the financing for their construction and upkeep. Generative AI has yet to prove its profitability and the computing heavy strategy of American tech companies may prove unnecessary given less resource intensive alternatives coming out of China.

“There's such a real threat in such a nascent industry that what is being built is not what is going to be needed in the long run,” said Burke. “The challenge remains that residential rate payers in the long run are being asked to finance the risk, and obviously that benefits the utilities, and it really benefits some of the most wealthy companies in the world, But it sure is risky for the folks who are living right next door.”

The Alliance for Affordable Energy expects the commission to make a decision on the plants this fall.


#ai #x27




Eulogy for the Satellite Phone


We take it for granted that we almost always have cell service, no matter where you go around town. But there are places — the desert, the forest, or the ocean — where you might not have cell service. In addition, there are certain jobs where you must be able to make a call even if the cell towers are down, for example, after a hurricane. Recently, a combination of technological advancements has made it possible for your ordinary cell phone to connect to a satellite for at least some kind of service. But before that, you needed a satellite phone.

On TV and in movies, these are simple. You pull out your cell phone that has a bulkier-than-usual antenna, and you make a call. But the real-life version is quite different. While some satellite phones were connected to something like a ship, I’m going to consider a satellite phone, for the purpose of this post, to be a handheld device that can make calls.

History


Satellites have been relaying phone calls for a very long time. Early satellites carried voice transmissions in the late 1950s. But it would be 1979 before Inmarsat would provide MARISAT for phone calls from sea. It was clear that the cost of operating a truly global satellite phone system would be too high for any single country, but it would be a boon for ships at sea.

Inmarsat, started as a UN organization to create a satellite network for naval operations. It would grow to operate 15 satellites and become a private British-based company in 1998. However, by the late 1990s, there were competing companies like Thuraya, Iridium, and GlobalStar.

An IsatPhone-Pro (CC-BY-SA-3.0 by [Klaus Därr])The first commercial satellite phone call was in 1976. The oil platform “Deep Sea Explorer” had a call with Phillips Petroleum in Oklahoma from the coast of Madagascar. Keep in mind that these early systems were not what we think of as mobile phones. They were more like portable ground stations, often with large antennas.

For example, here was part of a press release for a 1989 satellite terminal:

…small enough to fit into a standard suitcase. The TCS-9200 satellite terminal weighs 70lb and can be used to send voice, facsimile and still photographs… The TCS-9200 starts at $53,000, while Inmarsat charges are $7 to $10 per minute.


Keep in mind, too, that in addition to the briefcase, you needed an antenna. If you were lucky, your antenna folded up and, when deployed, looked a lot like an upside-down umbrella.

However, Iridium launched specifically to bring a handheld satellite phone service to the market. The first call? In late 1998, U.S. Vice President Al Gore dialed Gilbert Grosvenor, the great-grandson of Alexander Graham Bell. The phones looked like very big “brick” phones with a very large antenna that swung out.

Of course, all of this was during the Cold War, so the USSR also had its own satellite systems: Volna and Morya, in addition to military satellites.

Location, Location, Location


The earliest satellites made one orbit of the Earth each day, which means they orbit at a very specific height. Higher orbits would cause the Earth to appear to move under the satellite, while lower orbits would have the satellite racing around the Earth.

That means that, from the ground, it looks like they never move. This gives reasonable coverage as long as you can “see” the satellite in the sky. However, it means you need better transmitters, receivers, and antennas.
Iridium satellites are always on the move, but blanket the earth.
This is how Inmarsat and Thuraya worked. Unless there is some special arrangement, a geosynchronous satellite only covers about 40% of the Earth.

Getting a satellite into a high orbit is challenging, and there are only so many “slots” at the exact orbit required to be geosynchronous available. That’s why other companies like Iridium and Globalstar wanted an alternative.

That alternative is to have satellites in lower orbits. It is easier to talk to them, and you can blanket the Earth. However, for full coverage of the globe, you need at least 40 or 50 satellites.

The system is also more complex. Each satellite is only overhead for a few minutes, so you have to switch between orbiting “cell towers” all the time. If there are enough satellites, it can be an advantage because you might get blocked from one satellite by, say, a mountain, and just pick up a different one instead.

Globalstar used 48 satellites, but couldn’t cover the poles. They eventually switched to a constellation of 24 satellites. Iridium, on the other hand, operates 66 satellites and claims to cover the entire globe. The satellites can beam signals to the Earth or each other.

The Problems


There are a variety of issues with most, if not all, satellite phones. First, geosynchronous satellites won’t work if you are too far North or South since the satellite will be so low, you’ll bump into things like trees and mountains. Of course, they don’t work if you are on the wrong side of the world, either, unless there is a network of them.

Getting a signal indoors is tricky. Sometimes, it is tricky outdoors, too. And this isn’t cheap. Prices vary, but soon after the release, phones started at around $1,300, and then you paid $7 a minute to talk. The geosynchronous satellites, in particular, are subject to getting blocked momentarily by just about anything. The same can happen if you have too few satellites in the sky above you.

Modern pricing is a bit harder to figure out because of all the different plans. However, expect to pay between $50 and $150 a month, plus per-minute charges ranging from $0.25 to $1.50 per minute. In general, networks with less coverage are cheaper than those that work everywhere. Text messages are extra. So, of course, is data.

If you want to see what it really looked like to use a 1990-era Iridium phone, check out [saveitforparts] video below.

youtube.com/embed/omerPV8CPZQ?…

If you prefer to see an older non-phone system, check him out with an even older Inmarsat station in this video:

youtube.com/embed/mOvUxoA7Ngs?…

So it is no wonder these never caught on with the mass market. We expect that if providers can link normal cell phones to a satellite network, these older systems will fall by the wayside, at least for voice communications. Or, maybe hacker use will get cheaper. We can hope, right?


hackaday.com/2025/06/23/eulogy…




Operazione Midnight Hammer, l’attacco Usa che cambia il Medio Oriente. L’analisi di Caruso

@Notizie dall'Italia e dal mondo

Nella notte tra sabato 21 e domenica 22 giugno 2025, gli Stati Uniti hanno compiuto un passo senza precedenti nella loro storia moderna: l’attacco diretto alle infrastrutture nucleari iraniane. L'”Operazione Midnight Hammer” rappresenta un



Videoüberwachung und Staatstrojaner: Berliner Landesregierung will Befugnisse der Polizei ausweiten


netzpolitik.org/2025/videouebe…



Cloudflare mitiga un attacco da 7,3 terabit al secondo. Immagina 9350 film in HD scaricati in 45 secondi


A metà maggio 2025, Cloudflare ha bloccato il più grande attacco DDoS mai registrato: ben 7,3 terabit al secondo (Tbps). Questo evento segue di poco la pubblicazione del report sulle minacce DDoS per il primo trimestre del 2025 avvenuta il 27 aprile 2025, in cui era stato evidenziato attacchi che raggiungevano i 6,5 Tbps e 4,8 miliardi di pacchetti al secondo (pps).

37,4 terabyte non sono una cifra sbalorditiva per le dimensioni odierne, ma scaricarne 37,4 terabyte in soli 45 secondi lo è.

Equivale a inondare la rete con oltre 9.350 film in HD o a guardare in streaming 7.480 ore di video ad alta definizione senza interruzioni (quasi un anno di maratona di visione di serie TV consecutive) in soli 45 secondi.

Se si trattasse di musica, scaricheresti circa 9,35 milioni di brani in meno di un minuto, abbastanza per tenere impegnato un ascoltatore per 57 anni di fila. Immagina di scattare 12,5 milioni di foto ad alta risoluzione con il tuo smartphone senza mai esaurire lo spazio di archiviazione: anche se ne scattassi una al giorno, staresti lì a cliccare per 4.000 anni, ma in 45 secondi.

L’attacco ha preso di mira un cliente di Cloudflare, un provider di hosting, che utilizza Magic Transit per difendere la propria rete IP. I provider di hosting e le infrastrutture Internet critiche sono sempre più spesso bersaglio di attacchi DDoS, come riportato nell’ultimo rapporto sulle minacce DDoS

L’immagine sottostante mostra una campagna di attacchi condotta tra gennaio e febbraio 2025, che ha sferrato oltre 13,5 milioni di attacchi DDoS contro l’infrastruttura di Cloudflare e i provider di hosting protetti da Cloudflare.

L’attacco ha colpito a tappeto una media di 21.925 porte di destinazione di un singolo indirizzo IP, con un picco di 34.517 porte di destinazione al secondo. L’attacco ha avuto origine anche da una distribuzione simile di porte sorgente.

L’attacco da 7,3 Tbps è stato un attacco DDoS multivettore. Circa il 99,996% del traffico dell’attacco è stato classificato come flood UDP. Tuttavia, il restante 0,004%, pari a 1,3 GB del traffico dell’attacco, è stato identificato come attacchi di riflessione QOTD, attacco di riflessione Echo, attacco di riflessione NTP, attacco di flood UDP Mirai, flood Portmap e attacchi di amplificazione RIP

L'articolo Cloudflare mitiga un attacco da 7,3 terabit al secondo. Immagina 9350 film in HD scaricati in 45 secondi proviene da il blog della sicurezza informatica.



L’UE indaga sull’acquisizione della piattaforma X di Elon Musk da parte di xAI

L'articolo proviene da #Euractiv Italia ed è stato ricondiviso sulla comunità Lemmy @Intelligenza Artificiale
La Commissione ha inviato una richiesta di informazioni ai sensi del regolamento online dell’UE, il Digital Services Act (DSA), al fine di chiarire la struttura

Intelligenza Artificiale reshared this.



Filomena Gallo partecipa alla proiezione del film “La stanza accanto”


L’avvocata Filomena Gallo, Segretaria nazionale dell’Associazione Luca Coscioni, partecipa al dibattito e alla proiezione del film La stanza accanto di Pedro Almodovar. La proiezione è organizzata da Rete dei Diritti, in collaborazione con Arianteo.

L’appuntamento è per mercoledì 9 luglio 2025 alle ore 20:45 a Palazzo Reale, a Milano.


Con Filomena Gallo partecipa anche Valeria Imbrogno, psicologa e compagna di DJ Fabo. Presenta e coordina Ilio Pacini Mannucci, magistrato del Tribunale di Milano. I biglietti sono acquistabili in loco.

L'articolo Filomena Gallo partecipa alla proiezione del film “La stanza accanto” proviene da Associazione Luca Coscioni.



Datenaustausch zwischen Behörden: Innenminister setzen Vertrauen bei der Behandlung psychischer Erkrankungen aufs Spiel


netzpolitik.org/2025/datenaust…



Earth’s Oxygen Levels and Magnetic Field Strength Show Strong Correlation



Time series of O2 (blue) and VGADM (red). (Credit: Weijia Kuang, Science Advances, 2025)Time series of O2 (blue) and VGADM (red). (Credit: Weijia Kuang, Science Advances, 2025)
In an Earth-sized take on the age-old ‘correlation or causality’ question, researchers have come across a fascinating match between Earth’s magnetic field and its oxygen levels since the Cambrian explosion, about 500 million years ago. The full results by [Weijia Kuang] et al. were published in Science Advances, where the authors speculate that this high correlation between the geomagnetic dipole and oxygen levels as recorded in the Earth’s geological mineral record may be indicative of the Earth’s geological processes affecting the evolution of lifeforms in its biosphere.

As with any such correlation, one has to entertain the notion that said correlation might be spurious or indirectly related before assuming a strong causal link. Here it is for example known already that the solar winds affect the Earth’s atmosphere and with it the geomagnetic field, as more intense solar winds increase the loss of oxygen into space, but this does not affect the strength of the geomagnetic field, just its shape. The question is thus whether there is a mechanism that would affect this field strength and consequently cause the loss of oxygen to the solar winds to spike.

Here the authors suggest that the Earth’s core dynamics – critical to the geomagnetic field – may play a major role, with conceivably the core-mantle interactions over the course of millions of years affecting it. As supercontinents like Pangea formed, broke up and partially reformed again, the impact of this material solidifying and melting could have been the underlying cause of these fluctuations in oxygen and magnetic field strength levels.

Although hard to say at this point in time, it may very well be that this correlation is causal, albeit as symptoms of activity of the Earth’s core and liquid mantle.


hackaday.com/2025/06/23/earths…



What does the US want from tech?


What does the US want from tech?
IT'S MONDAY, AND THIS IS DIGITAL POLITICS. I'm Mark Scott, and here's a quick scheduling update. I'm taking the first two weeks off in July, so next week's newsletter will be the last until July 14.

Don't worry. I've got some programming planned for my upcoming break, but FYI.

— The United States is pursuing a "cake-and-eat-it" strategy on digital policymaking. It's leaving many confused, at home and abroad.

— The United Kingdom just passed the world's second mandate requiring social media companies to open up to outsiders in the name of accountability and transparency.

— Social media dominates how we all access information online. But artificial intelligence tools are likely to upend that status-quo.

Let's get started:



digitalpolitics.co/newsletter0…