Good question, the problem is: "How do you calculate said numbers?"
Hi everyone, I am organising LOCO 2024, 1st International Workshop on Low Carbon Computing
It's hybrid and will be held 3 Dec 2024, in Glasgow (Scotland) and on line.
On line attendance is free; in person is £30
Deadline 1 Oct (full talk)/ 8 Oct (lightning talk)
Please consider submitting, whether you're academic or not, and please spread the word.
@drwho Shit like this makes me hate not just #snap but @letsencrypt because that's more code than the entire backend for @cacert ...
acme.sh
& #CertBot scripts they made AND certainly not more than the #API for #CAcert back in it's days...I think there needs to be more and harder pushes for #FrugalComputing because there's no valid reason they basically shove an entire #OS onto an existing one...
Glad to see scientists with a higher profile than me saying the same things on the environmental costs of "AI".
The carbon impact of artificial intelligence
https://www.nature.com/articles/s42256-020-0219-9
It takes a lot of energy for machines to learn – here's why AI is so power-hungry
https://theconversation.com/it-takes-a-lot-of-energy-for-machines-to-learn-heres-why-ai-is-so-power-hungry-151825
Energy consumption of AI poses environmental problems
https://www.techtarget.com/searchenterpriseai/feature/Energy-consumption-of-AI-poses-environmental-problems
Silicon Valley and the Environmental Costs of AI
https://www.perc.org.uk/project_posts/silicon-valley-and-the-environmental-costs-of-ai/
In 2020: 27 kilowatt hours of energy to train an AI model.
2022: 1 million kilowatt hours.
https://semiengineering.com/ai-power-consumption-exploding/
via https://mastodon.green/@gerrymcgovern/111770694336468868
OK, I just did a more rigorous study and I conclude that a single ChatGPT query requires between 30x and 50x more energy than a conventional search query. We can't afford this.
(blog post to follow soon)
"Digitalization is unlikely to be the environmental silver bullet it is sometimes claimed to be. On the contrary, the way digitalization changes society, making it ever faster, more connected, and allowing us unprecedented levels of efficiency might in fact lead to a backfire."
Digital Rebound – Why Digitalization Will Not Redeem Us Our Environmental Sins
The past few days I've been thinking a lot again about one of the thought/design models most influential on my own #OpenSource practice: Frank Duffy's architectural pace layers (and Stewart Brand's subsequent extension to different contexts), their different timescales and interactions as basis for resilient system design:
1. Each layer exists & operates independently, moves at different timescales (from seconds to millennia and beyond)
2. Each layer influences and only interacts with its direct neighbors
"Fast layers innovate, slow ones stabilize." — S.Brand
I always found that model super helpful for analyzing and deciding how to deal with individual projects and components in terms of focus/effort, and asking myself which layer this thing might/should be part of. Lately though, I keep on trying to figure out how to better utilize that model to orient my own future practice, also with respect to the loose theme of #PermaComputing and how to frame and better organize my own approaches to it, incl. how to reanimate or repurpose some of the related, discontinued, but not invalid research & projects I've been doing along these lines over the past 15 years...
I understand and appreciate most of the focus on #FrugalComputing & #RetroComputing-derived simplicity as starting points and grounding concepts for attempting to build a more sustainable, personal, comprehensible and maintainable tech, but these too can quickly become overly dogmatic and maybe too constraining to ever become "truly" permanent (at least on the horizon of a few decades). I think the biggest hurdles to overcome are social rather than technological (e.g. a need for post-consumerist, post-spectacular behaviors), so I'm even more interested in Illich/Papert/Nelson/Felsenstein-inspired #ConvivialComputing, #SocialComputing, IO/comms/p2p, #Accessibility, UI, protocol and other resiliency design aspects becoming a core part of that research and think the idea of pace layering can be a very powerful tool to take into consideration here too, at the very least for guiding (and questioning) how to approach and structure any perma-computing related research itself...
Given the current physical and political climate shifts, is it better to continue working "upwards" (aka #BottomUpDesign), i.e. primarily focusing on first defining slow moving, low-level layers as new/alternative foundations (an example here would be the flurry of VM projects, incl. my own)? Or, is it more fruitful and does the situation instead call for a more urgent focus on fast-moving pace layer experiments and continuously accumulating learnings as fallout/sediment to allow the formation of increasingly more stable, but also more holistically informed, slower moving structural layers to build upon further?
It's a bit of chicken vs. egg! In my mind, right now the best approach seems to be a two-pronged one, alternating from both sides, each time informing upcoming work/experiments on the opposite end (fast/slow) and each time involving an as diverse as possible set of non-techbro minds from various fields...
* Make software that works on older devices, the older the better.
* Make software that will keep on working for a very long time.
* Make software that uses the least amount of total energy to achieve its results.
* Make software that also uses the least amount of network data transfer, memory and storage.
* Make software that encourages the user to use it in a frugal way.
"The climate cost of the AI revolution"
On the energy cost of Large Language Models, what their widespread adoption could mean for global CO₂ emissions, and what could be done about it.
https://limited.systems/articles/climate-cost-of-ai-revolution/