shakedown.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A community for live music fans with roots in the jam scene. Shakedown Social is run by a team of volunteers (led by @clifff and @sethadam1) and funded by donations.

Administered by:

Server stats:

269
active users

#entropy

0 posts0 participants0 posts today
Don Curren 🇨🇦🇺🇦<p>“They have proposed nothing less than a new law of <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23nature" target="_blank">#nature</a>, according to which the <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23complexity" target="_blank">#complexity</a> of entities in the universe increases over time with an inexorability comparable to the … inevitable rise in <a class="hashtag" rel="nofollow noopener" href="https://bsky.app/search?q=%23entropy" target="_blank">#entropy</a>, a measure of disorder” <a href="https://www.wired.com/story/why-everything-in-the-universe-turns-more-complex/?utm_source=nl&amp;utm_brand=wired&amp;utm_mailing=WIR_Daily_060825&amp;utm_campaign=aud-dev&amp;utm_medium=email&amp;utm_content=WIR_Daily_060825&amp;bxid=5bd67d5524c17c104802a904&amp;cndid=49072736&amp;hasha=92da85d78a132859c626d704a48daaf8&amp;hashc=54340b2d3b1ebc399607e4dd91d3784593e18ef6368cfa0e6cfe524bc4038950&amp;esrc=BottomStories&amp;utm_term=WIR_Daily_Active" rel="nofollow noopener" target="_blank">www.wired.com/story/why-ev...</a><br><br><a href="https://www.wired.com/story/why-everything-in-the-universe-turns-more-complex/?utm_source=nl&amp;utm_brand=wired&amp;utm_mailing=WIR_Daily_060825&amp;utm_campaign=aud-dev&amp;utm_medium=email&amp;utm_content=WIR_Daily_060825&amp;bxid=5bd67d5524c17c104802a904&amp;cndid=49072736&amp;hasha=92da85d78a132859c626d704a48daaf8&amp;hashc=54340b2d3b1ebc399607e4dd91d3784593e18ef6368cfa0e6cfe524bc4038950&amp;esrc=BottomStories&amp;utm_term=WIR_Daily_Active" rel="nofollow noopener" target="_blank">A New Law of Nature Attempts t...</a></p>
Replied in thread

@bullivant What’s the case for Fascist intelligence? Stupidly I guess, as Ai scam hits critical mass, on the basis of language acquisition and spatial recognition being recursive. No work, no money from IP, none to pay and profit from its theft. Partial intelligibility is comprehensive, so not constituted by the Control of Information Processing! #FascistTech #OligarchsFinalSolution #ReplacingPeople #AiScam #Ai #CIP #FascistStupidity #FascistFanBoys #DuoLingo #Entropy #Collapse

For today's #DailyDoodle, I traced and sketched the cover art for the #PhilipGlass album #GlassWorks while listening to the first track on repeat.

It's always been a favorite of mine, and I love how ordered his music is. I think you either love his stuff or utterly hate it, either completely enjoy listening to it, or NOPE right out of it after about a minute.

Here is the track, via invidious: https://inv.nadeko.net/watch?v=_2vRbNehGB0

I traced the basic outline of the image via the line tool (rather, the line mode of the pen tool) in #Xournal++, and then freestyled the shading with the album art to the side, rather than traced.

For tracing, I just put the Xournal++ window above the swayimg window, and made it mostly transparent.

I think I will start using this account for my daily doodles, as it'd be nice to have them gathered together in a separate space from my usual account (@(@rl_dane@polymaths.social), and I don't take that many photos, anyway.

It's very interesting how much #entropy there is in hand-drawn sketches. Even when I was trying to save space by converting the image to monochrome (which I didn't do with this one), the file sizes for the sketches were still pretty large, and even attempting to convert to #JPEG or #WebP didn't save much space over the monochrome PNG unless I really cranked down the compression factor and turned it into a blurry mess.

It's also fascinating that the compressed XML source for the image is about the same size as the PNG. ^__^ (204KiB vs 158KiB)
Replied in thread

@GovTrack

A thought experiment.

Do you understand the difference between Entropy and Random?

Consider a shaker with 50 Green balls and 50 Red balls inside, all uniform except for color.

Is the contents inside the shaker Random? The answer is No. We know the contents. Well defined.

But, we know the contents have max Entropy.

Next, we shake one ball out of the shaker, and drop it into a box which we can not see inside of.

What have we learned? Two things. We now know that the Entropy inside the shaker has dropped. And we know we have a Random bit in the box.

At this point, we still have max Entropy in the overall system (shaker and box), but we still do not have our Random bit.

We have to Observe the contents of the box, and Record the state of the ball in the box. Red=0, Green=1 or the reverse depending upon Choice.

You now have a Random bit.

You now put the ball back in the shaker which now restores the max Entropy.

Then, you shake again, repeating the process until you have collected the states of as many Random bits as desired.

This process is not fast. Beware of fast Random.

#PhysicsJournalClub
"Temperature as joules per bit"
by C.A. Bédard, S. Berthelette, X. Coiteux-Roy, and S. Wolf

Am. J. Phys. 93, 390 (2025)
doi.org/10.1119/5.0198820

Entropy is an important but largely misunderstood quantity. A lot of this confusion arise from its original formulation within the framework of Thermodynamics. Looking at it from a microscopic point of view (i.e. approaching it as a Statistical Mechanics problem) makes it a lot more digestible, but its ties to Thermodynamics still creates a lot of unnecessary complications.
In this paper the authors suggest that by removing the forced connection between entropy and the Kelvin temperature scale, one can rethink entropy purely in terms of information capacity of a Physical system, which takes away a lot of the difficulties usually plaguing the understanding of what entropy is actually about.
I don't think the SI will ever consider their suggestion to remove Kelvins as a fundamental unit and include bits, but this paper will be a great boon to any student banging their head against the idea of entropy for the first (or second, or third) time.

What Is Entropy? A Measure of Just How Little We Really Know.

Exactly 200 years ago, a French engineer introduced an idea that would quantify the universe’s inexorable slide into decay. But entropy, as it’s currently understood, is less a fact about the world than a reflection of our growing ignorance. Embracing that truth is leading to a rethink of everything from rational decision-making to the limits of machines.

By Zack Savistky

quantamagazine.org/what-is-ent