shakedown.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A community for live music fans with roots in the jam scene. Shakedown Social is run by a team of volunteers (led by @clifff and @sethadam1) and funded by donations.

Administered by:

Server stats:

270
active users

#accessibility

53 posts48 participants4 posts today

Quick #accessibility tip for iPhone users. You can turn on a gesture to let iOS read your screen to you in Settings > Accessibility > Spoken Content
Then you can swipe down with two fingers to get a spoken version of your screen. Really handy to check how your post will be read for blind people.

Thanks, Jack White!

"On Sunday night, the Cactus Club launched a campaign with a video on social media, explaining that if all of their followers donated just one dollar, they would have more than enough funds to pay for the remainder of the construction. On Monday, they received a surprise donation from none other than Jack White, who kicked in the $20,000 necessary to fund the bump out extension."

onmilwaukee.com/articles/jack-

OnMilwaukeeJack White makes surprise donation to Cactus Club accessibility initiativeThe surprise donation on Monday comes as the Bay View venue faced increased costs for a ramp to be installed in front of their main entrance.
Replied in thread

@IAmDannyBoling @georgetakei We need to have a bigger discussion about the emotional labor of this (and the etiquette of demanding it from others in public), only because it is 2025 in an era where automation for this is becoming a realistic expectation

It is time to get sanctimonious — not at all users — but at developers of screen readers (and client software, but those days are also drawing to a close), because #accessibility for all and #AltText for all should be an automatic privilege

This isn’t on the same level as hiding topics behind warning tags.

I am posting this due to the wonderful enthusiasm of the developer in regards to accessibility. When I asked if this app is accessible with Talkback, this was the response I received.

"Comment from Hakanft on R/Android

Thanks for asking — and that's a really important point!

The app was built using React Native (with Expo), so it should support basic TalkBack functionality out of the box. However, I haven’t fully optimized or tested it for screen reader accessibility yet. 🙁

Your question made me realize I need to improve this — and I will! If you're open to sharing feedback after trying it, I'd truly appreciate it."

The app itself is not only a great idea but can be extremely helpful. It is a fun way to gauge your water intake and ensure that you are drinking enough water each day. It also works with twenty-nine languages! Below is the full discussion on Reddit, ttcomplete with IOS and Android links.

reddit.com/r/Android/comments/

Replied in thread
@Alex Feinman @Nora Reed Alt-text must never include explanations! Explanations must always go into the post itself!

Not everyone can access alt-text. Sighted people need a mouse/trackball/touchpad/trackpoint or a touch screen to access alt-text. And in order to operate that, they need at least one working hand. But not everyone has working hands. Just like not everyone can see, which is why you describe your images in the first place, right?

For those who can't access alt-text, any information only available in alt-text and neither in the post text nor in the image itself is inaccessible and lost. They can't open it, they can't read it.

Here are three relevant pages in my (very early WIP) wiki about image descriptions and alt-text:

#Long #LongPost #CWLong #CWLongPost #AltText #AltTextMeta #CWAltTextMeta #Disability #A11y #Accessibility
hub.netzgemeinde.euJupiter Rowland - jupiter_rowland@hub.netzgemeinde.eu

I've submitted a proposal to WHATWG about a provenance/origin classifier for alt text on images.

github.com/whatwg/html/issues/

The current HTML specification provides the alt attribute on images to supply alternative text for users who cannot see visual content. However, there is no way to indicate whether the alt text was authored by a human or generated automatically by a machine (e.g., via AI or computer vision).

What problem are you trying to solve? The current HTML specification provides the alt attribute on images to supply alternative text for users who cannot see visual content. However, there is no wa...
GitHubProposal: New attribute alt-origin to indicate provenance of alt text (machine vs. human authored) · Issue #11448 · whatwg/htmlBy Johnr24

Venting.

If I have to open the browser's dev tools, dig through the DOM to find where you've set the background colour of the page, and then change it manually, just to be able to read what you've written, you have failed utterly at web design, user experience, and accessibility simultaneously.

Yes, I'm looking at you, meditationsinanemergency.com/p .

Meditations in an Emergency · Please Shout Fire. This Theater Is BurningThe United States is being destroyed from within, and mainstream journalism isn't making that clear. When I was a kid, there was a popular phrase--"what if they had a war and no one came?" What if we were in a war and no one noticed? Obviously all the people

Tuba: a desktop #fediverse client for #GNU #Linux Here's a nice review. news.itsfoss.com/tuba/ If you install #Debian (ver. 13), with #GNOMEDesktop it is already installed. Can be installed, with #APT commands, from the terminal. I like it, so far, but am looking for #GUI customizations. Glad to learn there's an #accessibility preferences panel.

It's FOSS News · Sail The Fediverse With Tuba Client for LinuxTuba Fediverse client lets you connect with your Mastodon instance, and more. Check it out here!