shakedown.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A community for live music fans with roots in the jam scene. Shakedown Social is run by a team of volunteers (led by @clifff and @sethadam1) and funded by donations.

Administered by:

Server stats:

252
active users

#socialcomputing

0 posts0 participants0 posts today

The past few days I've been thinking a lot again about one of the thought/design models most influential on my own #OpenSource practice: Frank Duffy's architectural pace layers (and Stewart Brand's subsequent extension to different contexts), their different timescales and interactions as basis for resilient system design:

1. Each layer exists & operates independently, moves at different timescales (from seconds to millennia and beyond)
2. Each layer influences and only interacts with its direct neighbors

"Fast layers innovate, slow ones stabilize." — S.Brand

I always found that model super helpful for analyzing and deciding how to deal with individual projects and components in terms of focus/effort, and asking myself which layer this thing might/should be part of. Lately though, I keep on trying to figure out how to better utilize that model to orient my own future practice, also with respect to the loose theme of #PermaComputing and how to frame and better organize my own approaches to it, incl. how to reanimate or repurpose some of the related, discontinued, but not invalid research & projects I've been doing along these lines over the past 15 years...

I understand and appreciate most of the focus on #FrugalComputing & #RetroComputing-derived simplicity as starting points and grounding concepts for attempting to build a more sustainable, personal, comprehensible and maintainable tech, but these too can quickly become overly dogmatic and maybe too constraining to ever become "truly" permanent (at least on the horizon of a few decades). I think the biggest hurdles to overcome are social rather than technological (e.g. a need for post-consumerist, post-spectacular behaviors), so I'm even more interested in Illich/Papert/Nelson/Felsenstein-inspired #ConvivialComputing, #SocialComputing, IO/comms/p2p, #Accessibility, UI, protocol and other resiliency design aspects becoming a core part of that research and think the idea of pace layering can be a very powerful tool to take into consideration here too, at the very least for guiding (and questioning) how to approach and structure any perma-computing related research itself...

Given the current physical and political climate shifts, is it better to continue working "upwards" (aka #BottomUpDesign), i.e. primarily focusing on first defining slow moving, low-level layers as new/alternative foundations (an example here would be the flurry of VM projects, incl. my own)? Or, is it more fruitful and does the situation instead call for a more urgent focus on fast-moving pace layer experiments and continuously accumulating learnings as fallout/sediment to allow the formation of increasingly more stable, but also more holistically informed, slower moving structural layers to build upon further?

It's a bit of chicken vs. egg! In my mind, right now the best approach seems to be a two-pronged one, alternating from both sides, each time informing upcoming work/experiments on the opposite end (fast/slow) and each time involving an as diverse as possible set of non-techbro minds from various fields... 🤔

Personal: I am very happy to announce that I have accepted a tenure-track Assistant Professor position at the University of Groningen. Looking forward to further collaborations, and am glad to continue working within the Information Systems Group at the Bernoulli Institute.

@academicchatter get in touch if you are working on #networkscience, #datascience, #misinformation, #polarization, #privacy, #crowdsourcing #SocialComputing #complexsystems @academicsunite

Thought I'd post an (academic) #introduction. :D I am an Assistant Professor at the University of #Groningen, with current research focus on #largscale #networkanalysis (#networkscience) working within the #InformationSystems Group; I previously worked on #humancomputation (don't like the term), #socialcomputing, #distributedsystems at #tuwien. Interested in socio-tech. issues, including #ethics #privacy, tech #policy
#academicsofmastodon #complexnetworks #complexsystems
@academicchatter

post-migration re-#introduction:

I'm an assistant professor in the School of Information at #umich. I develop computational methods to study conversations, in the vein of #computationalsocialscience #datascience #nlp #nlproc, with #socialcomputing #cscw #emca #hci #linguistics in my peripheral vision.

I have a dog whose yawns, out of context, can be construed as screaming. I maintain a messy mapping of books to cafés at tisjune.github.io/recreation/

tisjune.github.iorecreationjustine zhang's website

Been thinking more about how to introduce “design friction” to improve our online interactions. Mastodon has many, such as the lack of a quote-post function, that constrain virality. But, we lose the pro-social effects of virality, such as visibility into a conversation going on outside of our own “filter bubble.”

More on the antivirality of Mastodon: uxdesign.cc/mastodon-is-antivi