shakedown.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A community for live music fans with roots in the jam scene. Shakedown Social is run by a team of volunteers (led by @clifff and @sethadam1) and funded by donations.

Administered by:

Server stats:

252
active users

#eyetracking

0 posts0 participants0 posts today

New #accessibility rollout from #Slack: simplified layout mode in the desktop app.

From the blog post[1], simplified layout mode "helps you focus by showing one section of Slack at a time," and "minimizes distractions which may benefit single-taskers and people using assistive technology."

This was originally built to "make Slack easier to use for #eyeTracking and #headMouse users," but has since been determined to also benefit "other switch users as well as: #screenReader users (by reducing the amount of stuff on screen at once), #neurodivergent people who find Slack overwhelming or distracting, and even people with limited screen real estate."[2]

[1] slack.com/help/articles/412145

[2] web-a11y.slack.com/archives/C0

Slack Help CenterUse simplified layout mode in SlackSimplified layout mode for the Slack desktop app helps you focus by showing one section of Slack at a time. This mode provides simplified layouts and minimizes distractions, which may benefit singl...

In our latest project, we taught irony to school children (and the parents rejoiced).

Irony comprehension requires going beyond literal meaning of words and is challenging for children. In this pre-registered study, we investigated how teaching metapragmatic knowledge in classrooms impacts written irony comprehension in 10-year-old Finnish-speaking children (n = 41, 21 girls) compared to a control group (n = 34, 13 girls).

At pre-test, children read ironic and literal sentences embedded in stories while their eye movements were recorded. Next, the training group was taught about irony, and the control group was taught about reading comprehension. At post-test, the reading task and eye-tracking were repeated.

Irony comprehension improved after metapragmatic training on irony, suggesting that metapragmatic knowledge serves an important role in irony development. However, the eye movement data suggested that training did not change the strategy children used to resolve the ironic meaning. The results highlight the potential of metapragmatic training and have implications for theories of irony comprehension.

doi.org/10.1017/S0305000925000

Cambridge CoreLearning Irony in School: Effects of Metapragmatic Training | Journal of Child Language | Cambridge CoreLearning Irony in School: Effects of Metapragmatic Training

📉🎓 2x 100% TVL13 positions for #PhD or #postdoc 🎓📉

For the new Emmy-Noether project "#EEG in motion" in Stuttgart, I'm looking for new members to join our group to work out how to understand brain activity of self-, object- and eye-movements.

Methods are #EEG #eyetracking #machinelearning #RSE

s-ccs.de/emmynoether

pdf-link: s-ccs.de/assets/hiring/2024-05

Thanks for boosting & recommending to students directly 🙏🏼

Just noticed that I haven't written an #introduction since I joined 😀

I'm a PhD student at the University of St.Gallen (CH) where I study how ubiquitous #personalization systems can make our interactions with our environment more efficient, safer and more inclusive, and how they can be built in a responsible and societally beneficial way. My research combines #mixedreality , #UbiComp, #privacy, #algorithms, and #eyetracking.
Feel free to reach out, if you'd like to have a chat!

#ISOQOL content that conference attendees liked on other platforms:

Julie Ratcliffe presented among other things on their work including insights from the application of #EyeTracking technology when filling out the #EQ5D
link.springer.com/article/10.1

🔥For papers published in Quality of Life Research in 2023, this is so far the most accessed paper!🥳

SpringerLinkFeasibility of self-reported health related quality of life assessment with older people in residential care: insights from the application of eye tracking technology - Quality of Life ResearchPurpose Increasingly there are calls to routinely assess the health-related quality of life (HRQoL) of older people receiving aged care services, however the high prevalence of dementia and cognitive impairment remains a challenge to implementation. Eye-tracking technology facilitates detailed assessment of engagement and comprehension of visual stimuli, and may be useful in flagging individuals and populations who cannot reliably self-complete HRQoL instruments. The aim of this study was to apply eye-tracking technology to provide insights into self-reporting of HRQoL among older people in residential care with and without cognitive impairment. Methods Residents (n = 41), recruited based on one of three cognition subgroups (no, mild, or moderate cognitive impairment), completed the EQ-5D-5L on a computer with eye tracking technology embedded. Number and length of fixations (i.e., eye gaze in seconds) for key components of the EQ-5D-5L descriptive system were calculated. Results For all dimensions, participants with no cognitive impairment fixated for longer on the Area of Interest (AOI) for the response option they finally chose, relative to those with mild or moderate cognitive impairment. Participants with cognitive impairment followed similar fixation patterns to those without. There was some evidence that participants with cognitive impairment took longer to complete and spent relatively less time attending to the relevant AOIs, but these differences did not reach statistical significance generally. Conclusions This exploratory study applying eye tracking technology provides novel insights and evidence of the feasibility of self-reported HRQoL assessments in older people in aged care settings where cognitive impairment and dementia are highly prevalent.

Wow. In 24 hours, we have gone from zero to 4.4K followers, that‘s crazy. Thank you for a warm welcome and excellent tips. I gave up on replying to all of you after someone pointed out that I was spamming thousands of people – sorry! Also, please do not read too much into it if we do not respond or take a long time responding, we are a busy bunch and may simply sometimes miss your post or messages. Mastodon allows long posts so I am taking advantage of that, so here are a few things that you may – or may not – want to know.

—Who are we?—

Research in the Icelandic Vision Lab (visionlab.is) focuses on all things visual, with a major emphasis on higher-level or “cognitive” aspects of visual perception. It is co-run by five Principal Investigators: Árni Gunnar Ásgeirsson, Sabrina Hansmann-Roth, Árni Kristjánsson, Inga María Ólafsdóttir, and Heida Maria Sigurdardottir. Here on Mastodon, you will most likely be interacting with me – Heida – but other PIs and potentially other lab members (visionlab.is/people) may occasionally also post here as this is a joint account. If our posts are stupid and/or annoying, I will however almost surely be responsible!

—What do we do?—

Current and/or past research at IVL has looked at several visual processes, including #VisualAttention , #EyeMovements , #ObjectPerception , #FacePerception , #VisualMemory , #VisualStatistics , and the role of #Experience / #Learning effects in #VisualPerception . Some of our work concerns the basic properties of the workings of the typical adult #VisualSystem . We have also studied the perceptual capabilities of several unique populations, including children, synesthetes, professional athletes, people with anxiety disorders, blind people, and dyslexic readers. We focus on #BehavioralMethods but also make use of other techniques including #Electrophysiology , #EyeTracking , and #DeepNeuralNetworks

—Why are we here?—

We are mostly here to interact with other researchers in our field, including graduate students, postdoctoral researchers, and principal investigators. This means that our activity on Mastodon may sometimes be quite niche. This can include boosting posts from others on research papers, conferences, or work opportunities in specialized fields, partaking in discussions on debates in our field, data analysis, or the scientific review process. Science communication and outreach are hugely important, but this account is not about that as such. So we take no offence if that means that you will unfollow us, that is perfectly alright :)

—But will there still sometimes be stupid memes as promised?—

Yes. They may or may not be funny, but they will be stupid.

visionlab.isIcelandic Vision Lab

#Introduction
#migration

Just migrated to lingo.lol to be nearer to my people..

I’m a professor at #McGill in #Montreal #Canada who studies #language, #multilingualism, #psycholinguistics & #cognition using a variety of tools that include #eyetracking, #social #networks, and some #neuro approaches..

Looking forward to learning about people's work at this and other servers..

Lab web site: mcgill.ca/language-lab/

Mastodon Lab web site: @McGill_Multilingualism_Lab

Language & Multilingualism LabLanguage & Multilingualism LabDepartment of Psychology, McGill University

Hi all, I’m a researcher working on brain-computer interfaces at Snap Inc.

My goal is to prove that noninvasive #BCI and #neurofeedback can provide value outside of the lab.

Currently for me that includes (but does not limit to) applying #deeplearning methods to #timeseries data like #EEG, but also other sensors like #eyetracking or inertial sensors (#IMUs)



Also a #python geek. I maintain an M/EEG analysis toolbox
➡️ github.com/nbara/python-meegki

GitHubGitHub - nbara/python-meegkit: 🔧🧠 MEEGkit: MEG & EEG processing toolkit in Python 🧠🔧🔧🧠 MEEGkit: MEG & EEG processing toolkit in Python 🧠🔧 - GitHub - nbara/python-meegkit: 🔧🧠 MEEGkit: MEG & EEG processing toolkit in Python 🧠🔧

#Introduction

Happy to see this community expanding, and I’m hoping for it to overtake that other place!

I’m a professor at #McGill in #Montreal #Canada who studies #language, #multilingualism, & #cognition using a variety of tools that include #eyetracking, #social #networks, and some #neuro approaches..

Lab web site: mcgill.ca/language-lab/

Mastodon groups to follow for people in my area to get started here:
@linguistics
@cognition

Language & Multilingualism LabLanguage & Multilingualism LabDepartment of Psychology, McGill University