shakedown.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A community for live music fans with roots in the jam scene. Shakedown Social is run by a team of volunteers (led by @clifff and @sethadam1) and funded by donations.

Administered by:

Server stats:

269
active users

#eyetracking

0 posts0 participants0 posts today
James Scholes<p>New <a href="https://dragonscave.space/tags/accessibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>accessibility</span></a> rollout from <a href="https://dragonscave.space/tags/Slack" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Slack</span></a>: simplified layout mode in the desktop app.</p><p>From the blog post[1], simplified layout mode "helps you focus by showing one section of Slack at a time," and "minimizes distractions which may benefit single-taskers and people using assistive technology."</p><p>This was originally built to "make Slack easier to use for <a href="https://dragonscave.space/tags/eyeTracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyeTracking</span></a> and <a href="https://dragonscave.space/tags/headMouse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>headMouse</span></a> users," but has since been determined to also benefit "other switch users as well as: <a href="https://dragonscave.space/tags/screenReader" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>screenReader</span></a> users (by reducing the amount of stuff on screen at once), <a href="https://dragonscave.space/tags/neurodivergent" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neurodivergent</span></a> people who find Slack overwhelming or distracting, and even people with limited screen real estate."[2]</p><p>[1] <a href="https://slack.com/help/articles/41214514885907-Use-simplified-layout-mode-in-Slack?locale=en-US" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">slack.com/help/articles/412145</span><span class="invisible">14885907-Use-simplified-layout-mode-in-Slack?locale=en-US</span></a></p><p>[2] <a href="https://web-a11y.slack.com/archives/C042TSFGN/p1747324169693879" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">web-a11y.slack.com/archives/C0</span><span class="invisible">42TSFGN/p1747324169693879</span></a></p>
Matt<p>Excited to receive a new GazePoint GP3 eyetracker today for our testing lab!</p><p>This eyetracker is made in Canada, so if you're a researcher looking for one, consider supporting Canadian tech by checking out GazePoint or SR Research</p><p><a href="https://mstdn.ca/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a> <a href="https://mstdn.ca/tags/psychology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>psychology</span></a> <a href="https://mstdn.ca/tags/neuroscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neuroscience</span></a> <a href="https://mstdn.ca/tags/MadeInCanada" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MadeInCanada</span></a></p>
Tuomo H<p>In our latest project, we taught irony to school children (and the parents rejoiced).</p><p>Irony comprehension requires going beyond literal meaning of words and is challenging for children. In this pre-registered study, we investigated how teaching metapragmatic knowledge in classrooms impacts written irony comprehension in 10-year-old Finnish-speaking children (n = 41, 21 girls) compared to a control group (n = 34, 13 girls). </p><p>At pre-test, children read ironic and literal sentences embedded in stories while their eye movements were recorded. Next, the training group was taught about irony, and the control group was taught about reading comprehension. At post-test, the reading task and eye-tracking were repeated. </p><p>Irony comprehension improved after metapragmatic training on irony, suggesting that metapragmatic knowledge serves an important role in irony development. However, the eye movement data suggested that training did not change the strategy children used to resolve the ironic meaning. The results highlight the potential of metapragmatic training and have implications for theories of irony comprehension.</p><p><a href="https://doi.org/10.1017/S0305000925000054" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">doi.org/10.1017/S0305000925000</span><span class="invisible">054</span></a></p><p><a href="https://mementomori.social/tags/irony" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>irony</span></a> <a href="https://mementomori.social/tags/IronyComprehension" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>IronyComprehension</span></a> <a href="https://mementomori.social/tags/EyeTracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EyeTracking</span></a></p>
Benedikt Ehinger<p>📉🎓 2x 100% TVL13 positions for <a href="https://scholar.social/tags/PhD" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PhD</span></a> or <a href="https://scholar.social/tags/postdoc" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>postdoc</span></a> 🎓📉</p><p>For the new Emmy-Noether project "<a href="https://scholar.social/tags/EEG" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EEG</span></a> in motion" in Stuttgart, I'm looking for new members to join our group to work out how to understand brain activity of self-, object- and eye-movements. </p><p>Methods are <a href="https://scholar.social/tags/EEG" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EEG</span></a> <a href="https://scholar.social/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a> <a href="https://scholar.social/tags/machinelearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>machinelearning</span></a> <a href="https://scholar.social/tags/RSE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RSE</span></a> </p><p><a href="https://www.s-ccs.de/emmynoether" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">s-ccs.de/emmynoether</span><span class="invisible"></span></a> </p><p>pdf-link: <a href="https://www.s-ccs.de/assets/hiring/2024-05-27_emmyNoether.pdf" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">s-ccs.de/assets/hiring/2024-05</span><span class="invisible">-27_emmyNoether.pdf</span></a></p><p>Thanks for boosting &amp; recommending to students directly 🙏🏼</p>
IT News<p>DIY Eye and Face Tracking for the Valve Index VR Headset - The Valve Index VR headset has been around for a few years now. It doesn’t come wi... - <a href="https://hackaday.com/2024/05/19/diy-eye-and-face-tracking-for-the-valve-index-vr-headset/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">hackaday.com/2024/05/19/diy-ey</span><span class="invisible">e-and-face-tracking-for-the-valve-index-vr-headset/</span></a> <a href="https://schleuss.online/tags/virtualreality" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>virtualreality</span></a> <a href="https://schleuss.online/tags/mouthtracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mouthtracking</span></a> <a href="https://schleuss.online/tags/facetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>facetracking</span></a> <a href="https://schleuss.online/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a> <a href="https://schleuss.online/tags/valve" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>valve</span></a> <a href="https://schleuss.online/tags/vr" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vr</span></a></p>
Jannis<p>Just noticed that I haven't written an <a href="https://hci.social/tags/introduction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>introduction</span></a> since I joined 😀</p><p>I'm a PhD student at the University of St.Gallen (CH) where I study how ubiquitous <a href="https://hci.social/tags/personalization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>personalization</span></a> systems can make our interactions with our environment more efficient, safer and more inclusive, and how they can be built in a responsible and societally beneficial way. My research combines <a href="https://hci.social/tags/mixedreality" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mixedreality</span></a> , <a href="https://hci.social/tags/UbiComp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>UbiComp</span></a>, <a href="https://hci.social/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a>, <a href="https://hci.social/tags/algorithms" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithms</span></a>, and <a href="https://hci.social/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a>. <br>Feel free to reach out, if you'd like to have a chat!</p>
IT News<p>Experiencing Visual Handicaps and Their Impact on Daily Life, With VR - Researchers presented an interesting project at the 2024 IEEE Conference on Virtua... - <a href="https://hackaday.com/2024/03/29/experiencing-visual-handicaps-and-their-impact-on-daily-life-with-vr/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">hackaday.com/2024/03/29/experi</span><span class="invisible">encing-visual-handicaps-and-their-impact-on-daily-life-with-vr/</span></a> <a href="https://schleuss.online/tags/medicalsimulation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>medicalsimulation</span></a> <a href="https://schleuss.online/tags/visualimpairments" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>visualimpairments</span></a> <a href="https://schleuss.online/tags/virtualreality" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>virtualreality</span></a> <a href="https://schleuss.online/tags/medicalhacks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>medicalhacks</span></a> <a href="https://schleuss.online/tags/eyediseases" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyediseases</span></a> <a href="https://schleuss.online/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a> <a href="https://schleuss.online/tags/research" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>research</span></a> <a href="https://schleuss.online/tags/vr" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vr</span></a></p>
Matt Willemsen<p>Transparent Camera Tech Aims to Revolutionize Eye Tracking<br><a href="https://petapixel.com/2024/03/25/transparent-camera-tech-aims-to-revolutionize-eye-tracking/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">petapixel.com/2024/03/25/trans</span><span class="invisible">parent-camera-tech-aims-to-revolutionize-eye-tracking/</span></a> <a href="https://mastodon.social/tags/Technology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Technology</span></a> <a href="https://mastodon.social/tags/EmergingTechnology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EmergingTechnology</span></a> <a href="https://mastodon.social/tags/ImageSensor" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ImageSensor</span></a> <a href="https://mastodon.social/tags/newtechnology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>newtechnology</span></a> <a href="https://mastodon.social/tags/science" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>science</span></a> <a href="https://mastodon.social/tags/SensorTechnology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SensorTechnology</span></a> <a href="https://mastodon.social/tags/TransparentSensor" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TransparentSensor</span></a> <a href="https://mastodon.social/tags/EyeTracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EyeTracking</span></a></p>
Harry Underwood<p>This is very good news IMO <a href="https://union.place/tags/lenses" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lenses</span></a> <a href="https://union.place/tags/camera" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>camera</span></a> <a href="https://union.place/tags/vr" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vr</span></a> <a href="https://union.place/tags/xr" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>xr</span></a> <a href="https://union.place/tags/EyeTracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EyeTracking</span></a> <a href="https://union.place/tags/tech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>tech</span></a></p><p><a href="https://petapixel.com/2024/03/25/transparent-camera-tech-aims-to-revolutionize-eye-tracking/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">petapixel.com/2024/03/25/trans</span><span class="invisible">parent-camera-tech-aims-to-revolutionize-eye-tracking/</span></a></p>
Jan R. Boehnke<p><a href="https://mastodon.social/tags/ISOQOL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISOQOL</span></a> content that conference attendees liked on other platforms:</p><p>Julie Ratcliffe presented among other things on their work including insights from the application of <a href="https://mastodon.social/tags/EyeTracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EyeTracking</span></a> technology when filling out the <a href="https://mastodon.social/tags/EQ5D" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EQ5D</span></a><br><a href="https://link.springer.com/article/10.1007/s11136-023-03488-w" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">link.springer.com/article/10.1</span><span class="invisible">007/s11136-023-03488-w</span></a></p><p>🔥For papers published in Quality of Life Research in 2023, this is so far the most accessed paper!🥳</p><p><a href="https://mastodon.social/tags/HealthEconomics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HealthEconomics</span></a> <a href="https://mastodon.social/tags/HRQL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HRQL</span></a> <a href="https://mastodon.social/tags/Psychometrics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Psychometrics</span></a></p>
IT News<p>DIY Eye Tracking for VR Headsets, From A to Z - Eye tracking is a useful feature in social virtual reality (VR) spaces because it ... - <a href="https://hackaday.com/2023/08/05/diy-eye-tracking-for-vr-headsets-from-a-to-z/" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">hackaday.com/2023/08/05/diy-ey</span><span class="invisible">e-tracking-for-vr-headsets-from-a-to-z/</span></a> <a href="https://schleuss.online/tags/virtualreality" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>virtualreality</span></a> <a href="https://schleuss.online/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a> <a href="https://schleuss.online/tags/opensource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>opensource</span></a> <a href="https://schleuss.online/tags/vrchat" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vrchat</span></a> <a href="https://schleuss.online/tags/diy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>diy</span></a> <a href="https://schleuss.online/tags/vr" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vr</span></a></p>
IT News<p>Hackaday Prize 2023: Eye-Tracking Wheelchair Interface Is a Big Help - For those with quadriplegia, electric wheelchairs with joystick controls aren’t mu... - <a href="https://hackaday.com/2023/05/14/hackaday-prize-2023-eye-tracking-wheelchair-interface-is-a-big-help/" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">hackaday.com/2023/05/14/hackad</span><span class="invisible">ay-prize-2023-eye-tracking-wheelchair-interface-is-a-big-help/</span></a> <a href="https://schleuss.online/tags/2023hackadayprize" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>2023hackadayprize</span></a> <a href="https://schleuss.online/tags/thehackadayprize" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>thehackadayprize</span></a> <a href="https://schleuss.online/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a> <a href="https://schleuss.online/tags/mischacks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mischacks</span></a> <a href="https://schleuss.online/tags/webcam" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>webcam</span></a></p>
Icelandic Vision Lab<p>Wow. In 24 hours, we have gone from zero to 4.4K followers, that‘s crazy. Thank you for a warm welcome and excellent tips. I gave up on replying to all of you after someone pointed out that I was spamming thousands of people – sorry! Also, please do not read too much into it if we do not respond or take a long time responding, we are a busy bunch and may simply sometimes miss your post or messages. Mastodon allows long posts so I am taking advantage of that, so here are a few things that you may – or may not – want to know.</p><p>—Who are we?—</p><p>Research in the Icelandic Vision Lab (<a href="https://visionlab.is" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="">visionlab.is</span><span class="invisible"></span></a>) focuses on all things visual, with a major emphasis on higher-level or “cognitive” aspects of visual perception. It is co-run by five Principal Investigators: Árni Gunnar Ásgeirsson, Sabrina Hansmann-Roth, Árni Kristjánsson, Inga María Ólafsdóttir, and Heida Maria Sigurdardottir. Here on Mastodon, you will most likely be interacting with me – Heida – but other PIs and potentially other lab members (<a href="https://visionlab.is/people" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="">visionlab.is/people</span><span class="invisible"></span></a>) may occasionally also post here as this is a joint account. If our posts are stupid and/or annoying, I will however almost surely be responsible!</p><p>—What do we do?—</p><p>Current and/or past research at IVL has looked at several visual processes, including <a href="https://neuromatch.social/tags/VisualAttention" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VisualAttention</span></a> , <a href="https://neuromatch.social/tags/EyeMovements" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EyeMovements</span></a> , <a href="https://neuromatch.social/tags/ObjectPerception" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ObjectPerception</span></a> , <a href="https://neuromatch.social/tags/FacePerception" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FacePerception</span></a> , <a href="https://neuromatch.social/tags/VisualMemory" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VisualMemory</span></a> , <a href="https://neuromatch.social/tags/VisualStatistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VisualStatistics</span></a> , and the role of <a href="https://neuromatch.social/tags/Experience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Experience</span></a> / <a href="https://neuromatch.social/tags/Learning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Learning</span></a> effects in <a href="https://neuromatch.social/tags/VisualPerception" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VisualPerception</span></a> . Some of our work concerns the basic properties of the workings of the typical adult <a href="https://neuromatch.social/tags/VisualSystem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VisualSystem</span></a> . We have also studied the perceptual capabilities of several unique populations, including children, synesthetes, professional athletes, people with anxiety disorders, blind people, and dyslexic readers. We focus on <a href="https://neuromatch.social/tags/BehavioralMethods" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BehavioralMethods</span></a> but also make use of other techniques including <a href="https://neuromatch.social/tags/Electrophysiology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Electrophysiology</span></a> , <a href="https://neuromatch.social/tags/EyeTracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EyeTracking</span></a> , and <a href="https://neuromatch.social/tags/DeepNeuralNetworks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DeepNeuralNetworks</span></a></p><p>—Why are we here?—</p><p>We are mostly here to interact with other researchers in our field, including graduate students, postdoctoral researchers, and principal investigators. This means that our activity on Mastodon may sometimes be quite niche. This can include boosting posts from others on research papers, conferences, or work opportunities in specialized fields, partaking in discussions on debates in our field, data analysis, or the scientific review process. Science communication and outreach are hugely important, but this account is not about that as such. So we take no offence if that means that you will unfollow us, that is perfectly alright :)</p><p>—But will there still sometimes be stupid memes as promised?—</p><p>Yes. They may or may not be funny, but they will be stupid.</p><p><a href="https://neuromatch.social/tags/VisionScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VisionScience</span></a> <a href="https://neuromatch.social/tags/CognitivePsychology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitivePsychology</span></a> <a href="https://neuromatch.social/tags/CognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveScience</span></a> <a href="https://neuromatch.social/tags/CognitiveNeuroscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveNeuroscience</span></a> <a href="https://neuromatch.social/tags/StupidMemes" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>StupidMemes</span></a></p>
Debra Titone<p><a href="https://lingo.lol/tags/Introduction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Introduction</span></a> <br><a href="https://lingo.lol/tags/migration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>migration</span></a> </p><p>Just migrated to lingo.lol to be nearer to my people.. </p><p>I’m a professor at <a href="https://lingo.lol/tags/McGill" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>McGill</span></a> in <a href="https://lingo.lol/tags/Montreal" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Montreal</span></a> <a href="https://lingo.lol/tags/Canada" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Canada</span></a> who studies <a href="https://lingo.lol/tags/language" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>language</span></a>, <a href="https://lingo.lol/tags/multilingualism" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>multilingualism</span></a>, <a href="https://lingo.lol/tags/psycholinguistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>psycholinguistics</span></a> &amp; <a href="https://lingo.lol/tags/cognition" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>cognition</span></a> using a variety of tools that include <a href="https://lingo.lol/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a>, <a href="https://lingo.lol/tags/social" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>social</span></a> <a href="https://lingo.lol/tags/networks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>networks</span></a>, and some <a href="https://lingo.lol/tags/neuro" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neuro</span></a> approaches..</p><p>Looking forward to learning about people's work at this and other servers..</p><p>Lab web site: <a href="https://www.mcgill.ca/language-lab/" rel="nofollow noopener" target="_blank"><span class="invisible">https://www.</span><span class="">mcgill.ca/language-lab/</span><span class="invisible"></span></a></p><p>Mastodon Lab web site: <span class="h-card"><a href="https://fediscience.org/@McGill_Multilingualism_Lab" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>McGill_Multilingualism_Lab</span></a></span></p>
Nicolas Barascud<p>Hi all, I’m a researcher working on brain-computer interfaces at Snap Inc.</p><p>My goal is to prove that noninvasive <a href="https://sigmoid.social/tags/BCI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BCI</span></a> and <a href="https://sigmoid.social/tags/neurofeedback" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neurofeedback</span></a> can provide value outside of the lab.</p><p>Currently for me that includes (but does not limit to) applying <a href="https://sigmoid.social/tags/deeplearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>deeplearning</span></a> methods to <a href="https://sigmoid.social/tags/timeseries" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>timeseries</span></a> data like <a href="https://sigmoid.social/tags/EEG" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EEG</span></a>, but also other sensors like <a href="https://sigmoid.social/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a> or inertial sensors (<a href="https://sigmoid.social/tags/IMUs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>IMUs</span></a>)
</p><p>
Also a <a href="https://sigmoid.social/tags/python" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>python</span></a> geek. I maintain an M/EEG analysis toolbox<br>➡️ <a href="https://github.com/nbara/python-meegkit" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.com/nbara/python-meegki</span><span class="invisible">t</span></a> </p><p>

<a href="https://sigmoid.social/tags/introduction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>introduction</span></a> <a href="https://sigmoid.social/tags/neuroscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neuroscience</span></a> <a href="https://sigmoid.social/tags/altac" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>altac</span></a> <a href="https://sigmoid.social/tags/neurotech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neurotech</span></a></p>
Debra Titone@McGill_Multilingualism_Lab@fediscience.org
Debra Titone<p><a href="https://mstdn.social/tags/Introduction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Introduction</span></a></p><p>Happy to see this community expanding, and I’m hoping for it to overtake that other place!</p><p>I’m a professor at <a href="https://mstdn.social/tags/McGill" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>McGill</span></a> in <a href="https://mstdn.social/tags/Montreal" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Montreal</span></a> <a href="https://mstdn.social/tags/Canada" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Canada</span></a> who studies <a href="https://mstdn.social/tags/language" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>language</span></a>, <a href="https://mstdn.social/tags/multilingualism" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>multilingualism</span></a>, &amp; <a href="https://mstdn.social/tags/cognition" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>cognition</span></a> using a variety of tools that include <a href="https://mstdn.social/tags/eyetracking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyetracking</span></a>, <a href="https://mstdn.social/tags/social" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>social</span></a> <a href="https://mstdn.social/tags/networks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>networks</span></a>, and some <a href="https://mstdn.social/tags/neuro" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neuro</span></a> approaches..</p><p>Lab web site: <a href="https://www.mcgill.ca/language-lab/" rel="nofollow noopener" target="_blank"><span class="invisible">https://www.</span><span class="">mcgill.ca/language-lab/</span><span class="invisible"></span></a></p><p>Mastodon groups to follow for people in my area to get started here:<br><span class="h-card"><a href="https://a.gup.pe/u/linguistics" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>linguistics</span></a></span> <br><span class="h-card"><a href="https://a.gup.pe/u/cognition" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>cognition</span></a></span></p>