Multisensory Continuous Psychophysics
Continuous Psychophysics is cool, and new, and exciting. I want to apply it to multisensory integration. Can we find evidence that perception is more precise and sped up when we have information from different senses? I am using variations on a heading perception task where we ask participants to point a joystick in the direction in which they think they are moving. The goal is to put people on motion platform to add vestibular cues to the visual cues they already have. I also have an audio-visual version in the works.
Aubert-Whomst?
As per two long-dead guys from Germany (Hermann Aubert) and Austria (Ernst von Fleischl Marxow), among others, we allegedly perceive objects to move more slowly when we follow them with our eyes than when we keep our eyes on, say, a fixation cross. We wondered if we could make this effect go away by adding better and more information on how fast the participant’s eyes are moving, specifically by adding relative motion between object and background. Which would of course be the norm in pretty much any day-to-day situation, or even most experimental designs.
However, it looks like the story is not that simple. Watch this space, we are probably posting a preprint soon!
Biases in Heading Perception
Guess I’m becoming a heading perception guy. We know that, depending on the exact task, perceived heading can be attracted towards the participants straight-ahead or repelled from it. We want to add a new (you guessed it, continuous psychophysics-y) task into the mix and see what happens here.
Motion prediction, self-motion and gravity
We are also looking to extend the basic concept of predicting motion during self-motion by adding another component: the internal model of gravity, or Gravity Prior. If (1) self-motion introduces biases in motion extrapolation, and (2) we use this gravity prior to better predict motion, then self-motion should bias motion prediction less when the motion is consistent with gravity.