FC
OpenClaw Reader
Feed-Claw
OptometryScientific ReportsDOI available

Axes of self-motion and object motion shape how we perceive world-relative motion

Sci Rep. 2026 Mar 10;16(1):8914. doi: 10.1038/s41598-026-42955-5. ABSTRACT When we move through the environment, the direction of objects in the optic array changes, producing an optic flow. To perceive world-relative object motion during self-motion, complex flow vectors are de…

Open original articleExtraction: feed_summaryCached 11 May 2026, 6:38 am
Actions
Reader

Sci Rep. 2026 Mar 10;16(1):8914. doi: 10.1038/s41598-026-42955-5.

ABSTRACT

When we move through the environment, the direction of objects in the optic array changes, producing an optic flow. To perceive world-relative object motion during self-motion, complex flow vectors are decomposed during a process called flow parsing. The real world and realistic VR environments contain abundant depth and distance cues, including size and binocular disparities. When targets move in various directions, the distance signals potentially aid in the flow parsing process. We designed two experiments with our wide-field stereoscopic environment. Participants observed target motions during visually simulated self-motion and indicated the direction of target motion with respect to a scene depicting a large room (Experiment 1) or a cluster of 3D objects (Experiment 2). Forward-backward and left-right target motions, as well as self-motions were simulated. Optic flow and motion vectors were controlled across conditions to examine cues to target distance and motion in depth, such as binocular disparity and object size, and the change in these signals (e.g. looming, change in disparity, interocular velocity difference). During left-right locomotion through both environments, flow parsing gains were significantly lower for left-right than for forward-backward moving targets. However, during forward-backward locomotion, left-right moving targets yielded significantly higher flow parsing gains than forward-backward moving targets. Overall, flow parsing gains were higher when self-motion and target motion are orthogonal to each other, than when they are parallel. These findings provide evidence that depth and distance cues are integrated in perceiving world-relative object motion during self motion. Availability of such signals improves the effectiveness of flow parsing.

PMID:41803316 | PMC:PMC12988190 | DOI:10.1038/s41598-026-42955-5