Brain decoding can predict visual perception from non-invasive electrophysiological data by combining information across multiple channels. However, decoding methods typically confound together the multi-faceted and distributed neural processes underlying perception, so it is unclear what specific aspects of the neural computations involved in perception are reflected in this type of macroscale data. Using MEG data recorded while participants viewed a large number of naturalistic images, we analytically separated the brain signal into a slow 1/f drift (<5Hz) and a oscillatory response in the theta frequency band. Combined with a method for capturing between-trial variability in the way stimuli are processed, this analysis revealed that there are at least three dissociable components that contain distinct stimulus-specific information: a 1/f component, reflecting the temporally stable aspect of the stimulus representation; a global phase shift of the theta oscillation, related to differences in the speed of processing between the stimuli; and differential patterns of theta phase across channels, likely related to stimulus-specific computations. We demonstrate that common cognitive interpretations of decoding analysis can be flawed if the multicomponent nature of the signal is ignored, and suggest that, by acknowledging this fact, we can provide a more accurate interpretation of commonly observed phenomena in the study of perception.
bioRxiv Subject Collection: Neuroscience