Introduction

In this data set, individuals were presented with 3D panoramic indoor and outdoor scenes. They viewed each image for a set period of time, and were told to study the images for a later memory (recognition) task. Participants sat in the centre of a sphere, but were free to move their head and eyes while studying the scenes.

The goal of this project/analysis is to document patterns in individuals’ head and eye movements, and to identify whether there were consistent individual differences in individuals’ head and/or eye movement patterns.

I took a slightly more conservative approach by looking at only the outdoor/landscame scenes instead of both indoor and outdoor scenes, in case there were systematic differences in how individuals processed these two types of scenes. Later in the project, I hope to incorporate the indoor scenes.

Note: This is a comprehensive document where I walk through my decision-making process and contains a very detailed/thorough set of visualization of the data. If you’re not interested in this, I have provided a brief summary of what I did at the start of the model-based clustering analysis document, so you can refer to that document if you just want to look at the variables included/results.

 

Load libraries

Load data

Note: 4/32652 data points from the landscape data set were removed due to there being missing data in the eye or head movement coordinate data; given the small number of observations removed, this won’t affect the interpretation of the results too much.

(Quick) Data Exploration/Visualization

First, I spent some time trying to understand the data, first starting with the eye/head movement of one participant during one trial, then looking at the eye/head movement pattern of one participant across all trials.

All outdoor scene viewing for 1 participant

Detailed Look at Aggregate-Level Data

For this section, I will be examining a few aspects of the data in detail in an attempt to capture eye/head movement accurately. These variables/aspects are listed below:

  1. Eye/head movement “pattern” as captured by a measure of 2-dimensional spread (i.e., SDE)
  2. Saccade/head movement direction
  3. Saccade/head movement angle
  4. Saccade/head movement amplitude
  5. Fixation duration
  6. Head rotation

For each variable, I will take a look at the data at the fixation level for every participant (that is, observation is one fixation in the visualization), then take a look at the data at the trial level for each participant (i.e., every observation is an aggregate measure of fixations for each scene presented).

Constructing relevant data frames:

SDE:

# sde.look<-NULL
# sde.head<-NULL
# 
# # calculating SDD using calc_sdd from 'aspace' package
# for(s in sujs){ # subset by participant
#   persuj<-subset(land, subject==s)
#   for(i in stim){ # subset by each stim
# 
#     # include only the coordinates columns for eye movements for a given trial
#     perstim.look<-persuj[persuj$stim_name==i,][c("fixx", "fixy")]
# 
#     # include only the coordinates columns for head movements for a given trial
#     perstim.head<-persuj[persuj$stim_name==i,][c("fix_headx", "fix_heady")]
# 
#     # skip SDE calculation if there are less than 3 data points per trial
#     if (nrow(perstim.look)<3) {
#       next
#     }
# 
#     # calculat SDE if there are more than 3 data points
#     else {
#       # obtaining SDE stats for eye movements
#       look<-calc_sde_mod(id=paste0(s, "_", i),
#                      # filename=paste0("SDE_look_", s, "_", i, ".txt"),
#                      centre.xy=NULL, calccentre=TRUE,
#                      weighted=FALSE, weights=NULL, points=perstim.look, verbose=FALSE)
# 
#       head<-calc_sde_mod(id=paste0(s, "_", i),
#                      # filename=paste0("SDE_head_", s, "_", i, ".txt"),
#                      centre.xy=NULL, calccentre=TRUE,
#                      weighted=FALSE, weights=NULL, points=perstim.head, verbose=FALSE)
#     }
# 
#     # store calc_sde results
# 
#     sde.look<-rbind(sde.look, look)
#     sde.head<-rbind(sde.head, head)
#   }
# }
# 
# beep(sound = 3)
# 
# # separate the name column into subject and stim_name
# sde.look<-cbind(sde.look, strcapture("(.*)_(.*)", as.character(sde.look$id), data.frame(subject = "", stim_name="")))
# sde.head<-cbind(sde.head, strcapture("(.*)_(.*)", as.character(sde.head$id), data.frame(subject = "", stim_name="")))
# 
# # write data frames to file
# write.csv(sde.look, "VR_Basic Scene Viewing_Eye Movement SDE.csv", row.names=F)
# write.csv(sde.head, "VR_Basic Scene Viewing_Head Movement SDE.csv", row.names=F)

Angle change:

# df.angle<-NULL
# 
# # in order to calculate difference scores, the first instance of dir variable is set to 0
# fix<-land
# fix$sacc_dir<-car::recode(fix$sacc_dir, "NA=0")
# fix$fix_headdir<-car::recode(fix$fix_headdir, "NA=0")
# 
# 
# for (s in sujs){ # create data frame per participant
#   persuj<-subset(fix, subject==s)
#   for (i in stim){ # subset it based on stim
#     perstim<-subset(persuj, stim_name==i)
# 
#     # for every row
#     for (j in 1:nrow(perstim)){
#       if (perstim$fixnum[j]==1){
#         perstim$saccDelta[j]<-NA
#         perstim$headDelta[j]<-NA
#         perstim$saccDirDelta[j]<-NA
#         perstim$headDirDelta[j]<-NA
#       } else if (perstim$fixnum[j]>1){
#         perstim$saccDelta[j]<-perstim$sacc_dir[j]-perstim$sacc_dir[j-1] # raw difference score between eye movement dir
#         perstim$headDelta[j]<-perstim$fix_headdir[j]-perstim$fix_headdir[j-1] # raw difference score between head dir
# 
#         # calculate minimum absolute angle change between two vectors for saccades
#         perstim$saccDirDelta[j]<-ifelse(abs(perstim$saccDelta[j])<=360-abs(perstim$saccDelta[j]),
#                                         abs(perstim$saccDelta[j]),
#                                         360-abs(perstim$saccDelta[j]))
#         # do the same for head direction
#         perstim$headDirDelta[j]<-ifelse(abs(perstim$headDelta[j])<=360-abs(perstim$headDelta[j]),
#                                         abs(perstim$headDelta[j]),
#                                         360-abs(perstim$headDelta[j]))
#       }
#     }
#     df.angle<-rbind(df.angle, perstim)
#   }
# }
# 
# 
# beep(sound=3)
# 
# # recode first instance as NA instead of 0
# df.angle$sacc_amp<-car::recode(df.angle$sacc_amp, "0=NA")
# df.angle$sacc_dir<-car::recode(df.angle$sacc_dir, "0=NA")
# df.angle$fix_headamp<-car::recode(df.angle$fix_headamp, "0=NA")
# df.angle$fix_headdir<-car::recode(df.angle$fix_headdir, "0=NA")
# 
# write.csv(df.angle, "VR_Basic Scene Viewing_Angle Direction.csv", row.names = F)

Trial-Level Summary Data:

1. Central tendency/eliptical spread

The aspace package has code that calculates the standard deviation elipse (SDE) of points on a 2D plane. I simply modified the code a bit such that it would provide the summary output instead of the shapefile information.

Where is the centroild of people’s eye/head movements?

Here’s a look at the central tendency/spread with the areas overlayed:

Eliptical Spread

The black dots represent the centre of the standard deviation ellipse, and the area covered by the elipse is outlined in colour.

the calc_sde function breaks this down into several aspects. Below, I’ve provided visual representation of the data at the participant-level, with each observation representing aggregate reponses for each trial/scene.

Area of Elipse

The distributions generally look Gaussian; as such, mean and SD will be used to capture this set of data.

Eccentricity

Eccentricity is a ratio that measures how “stretched out” an elipse is; the closer eccentricity is to 1, the closer it is to looking like a circle.

Given the severe negative skew in the trial-level data per participant, it may be best to try and model this using an exponential function

Theta

Theta is a measure of the tilt of the Elipse. For instance, an elipse stretched along the horizon would have a Theta of roughly 90 degrees.

I displayed the mean and SD to see how much variation there was (especially in the means). Even though most of Thetas were centered around 90 degrees (i.e., along the horizon), there is still considerable difference in the mean Theta from participant to participant.

As such, given the approximately Gaussian distribution, mean and SD will be used to capture this set of data.

2. Saccade/Head Direction

Saccade/head direction indicates the direction of movement. I’ve presented the data in a polar histogram.

Fixation-Level Data

In examining the fixation-level data, it’s more accurate to think of saccade/head direction as having a bimodal distribution, with a peak at around 90 degrees and the other at 270 degrees. As such, simply calculating a mean for these values would not be representative of the model (i.e., the mean will be at 180 degrees, but the majority of the points actually fall on either 90 or 270 degrees).

There doesn’t seem to be a straightforward way to capture this bimodal (though I’m open to ideas). For now, I’ve used mean & SD to represent the trial-level data

3. Angle Change

These are metrics derived from the sacc_dir and fix_headdir variables. Angle change is different than saccade/head direction, as it indicates the change between two angle vectors.

The two key variables calculated are saccDirDelta (contains directionality) and headDirDelta (absolute angle difference with no direction/sign) which are the minimum angle change between eye or head movements. Since they are minimum angle changes, these variables area always less than 180 angular degrees.

Here is a histogram of the minimum angle changes for eye and head movement side by side:

## `stat_bin()` using `bins = 30`. Pick better value with `binwidth`.
## `stat_bin()` using `bins = 30`. Pick better value with `binwidth`.

 

 

Consistent with previous data, it seems that people generally tend not to make really big big head movements changes compared to eye movements

The same graph but separated for each participant:

Fixation-Level Data

4. Saccade/Head Movement Amplitude

Fixation-Level Data

5. Fixation Duration (Saccades Only)

6. Head Rotation (Head Movement Only)