Using Eye-Tracking Data - Dataset (cleaned, N = 44)

Citation Author(s):
General Dynamics Information Technology, Inc.
Consortium of Universities
Air Force Research Laboratory, Wright-Patterson AFB
General Dynamics Information Technology, Inc.
Submitted by:
Sasha Willis
Last updated:
Thu, 01/13/2022 - 15:08
Data Format:
0 ratings - Please login to submit your rating.



Efficient evaluation strategies are essential when reviewing computer code for trustworthiness and potential reuse. Previous researchers have examined factors that influence these assessments, and the HSMC proposes two information processing strategies to explain this process: heuristic and systematic processing. However, researchers have yet to empirically demonstrate the direct influence of the specific factors that affect cognitive effort, which can be inferred through eye-tracking metrics. Programmers (N = 52) were recruited to complete a Java code review task. We manipulated the source, readability, and organization of a single code piece to varying degrees and analyzed the effects of these factors on eye-tracking data (i.e., fixation count, average fixation duration, total fixation duration) and self-report data (i.e., perceived trustworthiness of the code, reuse intentions). Neither reuse intentions nor trustworthiness perceptions significantly differed across conditions. However, analyses of the eye-tracking data revealed increases in fixation counts and durations were present for code that was degraded, suggesting that more systematic processing was occurring in degraded code conditions compared to highly organized, highly readable code from a reputable source. An exploratory analysis of the AOIs containing readability and organization degradations revealed that misuse of case and misuse of declarations garnered the most attention from participants relative to the rest of the code piece. The implications of the current study extend to recommendations for writing code that is easily reusable by decreasing the cognitive effort needed for code review.



No associated documentation file.