Researchers using discrete choice experiments (DCE) are often faced with the difficult decision of selecting which are the key attributes that must be included into their analysis. Previous literature on methods for attribute selection is not particularly well documented, frequently leaving researchers with a wide choice of attributes that could lead to complex choice tasks. Moreover, selecting attributes that might be ignored by the respondents might generate biased results, especially if attribute non-attendance is not taken into consideration. In this paper, we offer a framework for the selection of key attributes using eye-tracking software. Our main objective is to investigate if eye movements during the completion of a choice experiment can provide additional information to select the attributes in designing a DCE. Pretesting the DCE by an on-screen survey tool and eye-tracking, we implemented three multinomial logit models (MNL) to compare the stated preferences of the respondents, the self-reported statements on non-attendance and their visual attention to each attribute. The eye-tracking data revealed that all respondents looked at most attributes for most of the time. However, attention is different from attendance. Results show that eye-tracking can be a complementary method to self-reported statements by providing key information for reducing task complexity and potential attribute non-attendance in designing a DCE (one from seven attributes was dropped).

Using eye-tracking as an aid to design on-screen choice experiments / Cubero Dudinskaya, E.; Naspetti, S.; Zanoli, R.. - In: JOURNAL OF CHOICE MODELLING. - ISSN 1755-5345. - ELETTRONICO. - 36:(2020). [10.1016/j.jocm.2020.100232]

Using eye-tracking as an aid to design on-screen choice experiments

Cubero Dudinskaya E.;Naspetti S.;Zanoli R.
2020-01-01

Abstract

Researchers using discrete choice experiments (DCE) are often faced with the difficult decision of selecting which are the key attributes that must be included into their analysis. Previous literature on methods for attribute selection is not particularly well documented, frequently leaving researchers with a wide choice of attributes that could lead to complex choice tasks. Moreover, selecting attributes that might be ignored by the respondents might generate biased results, especially if attribute non-attendance is not taken into consideration. In this paper, we offer a framework for the selection of key attributes using eye-tracking software. Our main objective is to investigate if eye movements during the completion of a choice experiment can provide additional information to select the attributes in designing a DCE. Pretesting the DCE by an on-screen survey tool and eye-tracking, we implemented three multinomial logit models (MNL) to compare the stated preferences of the respondents, the self-reported statements on non-attendance and their visual attention to each attribute. The eye-tracking data revealed that all respondents looked at most attributes for most of the time. However, attention is different from attendance. Results show that eye-tracking can be a complementary method to self-reported statements by providing key information for reducing task complexity and potential attribute non-attendance in designing a DCE (one from seven attributes was dropped).
2020
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S1755534520300312-main.pdf

Solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza d'uso: Tutti i diritti riservati
Dimensione 2.05 MB
Formato Adobe PDF
2.05 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/287409
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? 11
social impact