COGAIN Symposium:
Communication by Gaze Interaction

Warsaw, Poland • June 14 - 17, 2018

In collaboration with:



Important dates

Abstract due: Jan 12, 2018
Submission due: Jan 26, 2018
Feedback: Feb 21, 2018
Rebuttals: Mar 4, 2018
Decisions: Mar 26, 2018
Camera-ready: Apr 13, 2018


General co-chairs
Carlos H. Morimoto
Thies Pfeiffer

Program co-chairs
Marco Porta
Päivi Majaranta

A farwell message

The 2018 COGAIN Symposium was co-located with ETRA this year, and despite running in parallel with ETRA's main track, we were graced with a large audience. In total, we had 5 long papers and 5 short papers with diverse themes orbiting around gaze-based interaction and accessibility. We would like thank everyone who supported and made the COGAIN Symposium a success this year. We hope to see you on the next one!

Best paper award

Advantages of Eye-Gaze over Gaze-based Interaction in Virtual and Augmented Reality Jonas Blattgerste (Bielefeld University, CITEC, Bielefeld, NRW, Germany), Patrick Renner (Bielefeld University, CITEC, Bielefeld, NRW, Germany), Thies Pfeiffer (Bielefeld University, CITEC, Bielefeld, NRW, Germany)

Honrary mention

Beyond Gaze Cursor - Exploring Information-Based Gaze Sharing In Chat Christian Schlösser (University of Applied Sciences and Arts, Dortmund, Germany), Linda Cedli (University of Applied Sciences and Arts, Dortmund, Germany), Benedikt Schröder (University of Applied Sciences and Arts, Dortmund, Germany), Andrea Kienle (University of Applied Sciences and Arts, Dortmund, Germany)

Acknowledgments

We would like to thank all the authors, reviewers, and the board for their effort to make the COGAIN Symposium possible this year. We would also like to give special thanks to our panelists on Trends, challenges, and opportunities in gaze interaction:

  • Robert Cavin (Oculus Research / Facebook Reality Labs)
  • John Paulin Hansen (Technical University of Denmark)
  • Lewis Chuang (Max Planck Institute for Biological Cybernetics in Tübingen)
  • Roman Bednarik (University of Eastern Finland)

The 2018 COGAIN Symposium

The Symposium on Communication by Gaze Interaction organized by the COGAIN Association will be co-located with ETRA 2018, the ACM Symposium on Eye Tracking Research & Applications. ETRA 2018 will take place in Warsaw, Poland, from June 14th to the 17th.

The COGAIN Symposium will be organized as a "special session" at ETRA. By combining our efforts with ETRA, we hope to encourage a broader exchange of knowledge and experiences amongst the communities of researchers, developers, manufacturers, and users of eye trackers.

We invite authors to prepare and submit papers and notes following the same ETRA's ACM format. Papers are up to 8 pages (+ 2 additional pages for references) and Notes are up to 4 pages (+ 2 additional pages for references). During the submission process to ETRA 2018, you will be asked if you would like your paper to be presented at the COGAIN Symposium. COGAIN and ETRA will share the same rigorous two-phase review process, and accepted papers for the COGAIN Symposium will be published as part of the ETRA 2018 ACM Proceedings.

The COGAIN Symposium focuses on all aspects of gaze interaction, from general computer applications and gaze estimation systems to basic eye movement research that are related or might have impact on understanding and improving human communication with other humans and human interaction with machines. The symposium will present advances in these areas, leading to new capabilities in gaze interaction, gaze enhanced applications, gaze contingent devices etc. Topics of interest include all aspects of gaze interaction and communication by gaze including, but not limited to

  • Eye-controlled assistive technology
  • Eye-typing
  • Gaze interfaces for wearable computing
  • Gaze-contingent devices
  • Gaze-aware applications
  • Gaze-enhanced games
  • Gaze interaction with mobile devices
  • Gaze interaction in virtual and augmented environments
  • Gaze interaction paradigms
  • Usability and UX evaluation of gaze-based interfaces
  • User context estimation from eye movements
  • 3D gaze interaction