Navak: Action Research for a Painting Experience by Eyes via Eye-Tracking

Document Type : Original Articles

Authors

1 MSc Student in Computer Arts, School of Multimedia, Tabriz Islamic Art University, Tabriz, Iran

2 MSc in Computer Engineering (Artificial Intelligence), Tehran Central Branch, Islamic Azad University, Tehran, IranIran

3 Assistant Professor, School of Multimedia, Tabriz Islamic Art University, Tabriz, Iran

10.22122/jrrs.v15i1.3493

Abstract

Introduction: The eyes are a rich source of information in our daily lives. Using the eye as a form of input can enable a computer system to learn more. Users can use eye-tracking as a form of input in addition to the keyboard and mouse.Materials and Methods: In this study, a system was developed for experiencing eye painting through eye-tracking. The system was a digital painting software that received data through the eye-tracking device, and provided a graphical output. #C programming language was used for this purpose.  To evaluate this experience, people with a common design background were invited to describe the quality of their experience.Results: Despite the difficulty of controlling the eye, and lacking sufficient proficiency to draw in this way, participants described the experience as pleasant and enjoyable, and expressed their desire to work in this environment.Conclusion: Despite the type of the painting output, the participants find the approach novel; they were interested in continuing working with the software.

Keywords

  1. Candy L, Ferguson S. Interactive experience in the digital age: Evaluating new art practice. New York, NY: Springer; 2014.
  2. Edmonds EA. Human computer interaction, art and experience. In: Candy L, Ferguson S, editors. Interactive experience in the digital age: Evaluating new art practice. Cham, Switzerland: Springer International Publishing; 2014. p. 11-23.
  3. Cui J. Research on digital painting art and its diversified performance. Proceedings of the 3rd International Conference on Economics, Social Science, Arts, Education and Management Engineering (ESSAEME 2017). 2017 Ju29-30; Huhhot, China
  4. Jing S, Liang C, Zhi-Nan Q, Wei-jing H. Natural User Interface Design for Digital Painting Based on User Behavior Patterns. Proceedings of the 2016 International Conference on Artificial Intelligence and Computer Science (AICS 2016); 2016 Dec 23-25; Guilin, China.
  5. Corcoran PM, Nanu F, Petrescu S, Bigioi P. Real-time eye gaze tracking for gaming design and consumer electronics systems. IEEE Trans Consum Electron 2012; 58(2): 347-55.
  6. Kumar M. Gaze-enhanced user interface design [PhD Thesis]. Stanford, CA: Stanford University; 2007.
  7. Kumar M, Paepcke A, Winograd TA. EyePoint: Practical pointing and selection using gaze and keyboard. CHI '07: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; 2007 Apr28-May 3; San Jose, CA, USA. p. 421-30.
  8. Balbi B, Protti F, Montanari R. Driven by Caravaggio through his painting an eye-tracking study. Proceedings of the 8th International Conference on Advanced Cognitive Technologies and Applications: COGNITIVE 2016; 2016 Mar 20-26; Rome, Italy.
  9. Kristensson P, Vertanen K. The potential of dwell-free eye-typing for fast assistive gaze communication. Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012; 2012 Mar 28-30; Santa Barbara, CA, USA.
  10. Rosenberg R, Klein C. The moving eye of the beholder: Eye tracking and the perception of paintings. In: Huston JP, Nadal M, Mora F, Agnati LF, Cela-Conde CJ, editors. Art, aesthetics and the brain. Oxford, UK: Oxford University Press; 2015. p. 79–108.
  11. Fink G. Darwing with My Eyes. [Video File] [Online]. [cited 2015 Mar 12];. Available from: URL: https://www.youtube.com/watch?v=7ApFzumPQEQ
  12. Toby AAc.. EyeGaze Artist Francis Tsai and his Tobii PCEye [Video File] [Online]. [cited 2013 May 31];. Available from: URL: https://youtu.be/rp4zHIhm0L0
  13. Toby AAc. EyeGaze Artist Sarah Ezekiel on TV show London Tonight. [Video File] [Online]. [cited 2012 Nov 27];. Available from: https://youtu.be/-kpgKO6EMoU.
  • Receive Date: 26 February 2020
  • Revise Date: 01 June 2022
  • Accept Date: 22 May 2022
  • First Publish Date: 22 May 2022