Päivi Majaranta

Orcid: 0000-0002-3010-9408

Affiliations:
  • University of Tampere, Finland


According to our database1, Päivi Majaranta authored at least 40 papers between 2000 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Comparison of activity trackers in estimating canine behaviors.
Adv. Robotics, July, 2024

2020
Gaze Interaction With Vibrotactile Feedback: Review and Design Guidelines.
Hum. Comput. Interact., 2020

2019
Human augmentation: Past, present and future.
Int. J. Hum. Comput. Stud., 2019

Inducing gaze gestures by static illustrations.
Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, 2019

2018
Evaluation of Dry Electrodes in Canine Heart Rate Monitoring.
Sensors, 2018

Useful approaches to exploratory analysis of gaze data: enhanced heatmaps, cluster maps, and transition maps.
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, 2018

Happy dogs and happy owners: using dog activity monitoring technology in everyday life.
Proceedings of the Fifth International Conference on Animal-Computer Interaction, 2018

Dog activity classification with movement sensor placed on the collar.
Proceedings of the Fifth International Conference on Animal-Computer Interaction, 2018

2017
Vibrotactile stimulation of the head enables faster gaze gestures.
Int. J. Hum. Comput. Stud., 2017

Technology for Bonding in Human-Animal Interaction.
Proceedings of the Fourth International Conference on Animal-Computer Interaction, 2017

2016
Effects of auditory, haptic and visual feedback on performing gestures by gaze or by hand.
Behav. Inf. Technol., 2016

6th international workshop on pervasive eye tracking and mobile eye-based interaction.
Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2016 ACM International Symposium on Wearable Computers, 2016

Both Fingers and Head are Acceptable in Sensing Tactile Feedback of Gaze Gestures.
Proceedings of the Haptics: Perception, Devices, Control, and Applications, 2016

PursuitAdjuster: an exploration into the design space of smooth pursuit -based widgets.
Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, 2016

2015
Demo hour.
Interactions, 2015

2014
Using gaze gestures with haptic feedback on glasses.
Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, 2014

Effects of haptic feedback on gaze based auto scrolling.
Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, 2014

Delayed Haptic Feedback to Gaze Gestures.
Proceedings of the Haptics: Neuroscience, Devices, Modeling, and Applications, 2014

Look and lean: accurate head-assisted eye pointing.
Proceedings of the Eye Tracking Research and Applications, 2014

Haptic feedback to gaze events.
Proceedings of the Eye Tracking Research and Applications, 2014

Gaze gestures and haptic feedback in mobile devices.
Proceedings of the CHI Conference on Human Factors in Computing Systems, 2014

2012
Enhanced gaze interaction using simple head gestures.
Proceedings of the 2012 ACM Conference on Ubiquitous Computing, 2012

2nd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2012): proposal for a workshop (mini-track) at UbiComp 2012.
Proceedings of the 2012 ACM Conference on Ubiquitous Computing, 2012

Gaze Interaction and Applications of Eye Tracking - Advances in Assistive Technologies.
IGI Global, ISBN: 978-1-613-50098-9, 2012

2011
PETMEI 2011: the 1st international workshop on pervasive eye tracking and mobile eye-based interaction.
Proceedings of the UbiComp 2011: Ubiquitous Computing, 13th International Conference, 2011

2010
Eye-based Direct Interaction for Environmental Control in Heterogeneous Smart Environments.
Proceedings of the Handbook of Ambient Intelligence and Smart Environments, 2010

2009
Eye Tracking.
Proceedings of the Universal Access Handbook., 2009

Special issue: Communication by gaze interaction.
Univers. Access Inf. Soc., 2009

Scrollable Keyboards for Casual Eye Typing.
PsychNology J., 2009

Fast gaze typing with an adjustable dwell time.
Proceedings of the 27th International Conference on Human Factors in Computing Systems, 2009

2008
Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze.
Proceedings of the Eye Tracking Research & Application Symposium, 2008

2006
Effects of feedback and dwell time on eye typing speed and accuracy.
Univers. Access Inf. Soc., 2006

2005
Gaze-based human-computer interaction.
Proceedings of the 10th International Conference on Intelligent User Interfaces, 2005

Static Visualization of Temporal Eye-Tracking Data.
Proceedings of the Human-Computer Interaction, 2005

Eye-Tracking Reveals the Personal Styles for Search Result Evaluation.
Proceedings of the Human-Computer Interaction, 2005

2004
Effects of feedback on eye typing with a short dwell time.
Proceedings of the Eye Tracking Research & Application Symposium, 2004

2003
Proactive Response to Eye Movements.
Proceedings of the Human-Computer Interaction INTERACT '03: IFIP TC13 International Conference on Human-Computer Interaction, 2003

Auditory and visual feedback during eye typing.
Proceedings of the Extended abstracts of the 2003 Conference on Human Factors in Computing Systems, 2003

2002
Twenty years of eye typing: systems and design issues.
Proceedings of the Eye Tracking Research & Application Symposium, 2002

2000
Design issues of iDICT: a gaze-assisted translation aid.
Proceedings of the Eye Tracking Research & Application Symposium, 2000


  Loading...