Um Estudo de Mapeamento Sistemático sobre Metodologias de Avaliação em Interação Humano-Computador voltadas à Tecnologia Assistiva com foco em Pessoas com Deficiência Motora
Palabras clave:
metodologias de avaliação, interação humano-computador, tecnologia assistiva, deficiência motoraResumen
A proposição de uma Tecnologia Assistiva (TA) para a interação com o computador é ainda um grande desafio, uma vez que os dispositivos de interação precisam estar adaptados às necessidades e habilidades dos usuários. Este desafio é atualmente abordado pela área de Interação Humano-Computador (IHC), que explora o projeto, implementação e avaliação de sistemas informáticos computacionais interativos. No caso da avaliação de um dispositivo voltado para TA é ela que, além de outros fatores de performance, irá validar se a TA é realmente voltada para o público alvo. Este trabalho explora metodologias de avaliação em IHC com foco em pessoas com deficiência motora nos membros superiores, resultado de um mapeamento sistemático da literatura. Por fim, este trabalho incluí uma proposta de taxonomia de como estes dispositivos de TA são classificados quanto às suas formas de captação de dados.
Descargas
Citas
Andrade, A. O., Pereira, A. A., Pinheiro Jr, C. G., and Kyberd, P. J. (2013). Mouse emulation based on facial electromyogram. Biomedical Signal Processing and Control, 8(2):142–152.
Antunes, R. A., Palma, L. B., Coito, F. V., Duarteramos, H., and Gil, P. (2016). Intelligent human-computer interface for improving pointing device usability and performance. In Control and Automation (ICCA), 2016 12th IEEE International Conference on, pages 714–719. IEEE.
Azmi, A., Alsabhan, N. M., and AlDosari, M. S. (2009). The wiimote with sapi: Cre- ating an accessible low-cost, human computer interface for the physically disabled. International Journal of Computer Science and Network Security, 9(12):63–68.
Baranauskas, M. C. C. and Rocha, H. d. (2003). Design e avaliac ̧a ̃o de interfaces humano- computador. Campinas–SP: Nied/Unicamp.
Bersch, R. (2013). Introduc ̧a ̃o a Tecnologia Assistiva.
Betke, M., Gips, J., and Fleming, P. (2002). The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transac- tions on neural systems and Rehabilitation Engineering, 10(1):1–10.
Bian, Z.-P., Hou, J., Chau, L.-P., and Magnenat-Thalmann, N. (2016). Facial position and expression-based human–computer interface for persons with tetraplegia. IEEE journal of biomedical and health informatics, 20(3):915–924.
Biswas, P. and Langdon, P. (2015). Multimodal intelligent eye-gaze tracking system. International Journal of Human-Computer Interaction, 31(4):277–294.
Cardoso, R. C., Costa, V. K., Rodrigues, A. S., and Tavares, T. A. (2016a). Ana ́lise de frameworks para o desenvolvimento de produtos voltados a tecnologia assistiva. XVII Encontro de Po ́s Graduac ̧a ̃o da Universidade Federal de Pelotas.
Cardoso, R. C., da Costa, V. K., Rodrigues, A. S., Tavares, T. A., Xavier, K. F., Peroba, J. A., Peglow, J., and Quadros, C. L. S. M. (2016b). Doce labirinto: Experiencia de jogo utilizando interação baseada em movimentos da cabeça e recursos tangíveis. XV Simpósio Brasileiro de Jogos e Entretenimento Digital.
Dhillon, H. S., Singla, R., Rekhi, N. S., and Jha, R. (2009). Eog and emg based virtual keyboard: A brain-computer interface. In Computer Science and Information Tech- nology, 2009. ICCSIT 2009. 2nd IEEE International Conference on, pages 259–262. IEEE.
Draghici, O., Batkin, I., Bolic, M., and Chapman, I. (2013). The mouthpad: A tongue- computer interface. In Medical Measurements and Applications Proceedings (Me- MeA), 2013 IEEE International Symposium on, pages 315–319. IEEE.
Fitts, P. M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of experimental psychology, 47(6):381.
Hassenzahl, M., Burmester, M., and Koller, F. (2003). Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualita ̈t. In Mensch & Computer 2003, pages 187–196. Springer.
Huang, B., Lo, A. H., and Shi, B. E. (2013). Integrating eeg information improves per- formance of gaze based cursor control. In Neural Engineering (NER), 2013 6th Inter- national IEEE/EMBS Conference on, pages 415–418. IEEE.
Huo, X. (2011). Tongue drive: a wireless tongue-operated assistive technology for people with severe disabilities.
Huo, X., Park, H., Kim, J., and Ghovanloo, M. (2013). A dual-mode human computer in- terface combining speech and tongue motion for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 21(6):979–991.
Jose, M. A. and de Deus Lopes, R. (2015). Human–computer interface controlled by the lip. IEEE journal of biomedical and health informatics, 19(1):302–308.
Kalunga, E. K., Chevallier, S., Rabreau, O., and Monacelli, E. (2014). Hybrid inter- face: Integrating bci in multimodal human-machine interfaces. In Advanced Intelligent Mechatronics (AIM), 2014 IEEE/ASME International Conference on, pages 530–535. IEEE.
Kurauchi, A., Feng, W., Morimoto, C., and Betke, M. (2015). Hmagic: head movement and gaze input cascaded pointing. In Proceedings of the 8th ACM International Con- ference on PErvasive Technologies Related to Assistive Environments, page 47. ACM.
Lau, C. and O’Leary, S. (1993). Comparison of computer interface devices for per- sons with severe physical disabilities. American Journal of Occupational Therapy, 47(11):1022–1030.
Lee, K.-R., Chang, W.-D., Kim, S., and Im, C.-H. (2017). Real-time “eye-writing” recog- nition using electrooculogram. IEEE Transactions on Neural Systems and Rehabilita- tion Engineering, 25(1):40–51.
Le ́vy, P. (1999). Cibercultura. 1a edic ̧a ̃o. Sa ̃o Paulo: Editora, 34.
Levy, P. C., Antonio, N. S., Souza, T. R., Caetano, R., and Souza, P. G. (2013). Activeiris: uma soluc ̧a ̃o para comunicac ̧a ̃o alternativa e autonomia de pessoas com deficieˆncia motora severa. In Proceedings of the 12th Brazilian Symposium on Human Factors in Computing Systems, pages 42–51. Brazilian Computer Society.
Machado,M.e.a.(2010). Oculos mouse: Mouse controlado pelos movimentos da cabeça do usuário. Brazilian Patent INPI n. PI10038213.
Mackenzie, I. S. (1992). Fitts’ law as a performance model in human-computer interaction.
MacKenzie, I. S. (2011). Evaluating eye tracking systems for computer input. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies: Advances in Assistive Technologies, page 205.
MacKenzie, I. S. and Buxton, W. (1992). Extending fitts’ law to two-dimensional tasks. In Proceedings of the SIGCHI conference on Human factors in computing systems, pages 219–226. ACM.
MacKenzie, S. (2016). FittsTaskTwo (2D) -FittsLaw Software. http://www.yorku. ca/mack/FittsLawSoftware/. [Online; accessed 19-Nov-2016].
Manresa Yee, C., Muntaner, J. J., and Arellano, D. (2013). A motion-based interface to control environmental stimulation for children with severe to profound disabilities. In CHI’13 Extended Abstracts on Human Factors in Computing Systems, pages 7–12. ACM.
Mariano, D., Freitas, A., Luiz, L., Silva, A., Pierre, P., and Naves, E. (2014). An accelerometer-based human computer interface driving an alternative communication system. In Biosignals and Biorobotics Conference (2014): Biosignals and Robotics for Better and Safer Living (BRC), 5th ISSNIP-IEEE, pages 1–5. IEEE.
Martins, J. M., Rodrigues, J. M., and Martins, J. A. (2015). Low-cost natural interface based on head movements. Procedia Computer Science, 67:312–321.
Mazo, M. (2001). An integral system for assisted mobility [automated wheelchair]. IEEE Robotics & Automation Magazine, 8(1):46–56.
Melo, A. M. and Baranauskas, M. C. C. (2005). Design e avaliação de tecnologia web- acessível. In Congresso da Sociedade Brasileira de Computação, volume 25, pages 1500–1544.
Nielsen, J. (1993). Usability Engineering. Morgan Kaufmann Publishers Inc., San Fran- cisco, CA, USA.
Pedrosa, D. and Pimentel, M. d. G. C. (2014). Text entry using a foot for severely motor- impaired individuals. In Proceedings of the 29th Annual ACM Symposium on Applied Computing, SAC ’14, pages 957–963, New York, NY, USA. ACM.
Perez-Maldonado, C., Wexler, A. S., and Joshi, S. S. (2010). Two-dimensional cursor- to-target control from single muscle site semg signals. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 18(2):203–209.
Pinheiro, C. G., Naves, E. L., Pino, P., Losson, E., Andrade, A. O., and Bourhis, G. (2011). Alternative communication systems for people with severe motor disabilities: a survey. Biomedical engineering online, 10(1):31.
Preece, J., Rogers, Y., and Sharp, H. (2005). Design de interac ̧a ̃o. bookman.
Rodrigues, A. S., da Costa, V., Cardoso, R. C., Machado, M. B., and Tavares, T. A. (2016B). Analise de metricas para avaliacao da interacao baseada em apontadores: Um estudo de caso para o dispositivo iom.
Rodrigues, A. S., da Costa, V., Machado, M. B., Rocha, A. L., de Oliveira, J. M., Ma- chado, M. B., Cardoso, R. C., Quadros, C., and Tavares, T. A. (2016). Evaluation of the use of eye and head movements for mouse-like functions by using iom device. In International Conference on Universal Access in Human-Computer Interaction, pages 81–91. Springer.
Sauer, A. L., Parks, A., and Heyn, P. C. (2010). Assistive technology effects on the employment outcomes for people with cognitive disabilities: a systematic review. Disability and Rehabilitation: Assistive Technology, 5(6):377–391.
Soltani, S. and Mahnam, A. (2013). Design of a novel wearable human computer interface based on electrooculograghy. In Electrical Engineering (ICEE), 2013 21st Iranian Conference on, pages 1–5. IEEE.
Topal, C., Gunal, S., Koc ̧deviren, O., Dogan, A., and Gerek, O. N. (2014). A low- computational approach on gaze estimation with eye touch system. IEEE transactions on cybernetics, 44(2):228–239.
Vickers, S., Istance, H., and Hyrskykari, A. (2013). Performing locomotion tasks in immersive computer games with an adapted eye-tracking interface. ACM Transactions on Accessible Computing (TACCESS), 5(1):2.