Mostrar el registro sencillo del ítem

dc.contributor.authorCorredor Camargo, Javier Adolfospa
dc.contributor.authorPeña Cortes, Cesar Augustospa
dc.contributor.authorPardo Garcia, Aldospa
dc.date.accessioned2019-11-12T15:20:50Z
dc.date.available2019-11-12T15:20:50Z
dc.date.issued2019-03-01
dc.identifier.citationJ. Corredor, C. Peña, A. Pardo, “Evaluación de las emociones de usuarios en tareas con realimentación háptica utilizado el dispositivo Emotiv Insight,” INGE CUC, vol. 15, no. 1, pp. 9-16, 2019. DOI: http://doi.org/10.17981/ingecuc.15.1.2019.01spa
dc.identifier.urihttp://hdl.handle.net/11323/5595spa
dc.description.abstractIntroduction− This study assesses the five-performance metrics, available on the Emotive Insight device in a virtual toolpath tracking task through a mobile robot. Objective− Characterize and/or determine if some EEG metrics are related to primitives of a teleoperation task, where haptic feedback is provided, in order to verify if it can be useful to incorporate the information available from the Emotiv device in a shared control strategy. Methodology− An experimental design was formulated, which includes the recording and analysis of neurosigns in five users with a Brain Computer Interface (BCI), executing tasks of teleoperation of a mobile robot in the Environment of VREP (Virtual Robot Experimentation Platform). Results− The results show that engagement and relaxation are emotions that could be useful to identify demanding situations in tracking path and evasion of obstacles such as the experimental setup proposed in this article. On the other hand, it is observed that some metrics such as stress, excitement, interest and focus, on average, remain at similar levels during the task execution. Conclusions− Including brain computer interfaces of low-cost, such as the Emotiv in tasks with haptic feedback, offers new possibilities for assessment user performance and potential for control applications.eng
dc.description.abstractIntroducción: Este estudio evalúa las cinco métricas de desempeño, disponibles en el dispositivo Emotiv Insight en una tarea virtual de seguimiento de trayectorias por medio de un robot móvil. Objetivo: Caracterizar y/o determinar si algunas métricas EEG se relacionan con primitivas de una tarea de tele operación, donde se realimentan señales hápticas, en pro de verificar si puede ser útil incorporar la información disponible del dispositivo Emotiv en una estrategia de control compartido. Metodología: Se formuló un diseño experimental, que incluye el registro y análisis de neuroseñales en cinco usuarios con una Interfaz Cerebro Computador (ICC), ejecutando tareas de teleoperación de un robot móvil en el entorno de VREP (Virtual Robot Experimentation Platform). Resultados: Los resultados muestran que el compromiso y la relajación son emociones que podrían ser de utilidad para identificar situaciones demandantes en tareas de seguimiento y evasión de obstáculos. Por otro lado, se observa que algunas métricas como estrés, excitación, interés y enfoque, en promedio, se mantienen en niveles similares durante la ejecución de la tarea. Conclusiones: Incluir interfaces cerebro computador de bajo costo, como el Emotiv en tareas con realimentación háptica, ofrece nuevas posibilidades para la evaluación del desempeño del usuario y potencialmente para control.spa
dc.format.extent8 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.language.isospa
dc.publisherCorporación Universidad de la Costaspa
dc.relation.ispartofseriesINGE CUC; Vol. 15, Núm. 1 (2019)spa
dc.rightsCC0 1.0 Universalspa
dc.rights.urihttp://creativecommons.org/publicdomain/zero/1.0/spa
dc.sourceINGE CUCspa
dc.titleEvaluación de las emociones de usuarios en tareas con realimentación háptica utilizado el dispositivo Emotiv Insightspa
dc.typeArtículo de revistaspa
dc.identifier.urlhttps://doi.org/10.17981/ingecuc.15.1.2019.01spa
dc.source.urlhttps://revistascientificas.cuc.edu.co/ingecuc/article/view/2048spa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.identifier.doi10.17981/ingecuc.15.1.2019.01spa
dc.identifier.eissn2382-4700spa
dc.identifier.instnameCorporación Universidad de la Costaspa
dc.identifier.pissn0122-6517spa
dc.identifier.reponameREDICUC - Repositorio CUCspa
dc.identifier.repourlhttps://repositorio.cuc.edu.co/spa
dc.relation.ispartofjournalINGE CUCspa
dc.relation.ispartofjournalINGE CUCspa
dc.relation.referencesC. Passenberg, A. Glaser, and A. Peer, “Exploring the Design Space of Haptic Assistants: The Assistance Policy Module”, IEEE Transactions on Haptics, vol. 6, no. 4, pp. 440-452, Oct. 2013. [Online]. https://doi.org/10.1109/TOH.2013.34spa
dc.relation.referencesK. Holewa and A. Nawrocka, “Emotiv EPOC neuroheadset in brain-computer interface”, Proceedings of the 2014 15th International Carpathian Control Conference (ICCC). pp. 149–152, May. 2014. https://doi.org/10.1109/CarpathianCC.2014.6843587spa
dc.relation.referencesG. S. Taylor and C. Schmidt, “Empirical Evaluation of the Emotiv EPOC BCI Headset for the Detection of Mental Actions”, Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 56, no. 1, pp. 193–197, Sep. 2012. https://doi.org/10.1177/1071181312561017spa
dc.relation.referencesR. Lievesley, M. Wozencroft and D. Ewins, “The Emotiv EPOC neuroheadset: an inexpensive method of controlling assistive technologies using facial expressions and thoughts?”, Journal of Assistive Technologies, vol. 5, no. 2, pp. 67–82, Jun. 2011. https://doi.org/10.1108/17549451111149278spa
dc.relation.referencesR. Maskeliunas, R. Damasevicius, Martisius and M. Vasiljevas,“Consumer-grade EEG devices: are they usable for control tasks?,PeerJ 4, e1746, Mar. 2016. https://doi.org/10.7717/peerj.1746spa
dc.relation.referencesC.-L. Lin, F.-Z. Shaw, K.-Y. Young, C.-T. Lin and T.-P. Jung,“EEG correlates of haptic feedback in a visuomotor tracking task”, NeuroImage, vol. 60, no. 4, pp. 2258–2273, May. 2012. https://doi.org/10.1016/j.neuroimage.2012.02.008spa
dc.relation.referencesM. Grunwald, T. Weiss, W. Krause, L. Beyer, R. Rost, I. Gutberlet and H.-J. Gertz, “Power of theta waves in the EEG of human subjects increases during recall of haptic information”, Neuroscience Letters, vol. 260, no. 3, pp. 189–192, Feb. 1999. https://doi.org/10.1016/S0304-3940(98)00990-2spa
dc.relation.referencesH. Miura, J. Kimura, N. Matsuda, M. Soga and H, Taki, “Classification of Haptic Tasks based on Electroencephalogram Frequency Analysis”, Procedia Computer Science, vol. 35, Supplement (C), pp. 1270–1277, Jan. 2014. https://doi.org/10.1016/j.procs.2014.08.226spa
dc.relation.referencesW. Jia, Y. Luo, Y. Hu and J. Zhang, “Adaptive Force Control Tasks Have Far-Transfer Effect on Sustained Attention”, 9th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), vol. 2, pp. 212–217. Aug. 2017. https://doi.org/10.1109/IHMSC.2017.162spa
dc.relation.referencesT. Palomaki, “EEG-based brain-computer interface with visual and haptic feedback”, Master’s thesis, Helsinki University of Technology, 2007. Available: http://lib.tkk.fi/Dipl/2007/urn007655.pdfspa
dc.relation.referencesA. Chatterjee, V. Aggarwal, A. Ramos, S. Acharya, N. V. Thakor, “Operation of a Brain-Computer Interface Using Vibrotactile Biofeedback”, 3rd International IEEE/EMBS Conference on Neural Engineering. pp. 171–174, May. 2007. https://doi.org/10.1109/CNE.2007.369639spa
dc.relation.referencesL. George, M. Marchal, L. Glondu and A. Lecuyer, “Combining Brain-Computer Interfaces and Haptics: Detecting Mental Workload to Adapt Haptic Assistance”, Haptics: Perception, Devices, Mobility, and Communication. Springer, Berlin, Heidelberg, pp. 124–135, Jun. 2012. https://doi.org/10.1007/978-3-642-31401-8_12spa
dc.relation.referencesM. A. Benloucif, C. Sentouh, J. Floris, P. Simon and J. C. Popieul, “Online adaptation of the Level of Haptic Authority in a lane keeping system considering the driver’s state”, Transportation Research Part F: Traffic Psychology and Behaviour, In press, Sep. 2017. https://doi.org/10.1016/j.trf.2017.08.013spa
dc.relation.referencesE. Rohmer, S. P. N. Singh and M. Freese, “V-REP: A versatile and scalable robot simulation framework”, IEEE/spa
dc.relation.referencesRSJ International Conference on Intelligent Robots and Systems, pp. 1321–1326, Nov. 2013. https://doi.org/10.1109/IROS.2013.6696520spa
dc.relation.referencesF. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris, L. Sentis, J. Warren, O. Khatib and K. Salisbury, “The CHAI libraries”, Proceedings of Eurohaptics 2003, Dublin, Ireland, pp. 496–500, 2003.spa
dc.relation.referencesD. J. Block, M. B. Michelotti and R. S. Sreenivas, “Application of the Novint Falcon haptic device as an actuator in real-time control”, Paladyn, Journal of Behavioral Robotics, vol. 4, no. 3, 182–193, 2013. https://doi.org/10.2478/pjbr-2013-0017spa
dc.subject.proposalTelerobóticaspa
dc.subject.proposalInterfaz cerebro computadoreng
dc.subject.proposalRobots móvilesspa
dc.subject.proposalControl compartidospa
dc.subject.proposalHápticaspa
dc.subject.proposalEEGspa
dc.subject.proposalTeleroboticseng
dc.subject.proposalBrain computereng
dc.subject.proposalInterfaceeng
dc.subject.proposalMobile robotseng
dc.subject.proposalShared controleng
dc.subject.proposalHapticseng
dc.title.translatedAssessment of the users emotions in haptic feedback tasks using the Emotiv Insight devicespa
dc.type.coarhttp://purl.org/coar/resource_type/c_6501spa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/articlespa
dc.type.redcolhttp://purl.org/redcol/resource_type/ARTspa
dc.type.versioninfo:eu-repo/semantics/acceptedVersionspa
dc.relation.citationendpage16spa
dc.relation.citationstartpage9spa
dc.relation.citationissue1spa
dc.relation.citationvolume15spa
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.rights.coarhttp://purl.org/coar/access_right/c_abf2spa
dc.relation.ispartofjournalabbrevINGE CUCspa


Ficheros en el ítem

Thumbnail
Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

  • Revistas Científicas [1682]
    Artículos de investigación publicados en revistas pertenecientes a la Editorial EDUCOSTA.

Mostrar el registro sencillo del ítem

CC0 1.0 Universal
Excepto si se señala otra cosa, la licencia del ítem se describe como CC0 1.0 Universal