Mostrar el registro sencillo del ítem

dc.contributor.authorSanchez-Comas, Andresspa
dc.contributor.authorSynnes, Kårespa
dc.contributor.authorHallberg, Josefspa
dc.date.accessioned2020-09-11T19:06:55Z
dc.date.available2020-09-11T19:06:55Z
dc.date.issued2018
dc.identifier.issn1424-3210spa
dc.identifier.issn1424-8220spa
dc.identifier.urihttps://hdl.handle.net/11323/7091spa
dc.description.abstractActivity recognition (AR) from an applied perspective of ambient assisted living (AAL) and smart homes (SH) has become a subject of great interest. Promising a better quality of life, AR applied in contexts such as health, security, and energy consumption can lead to solutions capable of reaching even the people most in need. This study was strongly motivated because levels of development, deployment, and technology of AR solutions transferred to society and industry are based on software development, but also depend on the hardware devices used. The current paper identifies contributions to hardware uses for activity recognition through a scientific literature review in the Web of Science (WoS) database. This work found four dominant groups of technologies used for AR in SH and AAL—smartphones, wearables, video, and electronic components—and two emerging technologies: Wi-Fi and assistive robots. Many of these technologies overlap across many research works. Through bibliometric networks analysis, the present review identified some gaps and new potential combinations of technologies for advances in this emerging worldwide field and their uses. The review also relates the use of these six technologies in health conditions, health care, emotion recognition, occupancy, mobility, posture recognition, localization, fall detection, and generic activity recognition applications. The above can serve as a road map that allows readers to execute approachable projects and deploy applications in different socioeconomic contexts, and the possibility to establish networks with the community involved in this topic. This analysis shows that the research field in activity recognition accepts that specific goals cannot be achieved using one single hardware technology, but can be using joint solutions, this paper shows how such technology works in this regard.spa
dc.language.isoeng
dc.publisherCorporación Universidad de la Costaspa
dc.relation.ispartofhttps://www.mdpi.com/1424-8220/20/15/4227spa
dc.rightsCC0 1.0 Universalspa
dc.rights.urihttp://creativecommons.org/publicdomain/zero/1.0/spa
dc.sourceSensorsspa
dc.subjectSmart homespa
dc.subjectAALspa
dc.subjectAmbient assisted livingspa
dc.subjectActivity recognitionspa
dc.subjectHardwarespa
dc.subjectReviewspa
dc.titleHardware for recognition of human activities: a review of smart home and AAL related technologiesspa
dc.typeArtículo de revistaspa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.identifier.doihttps://doi.org/10.3390/s20154227spa
dc.identifier.instnameCorporación Universidad de la Costaspa
dc.identifier.reponameREDICUC - Repositorio CUCspa
dc.identifier.repourlhttps://repositorio.cuc.edu.co/spa
dc.relation.references70. Khan, M.A.A.H.; Roy, N.; Hossain, H.M.S. Wearable Sensor-Based Location-Specific Occupancy Detection in Smart Environments. Mob. Inf. Syst. 2018, 2018, 4570182.spa
dc.relation.references71. Iwasawa, Y.; Eguchi Yairi, I.; Matsuo, Y. Combining human action sensing of wheelchair users and machine learning for autonomous accessibility data collection. IEICE Trans. Inf. Syst. 2016, E99D, 1153–1161.spa
dc.relation.references72. Gupta, H.P.; Chudgar, H.S.; Mukherjee, S.; Dutta, T.; Sharma, K. A Continuous Hand Gestures Recognition Technique for Human-Machine Interaction Using Accelerometer and gyroscope sensors. IEEE Sens. J. 2016, 16, 6425–6432.spa
dc.relation.references73. Saha, J.; Chowdhury, C.; Biswas, S. Two phase ensemble classifier for smartphone based human activity recognition independent of hardware configuration and usage behaviour. Microsyst. Technol. 2018, 24, 2737–2752.spa
dc.relation.references74. Liu, Z.; Yin, J.; Li, J.; Wei, J.; Feng, Z. A new action recognition method by distinguishing ambiguous postures. Int. J. Adv. Robot. Syst. 2018, 15, 1–8.spa
dc.relation.references75. Yao, B.; Hagras, H.; Alghazzawi, D.; Member, S.; Alhaddad, M.J. A Big Bang—Big Crunch Type-2 Fuzzy Logic System for Machine-Vision-Based Event Detection and Summarization in Real-World AmbientAssisted Living. IEEE Trans. Fuzzy Syst. 2016, 24, 1307–1319.spa
dc.relation.references76. Trindade, P.; Langensiepen, C.; Lee, K.; Adama, D.A.; Lotfi, A. Human activity learning for assistive robotics using a classifier ensemble. Soft Comput. 2018, 22, 7027–7039.spa
dc.relation.references77. Wang, S.; Chen, L.; Zhou, Z.; Sun, X.; Dong, J. Human fall detection in surveillance video based on PCANet. Multimed. Tools Appl. 2016, 75, 11603–11613.spa
dc.relation.references78. Eldib, M.; Deboeverie, F.; Philips, W.; Aghajan, H. Behavior analysis for elderly care using a network of low-resolution visual sensors. J. Electron. Imaging 2016, 25, 041003.spa
dc.relation.references79. Wickramasinghe, A.; Shinmoto Torres, R.L.; Ranasinghe, D.C. Recognition of falls using dense sensing in an ambient assisted living environment. Pervasive Mob. Comput. 2017, 34, 14–24.spa
dc.relation.references80. Chen, Z.; Wang, Y. Infrared–ultrasonic sensor fusion for support vector machine–based fall detection. J. Intell. Mater. Syst. Struct. 2018, 29, 2027–2039.spa
dc.relation.references81. Chen, Z.; Wang, Y.; Liu, H. Unobtrusive Sensor based Occupancy Facing Direction Detection and Tracking using Advanced Machine Learning Algorithms. IEEE Sens. J. 2018, 18, 1–1.spa
dc.relation.references82. Wang, J.; Zhang, X.; Gao, Q.; Feng, X.; Wang, H. Device-Free Simultaneous Wireless Localization and Activity Recognition With Wavelet Feature. IEEE Trans. Veh. Technol. 2017, 66, 1659–1669.spa
dc.relation.references83. Rus, S.; Grosse-Puppendahl, T.; Kuijper, A. Evaluating the recognition of bed postures using mutual capacitance sensing. J. Ambient Intell. Smart Environ. 2017, 9, 113–127.spa
dc.relation.references84. Cheng, A.L.; Georgoulas, C.; Bock, T. Automation in Construction Fall Detection and Intervention based on Wireless Sensor Network Technologies. Autom. Constr. 2016, 71, 116–136.spa
dc.relation.references85. Hossain, H.M.S.; Khan, M.A.A.H.; Roy, N. Active learning enabled activity recognition. Pervasive Mob. Comput. 2017, 38, 312–330.spa
dc.relation.references86. Aziz, S.; Id, S.; Ren, A.; Id, D.F.; Zhang, Z.; Zhao, N.; Yang, X. Internet of Things for Sensing: A Case Study in the Healthcare System. Appl. Sci. 2018, 8, 1–16.spa
dc.relation.references87. Jiang, J.; Pozza, R.; Gunnarsdóttir, K.; Gilbert, N.; Moessner, K. Using Sensors to Study Home Activities. J. Sens. Actuator Netw. 2017, 6, 32.spa
dc.relation.references88. Luo, X.; Guan, Q.; Tan, H.; Gao, L.; Wang, Z.; Luo, X. Simultaneous Indoor Tracking and Activity Recognition Using Pyroelectric Infrared Sensors. Sensors 2017, 17, 1–18.spa
dc.relation.references89. Gill, S.; Seth, N.; Scheme, E. A multi-sensor matched filter approach to robust segmentation of assisted gait. Sensors (Switzerland) 2018, 18, 16–23.spa
dc.relation.references90. Sasakawa, D. Human Posture Identification Using a MIMO Array. Electronics 2018, 7, 1–13.spa
dc.relation.references91. Suyama, T. A network-type brain machine interface to support activities of daily living. IEICE Trans. Commun. 2016, E99B, 1930–1937.spa
dc.relation.references92. Li, W.; Tan, B.O.; Piechocki, R. Passive Radar for Opportunistic Monitoring in E-Health Applications. IEEE J. Trans. Eng. Health Med. 2018, 6, 1–10.spa
dc.type.coarhttp://purl.org/coar/resource_type/c_6501spa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/articlespa
dc.type.redcolhttp://purl.org/redcol/resource_type/ARTspa
dc.type.versioninfo:eu-repo/semantics/acceptedVersionspa
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.rights.coarhttp://purl.org/coar/access_right/c_abf2spa


Ficheros en el ítem

Thumbnail
Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

  • Artículos científicos [3154]
    Artículos de investigación publicados por miembros de la comunidad universitaria.

Mostrar el registro sencillo del ítem

CC0 1.0 Universal
Excepto si se señala otra cosa, la licencia del ítem se describe como CC0 1.0 Universal