Show simple item record

dc.creatorSanchez-Comas, Andres
dc.creatorSynnes, Kåre
dc.creatorHallberg, Josef
dc.date.accessioned2020-09-11T19:06:55Z
dc.date.available2020-09-11T19:06:55Z
dc.date.issued2018
dc.identifier.issn1424-3210
dc.identifier.issn1424-8220
dc.identifier.urihttps://hdl.handle.net/11323/7091
dc.description.abstractActivity recognition (AR) from an applied perspective of ambient assisted living (AAL) and smart homes (SH) has become a subject of great interest. Promising a better quality of life, AR applied in contexts such as health, security, and energy consumption can lead to solutions capable of reaching even the people most in need. This study was strongly motivated because levels of development, deployment, and technology of AR solutions transferred to society and industry are based on software development, but also depend on the hardware devices used. The current paper identifies contributions to hardware uses for activity recognition through a scientific literature review in the Web of Science (WoS) database. This work found four dominant groups of technologies used for AR in SH and AAL—smartphones, wearables, video, and electronic components—and two emerging technologies: Wi-Fi and assistive robots. Many of these technologies overlap across many research works. Through bibliometric networks analysis, the present review identified some gaps and new potential combinations of technologies for advances in this emerging worldwide field and their uses. The review also relates the use of these six technologies in health conditions, health care, emotion recognition, occupancy, mobility, posture recognition, localization, fall detection, and generic activity recognition applications. The above can serve as a road map that allows readers to execute approachable projects and deploy applications in different socioeconomic contexts, and the possibility to establish networks with the community involved in this topic. This analysis shows that the research field in activity recognition accepts that specific goals cannot be achieved using one single hardware technology, but can be using joint solutions, this paper shows how such technology works in this regard.spa
dc.language.isoengspa
dc.publisherCorporación Universidad de la Costaspa
dc.relation.ispartofhttps://www.mdpi.com/1424-8220/20/15/4227spa
dc.rightsCC0 1.0 Universal*
dc.rights.urihttp://creativecommons.org/publicdomain/zero/1.0/*
dc.sourceSensorsspa
dc.subjectSmart homespa
dc.subjectAALspa
dc.subjectAmbient assisted livingspa
dc.subjectActivity recognitionspa
dc.subjectHardwarespa
dc.subjectReviewspa
dc.titleHardware for recognition of human activities: a review of smart home and AAL related technologiesspa
dc.typearticlespa
dcterms.references70. Khan, M.A.A.H.; Roy, N.; Hossain, H.M.S. Wearable Sensor-Based Location-Specific Occupancy Detection in Smart Environments. Mob. Inf. Syst. 2018, 2018, 4570182.spa
dcterms.references71. Iwasawa, Y.; Eguchi Yairi, I.; Matsuo, Y. Combining human action sensing of wheelchair users and machine learning for autonomous accessibility data collection. IEICE Trans. Inf. Syst. 2016, E99D, 1153–1161.spa
dcterms.references72. Gupta, H.P.; Chudgar, H.S.; Mukherjee, S.; Dutta, T.; Sharma, K. A Continuous Hand Gestures Recognition Technique for Human-Machine Interaction Using Accelerometer and gyroscope sensors. IEEE Sens. J. 2016, 16, 6425–6432.spa
dcterms.references73. Saha, J.; Chowdhury, C.; Biswas, S. Two phase ensemble classifier for smartphone based human activity recognition independent of hardware configuration and usage behaviour. Microsyst. Technol. 2018, 24, 2737–2752.spa
dcterms.references74. Liu, Z.; Yin, J.; Li, J.; Wei, J.; Feng, Z. A new action recognition method by distinguishing ambiguous postures. Int. J. Adv. Robot. Syst. 2018, 15, 1–8.spa
dcterms.references75. Yao, B.; Hagras, H.; Alghazzawi, D.; Member, S.; Alhaddad, M.J. A Big Bang—Big Crunch Type-2 Fuzzy Logic System for Machine-Vision-Based Event Detection and Summarization in Real-World AmbientAssisted Living. IEEE Trans. Fuzzy Syst. 2016, 24, 1307–1319.spa
dcterms.references76. Trindade, P.; Langensiepen, C.; Lee, K.; Adama, D.A.; Lotfi, A. Human activity learning for assistive robotics using a classifier ensemble. Soft Comput. 2018, 22, 7027–7039.spa
dcterms.references77. Wang, S.; Chen, L.; Zhou, Z.; Sun, X.; Dong, J. Human fall detection in surveillance video based on PCANet. Multimed. Tools Appl. 2016, 75, 11603–11613.spa
dcterms.references78. Eldib, M.; Deboeverie, F.; Philips, W.; Aghajan, H. Behavior analysis for elderly care using a network of low-resolution visual sensors. J. Electron. Imaging 2016, 25, 041003.spa
dcterms.references79. Wickramasinghe, A.; Shinmoto Torres, R.L.; Ranasinghe, D.C. Recognition of falls using dense sensing in an ambient assisted living environment. Pervasive Mob. Comput. 2017, 34, 14–24.spa
dcterms.references80. Chen, Z.; Wang, Y. Infrared–ultrasonic sensor fusion for support vector machine–based fall detection. J. Intell. Mater. Syst. Struct. 2018, 29, 2027–2039.spa
dcterms.references81. Chen, Z.; Wang, Y.; Liu, H. Unobtrusive Sensor based Occupancy Facing Direction Detection and Tracking using Advanced Machine Learning Algorithms. IEEE Sens. J. 2018, 18, 1–1.spa
dcterms.references82. Wang, J.; Zhang, X.; Gao, Q.; Feng, X.; Wang, H. Device-Free Simultaneous Wireless Localization and Activity Recognition With Wavelet Feature. IEEE Trans. Veh. Technol. 2017, 66, 1659–1669.spa
dcterms.references83. Rus, S.; Grosse-Puppendahl, T.; Kuijper, A. Evaluating the recognition of bed postures using mutual capacitance sensing. J. Ambient Intell. Smart Environ. 2017, 9, 113–127.spa
dcterms.references84. Cheng, A.L.; Georgoulas, C.; Bock, T. Automation in Construction Fall Detection and Intervention based on Wireless Sensor Network Technologies. Autom. Constr. 2016, 71, 116–136.spa
dcterms.references85. Hossain, H.M.S.; Khan, M.A.A.H.; Roy, N. Active learning enabled activity recognition. Pervasive Mob. Comput. 2017, 38, 312–330.spa
dcterms.references86. Aziz, S.; Id, S.; Ren, A.; Id, D.F.; Zhang, Z.; Zhao, N.; Yang, X. Internet of Things for Sensing: A Case Study in the Healthcare System. Appl. Sci. 2018, 8, 1–16.spa
dcterms.references87. Jiang, J.; Pozza, R.; Gunnarsdóttir, K.; Gilbert, N.; Moessner, K. Using Sensors to Study Home Activities. J. Sens. Actuator Netw. 2017, 6, 32.spa
dcterms.references88. Luo, X.; Guan, Q.; Tan, H.; Gao, L.; Wang, Z.; Luo, X. Simultaneous Indoor Tracking and Activity Recognition Using Pyroelectric Infrared Sensors. Sensors 2017, 17, 1–18.spa
dcterms.references89. Gill, S.; Seth, N.; Scheme, E. A multi-sensor matched filter approach to robust segmentation of assisted gait. Sensors (Switzerland) 2018, 18, 16–23.spa
dcterms.references90. Sasakawa, D. Human Posture Identification Using a MIMO Array. Electronics 2018, 7, 1–13.spa
dcterms.references91. Suyama, T. A network-type brain machine interface to support activities of daily living. IEICE Trans. Commun. 2016, E99B, 1930–1937.spa
dcterms.references92. Li, W.; Tan, B.O.; Piechocki, R. Passive Radar for Opportunistic Monitoring in E-Health Applications. IEEE J. Trans. Eng. Health Med. 2018, 6, 1–10.spa
dc.type.hasVersioninfo:eu-repo/semantics/publishedVersionspa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.identifier.doihttps://doi.org/10.3390/s20154227


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

CC0 1.0 Universal
Except where otherwise noted, this item's license is described as CC0 1.0 Universal