EgoActive: integrated wireless wearable sensors for capturing infant egocentric auditory–visual statistics and autonomic nervous system function ‘in the wild’

Geangu, Elena, Smith, William A. P., Mason, Harry T., Astrid, Priscilla Martinez-Cedillo, Hunter, David, Knight, Marina I., Liang, Haipeng, Bazan, Maria del Carmen Garcia de Soria, Tse, Zion Tsz Ho, Rowland, Thomas, Corpuz, Dom, Hunter, Josh, Singh, Nishant, Vuong, Quoc C., Abdelgayed, Mona Ragab Sayed, Mullineaux, David R., Smith, Stephen and Muller, Bruce R. (2023) EgoActive: integrated wireless wearable sensors for capturing infant egocentric auditory–visual statistics and autonomic nervous system function ‘in the wild’. Sensors, 23 (18). p. 7930. ISSN 1424-8220

[img]
Preview
Text
[Sensors] EgoActive_Integrated Wireless Wearable Sensors for Capturing Infant Egocentric Auditory–Visual Statistics and Autonomic Nervous System Function ‘in the Wild.pdf - Published Version
Available under License Creative Commons Attribution 4.0.

Download (36MB) | Preview
Official URL: http://doi.org/10.3390/s23187930

Abstract / Description

There have been sustained efforts toward using naturalistic methods in developmental science to measure infant behaviors in the real world from an egocentric perspective because statistical regularities in the environment can shape and be shaped by the developing infant. However, there is no user-friendly and unobtrusive technology to densely and reliably sample life in the wild. To address this gap, we present the design, implementation and validation of the EgoActive platform, which addresses limitations of existing wearable technologies for developmental research. EgoActive records the active infants’ egocentric perspective of the world via a miniature wireless head-mounted camera concurrently with their physiological responses to this input via a lightweight, wireless ECG/acceleration sensor. We also provide software tools to facilitate data analyses. Our validation studies showed that the cameras and body sensors performed well. Families also reported that the platform was comfortable, easy to use and operate, and did not interfere with daily activities. The synchronized multimodal data from the EgoActive platform can help tease apart complex processes that are important for child development to further our understanding of areas ranging from executive function to emotion processing and social learning.

Item Type: Article
Additional Information: Institutional Review Board Statement: The studies presented in this paper were conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of the Department of Psychology, University of York, for studies involving humans. Informed Consent Statement: Informed consent was obtained from all subjects involved in this study. Written informed consent has been obtained from the individuals appearing in the images. Data Availability Statement: All data, hardware design and software will be made available upon reasonable request sent to the corresponding author (E.G.). Acknowledgments: We would like to express our gratitude to all families who dedicated their time to participate in the validation studies. Without their continued interest in our research and desire to help, these findings would have not been possible. The authors would also like to thank Brigita Ceponyte, Emily Clayton, Fiona Frame, Laura Tissiman, Marc Green, and Anna Childs for their invaluable support and help throughout the project. Conflicts of Interest: The authors declare no conflict of interest. -- This work presented in this manuscript received funding from the Wellcome Leap, the 1 kD Program
Uncontrolled Keywords: infant; child; wearable sensors; egocentric view; head-mounted camera; ECG; body movement; naturalistic research methods; real-world big data; multimodal measures
Subjects: 000 Computer science, information & general works
100 Philosophy & psychology > 150 Psychology
Department: School of Computing and Digital Media
Depositing User: Mona Abdelgayed
Date Deposited: 11 Dec 2023 12:57
Last Modified: 15 Dec 2023 09:39
URI: https://repository.londonmet.ac.uk/id/eprint/8976

Downloads

Downloads per month over past year



Downloads each year

Actions (login required)

View Item View Item