Back to articles
Regular Articles
Volume: 5 | Article ID: jpi0146
Image
Introducing CatchUTM: A Novel Multisensory Tool for Assessing Patients’ Risk of Falling
  DOI :  10.2352/J.Percept.Imaging.2022.5.000407  Published OnlineFebruary 2022
Abstract
Abstract

To date, only a few studies have investigated the clinical translational value of multisensory integration. Our previous research has linked the magnitude of visual-somatosensory integration (measured behaviorally using simple reaction time tasks) to important cognitive (attention) and motor (balance, gait, and falls) outcomes in healthy older adults. While multisensory integration effects have been measured across a wide array of populations using various sensory combinations and different neuroscience research approaches, multisensory integration tests have not been systematically implemented in clinical settings. We recently developed a step-by-step protocol for administering and calculating multisensory integration effects to facilitate innovative and novel translational research across diverse clinical populations and age-ranges. In recognizing that patients with severe medical conditions and/or mobility limitations often experience difficulty traveling to research facilities or joining time-demanding research protocols, we deemed it necessary for patients to be able to benefit from multisensory testing. Using an established protocol and methodology, we developed a multisensory falls-screening tool called CatchUTM (an iPhone app) to quantify multisensory integration performance in clinical practice that is currently undergoing validation studies. Our goal is to facilitate the identification of patients who are at increased risk of falls and promote physician-initiated falls counseling during clinical visits (e.g., annual wellness, sick, or follow-up visits). This will thereby raise falls-awareness and foster physician efforts to alleviate disability, promote independence, and increase quality of life for our older adults. This conceptual overview highlights the potential of multisensory integration in predicting clinical outcomes from a research perspective, while also showcasing the practical application of a multisensory screening tool in routine clinical practice.

Subject Areas :
Views 211
Downloads 24
 articleview.views 211
 articleview.downloads 24
  Cite this article 

Jeannette R. Mahoney, Claudene J. George, Joe Verghese, "Introducing CatchUTM: A Novel Multisensory Tool for Assessing Patients’ Risk of Fallingin Journal of Perceptual Imaging,  2022,  pp 000407-1 - 000407-6,  https://doi.org/10.2352/J.Percept.Imaging.2022.5.000407

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2022
  Article timeline 
  • received February 2021
  • accepted September 2021
  • PublishedFebruary 2022

Preprint submitted to:
jpi
Journal of Perceptual Imaging
J. Percept. Imaging
J. Percept. Imaging
2575-8144
Society for Imaging Science and Technology
1.
Introduction
According to the Centers for Disease Control and Prevention (CDC), more than 25% of Americans over the age of 65 (∼16 M) experience a fall annually [3, 37]. In fact, over 3 million older Americans require an emergency room visit each year because of fall-related injuries, and individuals who fall once are likely to fall again [36, 38]. Falls are the leading cause of injury and injury-related death in older adults, and are a significant burden to the U.S. healthcare system with over $50 billion spent annually on non-fatal and fatal falls. The CDC recommends routine fall-risk screening at least annually; however, according to Sun & Sosnoff [32], screening is currently not systematically integrated into practice. Some identified barriers to successful implementation of quantitative falls-assessment in older adults include: (1) over-reliance on subjective measurements that are limited in scope and only modestly capture potential fall-risk (e.g., patient responses to physician queries about fall history, balance/walking difficulty, and fear of falling); (2) lack of cost-effective technology that assesses falls; (3) on-site testing; and (4) clinical time constraints for physicians and staff to administer and interpret test results.
While several functional mobility tests are in fact readily available for clinical use such as the Timed-up and Go (TUG) test [30], Performance-Oriented Mobility Assessments [33], Berg Balance Test [2], FallSkip [29], and Sway [26], these assessments are unfortunately not systematically integrated into practice yet, and most require in-person testing and interpretation of results by a physician or trained staff. Lack of efficient quantitative screening tools and failure to raise falls awareness in older adults effectively contribute to increased occurrence of falls, increased societal burden of high annual falls-related expenses, and most importantly, decreased quality of life for our seniors. Thus, there is an unmet need for novel quantitative and research-based digital health screening tools that can raise falls awareness while striving to improve patient outcomes.
Falls are inherently complex on many levels. Aging presents additional challenges to the central nervous system by concurrently disrupting the functionality of cognitive, sensory, and motor systems [27]. Specifically, age-related visual and somatosensory impairments have been linked to slower gait [12], functional decline [13], increased risks of falls [5, 11, 14, 16], and worse quality of life [7]. Balance requires efficient interactions between musculoskeletal and sensory systems [31], which are compromised in aging [15, 38]. Moreover, poor balance is a major predictor of falls and is the leading cause of injury and death in older Americans [38]. Our laboratory research reveals robust but differential multisensory integration effects (i.e., visual-somatosensory) in healthy aging, while highlighting important associations of visual-somatosensory integration with both cognitive (attention) and motor outcomes (balance, gait, falls) in aging [1821, 23].
1.1
Multisensory Integration and its Significance in Aging
Multisensory integration, a rapidly growing field of neuroscience as demonstrated by a recent increase in publications and special interest topics, investigates the simultaneous processing of information from multiple sensory systems. Our brains are specifically designed to simultaneously process concurrent information from multiple sensory inputs to produce the most appropriate response to environmental cues [4]. Such responses are vital to functional independence in the real world, including successful completion of daily activities [4, 24, 34].
When simultaneous sensory inputs (e.g., visual and somatosensory) are presented, they combine in the brain by a non-linear process to yield faster responses than their unisensory constituents. Efficient sensory integration depends on intact feedback and feedforward neuronal loops between cortical (primary sensory regions, multisensory areas (e.g., superior temporal sulcus), and motor regions) and subcortical (thalamus) regions [28]. Cortico-cortical and cortico-thalamic loops required for intact multisensory integration and balance performance are notoriously compromised with aging. Unfortunately, multisensory integration processes have not been comprehensively examined, and their relation to clinical outcomes across diverse populations has been recognized as a major knowledge gap in the field [17, 24, 34]. The mission of our aging research is to address this knowledge gap and to demonstrate the clinical utility of multisensory integration processes in predicting cognitive and motor outcomes.
The efficiency of multisensory integration can be quantified using established probabilistic modeling of behavioral performance, such as reaction time and accuracy, to determine the magnitude of multisensory integration. This measure is operationalized as the area under the curve of the difference between actual and predicted cumulative distribution functions (CDFs) of reaction time data. In an effort to increase innovative translational multisensory projects, we have published a step-by-step tutorial for calculating the magnitude of multisensory integration [22]. Briefly, the laboratory paradigm consists of three experimental blocks where 45 unisensory visual, 45 unisensory somatosensory, and 45 visual-somatosensory trials are randomly presented with a random inter-trial-interval ranging from 1 to 3 s. Each stimulus is presented for 100 ms, and the participants are asked to press a foot pedal as soon as they feel, see, or feel and see any stimulation. The three experimental blocks are separated by 20-s rest blocks to reduce fatigue and enhance attention. Valid reaction time data (a maximum of 45 trials per condition) are collected, sorted in ascending order by condition, and binned into percentiles (typically in 5% increments) from fastest reaction time (0.00 percentile) to slowest reaction time (1.00 percentile). Next, reaction time data to the three experimental conditions are submitted to cumulative distribution functions (CDFs)—this provides the probability of a response occurring during any given percentile bin. The “predicted” CDF (the sum of the visual alone CDF and the somatosensory alone CDF, with an upper limit of 1) is subtracted from the “actual” CDF of the combined visual-somatosensory (i.e., multisensory) condition and the resulting difference wave is plotted. Positive values at any given latency (i.e., percentile bin) are indicative of successful multisensory integration and the area under the curve of these values can be calculated and used to determine the magnitude of multisensory integration (see [22] for detailed specifications). As an example, Figure 1 (adapted from Mahoney and colleagues [21]) depicts the cumulative probability difference (y-axis) between actual and predicted CDFs during percentile binned reaction time responses (x-axis). The overall study cohort (n = 345; black dashed trace) reveals positive values during the fastest tenth (0.00–0.10 percentile bins) of responses. Here, the area under the curve during the 0.00–0.10 percentiles (gray shaded box) represents the magnitude of visual-somatosensory integration (VSI) and higher area under the curve values signify greater ability to successfully integrate multisensory inputs.
Figure 1.
Cumulative probability difference wave: The difference wave between “actual” and “predicted” cumulative distribution functions over the trajectory of averaged responses for the total cohort (dashed trace) and for each of the three cognitive status groups (solid traces; adapted from [21]). Area under the curve during the positive portion of the difference wave (gray shaded box) represents the magnitude of visual-somatosensory integration, where higher area under the curve values indicate greater ability to successfully integrate multisensory inputs.
In a series of laboratory studies examining the magnitude of visual-somatosensory integration (VSI) using the above-referenced methodology, we demonstrate differential VSI abilities across healthy older adults. We reveal the clinical importance of multisensory integration in aging as we showed that greater ability to integrate visual and somatosensory information was associated with lesser likelihood of falling. That is, magnitude of VSI demonstrated incremental predictive validity for falls over balance and other known fall risk factors, suggesting that inefficient multisensory integration could contribute to falls via alternate pathways or mechanisms [18]. We also demonstrated that older adults with intact levels of VSI (area under the curve values > 0) demonstrate better balance [18] and faster gait velocity [23], compared to those with deficient levels of VSI (area under the curve values < 0). Our latest work reveals the mediating effect of mild cognitive impairment (MCI) and dementia on the association between magnitude of multisensory integration and mobility measures including balance and gait [21]. Referring attention back to Fig. 1, notice the existence of differential multisensory integration effects. That is, when the overall group difference wave (black dashed trace) was later parsed based on participants’ cognitive status (normal n = 293, solid light gray trace; mild cognitive impairment (MCI) n = 40, solid dark gray trace; dementia n = 12, solid black trace), the magnitude of VSI was significantly reduced in older adults with MCI and dementia [21]. In this study, we revealed that cognitive status mediates the relationship between magnitude of multisensory integration and mobility outcomes, where those with cognitive impairments demonstrated worse multisensory integration and slower gait/worse balance, which increases their risk for falls. Further, the results indicate that magnitude of VSI was specifically associated with attention-based performance (i.e., Attention Index) on the Repeatable Battery for the Assessment of Neuropsychological Status [21].
But why should cognitive status implicate multisensory functioning? Many studies have indicated the critical role of the prefrontal cortex (PFC) in maintaining successful gait and cognition [1]. Studies in primates and young adults also reveal that flexible multisensory integration processes are regulated by specific areas in PFC, including but not limited to dorsomedial and ventrolateral regions [6, 10]. There is good reason to suspect that impairments in cognition adversely affect the association between magnitude of multisensory integration and mobility measures in aging because: (1) flexible multisensory processing in young adults appears to be regulated by PFC [6, 10]; (2) selective attention processes modulate multisensory integration in aging [9, 25]; and (3) disruptions in executive attention and cognition in aging compromise both mobility and multisensory integration processes [8, 21, 35]. However, future studies are still needed to pinpoint the exact overlapping neural circuits involved in (multi)sensory, cognitive, and motor functioning in both healthy and impaired older adults.
2.
Translating Research into Clinical Practice
While multisensory integration effects have been measured across a wide array of populations using various sensory combinations and different neuroscience research approaches, multisensory integration tests have not been systematically implemented in clinical settings. Though the significance of uncovering the clinical translational value of multisensory integration processes has been recognized [17, 24, 34], relatively few studies have investigated the utility of clinical multisensory tools. Our method for quantifying multisensory interactions demonstrates clear clinical-translational value with regard to predicting motor outcomes like falls. Using our research as a solid foundation and our patent-pending system and methods for testing multisensory integration effects (U.S. Provisional Application No: 62/908,180; U.S. Non-Provisional Application No: 17038974), we have developed an innovative and quantitative iPhone-based multisensory reaction time assessment called CatchUTM. The main objective of CatchUTM is to facilitate the identification of patients who are at increased risk of falls and promote the initiation of interventions aimed at reducing falls.
The impetus for creating CatchUTM… Before You Fall was to alleviate disability, promote independence, and increase quality of life for our older adults. CatchUTM is a quick (<10 min) multisensory mobile reaction time assessment tool that older adults can complete in the comfort of their own home, residential community, or medical provider’s waiting room. The actual assessment contains the exact experimental design noted above. Specifically, patients will be asked to complete the simple reaction time test employing three sensory conditions (visual alone, somatosensory alone, and combined visual-somatosensory; see iPhone in Figure 2 for example of visual stimulus (*) and somatosensory stimulus (vibration)). The addition of 45 control (i.e., “catch”) trials, where no stimulation is presented and no response is expected, affords monitoring of attentional performance throughout the assessment. Patients will be instructed to respond to each stimulus as quickly as possible by pressing a designated response space on the iPhone touchscreen (see gray “Click Here” response area on iPhone in Fig. 2) with either their left or right thumb. CatchUTM technology has been successfully developed for iPhone (to be available through iOS App stores).
Figure 2.
Introducing CatchUTMBefore You Fall: CatchUTM is a quick and accessible mobile multisensory falls-screening tool that is based on over 15 years of multisensory research. Figure 2 depicts the look and feel of the CatchUTM app on an iPhone. Patients will be asked to complete this simple reaction time test by keeping their eyes fixated on the cross, and pressing the gray response area (i.e., “Click Here”) as soon as they see, feel, or see and feel any stimulation. Visual stimulation is presented here as asterisks displayed on the iPhone screen. Somatosensory stimulation is a vibration from Apple’s Taptic Engine. The visual and somatosensory stimulation can occur in isolation or concurrently as in the case of the visual-somatosensory stimulation condition.
3.
Current Validation Studies
Our goal is to provide a standardized and mobile multisensory screening test that is quick, easy, affordable, and accessible. However, several necessary studies are currently underway to validate that the CatchUTM accurately predicts falls just like our laboratory apparatus given that the look and feel of the assessment on an iPhone is inherently different compared to the established laboratory experimental setup and apparatus. Some of these alterations, necessary to move from a clunky and expensive lab apparatus to mobile and accessible iPhone, include differences in the visual and somatosensory stimulators, inclusion of an iPhone display, and a response pad change from foot pedal presses to finger presses on a touchscreen.
It is well known that reaction times differ based on specifications of employed visual and somatosensory stimulators. In fact, such RT differences have been captured in our laboratory experiments over the years when including different visual inputs (LEDs lights versus asterisks presented on computer monitors) and different somatosensory inputs (electric square wave pulses versus pager vibrators versus pneumatic pulses). CatchUTM also requires finger responses, as opposed to foot responses that were utilized in our laboratory experimental protocols. Thus, in order to make claims that CatchUTM taps into similar visual-somatosensory integration processes that are predictive of falls as reported using our laboratory apparatus, several validation studies are currently being performed. The goal of these validation studies are twofold: (1) to determine the relationship of visual-somatosensory integration processes (collected through CatchUTM) with history of falls in the past year (baseline) and incident falls over a 12-month study period (collected bimonthly through telephone interviews) using Cox proportional hazard models; and (2) to determine whether visual-somatosensory integration effects from simple foot reaction times collected in the laboratory are translatable to visual-somatosensory integration effects from simple finger reaction times on an iPhone. We aim to demonstrate acceptable-to-excellent predictive accuracy (0.70–0.90 area under the receiver-operating characteristic (ROC) curve) of CatchUTM for identifying at-risk individuals for falls. If finger initiated visual-somatosensory integration effects fail to predict falls, additional validation studies will be implemented using Bluetooth foot-pedal response pads.
4.
Future Directions
Much like the newly implemented cognitive screening test during annual wellness visits for adults aged 65 and over, we will propose that older adults receive a CatchUTM assessment to assess multisensory integration performance and the likelihood of a fall vulnerability. Once a CatchUTM assessment is completed, the ordering physician will receive an email with their patient’s multisensory integration results and a general impression. Based on our research findings and results from our validation studies, patients with poor multisensory integration performance will likely be at higher risk for falls and other mobility impairments.
The CatchUTM report will be designed to provide physicians with convenient access to current CDC guidelines, as well as tailored recommendations that may propose inclusion of falls counseling, health education, and access to other home-based clinical health services like physical therapy and home health safety services to help mitigate future falls for their patient. This report will be delivered electronically to the ordering physician and will be available in the patient portal that is currently being developed. All patients will receive tailored recommendations based on their CatchUTM results as well as their specific endorsed medical co-morbidities. Currently, no specific intervention will be required, but as our research advances, we hope to be able to include more specific multisensory recommendations in the tailored patient report.
We believe that by providing physicians with easy to interpret results and recommendations at their fingertips, patients will be made more aware of the potential hazards associated with falls, and become more cognizant of their surroundings, which will ultimately provide a greater sense of safety in their own homes. We are working to ensure that older adults receive a CatchUTM assessment 1–2 times per year, since falls are already considered a priority area to be examined. However, a CatchUTM assessment can be ordered any time to facilitate identification of patients who are at increased risk of falls and promote physician-initiated falls counseling. Falls counseling effectively reduces falls in seniors, it just needs to be integrated systematically in clinical practice and we believe that inclusion of CatchUTM assessments in clinical practice will help streamline this process.
5.
Clinical Implications
Examining the facilitative benefit of multisensory information processing in older adults could have important clinical and public health implications. These include potentially providing insight into the cognitive and physical attributes of the aging process, affording an understanding of the biological basis of aging, and subsequently aiding in the identification of opportunities to introduce sensory, cognitive, and physical remediation programs to older adults. We believe that optimizing integration of visual-somatosensory inputs may ultimately provide the framework for successful interventions that will reduce falls, improve mobility while alleviating disability, and help maintain functional independence in older adults. Moreover, implementation of CatchUTM throughout the United States will afford acquisition of large datasets that can also inform development of such future multisensory-based interventions.
While our primary research efforts focus on healthy aging, we have plans to expand to other advanced aging disease populations at increased risk for falls including, but not limited to, Alzheimer’s disease, diabetes, and HIV. Given the proposed overlap in neural circuitry associated with sensory, motor, and cognitive functioning, we believe that there may also be an opportunity to raise falls-awareness in patients with other sensory (e.g., Autism, Sensory Processing Disorder) and motor (e.g., Parkinson’s disease) disorders.
6.
Conclusions
Mobility requires efficient interaction of musculoskeletal and sensory systems (especially visual, somatosensory, and vestibular) to control everyday movements, and these systems are compromised in aging and linked to cognitive status. Here, we introduce a novel research-based, clinical-translational multisensory assessment tool to identify older adults at-risk for falls that is currently undergoing validation studies. It is our hope that implementation of this product will lead to increased awareness of falls, increased access to preventative care measures, decreased societal burden of annual falls-related expenses, and an influx of knowledge that will aid in the development of future multisensory-based interventions aimed at alleviating disability, enhancing quality of life, and maintaining functional independence in older adults.
Financial Disclosure
This work was supported by the National Institute on Aging at the National Institute of Health (K01AG049813 to J.R.M.), (R01AG044007 to J.V.), & (R01AG036921 to Dr. Roee Holtzer). Additional funding was supported by the Resnick Gerontology Center of the Albert Einstein College of Medicine.
Disclosures
JRM has a financial interest in JET Worldwide Enterprises Inc., a digital health startup spun out of research conducted at Albert Einstein College of Medicine. An exclusive license to the intellectual property for CatchUTM was acquired from Albert Einstein College of Medicine in July 2021.
Acknowledgments
Special thanks to our amazing CatchUTM team, especially Lori Lonczak, RPh, MBA and Bob Kotch, MBA, MSEE for all of their hard work and support. CatchUTM is dedicated to Dr. Mahoney’s grandmother Jean Sisinni (1930–2021) for her never-ending love, strength, support, guidance, and selflessness.
References
1BeauchetO.AllaliG.AnnweilerC.VergheseJ.2016Association of motoric cognitive risk syndrome with brain volumes: results from the GAIT studyJ. Gerontol. A Biol. Sci. Med. Sci.71108110881081–810.1093/gerona/glw012
2BergK. O.Wood-DauphineeS. L.WilliamsJ. I.MakiB.1992Measuring balance in the elderly: validation of an instrumentCan. J. Public Health.83S711S7–11
3BergenGStevensM. R.BurnsE. R.Falls and fall injuries among adults aged ≥65 years - United States, 2014MMWR Morb. Mortal. Wkly. Rep.65993998993–810.15585/mmwr.mm6537a2
4CalvertG. A.SpenceC.Stein (eds)B. E.The Handbook of Multisensory Processes2004The MIT PressCambridge, Massachusetts
5CamicioliR.PanzerV. P.KayeJ.1997Balance in the healthy elderly: posturography and clinical assessmentArch. Neurol.54976981976–8110.1001/archneur.1997.00550200040008
6CaoY.SummerfieldC.ParkH.GiordanoB. L.KayserC.2019Causal inference in the multisensory brainNeuron102107610871076–8710.1016/j.neuron.2019.03.043e8
7CarabelleseC.AppollonioI.RozziniR.BianchettiA.FrisoniG. B.FrattolaL.TrabucchiM.1993Sensory impairment and quality of life in a community elderly populationJ. Am. Geriatrics Soc.41401407401–710.1111/j.1532-5415.1993.tb06948.x
8HoltzerR.WangC.VergheseJ.2012The relationship between attention and gait in aging: facts and fallaciesMotor control.16648064–8010.1123/mcj.16.1.64
9HugenschmidtC. E.MozolicJ. L.LaurientiP. J.2009Suppression of multisensory integration by modality-specific attention in agingNeuroreport2034953349–5310.1097/WNR.0b013e328323ab07
10JonesE. G.PowellT. P.1970An anatomical study of converging sensory pathways within the cerebral cortex of the monkeyBrain: J. Neurol.93793820793–82010.1093/brain/93.4.793
11JudgeJ. O.KingM. B.WhippleR.CliveJ.WolfsonL. I.1995Dynamic balance in older persons: effects of reduced visual and proprioceptive inputJ. Gerontol. A Biol. Sci. Med. Sci.50M263M270M263–7010.1093/gerona/50A.5.M263
12KayeJ. A.OkenB. S.HowiesonD. B.HowiesonJ.HolmL. A.DennisonK.1994Neurologic evaluation of the optimally healthy oldest oldArch. Neurol.51120512111205–1110.1001/archneur.1994.00540240049015
13LaForgeR. G.SpectorW. D.SternbergJ.1992The relationship of vision and hearing impairment to one-year mortality and functional declineJ. Aging Health4126148126–4810.1177/089826439200400108
14LordS. R.RogersM. W.HowlandA.FitzpatrickR.1999Lateral stability, sensorimotor function and falls in older peopleJ. Am. Geriatrics Soc.47107710811077–8110.1111/j.1532-5415.1999.tb05230.x
15LordS.SherringtonC.MenzH.CloseJ. C.Falls in Older People: Risk Factors and Strategies for Prevention2007Cambridge University PressCambridge
16LordS. R.WardJ. A.1994Age-associated differences in sensori-motor function and balance in community dwelling womenAge Ageing23452460452–6010.1093/ageing/23.6.452
17MahoneyJ. R.Barnett-CowanM.2019Introduction to the special issue on multisensory processing and aging (Part II): links to clinically meaningful outcomesMultisens. Res.32665670665–7010.1163/22134808-20191509
18MahoneyJ. R.CottonK.VergheseJ.2019Multisensory integration predicts balance and falls in older adultsJ. Gerontol. A Biol. Sci. Med. Sci.74142914351429–3510.1093/gerona/gly245
19MahoneyJ. R.DumasK.HoltzerR.2015Visual-somatosensory integration is linked to physical activity level in older adultsMultisens. Res.28112911–2910.1163/22134808-00002470
20MahoneyJ. R.HoltzerR.VergheseJ.2014Visual-somatosensory integration and balance: evidence for psychophysical integrative differences in agingMultisens. Res.27174217–4210.1163/22134808-00002444
21MahoneyJ. R.VergheseJ.2020Does cognitive impairment influence visual-somatosensory integration and mobility in older adults?J. Gerontol. A Biol. Sci. Med. Sci.75581588581–810.1093/gerona/glz117
22MahoneyJ. R.VergheseJ.2019Using the race model inequality to quantify behavioral multisensory integration effectsJ. Vis. Exp.
23MahoneyJ. R.VergheseJ.2018Visual-somatosensory integration and quantitative gait performance in agingFront Aging Neurosci.1037710.3389/fnagi.2018.00377
24MeyerG. F.NoppeneyU.2011Multisensory integration: from fundamental principles to translational researchExp. Brain Res.213163166163–610.1007/s00221-011-2803-z
25MozolicJ. L.HugenschmidtC. E.PeifferA. M.LaurientiP. J.MurrayM. M.WallaceM. T.2012Multisensory integration and agingThe Neural Bases of Multisensory ProcessesLlc.Boca Raton FL
26MummareddyN.BrettB. L.Yengo-KahnA. M.SolomonG. S.ZuckermanSL.2020Sway balance mobile application: reliability, acclimation, and baseline administrationClin. J. Sport Med.30451457451–7
27SaxtonS. V.EttenM. J.PerkinsE. A.Physical Change & Aging: A Guide for the Helping Professions20105th ed.Springer Publishing CompanyNew York; NY
28SchroederC. E.FoxeJ. J.CalvertG. A.2004Multisensory convergence in early cortical processingThe Handbook of Multisensory Processes295309295–309MIT PressMassachusetts
29Serra-AñóP.Pedrero-SánchezJ. F.InglésM.Aguilar-RodríguezM.Vargas-VillanuevaI.López-PascualJ.2020Assessment of functional activities in individuals with Parkinson’s disease using a simple and reliable smartphone-based procedureInt. J. Environ Res. Public Health17412310.3390/ijerph17114123
30Shumway-CookA.BrauerS.WoollacottM.2000Predicting the probability for falls in community-dwelling older adults using the timed up & go testPhys. Ther.80896903896–90310.1093/ptj/80.9.896
31Shumway-CookA.WoollacottM.LippincottMotor Control20124th ed.Williams and WilkinsNew York, NY
32SunR.SosnoffJ. J.2018Novel sensing technology in fall risk assessment in older adults: a systematic reviewBMC Geriatr.181410.1186/s12877-018-0706-6
33TinettiM. E.1986Performance-oriented assessment of mobility problems in elderly patientsJ. Am. Geriatrics Soc.34119126119–2610.1111/j.1532-5415.1986.tb05480.x
34WallaceM. T.SteinB. E.2012The impact of multisensory alterations in human developmental disabilities and disease: the tip of the iceberg?The New Handbook of Multisensory Processing645656645–56The MIT PressCambridge, MA
35Yogev-SeligmannG.HausdorffJ. M.GiladiN.2008The role of executive function and attention in gaitMov Disord.23329342329–4210.1002/mds.21720quiz 472
37CDC. Facts about Falls [Available from: https://www.cdc.gov/falls/facts.html
38Vision, Hearing, Balance, and Sensory Impairment in Americans Aged 70 Years and Over: United States, 1999-2006 http://www.cdc.gov/nchs/products/databriefs/db31.htm: Center for Disease Control