NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED582287
Record Type: Non-Journal
Publication Date: 2017
Pages: 279
Abstractor: As Provided
ISBN: 978-0-3555-0492-7
ISSN: EISSN-
EISSN: N/A
Mobile Devices as Platforms for Estimation, Control, and Enhanced Interaction with Physical Systems
Frank, Jared Alan
ProQuest LLC, Ph.D. Dissertation, New York University Tandon School of Engineering
In just a short time, mobile devices have revolutionized the way we access information and interact with each other. With an ever expanding list of sensors and features, these devices are also capable of reshaping our experiences with physical systems. Prior efforts to explore this potential of mobile devices have often considered traditional user interfaces, leaving many of the capabilities of the devices untapped. In contrast, we consider the development and evaluation of novel systems that integrate traditional engineering platforms with mobile sensing and computation. Such systems are classified as "mobile cyber-physical systems," as the physical behavior of platforms and the performance of integrated mobile technologies become coupled. By shifting sensing and computation onto mobile devices, the systems have distinct affordances, such as platforms with reduced cost and complexity. Furthermore, by employing visual sensing and interactive graphics on the device, mobile mixed-reality interfaces are designed that yield intuitive and engaging user experiences. However, limitations in mobile sensing and computation introduce challenges in implementation. Thus, a series of experiments and user studies are conducted to characterize the performance needs of such systems and to formulate solutions that address them. First, we investigate how mobile devices enable the design of intuitive metaphors for mapping user interactions with devices to commands for operating physical systems. Then, we explore the potential of mobile devices to aid in the estimation and control of systems built from laboratory test-beds modified according to two fundamentally different approaches. In the first approach, smartphones are mounted directly to test-beds to facilitate inertial- and/or vision-based measurement and control of test-beds. In the second approach, tablets are held such that their rear-facing cameras allow for vision-based measurement and control of test-beds as well as delivery of mobile mixed-reality interfaces for interacting with test-beds. Experiments validate the feasibility of the two approaches and uncover the factors that impact stability and performance. Moreover, we examine the use of developed systems as educational platforms that engage learners in new and interactive ways. Results of user studies indicate that the utilization of mobile technologies creates opportunities to deliver learning experiences that benefit from access to both concrete physical models and interactive visualizations of concepts, and permits the development of more portable, affordable, and engaging platforms for science and engineering education. Finally, we explore the application of the mobile mixed-reality approach to interactions in shared spaces with various robotic systems that have limited perception and computational power. Results of experiments in which participants command a robot to manipulate objects indicate that mobile mixed reality provides acceptable performance and user experiences as compared to conventional approaches, without the need to install sophisticated hardware on or around the robot. A novel vision-based control approach is presented in which the state of a mobile robot is coupled with that of an interactive virtual representation of the robot maintained by a mobile interface. Results of experiments in which participants generate paths and maps for mobile robots to use in planning and navigation show that robots can be driven along desired paths either drawn directly by the user or generated by an algorithm developed to correct user-drawn paths that cause collisions. Then, a technique is presented that uses both a mobile device's visual and inertial data to track, control, and interact with swarms of small, simple mobile robots that have no sensors. The dissertation concludes by exploring the use of the device's sensor data and a single robot with/without proprioceptive sensing to estimate the relative pose between them to support effective and safe collaborations in shared spaces. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml.]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A