California Polytechnic University, California
April 10, 2025
April 10, 2025
April 12, 2025
10.18260/1-2--55168
https://peer.asee.org/55168
We intend to follow up with a full paper and presentation. Our team is continuing development on the Obstacle Avoidance Senior Capstone project, a mobile application designed to empower visually impaired users by enhancing their ability to autonomously navigate unfamiliar spaces safely and with confidence. This project has transitioned to fill the role of a supplementary tool, like a guide dog or white cane, that provides real-time obstacle recognition and feedback using Apple’s Assistive Technologies. We seek to offer detailed feedback to users, including identification, distance, and angular position of approaching obstacles, with additional support through haptic feedback and spatialized audio. Throughout this project we’ve had an unstructured yet supportive development space that has allowed for us to explore new and innovative avenues we otherwise may have avoided. Our professors have provided us with a wealth of knowledge on various new topics and industry practices that we would not have been introduced to before. While ensuring that we have all the tools to succeed our professors challenged us to take ownership of our ideas, solutions, and development processes. To ensure User-Centered Design, we’ve been working alongside community members like Dr. Ronald Peterson, an advocate and member of the San Diego blind community, to gather critical insights of users' needs and better understand how we can offer support utilizing new technologies. Under his guidance we’ve designed the application to function using technologies such as Apple’s “ voice over” for ease of use. This partnership has guided our focus toward developing a system that prioritizes safety, autonomy, and reliability for the visually impaired, addressing challenges such as navigating urban environments and avoiding potential hazards. This connection was only possible through the support of the faculty running the Senior Capstone course, and discourse established thanks to previous teams who’ve gone through the experience. Throughout our planning and design phase, we conducted extensive research and established user-stories to identify key features and requirements we will undergo throughout the next semester, overseen by the guidance of our project mentor Dr. Charles Wolfe. We’ve begun refactoring existing code to update frameworks and libraries, integrating the latest Apple capabilities such as Core ML and LiDAR depth sensing. Through guidance from our professors and individual research we have adopted a 3-tier architecture for this application with a user interface, a logic layer, and a database layer. Allowing us to separate all layers ensures a scalable, adaptable, and maintainable solution that can seamlessly interact with pre-trained machine learning models and object detection systems like YOLOv3 alongside a semantic segmentation model to achieve the highest possible level of accuracy while maintaining performance. The combination of these two models will provide our decision block with a deep understanding of the terrain in front of the user, detecting objects as well as negative space and classifying the detected obstacles Tackling a complex and ever changing problem offers its fair share of challenges. Through introductory lessons on various topics covering development processes, system architecture, and more while allowing us to continue our own research into how these concepts can be applied within our own projects. These lessons helped guide us towards our current architecture taking a 3-Tiered approach, and structuring how we plan to manage and organize application data and system logs. We’ve benefitted from having open communication with many professors and faculty, who have offered us feedback and direction, which wouldn’t be possible without the style and structure of the capstone process thus far. Looking ahead, we plan to conduct iterative testing to evaluate the app’s performance under diverse real-world conditions. This includes measuring accuracy in obstacle detection, the speed of processing, and user satisfaction with feedback modalities. As well as making sure that built in accessibility features work correctly and that the app is easily navigable with them. By working closely with end-users, we will refine the app to ensure it meets their needs while adhering to federal accessibility standards. We anticipate delivering a transformative solution that bridges the gap between existing assistive technologies and emerging innovations, providing visually impaired individuals with a tool that enhances their independence and quality of life.
Aranda, D. M., & Breach, C., & Fernandez, J. (2025, April), Educating Engineers to Enhance Accessibility through User-Centered Design Paper presented at 2025 ASEE PSW Conference, California Polytechnic University, California. 10.18260/1-2--55168
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2025 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015