Second Sight is an app that connects their users to the smart phone in a whole new way. As we continue to advance our own technology to cooler and sleeker things we constantly are putting other users at a disadvantage. These disadvantages for people who are visually impaired are greatly increased now with not having buttons on their phones to assist them in typing what they want to type. On top of this if the user buys a smart phone they need to rely on voiceover to assist them with using the device (Goggin 92). My goal with this app is to revolutionize the industry and to make it an even level ground for not just one portion of the market, but the entire market. Technology is ever expanding and being used in multiple ways we never thought it would be used, like for instance technology is gaining prominence in education (Wong p. 646) and could be segregating the visually impaired from getting the most out of their learning experience.
So then what’s the app? I call it second sight; it’s an app that allows the user to use the camera’s functionality to integrate within the app to allow the user more freedom and use of the phone. In a little bit I will demonstrate the key features for the app in a visual component so that you can get a better visual understanding of what the app will look like. Before that can happen, I want to give you insight into my research and development of the app.
Imagine that you entered a new city, have no idea where you are and your blind. Trying to navigate your way from destination to destination is not going to be easy. Actually it’s going to be a down right struggle, this is something that is never thought of as we advance our technology. However every day this happened for a portion of our society that lives with low vision or total blindness. While we all have our heads deep into the devices, when we walk, when we sit down, when we are in company with friends. We are attached to these devices; in some cases these devices are running our lives. Simple things like adding an appointment, texting a friend, get directions and even taking a picture and sharing it with others can be a struggle. This is where the application Second Sight comes into play; the application is an app that enhances the lives of those with visual impairs.
The app came to me while I was away in Memphis, Tennessee covering The American’s Basketball Tournament. While there we had the chance to walk around an entirely new city, not knowing where we were going only using the maps on our phones. I traveled into a few packed venues to listen to some live music; we were in the home of Elvis after all. What I realized during this time here was all the beauty that surrounds us that we see with our eyes and there are people out there that cannot see at all. I made it my goal that week to create an app that will allow the user to get a better understanding of everything around them.
I never realized just how much goes on and such when you close your eyes. I sat in the FedEx Forum for Temple verse University of Central Florida with my eyes close for around five minutes. Within that time I realized how difficult it was to enjoy the game of basketball because you could not get a grasp on who was making what baskets and then where you were suppose to get to your seat. It truly took me back to when I was in high school and did a workshop called Welcome To My World. This workshop’s outcome is to get those who have no disabilities to understand living the lives of those with disabilities and how they function on a daily basis.
The question is how do we truly connect a world that is fast moving, always evolving with new technologies and new ways to do thing with those who really cannot function that way as far as using new technologies like touchscreen devices. There are expensive solutions out there like screen covers and different applications, but nothing exist as ambitious as second sight and for that reason that is why we exist. Our mission is to change one users experience one phone at a time. We want to open the users environment and surrounding to them in a new way that allows them to visualize and see using the devices that they can readily get.
The app is built upon various different applications like Google field trip, as well as Google glass. There is also an app that was developed at the University of Palermo in Italy that we take a few influences from, this application is a virtual cane for the visually impaired that can be used inside where there is no GPS (Gallo 2013). On top of this it users well placed QR codes to give the user more information when the QR code is scanned.
What does this mean for us at second sight? Well there are 285 million people worldwide that are visually impaired. Out of that 39 million are blind while 246 million have low vision (Jafri 2013). We have the chance to enhance a large populations life. What does the app do? The apps design is simple, we input while time visual data into our app using the camera and through headphones attached to the phone we describe into detail what the user is seeing. Say for instance you never have been able to see a sunset, point the phones camera to the sky and then the app will vivid describe what it looks like to paint you that picture for your imagination to create it for you.
Photos in general are a huge part of everyone’s live, anytime you go out with a group of friends your always taking multiple photos with each other. For those who suffer from visual impairment might not be able to get that same enjoyment out of those photos as the rest of us do. So we have to create an accessible photo library that will live within the app. There are a few goals for this to happen from a survey that Susumu Harada did, through this survey he learned three things. Users want to quickly capture a photograph along with a memo and ambient recordings, they want to browse through the collection and get a grasp of the content for each photo, and then finally the user want to take the lead in sharing their photos with a sighted person (Harada 2013).
Knowing this we would use the voiceover function for commands from the user to integrate into the app. For instance if the user is holding their phone and want to take a photo, the user can initiate the photo by saying “Photo on.” Once the user initiates the photo option the app will describe what the camera is seeing, once the user is set for the photo they initiate the capture by saying “Take Photo.” The photo is captured with ambient sound, as well as allows the user 10 seconds to audible add a memo to the photo. If the user wants to view the album, the command is “View my photos.” This brings up an overview of all their photos with potions of giving command of dates, and places where you took the photos. This way the user can share with their friends what pictures they took.
Second Sight isn’t only about photos; we use video and GPS technology to allow the user to use the app to get around town. The user can prompt the app by saying “Directions to (address).” After the app looks up directions it presents the user with three options, mass transit, walking directions or call a taxi. If the user elects for walking directions, the camera than turns into it’s cane assistant where the user can hold the phone in front of them to get audible turn by turn directions for their destination. For the other two options the camera can identify coins and dollar bills to the user so they give the right denomination out, and can check to get the right change back as well.
The app is based fully off of mobility; our goal is to make the app as mobile as we can. Second Sight is a mobile application that is used currently in conjunction with existing users mobile devices. Once a user downloads and launches the application for the first time they will be presented with a few audio notifications. These notifications are to alert the user to give our application access to a few features we need to use from the device to benefit their experience within our application. These alerts will allow us access to there camera, camera roll, and location services. These services will allow the user full experience and are necessary for the application to function the way it’s supposed to.
The mobile application, Second Sight is based on the principle and ideas of being accessible for a wide range of users. The principle client being those who are visually impaired and being the most accessible to those users that’s our priority. Because of this, once the application is opened on your device the display actually goes into a sleep mode. The application is not really made for to display on the screen but use audio cues as well as the camera function. With that being said those users who have good vision might find this a little annoying because they are use to using the screen for everything. So to make the application more accessible, we can create a screen interface as well, that functions with the audio cues from a users voice.
The issue is that with 90% of the worlds visually impaired being in developing countries (Jafari 2013), we have to come up with a program to better the quality of life in these developing countries. One idea is to create a program where we recycle your old devices and donate them to users in these countries. However just simply giving them these phones to solve the issue, we have to build a network for these devices to work off of. That’s another challenge we face, as this application has features that closely work with technology like GPS. This advance technology would need to be present within the area that the user wants to use.
Second Sight is an ambitious application that is out to change the world one user at a time. No matter what type of vision they have, we as a society have a responsibility not to just evolve our technology but include all type of users as the mobile media technologies progress. That philosophy is at the core of this application, giving every user, no matter what the condition of their vision is, a chance to interact with current technology. From taking a photo to walking to their next destination we wanted to make life simpler and able to take advantage of the devices we have all came to love for those who suffer with visual impairments.
Donner, J., (2008) Shrinking Fourth World
Gallo, P. (December 13th, 2013) Arianna: Path Recognition for Indoor Assisted Navigation with Augmented perception. Retrieved from http:// arxiv.org on April 21st, 2014.
Goggin, G., (2006) Cell Phone Culture: Mobile Technology in everyday Life.
Harada, S., ect.. (2013) Accessible Photo Album: Enhancing the Photo Sharing Experience for People with Visual Impairment. CHI 2013: Changing Perspective Paris, France. P. 2127 – 2136
Jafri, R., ect… (2013) Computer Vision-based object Recognition for the Visually Impaired Using Visual Tags. The 2013 International Conference on Image Processing, Computer Vision and Pattern Recognition. P. 400-406
Wong, M. (October – November 2012) Teaching the benefits of Smart Phone Technology to Blind Consumers: Exploring the Potential of the iPhone. Journal of Visual Impairment & Blindness. P 646 -650.