The topic for my app is based around people who have disabilities, for those who don’t know about my background I have grown up volunteering with my family in a program that pairs non disabled with those who have disabilities. So with the experiences that I have come across via the T.O.D.A.Y. Program as well as my trip to Tennessee where I did some of my field research for the app that I am developing. The question on everyone minds is what type of app is it; it’s an app that hopefully will enhance the lives of those who are visually impaired.
The app needs to not only advance the lives of those who are blind but for those who have a hard time seeing as well. Seeing how many people are transitioning to smart phones, as well as just in general how many companies are getting out of the “simple” phone business. Currently the losers on this business switch is those who are visually impaired because on screens there is no easy way for the visually impaired to feel around a flat screen surface. Now there are various amounts of solutions that have been presented for as a solution for this. One of those is Tactus, which is a screen protector that goes onto the display to create a raise display that helps for typing but not seeing.
The idea for the app would pair not only a built in liquid display screen that within the phones accessibility option you could turn on. Once the technology is turned on anytime the screen is unlocked the buttons automatically conform to where the user would need to interact to use their phone. This way they can turn it on, or off or text someone using the phone as well as call someone, obviously there would need to be a little bit of training that goes into it but I am sure the users will go head over heels for it.
That’s just one part behind the technology; we would also be doing an app that really illustrates what we want to do better. It will function with the use of the camera to not only help illustrate what were seeing as far as color and people but, using headphones we can train the app to remember people via faces and such so that those who are blind not only need to recall people by voice. With using this “companion app” the user can get a vivid picture of what the world looks like and describing to them where they all. Say the user is on a beach and before can tell by using their sense of smell and touch. Now we can vivid paint a picture for them by using words and describe to them what they are seeing. The goal of the “companion app” is to open up a whole new world for the user.
On my trip to Memphis and Nashville I came across a bunch of various social settings, from a 18 hour car ride throughout cities and mountains to a two hour jam session in a chain restaurant. The experiences of just exploring new cities and sights were amazing and for a person with disabilities, especially visually impaired they really can only tell by the air and sounds but cant see the difference. That’s where I see the app helping with it using the technology that we develop as far as features as well as our “companion app.” I think painting that vivid picture that we all take for granted is something that we can do for those who are visually impaired.
I was at a symposium on Friday where I guy who was hearing impaired all his life was able to hear for the first time at the age of 23! All he did was change his hearing aids and with the new advancement in technology he was now able to hear more than just the low tones. Technology as a whole is so impactful on lives and it changes the way we live and interact with everyone. That is why our “liquid display technology” as well as the vision “companion app” is so crucial in improving lives.
There are a few articles that I have used in conjunction to my research to better understand some of the other technology out there that can help advance our own technology plans. The articles I have used as research are listed below.
Frey, Brian BrailleTouch: Mobile Texting for the Visually Impaired. http://www.cc.gatech.edu/~mromero/frey_southern_romero_2011.pdf
Kane, Shaun Slide Rule: Making Mobile Touch Screens Accessible to Blind People Using Multi-Touch Interaction Technique. http://www.cs.rochester.edu/u/jbigham/pubs/pdfs/slide-rule.pdf
Apple applications for the visually impaired, http://ntac.blind.msstate.edu/consumers/files/Comprehensive_Apple_Apps.pdd
I picture an app that allows the user to touch a corner of the screen to activate the camera portion of the app. When the camera is activated it allows the user various modes. The next few lines will document some of those features under the modes that we currently plan on offering.
Live View Mode:
- Will allow to audible tell the user what is going on in front of them.
- Let the user input a destination and give them the best course of direction
- For walking, it allows the GPS in the phone to help navigate around the streets to give them to by turn direction
- For Bus transportation, it will give the user the directions to which bus stop to use. Once there at the bus stop it will give them the times and how long till the next bus arrives.
- When a user takes a photo the app will describe into detail what the photo is.
- Say it’s a photo of a sunset; the app would go into vivid detail about into painting the picture using words.
- If the photo is taken within the app we have a unique feature where we will capture the scent of that photo. So that when you look at the photo and touch different aspects of the photo, the app will release different scents so that you can use the picture and connect it with your memory.
All of this is going to take time to develop fully and then launch the product. Obviously this is going to take a few months to totally nail down the technology that is going into this project, so to say that were going to launch in a few months won’t be accurate. I see this as being at least a year out from launch. Leading upto launch here is what we expect to happen.
- Apply for the patent
- Start looking for partners to develop the technology for the phones
- Hire developers to start coding the app
- Develop technology
- Develop the App
- Alpha testing on both products and first testing of the new technology paired with the app
- Beta Testing
- Partner Contracts signed
- Marketing the app and new technology
- Training sales people on how it works
- Product and App Launch