Aipoly Vision app has received the Best Innovation Award at CES 2017. The reason why Aipoly has been given its deserved recognition is that not only the app helps blind or visually impaired people to ‘see,’ but it speaks seven different languages too.
Aipoly is a smart-phone AI application is currently available free of cost in iOS systems which include iPhone, iPod touch, and iPad (will be available on Android too within few months). The app uses color recognition and image recognition technology to help the visually impaired and color blind people to understand what objects are around them.
The app user can use it either to understand color, using the color recognition function or can identify objects by using the object recognition function. When the user points the phone camera to the object in front of them, the app tells the name of the object or its color.
How does the app work?
The app uses convolutional neural networks which can describe the input given to the device with words.The user points the camera of the phone while running the application, and the image recorded is then sent to the cloud platform for processing. When the vision system has identified the object the phone is pointing at, the app returns the answer in speech and text form. The system has the capability to detect multiple objects present in the scene and can even specify the relationship between the concerned objects.
Recently, Aipoly can detect hundreds of objects without even undergoing any prior training, and is programmed to utter the name of the seen object within few seconds. The person need not take a picture of the image, the app constantly visualizes the object in the scene and sends back the result to its user.
However, since the app can work only with the internet connection on, its performance is highly dependent on the availability of network coverage. While the entire process can take only five seconds when working on a fast Wi-fi network, it may take as much as 20 seconds when the connection is poor.
The developers said that the app had been trained with as many as 300,000 images at the initial stage, but they (the developers) are intending to continue with further improvement so that the app can recognize street signals and other specific items like walking canes, guide dogs, etc., which are needful to the blind people.
It may seem that asking help from a fellow human being will feel a lot nicer than asking an app; one of the developers, Cheng, argued that using the app will make its user less dependent on others.
The app cannot always accurately distinguish between genders. Although the app can understand few distinct facial expressions like smile, anger; the app is not very sensitive. Recognizing human emotions is a field where the app developers wish to focus on for improvement of the app.
Rizzoli said that the system thinks hard before calling someone ‘man’ when it’s not confident about the recognition; since it will be impolite to call a female person ‘man.’
The app also faces problem while detecting blurry or low-quality images. While taking a close up image of an empty juice bottle, it names the item as ‘Naked glass jar.’ This leaves a place for confusion in the mind of the user.
Latency is another important issue that Aipoly is facing currently. It is using a cloud platform the processing time of which greatly constrains the utility of the app.
The developers are working on the app to ensure an easier way to help the visually impaired people to understand their surroundings. Any user can help Aipoly to learn about new objects by describing the particular object. The app will also undergo further updates to understand complex scenes with various objects. It will be able to detect the objects and recognise the relationship that the objects share, like ‘a man riding a bicycle’ or ‘a dog sitting near a tree’.