This startup relies on ordinary cameras, most of which in our smartphones, to provide users with two different amazing services. Aipoly is an artificial intelligence startup and their technology works by using simple cameras and artificial intelligence to map and understand the interactions that human beings have with the real world.
Currently, their technology is used in one of two ways. The first is all about allowing vision-impaired customers to "see" and recognize the world around them and the second is for use in stores. By using their technology in stores, autonomous store platforms can be built where items and customers are tracked with their Vision AI to ensure that the store is stocked properly and items are appropriately paid for.
Their Aipoly app, called Aipoly Vision, is available for iPhones. When pointed at objects like headphones or the sky it can accurately relay the appearance of each object and identify what it is to service those with visual impairments.
The app was made to be simple to use. The user just has to download the app, open it, point the rear of their device towards an object and click. Aipoly will then identify the object for them. The app just uses the user’s iPhone and does not require internet, unlike similar apps on the market. This is done through a deep learning technique which is a machine-learning technique designed by studying the human brain. It reduces images to their most basic forms like lines curves and circles and then compares them so similar objects.
This technique is the same kind of technology that Google and Facebook use when they recognize your face and tag you in your friend’s pictures. Since they’re still working out the kinks, the number of items the app can correctly identify and the correctness of its guests can be limited. However, users can submit their own recommendations or corrections to make the app’s recognition of items more helpful. For users, this technology can be life changing. Imagine image hearing after years (or a lifetime) of being unable to see it yourself that your daughter is wearing a green dress or that the sky is blue today.
So what about the other use of their vision API? It works in the same way that their app for the blind and visually impaired works, and you may have already used something similar. A virtually-staffed store may sound crazy or weird but many of us have already used them. Generally, it’s a store partially staffed and filled with terminals for self-checkout during certain hours or a store that is completely free of human employees.
For businesses, the prospect is great. A store with no queues, no checkouts and shelves that are always stocked with products that are monitored and guaranteed to be available. This can help business owners cut costs and reduce frustration for customers who need to get their products, pay and be on their way, not wait in line and lose time due to human error or large queues.
Right now they offer autonomous retail store platforms and distribution centers which don’t need to be staffed. In both situations, the items that are moved or purchased, say by a customer or a warehouse working stocking shelves, can be tracked in real time. Their autonomous stores don’t require cards, cash or a checkout, items are charged to our account when we exit the store, which can be done immediately after selecting the items we need.
In stock rooms, an autonomous inventory is taken every 50 milliseconds so that a full inventory scan with the most up to date information on products stocked is available in real-time. If objects become expired, the market owners and shoppers can be notified. This is just another way their apps work like eyes for those who can’t see objects due to visual impairment or physical distance.
The cameras not only track the movements and interactions of human beings, but they can track and identify items of any shape and size using deep learning. If no image data is available for objects, it can be procedurally generated using a simulated environment. Store owners can place items in a machine for a few minutes and the AI will learn to identify each item.
I personally like Aipoly, not only for the great prospects they provide to the visually impaired to experience life like those who are sighted, but also by their ingenious display of the fluidity of artificial intelligence in everyday life.