Our old smartphones can be the ultimate I.O.T. devices for our smart cities (Part 3).
Oh boy ! This part is very interesting! As we are going to talk about the most versatile sensor in a smartphone, The Camera. The smartphone camera can do many things if its versatility is used in the context of the smart city sensors (See, Part 1 and Part 2). The obvious applications are photo and video capture, but there is this problem: If you capture video on thousands of cameras installed in a city, there will be huge amount of data that will need to be processed and the data will need a lot of bandwidth to transmit that data. However, we have made many advances in smartphone processors. Their capability can be leveraged to process a captured video from a smartphone camera and alert a central server when a noteworthy incident is detected. Thereby, reducing the amount of data that needs to be transmitted. What can be a noteworthy incident? It can be anything. It may be an accident happening in real time. It may be a dangerous object like a weapon, it may be a fugitive, it may be a person having a heart attack, an infrastructure failing. The definitions of a noteworthy incident are many and can be dynamically updated on the smartphones with software. There are a lot of technologies being developed around the world that can be used to detect a noteworthy incident.
Object recognition of harmful objects like weapons can be one of the prime noteworthy incidents that a smart city sensor can detect. There are companies out there that are developing camera software that can detect weapons. For example, Nanowatt Design, a company working in computer vision, has developed a computer vision system called Gun Detect. Gun Detect uses a camera and software to detect a weapon in its field of view. This technology can be implemented on a smart city sensor to identify weapons on the streets.
MIT is developing a couple of technologies that can be used in smart city sensors.
The first one is a pulse camera. This is a rather mature technology now, developed in 2010. The pulse camera can capture a video of a person's face, measure slight variations in brightness produced by flow of blood through the blood vessels in the face, and deduce the person's pulse rate. This technology has been integrated in an iphone app called cardiio. This can be used in detecting medical emergencies on the streets with the smart city sensors.
The second is Visual Microphone, which can process video for filter out small vibrations on an object. These vibrations are then used in a audio recognition software to identify sounds in the vicinity of the object. The visual microphone can also be used to detect a materials physical properties by studying its vibrations. There can be many other applications, such as identifying materials, monitor infrastructure like bridges or roads for defects. Take a look at the TED talk of Abe Davis, one of the pioneers of this technology, below.
Apart from the few technologies discussed above there are probably many others that can leverage the combination of a camera and a powerful processor in a smartphone (smart city sensor) to identify noteworthy incidents and alert the relevant authorities in real-time, so that suitable actions can be taken, lives can be saved, and infrastructure failures could be prevented.