ISS uses AI to monitor Earth, Flapping-wing robot can autonomously land like a bird, Bumble bee air taxis
ISS’ new AI-powered program will help real-time monitoring of Earth's climate
To control natural events like wildfires, satellites are being used to monitor and prevent them from becoming uncontrollable.? Metaspectral , a Vancouver-based software company has created a platform based on real-time #ai analysis of #hyperspectralimaging satellites. It enables the platform to anticipate potential forest fires and designate high-risk areas. This will allow firefighters and civil authorities to prevent outbreaks and focus their resources on the most affected areas.
The company has already been awarded a patent for using AI to compress hyperspectral imagery. In the recycling industry, it has made advances in better identifying and sorting hard-to-recycle materials with #machinelearning models. The technology will now be deployed on the International Space Station (ISS). It is also being adopted by the Canadian Space Agency (CSA) to analyze hyperspectral image data from the ISS and Earth observations satellites in Low Earth.
Metaspectral co-founder and CEO, Francis Doumet, was noted saying:
"In the commercial space industry, Metaspectral is enabling space companies to maximize the operational capacity of space assets by allowing them to capture and transmit more data within the same bandwidth while also saving on operational costs by minimizing the data that needs to be transmitted to the ground."
For decades, NASA and the Canadian Space Agency (CSA) have relied on Earth Observation Satellites (EOS) to monitor wildfires using infrared imaging cameras. Now, entire regions can be analyzed by #remotesensing technology with hyperspectral sensors to quantify and characterize the level of "fuel" on the ground, including flammable materials such as dry or dead trees and plants. "This allows authorities to identify the areas with the highest probability of a fire breaking out with greater accuracy than what could be determined using other methods like historical data.
Advances in infrared astronomy, #machinelearning, sensor design, hardware, software applications as well as #datascience are transforming the field of satellite imaging. Earth Observation Satellites (EOS) can serve as an early warning system to alert ground forces as soon as a small fire erupts. Metaspectral Fusion analyzes hyperspectral data and streams it in real-time. Machine Learning Operations agents continuously improve the AI models. This allows high-risk areas to be designated based on their spectral signatures, quantifying an area's healthy trees and healthy grass. Earth observation with hyperspectral imagery can even be used to monitor environmental hazards like oil spills and even determine plastic levels on the surface of the ocean.
Hyperspectral images and their analysis produce large volumes of gigabits of data per second. Metaspectral's proprietary compression and deep learning algorithms allow the platform to process and analyze this data in real-time while performing pixel-by-pixel level analysis. By creating access to real-time data, the platform allows businesses, governments, or other users to benefit from insights within seconds of the data being captured instead of the days it has historically taken."?
Winged robot can now land like a bird on horizontal perch
EPFL researchers have successfully invented a method that allows a flapping-wing #robot to land autonomously on a horizontal perch using a claw-like mechanism. This breakthrough has the potential significantly expand the scope of robot-assisted tasks.
While it may look easy, perching is a highly complex maneuver that involves an extremely delicate balance of timing, high-impact forces, speed, and precision. Until now, no flapping-wing?robot has been able to master it.
Raphael Zufferey, a postdoctoral fellow in the Laboratory of Intelligent Systems (LIS) and Biorobotics (BioRob) in the School of Engineering, is the first author on a recent paper describing the unique landing gear that makes this perching possible. He built and tested the model in collaboration with colleagues at the University of Seville, Spain, where the 700-gram ornithopter itself was developed as part of the European project GRIFFIN.
Zufferey said that "this is the first phase of a larger project. Once an ornithopter can master landing autonomously on a tree branch, then it has the potential to carry out specific tasks, such as unobtrusively collecting biological samples and measurements from trees." Eventually, it will be able to land on artificial structures, which could open up further areas of application."
领英推荐
Ornithopters, like many unmanned aerial vehicles?(UAVs), have limited battery life but soon will be able to recharge using solar energy, making them ideal for long-range missions.
"This is a big step toward using flapping-wing robots, which as of now can really only do free flights, for manipulation tasks and other real-world applications," Zufferey says.
The engineering involved in landing an ornithopter on a perch without any external commands required the ornithopter had to be able to slow down significantly as it perched, while still maintaining flight. Moreover, the claw needed to be strong enough to grasp the perch and support the weight of the robot, without being so heavy that it could not be held aloft. Lastly, the robot needed to be able to perceive its environment and the perch in front of it in relation to its own position, speed, and trajectory.
All this was achieved by equipping the ornithopter with a fully on-board computer and navigation system, which was complemented by an external motion-capture system to help it determine its position. The ornithopter's leg-claw appendage was finely calibrated to compensate for the up-and-down oscillations of flight as it attempted to hone in on and grasp the perch. The claw itself was designed to absorb the robot's forward momentum upon impact, and to close quickly and firmly to support its weight. Once perched, the robot remains on the perch without energy expenditure.
Zufferey and his team are already working on expanding the scope of the robot to outdoor settings.
Bumble Bee Flights' air taxis are expected to be launched by April 2023
Bengaluru-based autonomous air mobility solutions provider Bumble Bee Flights has recently raised $37 million from the UK-based technology conglomerate SRAM & MRAM Technologies and Resources Limited.
The company will use the capital to build an assembly plant for manufacturing #airtaxis. The first prototype is expected to be launched by April 2023. The startup said that it would manufacture air taxis under the Bee Flights brand. The fleet of air taxis will be certified and available for production in 2024.
Bumble Bee Flights is committed to designing and manufacturing the air taxis by strategically partnering with operators. Obtaining certifications and licenses will be required for markets like the US, the UK, the UAE, India, and Singapore.
Bumble Bee Flights founder Arjun Das believes that the autonomous air taxis would not only ease the already burdened urban road infrastructure but also will work towards #decarbonization .
“Bee1 is the first air taxi from India that will be certified in multiple countries and is intended for various uses including human transportation, air ambulance, air taxi, and in the logistics and supply chain sectors," Arjun commented.
Weighing around 300 kgs, Bumble Bee Flights is India’s first autonomous air mobility solution capable of human transportation and can also be used as an air ambulance, logistics carrier, as well as for defence applications in the military. The air taxis will have the capacity to carry one person along with a suitcase and will be able to land on building rooftops without needing helipads. The #evtols currently have a range of around 20 kms.