This week The Zoological Society of London (ZSL) announced Pete the Fern - a plant capable of producing electricity. Growing plants that can produce electricity opens up radical new opportunities to run AI in even the most remote locations.
“This is an exciting new development for conservationists, as they have been able to harness the natural energy created by living plants to create small amounts of electricity, allowing them to ’plug in’ to nature.” - Helena Horton, The Telegraph
Pete, the ZSL fern, is primarily a power generator. Biologists have long known that as plants grow, they release biomatter that attracts bacteria. As the bacteria breaks down the biomatter, it generates small but measurable amounts of energy.
To capture the energy Pete generates, the ZSL teamed with technologists to design a microbial fuel cell that stores an endless amount of low-wattage energy. In effect, this means we can now plug into the earth for the power we need.
So what can you actually plug into the earth?
The answer is not much. Unless you have devices that are so energy efficient they can operate on the tiny amounts of energy harvested from a plant - this is where the story ends.
It was this fundamental block that led ZSL to Xnor.ai.
“I was very intrigued when Alasdair Davis (leading scientist on the ZSL project) reached out and said he had an electricity producing plant. They wanted to use the electricity for the plant to take a selfie but could not find a camera capable of energy harvesting and operating at the power levels the plant was generating” - Saman Naderiparizi, Head of Hardware Engineering at Xnor.ai
London Zoo first heard of Xnor when we announced solar powered AI. Through careful hardware-software co-design, Xnor was able to build an AI system that was so low power it could operate based on energy harvested from the sun. Our device contained a camera that could see the world, running an AI algorithm that could automatically detect people and objects.
Now the ZSL team was asking whether we could push this technology even further by running it on energy in the soil, produced from a plant.
"Thanks to the programmable aspect of our solar powered AI hardware, we were able to simply update the device firmware and configure it to be used as a selfie camera for ZSL" - Colin Pate, Hardware Engineer at Xnor.ai
As the leader in ultra low-power AI, Xnor has been working at the intersection of AI and the environment.
“Working closely with conservationists is a natural fit for Xnor because we are constantly trying to reduce the power consumed in running AI algorithms. Once we proved that you could run battery-less AI, it opened up so many opportunities for us to run AI in the field with different partners” - Sophie Lebrecht, SVP of Strategy and Operations
We have used our high-accuracy object detection models to help with seal detection in the arctic in collaboration with NOAA. Our models are small enough that they can operate in real-time, at high altitude from a plane. This has been part of an exciting effort on seal conservation lead by some of the world’s leading scientists at NOAA and Microsoft.
So when ZSL asked us to complete the puzzle and provide the hardware necessary for Pete the Fern to actually take the selfie, we jumped at the opportunity.
How it works
The below diagram shows the end-to-end plant-powered AI camera system.
Low-voltage and low-current signal is being generated by the plant developed by ZSL. This weak signal is then pushed into a boost harvester on the Xnor low-power AI hardware. The boost harvester, accumulates energy on a super capacitor and whenever it has accumulated enough energy, it enables the voltage management module. The voltage management module, then regulates the voltage and generates multiple voltages that will fuel different parts of the Xnor low-power AI hardware. The voltage management module will remain active for a period long enough for capturing one image frame, running an AI model, and storing the image frame onto a Flash memory. The Xnor low-power AI hardware has two LEDs that indicate the status of the hardware. Every time an image is captured, the AI inference is finished, and the image storage is done, the red LED blinks, indicating that one round of full process is accomplished. When the Flash memory is filled up, the red LED will be on for a noticeably longer period of time indicating that the memory is full. Then the user can use the low-power wireless module that exists on the Xnor AI hardware and download the images wirelessly. Alternatively, the user can use a standard USB cable to connect the Xnor AI hardware to a computer and download all of the images through USB too. After all of the images are successfully downloaded from the Xnor low-power AI hardware, the green LED turns on indicating the device is ready for another round of image capture, AI inference and image storage.
What does this mean for the future impact of AI?
By proving that we can harvest energy from the soil to power a camera, we open up countless opportunities for environmental monitoring and conservation.
“We were amazed that we could remove batteries and still run AI algorithms on tiny solar cells. Now we have proven that we can remove even the solar cells and simply plug into the soil! This represents a huge break-through for the applications of ultra-low powered AI” - Ali Farhadi, co-founder and CXO. “In the future we will be able to have low-cost, always-on remote monitoring systems that will change how we can track and monitor endangered species”.
“Innovations like this will be a critical tool for better understanding the magnitude of the current extinction crisis” - Jason Groves, Professor of Environmental Humanities, University of Washington
This technology also has the potential to go beyond just conservation to help efforts in efficient farming and agriculture, forest fire safety, backcountry search and rescue missions and many more.
Xnor named to 2019 Forbes AI50
On September 15, 2019, Xnor was honored to earn a spot on the Forbes AI 50, it’s annual...Read More
Adding Deep Learning to your Edge Device with 5 Lines of Code — Xnor and SAST
Historically, if you wanted to include sophisticated deep learning tasks in your...Read More
Neural Architecture Learning Leads to the Most Optimized Computer Vision Model
To understand what this all means, how AI models are structured, and how they are...Read More