Kelp forests are some of the most biodiverse ecosystems on Earth. We might know kelp as a superfood but in the sea it provides shelter, food, and breeding grounds for countless marine species. Unfortunately, these vital underwater habitats are in decline due to factors like climate change, pollution, and overfishing.
The post How cold water divers can save kelp forests appeared first on Green Prophet.
The Irish philosopher George Berkely, best known for his theory of immaterialism, once famously mused, “If a tree falls in a forest and no one is around to hear it, does it make a sound?”
What about AI-generated trees? Probably wouldn’t make a sound, but critical nonetheless for things like conservation efforts to adapt our urban forests to climate change. To that end, scientists from MIT CSAIL, Google, and Purdue University’s novel “Tree-D Fusion” system merges AI and tree-growth models with Google’s Auto Arborist data to create accurate, 3D urban trees.
The project produced the first-ever large-scale database of 600,000 environmentally aware, simulation-ready tree models across North America. This helps urban planners understand where they can build more green lungs. Cities like Toronto get a 17.5% green canopy, while Tel Aviv gets a 17%.
“We’re bridging decades of forestry science with modern AI capabilities,” says Sara Beery, MIT EECS Assistant Professor and MIT CSAIL Principal Investigator, a co-author on a new paper about Tree-D Fusion. “This allows us to not just identify trees in cities, but to predict how they’ll grow and impact their surroundings over time. We’re not ignoring the past 30 years of work in understanding how to build these 3D synthetic models, instead, we’re using AI to make this existing knowledge more useful across a broader set of individual trees in cities around North America, and eventually the globe.”
Tree-D Fusion builds on previous urban forest monitoring efforts that used Google Street View data, but branches it forward by generating complete 3D models from single images. While earlier attempts at tree modeling were limited to specific neighborhoods, or struggled with accuracy at scale, Tree-D Fusion can create detailed models that include typically hidden features, such as the back side of trees that aren’t visible in street-view photos.
AI trees and implications for making cities cooler, safer, better maintained
The technology’s practical applications extend far beyond mere observation.
City planners could use Tree-D Fusion to one day peer into the future, anticipating where growing branches might tangle with power lines, or identifying neighborhoods where strategic tree placement could maximize cooling effects and air quality improvements. They can map how trees might respond to climate change or stop catastrophic flooding. These predictive capabilities, the team says, could change urban forest management from reactive maintenance to proactive planning.
“This high level of specificity in tree simulation has broad applications in forestry, where species and genera vary in growth, ecological roles, and climate resilience,” says Jan Stejskal, Assistant Professor at Czech University of Life Sciences Prague, who wasn’t involved in the research. “Also, it enables city planners to simulate how urban forests affect air quality, shade, and biodiversity, helping optimize tree planting for urban cooling, carbon sequestration, and habitat creation, ultimately fostering more sustainable cities.”
A Tree Grows in Brooklyn (and many other places)
The researchers took a hybrid approach to their method, using deep learning to create a 3D envelope of each tree’s shape, then using traditional procedural models to simulate realistic branch and leaf patterns based on the tree’s genus. This combo helped the model predict how trees would grow under different environmental conditions and climate scenarios such as different possible local temperatures and varying access to groundwater.
Now, as cities worldwide grapple with rising temperatures, the research offers a new window into the future of urban forests. In a collaboration with MIT’s Senseable City Lab, the Purdue University and Google team is embarking on a global study that reimagines trees as living climate shields. Their digital modeling system captures the intricate dance of shade patterns throughout the seasons, revealing how strategic urban forestry could hopefully change sweltering city blocks into more naturally cooled neighborhoods.
“Every time a street mapping vehicle passes through a city now, we’re not just taking snapshots — we’re watching these urban forests evolve in real-time,” says Beery. “This continuous monitoring creates a living digital forest that mirrors its physical counterpart, offering cities a powerful lens to observe how environmental stresses shape tree health and growth patterns across their urban landscape.”
AI-based tree modeling has emerged as an ally in the quest for environmental justice: By mapping urban tree canopy in unprecedented detail, a sister project from the Google AI for Nature team has helped uncover disparities in green space access across different socioeconomic areas. “We’re not just studying urban forests — we’re trying to cultivate more equity,” says Beery. The team is now working closely with ecologists and tree health experts to refine these models, ensuring that as cities expand their green canopies, the benefits branch out to all residents equally.
While Tree-D fusion marks some major “growth” in the field, trees can be uniquely challenging for computer vision systems. Unlike the rigid structures of buildings or vehicles that current 3D modeling techniques handle well, trees are nature’s shape-shifters — swaying in the wind, interweaving branches with neighbors, and constantly changing their form as they grow. The Tree-D fusion models are “simulation ready” in that they can estimate the shape of the trees in the future, depending on the environmental conditions.
“What makes this work exciting is how it pushes us to rethink fundamental assumptions in computer vision,” says Beery. “While 3D scene understanding techniques like photogrammetry or NERF excel at capturing static objects, trees demand new approaches that can account for their dynamic nature, where even a gentle breeze can dramatically alter their structure from moment to moment.”
The team’s approach of creating rough structural envelopes that approximate each tree’s form has proven remarkably effective, but certain issues remain unsolved. Perhaps the most vexing is the “entangled tree problem”, when neighboring trees grow into each other, their intertwined branches create a puzzle that no current AI system can fully unravel.
The scientists see their dataset as a springboard for future innovations in computer vision, and they’re already exploring applications beyond street view imagery, looking to extend their approach to platforms like iNaturalist and wildlife camera traps.
“This marks just the beginning for Tree-D Fusion,” says Jae Joong Lee, a Purdue University PhD student who developed, implemented and deployed the Tree-D-Fusion algorithm. “Together with my collaborators, I envision expanding the platform’s capabilities to a planetary scale. Our goal is to use AI-driven insights in service of natural ecosystems – supporting biodiversity, promoting global sustainability, and ultimately, benefiting the health of our entire planet.”
The post AI scientists get full image map of urban trees appeared first on Green Prophet.
Leave a Reply