Image of Sumedh in forest using the app on a smartphone

Click here to play video. Thumbnail of project video, Sumedh in forest using the app on a smartphone

WildFire Precogs: AI in the Forest

Simplifying forest sampling and carbon estimation by digitizing the most cumbersome forestry practices using AI, AR for iOS with an emphasis on the photoload forest fuel sampling process

Forestry
Design
Prototyping
AI
Research
AR

STATUS

The two features designed are slated to be implemented in AI2 Fuels Data app.
I was also selected among the top 12 to present the topic to a general audience at UW's 3MT.

Fuels Data App

DURATION

  • 6 months
  • Sep, 2024 - Mar, 2024

TEAM (x3)

  • Sumedh Supe, Product and UX
  • Ishwarya Kasu, ML Engineer
  • Anqi Pan, UI, Media and Budgeting

CONTRIBUTIONS: Product and UX

Led stakeholder management, user and technology research, and designed the interface for AI predictions.

As Product Manager, I developed and executed the feature roadmap, led technological experimentation, and aligned stakeholder needs.

As User Research Lead, I sourced participants and employed creative methods to gather actionable insights, even during foresters' off-season.


FINAL OUTCOME

Four mobile app interface screens of Wildfire PreCogs, demonstrating its functionalities. The first screen shows the app logo and tagline, the second displays a project dashboard with tasks like capturing samples, comparing fuel loads, and exporting data. The third screen features a guided image capture tool with red framing instructions, and the fourth shows a comparison mode displaying identified fuel loads in tons per acre alongside image thumbnails of different plots.

The app streamlines fuel load estimation with project management, guided image capture, and comparison tools


Through functional iterative prototyping across various iterations informed by user insights and usability tests. The team came out with an iOS app consisting of two features to reduce cognitive overload and save time .

1. Automatic Guided Image Capture (ARKit): Using a check scan like feature but guiding users continuously for ensuring all images of a sample are collected consistently regardless of who captures images

2. Comparison Mode (AI output interactions): A suggestion based system reducing the number of choices to choose between for easier comparisons


PROBLEM

Wildfires costing the US Govt 400-900 billion dollars, as much as the Swiss GDP.

Wildfires cost the US govt as much as the Swiss GDP

Fuel reduction methods can prevent 60% of the damage.

Preventive methods like fuel reduction are extremely helpful

Four mobile app interface screens of Wildfire PreCogs, demonstrating its functionalities. The first screen shows the app logo and tagline, the second displays a project dashboard with tasks like capturing samples, comparing fuel loads, and exporting data. The third screen features a guided image capture tool with red framing instructions, and the fourth shows a comparison mode displaying identified fuel loads in tons per acre alongside image thumbnails of different plots.

Current scientific methods are very manual and tedious, image source: FASMEE

Four mobile app interface screens of Wildfire PreCogs, demonstrating its functionalities. The first screen shows the app logo and tagline, the second displays a project dashboard with tasks like capturing samples, comparing fuel loads, and exporting data. The third screen features a guided image capture tool with red framing instructions, and the fourth shows a comparison mode displaying identified fuel loads in tons per acre alongside image thumbnails of different plots.

The photoloading sampling technique is tedious, extremely manual and requires thousands of decisions and comparisons to be made

Uncontrollable wildfires cost the US government $400-900 billion every year. That's equivalent to the GDP of Switzerland. But fuel reduction methods can reduce wildfire risk by 60%.

Fuel reduction methods require accurate data from manual data sampling processes that haven't seen any technological innovation in decades. With rising temperatures it becomes even more essential to perform data sampling more frequently. The exisiting labor shortages exacerbate the process which result to foresters not being able to get up to date data from forests.

Processes like the Photoloading sampling method require heavy cognitive effort, time and manual recording of data. In this project we try to fasten and ease the photoloading proces through effective digitization and automation, to ensure timely data reaches the decision makers.

Our User

A user persona slide for a forest practitioner, showing a circular image of a smiling individual in the field. The slide includes sections labeled 'Needs,' 'Goals,' and 'Pain Points.' The 'Needs' section lists efficient sampling methods, durable tools, well-lit spaces for data collection, and simplified data entry. The 'Goals' section highlights efficient sampling, accurate categorization of fuel types, and streamlined data processes. The 'Pain Points' section includes labor-intensive manual processes, specialized training requirements, time-consuming fieldwork, limited network connectivity, and weather-influenced data.

User Persona: Forest Practitioner - A detailed overview of the needs, goals, and pain points of a forest practitioner, who actively collects data and maintains forest health in the field

After multiple Subject Matter Expert Interviews, User Interviews and Desk Research, we narrowed down the user to the forest practitioner, they are ecologists that actively or seasonally practice in the forest, collecting data and running initiatives to maintain forest health.

Some striking characteristics of our users:

  • - 80% of them have an iPhone issued by the US Govt
  • - Multiple Sampling processes involved across different regions, very little standardisation
  • - Have to make thousands of decisions everyday
  • - Are in a short supply as they need specialised training and experience
  • - Not great photographers

Problem Statement

With all that information, and through numerous iterations, this was the problem statement we set to solve.

To address the labor-intensive and time-consuming process of fuel estimation in forests, crucial for implementing preventative wildfire measures, we are developing a mobile iOS application.

PROCESS

A diagram illustrating the Double Diamond Design Process with four phases: Discover, Define, Design, and Deliver. The timeline starts in 09/2023 with 'Discover,' focusing on research, observation, and extracting insights into the problem. The next phase, 'Define' in 11/2023, includes defining research questions, interviewing SMEs, and narrowing the focus. The 'Design' phase in 01/2024 highlights ideation, prototyping, usability test plans, and recruiting testers. The final phase, 'Deliver,' scheduled for 03/2024, involves product development, conducting evaluations, and iterating based on user feedback. The process transitions from an initial vision (problem) to a solution (launch)

We followed an iterative double diamond design approach for getting user insights in a short period of time

We started with our research question focussed on learning more about the existing process through desk research. The gaps in the desk research were completed through primary research methods. We retained some of the participants for evaluations during further rounds.

A table titled 'Primary Research Methods' with columns for 'Information Gap,' 'Primary Research Methods,' 'Participant Details,' and 'Participant Descriptions.' Rows list various research gaps such as usage of sampling methods, cumbersome methods, and technological advancements. Corresponding methods include interviews, participatory observation, contextual inquiry, and heuristic evaluation. Participant details range from 2 to 8 total, with roles such as Fuels Specialist, Ecologist (Fire Science), Forest Fuels Lab UW Members, UW SEFS Students, Fuel's Data App team members, and the University of Iowa Forestry Department.

Substituting knowledge gaps with all rounded primary research methods

We mapped the information and insights from the foresters to come with 3 commonly occuring scenarios that could provide the maximum impact when solved.

A slide titled 'Scenarios' with three scenarios described in green boxes. The first box states, 'I would like to take good pictures of the photoload sample so that I can view it later,' labeled 'Image Quality and Capture' with the number 1. The second box states, 'I would like to get fuel data loading from standardized sample plot image,' labeled 'Comparison Mode' with the number 2. The third box states, 'I would not like to carry photoload sample plot with me,' labeled 'Virtual AR Plot Frame' with the number 3. On the right, there is a mockup of a mobile app showing an AR capture frame over a forest ground, with a text bubble saying, 'Looks right! :).'

Scenarios with brainstormed ideations: if solved will provide maximum time savings to the sampling problem

Iterative Prototyping and Evaluations

With an idea of how the solutions should look like, we began prototyping at various levels, starting with low fidelity figma and paper, reaching to functional prototypes with our built-in code. We conducted 3 user evaluation rounds on those prototypes. Incrementally improving the designs. A number of different AR features were also experimented with for the virtual plot frame.

Image titled 'Evaluative Methods' presenting four usability testing approaches: Usability Tests and Heuristic Evaluations, Wizard of Oz Method, Cognitive Walkthrough, and Think-Aloud. Text descriptions accompany three photos: a group discussion in a forest, a person showing a laptop screen outdoors, and hands holding a smartphone capturing an image of a plant.

Evaluation toolkit

Once we realized that the features we designed were useful, we finalized our features, tracked our evaluations results and constantly kept on iterating to improve the User Experience.

Table titled 'Evaluations Compared' displaying results of two evaluations across tasks for six users. The rows represent tasks related to Image Quality and Capture, Comparison Mode, and Virtual AR Frame. Cells are color-coded from green (0 assists) to red (4 assists) based on the number of assists users required. A color legend is included at the bottom of the table.

Tracked ease of performing a task based on the number of assists provided to the user, we improved significantly on our prototypes

Diagram titled 'User Flow' depicting five steps in the photoload process: 1) Go to sample site and organize, 2) Place square sample plot and take pictures, 3) Compare with samples, 4) Calculate, and 5) Export and report findings. Includes small images representing each step, with a background photo of individuals in an outdoor field setting working with tools and documentation.

The features designed mapped to the user flow of the foresters conducting sampling procedure

The following is a journey through our prototype iterations for both software and hardware. We realized that the foresters are extremely new to technology and thus introduced guided instructions to click images. A consistent design framework using Material Design was also implemented. The hardware part of the equation, the frame was covered yellow to make computer vision easier.

Image titled 'Prototype Iteration 2' showcasing three key features of the prototype: 1) Image Quality and Capture, displaying an interface with a checkmark and a rotate icon, 2) Comparison Mode, showing sample comparisons and calculated data, and 3) Virtual AR Plot Frame, illustrating measurements overlaid on a plot frame in a natural setting. Each feature is represented in a dedicated panel with descriptive labels.

Iteration 2 after features were decided

Image titled 'Prototype Iteration 3' featuring two key prototype features: 1) Image Quality and Capture, showing a mobile interface with a 'Move Back' button and an outlined plot frame for capturing images, and 2) Comparison Mode, displaying identified fuels and their calculated ton/acre values with sample images. The background shows an outdoor setting with individuals working in the field.

Third iteration of the desings

Image titled 'Prototype Iteration 3 - Software,' detailing software features including object detection challenges, rectangle detection for angle correction, removal of the depth camera to reduce app weight, stability improvements using calibration and smoothing filters, and feedback mechanisms such as a green bounding box with OK and Cancel options. Background shows individuals working in a natural outdoor setting.

Iterations of the functional prototype

Diagram titled 'Prototype Iteration 3 - Software Architecture' showing the process from live camera feed to rectangle detection using VNRectangleRequest. Highlights the detection of rectangles, augmented graphics (red for detected, green for validated), perspective correction, and saving adjusted images to the photo library. Swift programming and Vision framework logos included, with interface and graphics layers in the workflow

Final software architecture of the functional prototype for the first automatic guided image capture feature

Diagram titled 'User Flow' depicting five steps in the photoload process: 1) Go to sample site and organize, 2) Place square sample plot and take pictures, 3) Compare with samples, 4) Calculate, and 5) Export and report findings. Includes small images representing each step, with a background photo of individuals in an outdoor field setting working with tools and documentation.

Changing the color of the frame allows for higher contrast in computer vision

Exploratory work to determine how the virtual frame would work with regards to the height of phone from ground was also performed.

Slide titled 'Exploratory Work: Camera Height iPhone' showing two bar charts and an image of a 1m² plot frame. The first chart compares the minimum distance from the ground required to capture a 1m² plot frame using different iPhone camera settings: 0.5x, 1x, and Measure Tool Default. The second chart plots error percentage against phone distance, indicating an increase in error with greater distance. On the right is a photo of a 1m² plot frame captured using a tripod, with measurement annotations.

iPhone's depth cameras combined with ARKit for creating the virtual frame weren't suitable and required a person to place the phone way above the height of the person

Emerging business case and commercialization

Commercialization was never the goal, the goal was to create a tool that is openly accessible. Although, the tool can help with reforestation efforts to understand carbon composition.

Diagram titled 'Emerging Business Case' showing interconnected components: Key Resources (technical team, forest management expertise, partnerships), Key Partners (non-profits, state and federal forest services), Key Activities (app development, workshops, partnerships), Customer Relationships (workshops, online support), Cost Structure (app development, marketing, training), Revenue Streams (organizational purchases, funding from non-profits), Value Propositions (cost-effective sampling for organizations, simplified process for end-users), and Customer Segments (natural resource organizations, forest practitioners). Background shows a forest area with signage and individuals.

Our major value proposition was cost and time savings, this is a plan that discusses how we plan to onboard people to use the app


RESULTS

Towards the final evaluation we created the functional high fidelity prototype of the automatic guided image capture, for this would help the Comparison Mode. Capturing consistent images would provide labelled training data for our recommendation system. We had the following research questions:

  • - Can the functional prototype help people move their camera to a uniform position?
  • - Can the functional prototype collect samples that conform to the quality measures as discussed compared to a normal camera?
  • - Does Material 3 Design solve navigational issues encountered with previous prototypes?
  • - Does the app provide for a consistent way to practice photo loading?

And we found that the Automatic Guided Image Capture helped data collection time from hours to less than 2 minutes consistently across different people.

Chart titled 'Comparing Images for Samples,' showing three lines: blue for pictures taken for samples, red for blurry images, and green for properly cropped images. The x-axis represents different user results (Expected 1 and 2, Result A1, A2, B1, and B2), and the y-axis represents the number of images. Annotations on the right indicate '90% well-cropped images' and '70% clear images,' with a background showing an outdoor setting.

Comparison of images for samples, highlighting the number of pictures taken, blurry images, and properly cropped images, with metrics indicating 90% well-cropped and 70% clear images.

Slide featuring three metrics: '<2 minutes' for sampling time with a stopwatch icon, '100%' preference for app-based images with a thumbs-up icon, and '100-110 cm' as the optimal camera distance from the plot with an upward arrow icon. The background shows individuals working outdoors with tools and a sample plot frame.

Key metrics for the sampling process, highlighting efficiency ( less than 2 minutes per sample), 100% preference for app-based images, and optimal camera distance of 100-110 cm from the plot

Slide titled 'How our work helps' featuring three icons: a stressed person labeled 'Human-Error,' a bored face labeled 'Tedious,' and an hourglass labeled 'Time-Consuming.' Corresponding solutions include consistent distance and image gathering, reduced decision-making with future AI-based comparison modes, and reduced time managing sample data for faster processes.

Our work digitizes the process and introduces automations at time consuming junctions solving major forester issues


REFLECTIONS

We had underestimated how difficult it would be to reach users, especially observe foresters perform sampling work. The project started at a time when foresters were not performing any sampling activities. We had to improvise and attend workshops to meet foresters and know their craft.

This was a new space to work in for everyone, no one had a background in forestry and it took a pretty long time for us to get used to the terminology and how things worked with forest sampling. This was a great learning experience nonetheless. If we were given a chance to start again, I think we would save most of our time here.

We found out first hand that getting computer vision to work in real life is super hard, especially with images of twigs. Even major LLM models cannot help detect twigs apart from each other. Hopefully, well collected and labelled data in the future will help.

This project would have not been possible without a great team and the help of some amazing mentors at GIX and Ai2's Wildlands team. I am thankful to them. I have a strong belief that this will help create world where uncontrollable wildfires cease to exist.


Status: The features and designs will be implemented in Ai2 Fuel's Data app. The app is used by foresters to sample data. I was selected across UW to present this project for a general audience.