Home IssueArtificial Intelligence 5 Q’s for Adam Szymanski, Software Engineer at Argonne National Laboratory

5 Q’s for Adam Szymanski, Software Engineer at Argonne National Laboratory

by Hodan Omaar
by

The Center for Data Innovation spoke with Adam Szymanski, a software engineer at the Argonne National Laboratory who is using AI cameras to monitor how birds interact with solar infrastructure. Szymanski discussed how this data can help explain why avian mortality near solar facilities is so high, how AI cameras can increase the types of avian data collected, and the challenges neural networks face in detecting birds.

Hodan Omaar: An Argonne study published in 2016 estimates solar facilities kill between 37,800 and 138,600 birds each year. Why is the range so large and the link between solar facilities and bird populations so unclear, and how are your team using AI to better quantify this?

Adam Szymanski: The 2016 avian-solar mortality estimate was based on limited data available at the time and were focused on data from concentrating solar power, which uses mirrors to concentrate the sun’s energy, and photovoltaic technologies, which generate electricity directly from sunlight via an electronic process that occurs naturally in semiconductors. As a result of the limited amount of available data and the wide range of reported fatalities within those datasets, our estimated mortality rates covered a conservatively wide range. 

The key take-away from this paper, and others that have looked at avian-solar mortality, is that observed avian mortality at solar facilities appears to be relatively low compared to other forms of avian mortality but is still high enough to warrant investigating the nature and magnitude of the problem and ways we can minimize and mitigate it.  

The goal of this AI camera monitoring project is to provide cost-effective continuous monitoring to increase the amount, quality, and variety of data that will help researchers better understand these interactions and whether there might be specific impacts to certain bird populations.

Omaar: This project was borne from an earlier one that trained computers to detect drones flying in the sky. How did you build on those capabilities and what new challenges do birds present to monitoring that drones do not?

Szymanski: In the previous project to detect drones, we developed algorithms and models to look for flying things and classify them, and in fact, one of the categories for classification in that neural network model was classifying birds. It seemed natural that we would use similar techniques and just shift the main focus from drones to birds.

However, there are a few things that are a bit more challenging in detecting birds around solar panels than drones in the sky. For one, it is more difficult for the moving object detection algorithm to differentiate what is moving and what is static background. In the drone detection case, the camera faces up at the sky, creating a backdrop where generally nothing else moves apart from the object we want to classify. At a solar facility, the camera faces the solar panels, producing a background that includes many other moving objects that need to be accounted for. Another challenge is that we do not only want to detect the birds, we want to classify their activities over time too. This requires us capturing and analyzing a sequence of images instead of a single frame.

Omaar: Who are the desired end users for this system and what sorts of questions might they be better able to answer by detecting and monitoring bird activity?

Szymanski: The target end users for this system, and the data it will generate, are the solar energy industry that could eventually install the monitoring system, federal and state agencies whose responsibility it is to protect wildlife, and non-governmental organizations, such as wildlife and avian conservation groups, that have an interest in understanding avian-solar interactions. 

Using our system to collect a large volume of accurate data on avian-solar interactions can help detect patterns and begin answering several key questions such as: Are certain types of birds more prone to strikes? Do collisions increase at certain times of the day or year? Does geographic location of the solar panels play a role in the types of interactions? Do solar energy facilities provide viable habitat for birds? Users can also monitor other wildlife using our technological framework by retraining AI with appropriate data.

Omaar: How much computing is performed on site at solar facilities and in what ways has COVID-19 impacted data collection?

Szymanski: The camera system we are developing will have an on-board GPU processor that can run the tracking and classification algorithms we are developing on the device itself, also known as computing “at the edge.” Doing this will take the burden off the solar facilities themselves to have onsite, high-performance computing or a large network bandwidth for transferring video streams. Because we are computing at the edge, the only data that will need to be sent back to operators or analysts are the bird detections and meta-data about the bird’s activities, which requires much less data transfer.

In the current phase of the program, we are collecting raw video data of birds at solar facilities that we will use to train the machine learning models. To do this, we need to visit the facilities for setting up equipment and in some cases retrieving video. The unexpected COVID-19 crisis has required us to redesign our project activities slightly. However, the team and our facility partners have quickly adapted to the situation and we have a new, safe protocol. We have recorded several hundred hours of video collected so far and are optimistic about continuing progress toward our goal.

Omaar: The project is still in the first year of three; what lessons have you learned and how will this impact the project going forward?

Szymanski: We have learned as we have been analyzing the collected data from several solar sites that there is a wide variety of background scenery. From vegetation, infrastructure, different types of wildlife, to different terrain, the models we are developing need to perform well in many different environments and we need to take care to test and validate them all. 

In addition to the technical aspect, we have learned that building partnerships with solar facilities and keeping stakeholders informed on our technology development are important. Without their interest and support we will not be able to effectively accomplish our goal. Building and maintaining trusted relationships takes time and effort, and we are committed to continuing to make this a priority.

 

You may also like

Show Buttons
Hide Buttons