Dr. John Adler is a geographer and remote sensing professional with a diverse career in academia, government, and ecological research. He serves as a Lead Airborne Sensor Operator for the National Ecological Observatory Network (NEON) where he conducts airborne surveys across the nation during periods of peak vegetation greenness, contributing to large-scale ecological data collection efforts. In this episode, Austin chats with John about the ins-and-outs of collecting airborne lidar data in concert with RGB + hyperspectral sensors over NEON sites from Alaska to Puerto Rico. They close with a discussion of what the future may hold for NEON data collection and upload.
Episode Transcript
#21 – John Adler
June 19th, 2025
{Music}
Announcer: Welcome to the LIDAR Magazine Podcast, bringing measurement, positioning and imaging technologies to light. This event was made possible thanks to the generous support of rapidlasso, producer of the LAStools software suite.
Austin Madson(00:19)
Hey everyone and welcome to the LIDAR Magazine podcast series. My name is Austin Madson and I’m an associate editor at LIDAR Magazine. Thanks for tuning in as we continue our journey exploring the many different applications of lidar remote sensing. We’re really happy to have the opportunity to talk to Dr. John Adler today, a lead airborne sensor operator for NEON, the National Science Foundation’s National Ecological Observatory Network. Dr. John Adler is a geographer and remote sensing professional with a diverse career in academia, government, and ecological research. He holds a PhD in geography from the University of Colorado Boulder, where he teaches courses on uncrewed aircraft systems and remote sensing. At the National Ecological Observatory Network, or NEON, operated by Battelle, Dr. Adler serves as the lead airborne sensor operator.
In this role, he conducts airborne surveys across the nation during periods of peak vegetation greenness, contributing to large-scale ecological data collection efforts. He’s a former NOAA and US Navy aerial navigator. Dr. Adler has conducted missions to both poles and through 85 hurricane eyewalls, reflecting a career dedicated to advancing Earth observation through cutting-edge technology and interdisciplinary efforts. So thanks for joining us today, Dr. Adler. And as I understand it, you’re fresh off the plane, so I hope you guys had a good couple of flights today.
John Adler (01:44)
Right, as you mentioned, we ended up doing what I call a “double shuttle”– because the conditions were very favorable today. So we did two full flights. We were able to finish one of our boxes.
Austin Madson(01:55)
That’s fantastic. Love to hear it. Dr. Adler, let’s start by having you talk a little bit about what NEON actually is.
John Adler (02:04)
Sure. ⁓ Big picture, NEON. Our mission is to advance understanding of these complex interactions between the ecosystems and the environment. We provide open access, high quality, long term, okay, on the order of 30 years, ecological data, that’s our goal. And ⁓ other folks, scientists, researchers, and the public can access all of our data via neonscience.org and do the actual science analysis.
It’s a National Science Foundation facility ⁓ operated by Battelle of about 81 individual sites across the US.
Austin Madson(02:43)
81. Okay, so you guys do a fair bit of work at different locations. So, NEON itself does a lot of different things from my understanding. What is your specific role there, Dr. Adler, and how long have you been a part of the program?
John Adler (02:58)
So I’ve been there 11 years now. And as you mentioned, I’m in the back of the plane. I’m one of the operators and we’re running the different systems we have. So hyperspectral, lidar and camera systems. But real quick, let me just throw in, it’s because it’s more than just lidar. I know the audience is mostly lidar, but in order to understand our operations, you have to see what the overall NEON project is. And so they’ve kind of broken the US up into these 20 ecological zones, we call them domains. Each dimension has a distinct landform, vegetation, and ecosystem in there. Within each of these 20 or so, there’s typically three main collection sites.
Some have four, so that’s how we get up to our 81 total sites. But within these 20, there’s going to be one primary site with at least one large tower. You can imagine this tower is going above the vegetation there. So if you’re out in the Pacific Northwest, you’re to have a very high tower and if you’re out in it’s a lower tower. So our remote sensing group is a small part of NEON and I just want to cover that there’s a whole bunch of people that are on the ground that are collecting biological data, taking in situ, meaning in site and on place sampling too. And so you can kind of think of us as kind of in between that in situ where they can get the data right where the people are walking around, but it doesn’t cover a vast area. So we’re in between that and then the overhead satellite imagery, right, which is in the day. So we are on the order of about hundred square kilometers or 10 by 10 kilometer boxes. This is our goal at these sites, and so each of the roughly 20 or 81 or so sites has a similar kind of area of interest, that same square kilometer grid exactly over that tower, which is our main collection site. And the tower is instrumented with different instrumentation so you can get flux measurements, etcetera. So we’ll put that 100 square kilometer box over that at each of the sites. And then we have the other sites too, but the primary interest is that towered site.
Austin Madson(05:18)
Got it, yeah, 100 square kilometers. It’s not the one square kilometer I just mistakenly said.
John Adler (05:23)
Right. So it’s bigger than what anybody could cover on foot and that’s the point. And then we fly, it’s interesting, we’re flying at 3000 feet above the ground. So fairly close from an airborne perspective. Right. In science terms, right, that’s a thousand meters. So it is interesting on the aviation side, everything’s in feet. On the science side, everything’s in meters. So you got to make sure we’re talking the same distances and things as we communicate with science and flight.
Austin Madson(05:54)
Right, yeah. Well, so how is your work as a NEON airborne sensor operator kind of spread across the year? I assume, right, there’s a certain season for flying and for planning and calibrating. And can you talk a little bit about what that looks like?
John Adler (06:09)
Sure. From an annual point of view, our field collection usually begins somewhere in mid-March. The twin Otter aircraft that we fly, which come from Twin Otter International, we will do probably about two weeks of calibration flights. Okay. So, inside, have the lidar, we have hyperspectral and the regular camera. We do that out of our main headquarters in Boulder, Colorado. Then from there, the planes head out all across the US to these different domains we talked about, which go from basically Dead Horse or Prudhoe Bay, Alaska, way up on the North Slope, all the way down to Puerto Rico. During the summer period when the vegetation is growing, that’s our main season for data collection; and that goes into about September timeframe, and then the planes come back. We do a post calibration about two weeks per plane. And then over the winter months, we’re calibrating like the hyperspectral sensor will send the lidar sensors back to manufacturers because we use COTS systems. We’ll also do flight planning for the following year and other tasks as required. So that’s what the winter’s like.
Austin Madson(07:18)
Gotcha. Yeah. So quite a bit of different types of work through, each season. Any, how long did you say the field collection season was?
John Adler (07:26)
Basically, I’d say one April to about four or five September.
Austin Madson(07:32)
And so this segues into my next question. You’ll have about 81 NEON field sites, right? think 47 terrestrial and 34 freshwater aquatic sites. And they range, like you said, Dr. Adler, from Alaska to Hawaii, all the way to Puerto Rico and all over the continental United States in between. how do you decide, you know, which of those field sites get collected, A, in a given year? And I guess at what part of the year do you determine you need to go and collect that data.
John Adler (08:03)
In theory, we would want to get all the sites every year. But as you have funding levels that could become flat for a couple of years in a row, and then you to go up, you have to do some changes. But what’s really driving it is remember, we have a lot of our teams on the ground doing those collections in situ. And so we try to fly at the same time they’re at that site doing the collections. So we kind of follow along where we the ground teams are going to be.
So then they will pick, let’s say, every four out of five years or three out of five years, we want to hit certain sites. so based on that distribution, we’ll do that as our primary focus, and then we’ll fill in the other sites around that.
Austin Madson(08:48)
I see. Do they select the time at which they collect in situ data at a given field site based on kind of the peak greenest for the individual field site?
John Adler (09:01)
You got it exactly. so ⁓ peak greenness is the term that we use. And in order to plan for that, we look at historical satellite data and plants reflect very well in the infrared. And so we can use that satellite data to see when is this vegetation in this particular ecosystem, the most vibrant, right? Giving off the most amount of reflected infrared energy.
And then that’s going to be where the plant is going to grow the most, which is when we want to go fly. And so this peak greenness period, we’ll come up with a beginning date and an end date and we do 90 % of peak greenness. So we kind of straddle the center there. And so let’s say we come up with a three or four week period. So then each site would have this box that says, okay, during this two to four week period or whatever is, how do we mix that in along with those other programming priorities to get overhead the in situ sampling.
So peak greenness is what drives it. And so what’s interesting, if you think of the sun, is it’s coming up from Equinox in March, and it’s going up north in June, and then it comes back down. Our flights kind of follow that. We kind of start off in Florida and Texas. Okay, right now we’re in Florida. Summer goes on, we’ll head up north, we’ll go up to the Dakotas, we’ll go into Oregon and then we’ll go up into Alaska and then we’ll kind of come back down south again to meet the peak greenness and we’ll end up in Arizona, etc.
Austin Madson(10:32)
Gotcha, so you’re just a regular old sun tracker, huh?
John Adler (10:36)
In a way, yeah. And of course, it’s all site-specific, but that’s kind of how we operate.
Austin Madson(10:42)
Well, so you guys have a diverse array of field sites. Some have different characteristics than others, and some people might prefer different ecosystems and landscapes than others. So how does your team, which there are, I think, a few of you, how does your team decide on who goes where, right? Do you really like to go to Alaska? you tell your coworkers, like, hey, don’t touch Alaska. Alaska’s mine.
John Adler (11:05)
Right, right. Well, ⁓ we have a very small flight team, you know, and there’s low turnover. We enjoy our jobs. And it is a special kind of person because you are gone for quite a while. Right. Not any weekends during the summer. Right. And so a typical time frame is you go out for three to four weeks and then you would roll off and have a week off and then somebody else would come in. But as far as that goes, you put in usually your number one site that you want to go to and you also might put in your number one, ⁓ I’m going to my sister’s wedding or something, so request, right? So it’s a blend of time off, that week time off versus also what sites that you’re interested in going. And so all that goes in with our Cameron who runs our schedule for us and he does a great job with that.
Austin Madson(11:57)
Yeah, you have lots of logistics and speaking of logistics, Dr. Adler, we talked a little bit about timing with field crews and peak greenness. Are there any other logistical challenges and things that you all face as you’re thinking about mission planning and trying to hit different sites?
John Adler (12:15)
Sure, so normally if your light are and you’re doing survey, you’re not too worried about the sun. And so you can fly, you know, any time-period basically. You are worried about clouds, right? Because the laser is not going to penetrate through clouds. But in our case, so we’re worried about clouds. We’re also worried about the sun because we need to have a certain amount of illumination. And we also want to have the sun fairly high in the sky. So there’s not a lot of shadowing. It’s a 40 degree sun angle.
So the sun should be up above the horizon and at the 40 degree mark to when it wants to set 40 degree mark. That’s kind of our spot in order to meet the requirements for the hyper spectral sensor. So that’s another item that gets spliced in there.
Austin Madson(12:58)
Right, because you aren’t just flying the lidar, right? You’re collecting all of these other data sets that are required for the science.
John Adler (13:05)
Right. And if I could go back to that 3000 feet, that’s kind of what drives our altitude. Whereas other lidar surveys might, they want to be high and they want to go fast because the higher you are, the more ground you can cover and you need really fast lidar systems, which they’re constantly improving in order to meet those requirements. But for us with that hyperspectral sensor, we want one meter pixel on the ground for all of these 400 plus color channels.
And so that means we got to be down closer. so that’s what drove our 3000 foot altitude above ground level AGL.
Austin Madson(13:43)
I see. Well, let’s pause for a quick second and hear from our sponsor, LAStools.
—
The LIDAR Magazine Podcast is brought to you by rapidlasso. Our LAStools software suite offers the fastest and most memory efficient solution for batch-scripted multi-core lidar processing. Watch as we turn billions of lidar points into useful products at blazing speeds with impossibly low memory requirements. For seamless processing of the largest datasets, we also offer our BLAST extension. Visit rapidlasso.de for details.
—
Austin Madson(14:21)
Dr. Adler, let’s talk a little bit about some of the different systems that you and your team operate. I know you’d mentioned obviously the laser scanners and the hyperspectral. So can you talk about the instrumentation that you and your team have and some of the nuances and maybe the restrictions when it comes to planning those missions and collecting and processing that data so that the kind of downstream science team can properly process and utilize everything?
John Adler (14:47)
Sure, sure. So we have three systems, three aircraft systems, and they call our team the AOP, the Airborne Observatory Platform. So we have three lidar systems. And currently we have a RIEGL, the VQ-780-II, that’s our brand new model. We were just fielding right now. Our first flights will be starting on Wednesday coming up over one of the sites, right? Yeah, out of Texas. We also have a RIEGL Q-780.
Austin Madson(15:10)
Exciting.
John Adler (15:16)
And then we also have an Optech Galaxy Prime. So each one of those lidars is mated with a specific hyperspectral sensor. And then we drop it into the planes as we need. And then of course we have a regular Phase 1, about 100 megapixel camera system with that.
Austin Madson(15:33)
I know in a previous discussion we had talked about how your team is limited to certain kind of FBO locations and hanger types because of these kind of restrictions that come with some of these sensors. Can you talk a little bit about that?
John Adler (15:47)
Yeah, and also what drives our speed. And the optical sensor was built by JPL. We call it the NIST, the NEON Imaging Spectrometer. There’s not very many of these, but they have to be cryo-cooled. And it takes about two days to cool them down to about 140 Kelvin and also a very low pressure vacuum kind of atmosphere in there at the sensor, and so therefore we need 220 power 24 hours a day. So we have to go to certain airfields with a hanger that have that kind of power capability. So that’s another thing that we do in the off season, know, map out, make sure that we have the right places to go so that this thing is plugged in all the time.
Austin Madson(16:34)
Right. And what happens if the power goes out?
John Adler (16:39)
Well, we’ve got a pretty robust telemetry system. And so basically, through Wi-Fi or two different cellular connections, it’s constantly monitoring that. And so we will get a ping if it loses power, et cetera. ⁓ We also, it’s kind of interesting, we have an eight-minute backup battery system. So when we totally lay it out of the hanger, we have to turn off the power. It goes on to this UPS out of the hangar and then we plug in a separate ground power unit. And so that’s how we can survive certain power problems, etc. that might come transient power problems. So the UPS really helps us out and it gets us out of the hangar.
Austin Madson(17:20)
Well, it sounds a little bit stressful having to keep those sensor powered. Have you had any close calls or any interesting stories where you’re like, shoot, we need to get the hyperspectral sensor powered up ASAP?
John Adler (17:36)
Well, yes, definitely we have to tell the line crew ahead of time because when you’re moving planes, ⁓ some of the hangars, like they’ve got to move other planes around or something like that. And so we’re watching the LEDs click off as the power levels drop because we’re pulling a lot of amps. I’m talking about 80 amps or so. So it’s a lot of power coming out of these things. And so you’re watching the lights go down and just hoping that you get back in there.
Now you can plug it in and then eventually, usually like if you lose power, you can recover in a couple of hours. So sometimes at a fuel stop or something like that. But still a little bit of time to recover and you want to charge up your ⁓ UPS again.
Austin Madson(18:18)
Yeah, totally. Well, what about, I know you had mentioned kind of ⁓ a low flight speed. Can you talk a little bit about that and what’s that for?
John Adler (18:28)
Yeah, that’s a really interesting concept because when we’re taking images with the hyperspectral, even though it’s a regular chip, like imaging chip, kind of the standard 600 by 480 type pixels, it’s only using one pixel across. So the swath is one pixel across 600 of those. Right. Push-broom type sensor. And so as the plane goes forward, it images and then it images again along that one pixel swath. And so
Austin Madson(18:40)
Okay.
John Adler (18:55)
It builds up that 2D picture that we’re used to, like with our iPhones. Right. plane goes too fast. You can imagine you’re going to be skipping some lines in there. And so the speed that we’re set for is 97 knots is our perfect spot there to fly so that we’re not flying too fast and we won’t skip over. But yeah, so it’s kind of interesting that we got to build up. Now you ask, well, why is it that way if you have this 2D chip? Well, it’s a grading. It’s kind of like a prism, but it’s called a grading. But basically the light will come up into the sensor and then gets fanned out across the color spectrum. So it’s going from the blue, maybe 320 nanometers all the way up to 2500 nanometers. All those color channels are what’s going up the other side of the chip. So that’s why it’s only one pixel wide and 600 or so across.
Austin Madson(19:42)
Right. And you can tell that you’ve taught robust sensing classes.
John Adler (19:47)
Yeah, it is a hard concept, but it’s amazing this all works. And it’s a non-gimbaled system too, which is, you know, it takes a lot of extra effort to process to get everything back to geo-rectified.
Austin Madson(19:59)
Right, and y’all have been doing this, well at least you said Dr. Adler, for 11 years or so, hopefully everything’s kind of pegged in and all your workflows are good. Have y’all had to make any changes to workflows over the last couple years or y’all have it pretty dialed in?
John Adler (20:14)
It’s pretty dialed in, when you get new systems, and as we know with any kind of computer system, if you advance your technology, you’ve to go back and do some scrubbing. Also, we’ve learned better ways to do things. So an example is what’s called the BDRF, the bi-directional reflectance function. That’s something new that’s being put in. And so we go back into the previous data from the hyperspectral sensor and then recompute all that, taking into account this new function.
which improves the data quality there.
Austin Madson(20:46)
Let’s talk a little bit about the lidar data that y’all’s team acquires and kind of how it’s used within NEON itself.
John Adler (20:54)
So we have two forms of lidar data. so traditionally, what’s called discrete data, which is you imagine a photon goes out and it bounces back and you get an XYZ coordinate, right? Right. And then we also have waveform data. So if you imagine that the lidar recorder is kind of listening for all the photons and it’s giving you almost like a profile view. So if you can imagine like the tree is the lidar points are coming down the side of the tree from top to bottom.
It’s kind of giving you the profile view of that tree. So that’s kind of a waveform recorder. And so there were two approaches by two different companies. So the RIEGL system actually records in that waveform and then it samples it back out to get you the discrete data. Whereas with the Galaxy system, it has the traditional discrete recorder, but it also has a kind of a attachment here that’s bringing back or recording the waveform as well.
Austin Madson(21:55)
I see. And so you all are collecting full waveform and you can obviously get the discrete returns from that. And then, you know, so this, guess the science team goes out and geo references the data and does some QA, QC, and then, then how has the lidar data used itself for science?
John Adler (22:12)
Right. And so from the lidar, there’s a couple of ⁓ interesting products that we have with that. So you’re going to get traditionally, right, the XYZ points that come out. And then our third sensor, the camera, haven’t talked that much about digital camera, right? It can give you like a colorized point that you can then colorize each one of these lidar points with the camera. Right. Just colorize point clouds. Also, you’re going to get the slope and aspect and other things coming from the digital data from the ⁓ lidar. And that will feed into the hyperspectral system, which helps in that calculation of the BDRF because it’s, you know, is the side of the hill facing you, facing away from you, et cetera.
The other important thing about the lidar data is we put out ⁓ both a DTM, which is the digital terrain model. So this is basically your last returns back to the lidar, right? And it’s giving you the ground. And then also put out the DSM, which is kind of your first returns, the tops of the trees. And then we subtract those two out and you get what’s called the canopy height model, the CHM. And so that’s, you know, you subtract those two and you get that output. so the lidar is putting that out, but in addition, those one meter, ⁓ basically we try to get a one meter resolution from the lidar DEM then gets pushed into the hyperspectral cubes in order to geo rectify them and then further on for downfield like the BDRF. That makes sense. So we use our lidar data specifically to ortho rectify our hyperspectral stuff.
Austin Madson(23:50)
I see.
And I’m pretty sure that all of the data is freely available so people can go in and download it and use it for their science applications,
John Adler (24:05)
That’s right. NeonScience.org is our website. And the other nice thing about it is we didn’t really talk about how much data we get. A typical flight is about a terabyte. And so like today we just landed, I’ve got two terabytes of data, right? Because I’m using a right? And so each one of those is nine hard drives when you add the hyperspectral, the lidar, and the camera stuff. So then we have to push nine into one hard drive, which we raid, and then send the mirrored copy FedEx back to the headquarters to put into the cloud. Forever to try to do that on the road.
Austin Madson(24:43)
Right.
Well, yes, so this brings me, because I wanted to touch on this too. You’re collecting lidar and RGB and hyperspectral, not just lidar but the full waveform. You just said you all collected a couple of terabytes today. So you all are shipping the data back. And then what about for data backups and redundancy and things? How does your team deal with that?
John Adler (25:09)
So when we go from the nine drives to the one, we mirror that. So we always keep on drive with the flight crew. And then we ship the other one out. Once it’s ingested successfully and we know that it’s on the cloud, and then that itself has its own backup systems, then we can go ahead and use that drive to be a ray drive for the next set of data. So we always have consistency there.
Austin Madson(25:36)
So you’ll have three sets of instrumentation. How do you all determine which of those sets your team is going to use for a given field site?
John Adler (25:47)
Typically, we go with two planes per year, but we can do three. But again, it’s back down to where is the sampling going to be done? And let’s say you had a that was operating in Oregon. You wouldn’t make any sense to move that one over to Florida and then bring it back to Arizona, right? And so you kind of also have this logistics that you want to reduce transit times. And so that plays into it. So it’s not so much which light our system goes where it’s more overall, what’s logistically the best solution.
Austin Madson(26:21)
What site do you all have in Oregon or Washington state?
John Adler (26:25)
Oregon, we call it D16. D16. Out of Hillsborough in Portland area.
Austin Madson(26:27)
Okay, so.
Gotcha. So let’s say for D16, historically the last three times you’ll have collected Optech data there. Would you take that into account as you’re trying to determine your next data collect?
John Adler (26:43)
I won’t say that we don’t, but I’m not in that decision process because there are other things too. I’m not sure maybe there was something that needed to be done on one of the hyperspectrals which brought down that particular payload. If you think of it, our output should be fairly agnostic when you put out your DTMs, your different products. I don’t really think that’s taken that much into account.
Austin Madson(26:58)
Right, I see.
So really the main thing that the lidar data is used for is to derive the elevation model to kind of rectify the hyperspectral data.
John Adler (27:20)
Well, and also in its own right, because you can see heights of vegetation growth patterns. so the science teams out there can see what is happening to the vegetation and stuff. So in its own right, the lidar data is used quite frequently.
Austin Madson(27:35)
Yeah, I have some colleagues who over the years have used a lot of y’all’s data. So thank you for collecting great data.
John Adler (27:41)
Yeah, and you know, there’s, you know, the breadth of the trees and there’s just, there’s a lot of uses for the lidar data. It’s just on the side, which people don’t think about it, it is used on the hyperspectral side.
Austin Madson(27:53)
Well, know you mentioned, Dr. Eiler, that you all aim for one meter spatial res for the elevation models. What kind of density ranges do you all collect the lidar data in?
John Adler (28:06)
So originally, we had these gemini systems and it was around just approaching four photons per meter squared. now as we move into like this newer system from Regal, we can get 16 points per meter squared or maybe even a little more. it kind of depends. For us, we also have this thing called the ENOHD, the enhanced.
I got to look that up. can’t remember exactly what it stands for, but basically it’s ice-safety issues. And so we have an advanced safety model because we want to make sure like if someone is standing on a mountain with binoculars looking at the lidar that they’re not going to hurt their eyes, right? And so we can’t use the current lidar systems to like their optimized design.
because we have to take into the safety and the concern. Whereas if you’re a regular surveyor, you might fly a lot higher and know, higher points densities. So that’s why we’re down around somewhere eight to 16, something like
Austin Madson(29:15)
Right. Yeah. And that’s mostly because y’all are flying around a thousand meters, right? Back to kind of this data discussion that y’all are collecting quite a bit of data and you know, you’re shipping the drives back via FedEx. So y’all are having to leave the FBO and go to FedEx on the way to the hotel and kind of all this logistical stuff. You know, do you see any changes on the horizon for how the data may be handled?
John Adler (29:19)
Exactly.
Well, yeah, actually, I remember in the old days, you could get these bonded satellite links through Inmarsat, I think it was. Okay. It took four modem channels and put them together and you’d get higher data rates. One could envision that if you were able to bond multiple channels, like let’s say Starlink or a similar satellite system, you could direct offload from the plane.
and then bounce it back down to the data center. So something like that would be really great. And I don’t think it’s that far from reality, but we’d have to do some analysis on that.
Austin Madson(30:20)
Yeah, so I guess you all would have to do spend quite a number of weeks, you know, testing and implementing that and making sure that they’re kind of redundant systems in place and all that stuff.
John Adler (30:31)
In reality, it’s just like any kind of computerized system. You would still collect traditionally and maybe for the whole year, you’d simultaneously maybe rebroadcast via satellite and make sure it goes well.
Austin Madson(30:48)
What do you see for the future of ⁓ remotely sensed data collection for NEON? I know you all have made some changes over the years, switching to different lidar systems and things like that. So can you talk a little bit about the future of remotely sensed data acquisition for NEON?
John Adler (31:06)
Well, I think one of the great things that could happen ⁓ is UAS. It seems like our project is perfect for this because our flight lines don’t change. We want to fly over the same lines every year in order to detect these changes in the environment, right? So if you were able to put together an operational plan that FAA would sign off on, it wouldn’t change every year. It’s not dynamic, right? And so, you know, something like that, I think, would be really exciting.
Bring advantages because perhaps you can augment some of the flights that we do now. If we miss a site, let’s say maybe the UAS system could pick up that site.
Austin Madson(31:48)
I wonder how long it would take to implement some of those workflows and how it would affect your personnel and all of that. Because, right, so there are a handful of sensor operators like yourself and, okay, well, what happens if you all switch to UAV and are you out of the job?
John Adler (32:05)
Well, uncrewed aircraft systems, you know, doesn’t mean there’s not people. that’s totally no more. Yes. If anything, it might take more people. So, you know, so what’s the value there? You know, it would be able to collect more data. And if you were able to do that, then maybe it’s worth it, right?
Austin Madson(32:13)
Right.
totally and you know just kind of switch around personnel and yeah.
John Adler (32:29)
You asked me what my prior history was, right? I was a navigator. So literally I was in the Navy with a sextant and then they invented this thing called GPS, right? So you got to pay attention to changes in technology.
Austin Madson(32:44)
Well yeah, we’ll look forward to see how and if some of these changes get implemented over the next couple of years. Well, Dr. Eilert, are there any kind of exciting projects or other changes going on at NEON or in general in the lidar space that you want to talk about?
John Adler (33:00)
Well, one thing I do want to mention is Google Earth Engine. I’m not sure if you’re familiar with it. So for those that aren’t, it’s not quite Google Earth, but it’s like a codable Google Earth. And Google Earth Engine is now taking in our data. And so when you think of satellite data, because they have VIRS and they have Landsat and they have all this other satellite data in there, but that’s typically eight to 25 channels of bands.
And so we come with 400 plus bands, right? But I think this would be great for the user community on the hyperspectral side because now you’re not having to necessarily download all this data. You can manipulate it in Google Earth Engine. So it’s kind of a super exciting concept getting away from the data center, download process data, special software kind of thing. So that’s something to look forward to.
Austin Madson(33:56)
Do you all store the lidar
John Adler (34:00)
It is being put in there. Now, I’m not sure what extent that is yet because the actual stuff we just got in, not everything, but we’ve got a lot in there now. And it’s set up initially for that type of data, for tile satellite data. But the Google Earth Engine is pulling in lidar data, but I’m not sure how far they’ve gotten. And I’m not sure if it’s just point cloud, if it’s also waveform, but it’ll be something to look forward to in the future also.
Austin Madson(34:30)
And you also mentioned that people can download the full waveform data and point clouds and elevation models and things from your website. And I think some of that data is hosted on OpenTopography too. Is that right? I can’t remember.
John Adler (34:43)
Yeah, we did talk with them a little bit. I’m not sure like our full set, I don’t think is there. Yeah, parts of it. Got it. So cool. But yes, neon science.org. You definitely go in there and you it’s really nice. You can you can pick out geographically. I want stuff in this area, you know, various ways that you can pick out the data you want.
Austin Madson(35:04)
Well before we wrap up, what’s your favorite field site of these 40 or so field sites that you all go to and why?
John Adler (35:13)
Well, you know, that’s a hard one. I don’t really have a favorite one, but I do like the one in New Hampshire. ⁓ And yeah, we call it the D1, Domain 1. You’ve got varied topography. You’re flying right up by Mount Washington. You’ve got all the trees there. Yeah, so it’s a really nice site out there. And especially if you’re going there towards the end of the summer and the leaves also are starting to go too.
Austin Madson(35:28)
Beautiful. ⁓
John Adler (35:42)
but we’re flying up at higher elevations, but I mean, you’re in the area, so yeah.
Austin Madson(35:47)
Well, I hope you get to site D1 this year, if not next year.
Well, that’s all we have for this episode. I want to extend thanks to Dr. Adler for chatting with us today and giving us some really cool insight into NEON and all of their sensors and how they operate those sensors. So thanks again, Dr. Adler, for joining us today. you. And thanks to everyone for tuning in to the discussion. As always, I hope you’re able to learn something new. If you haven’t already, make sure to subscribe to receive episodes automatically via our website or Spotify or Apple Podcasts or whatever flavor you like.
John Adler (36:11)
Thanks!
Austin Madson(36:25)
Stay tuned for other exciting podcast episodes in the coming weeks and take care out there.
If you want to ask about our podcasts or make comments, don’t hesitate to contact us. Thank you for listening. Good day.
Announcer: Thanks for tuning in. Be sure to visit lidarmag.com to arrange automated notification of new podcast episodes, subscribe to newsletters or print publication and more. If you have a suggestion for a future episode, drop us a line, here. Thanks again for listening. This edition of the LIDAR magazine podcast is brought to you by rapidlasso, our flagship product. The LAStools software suite is a collection of highly efficient, multi-core command line tools to classify, tile, convert, filter, restore, triangulate, contour, clip and polygonize lidar data. Visit rapidlasso.de for details.
{Music}
THE END