#19 – Ralph Dubayah

Dr. Ralph Dubayah is a Distinguished Professor of Geographical Sciences at the University of Maryland, College Park. He is the Principal Investigator for NASA’s Global Ecosystems Dynamics Investigation Lidar (GEDI) – a full waveform lidar onboard the International Space Station (ISS). In this episode we hear from Dr. Dubayah about the ins and outs of GEDI as well as an upcoming spaceborne lidar mission that’s under preparation. We wrap up with a discussion of Dr. Dubayah’s dream for airborne laser scanning (ALS) from space!

Episode Transcript

#19 – Ralph Dubayah

May 14th, 2025

{Music}

Announcer: Welcome to the LIDAR Magazine Podcast, bringing measurement, positioning and imaging technologies to light. This event was made possible thanks to the generous support of rapidlasso, producer of the LAStools software suite.

Austin Madson: Hey, everyone, and welcome to the LIDAR Magazine podcast series. My name is Austin Madson and I’m Associate Editor at LIDAR Magazine. Thanks for tuning in as we continue our journey exploring the many different applications of lidar remote sensing.

Today we are really happy to have the opportunity to chat with Dr. Ralph Dubayah, the Principal Investigator, or PI, of the Global Ecosystems Dynamics Investigation lidar or GEDI.

Dr. Dubayah is a Distinguished Professor of Geographical Sciences at the University of Maryland at College Park. He received his bachelor’s from Cal-Berkeley and both his master’s and PhD degrees from UC Santa Barbara. And I really don’t blame him for staying all those years in Santa Barbara during graduate school; it’s a beautiful place.

His research focuses on ecosystem characterization for carbon modeling, habitat and biodiversity studies, land surface energy and water balance modeling, spatial analysis and, of course, remote sensing science.

Dr. Dubayah has played a key role in numerous NASA remote sensing projects over the years. In particular, he was the principal investigator for the Vegetation Canopy Lidar, or VCL, a NASA mission to measure the three-dimensional structure of the earth’s force for carbon assessments.

And more recently Dr. Dubayah serves on the science team for NASA’s upcoming NISAR mission and is the PI for NASA’s Global Ecosystems Dynamics Investigation lidar or GEDI mission. GEDI is led by the University of Maryland in collaboration with NASA’s Goddard Space Flight Center just up the road and there’s a multibeam lidar instrument onboard the International Space Station.

So, I could go on and on here about Dr. Dubayah’s accolades, but why don’t we dive in and talk a little bit about spaceborne lidar. So, thanks for joining us today, Dr. Dubayah.

Dr. Ralph Dubayah: Thank you very much for having me, Austin.

Austin Madson: Of course, yes. I’m excited to chat. So, why don’t we get started and have you talk a little bit about your background and how you came to be involved in the remote sensing in lidar communities.

I think everyone’s story is really fascinating and I love hearing these stories about everyone’s trajectory on where they are now and how they got here.

Dr. Ralph Dubayah: Sure. I’d always actually as a kid always liked maps and I really also liked astronomy and I wanted to be an astronomer. I went to UC Berkeley as an undergrad and actually started out in physics and astronomy and then after about three years switched over to a geography major and that’s where I became exposed to remote sensing.

As I looked at that and all the work NASA had been doing, I said, “Oh, this is actually a pretty cool way to combine my interests” which I had a very strong interest in environmental science and using NASA-types of data.

And so, after I graduated from Berkeley, I worked for a year at a place in Berkeley, I think it was called the John Muir Institute which did timber harvest management planning and I was doing the usual grunt work that you do as an undergrad at that job, right?

Austin Madson: {Laughter} Right.

Dr. Ralph Dubayah: Digitizing maps and things like that, but then I had become aware of Santa Barbara being this nexus of remote sensing. And so I applied to go to grad school and I went to grad school there and at that time what I was interested in and what I was focused on was working on looking at spatial analysis in terms of topographic effects.

And a little bit interesting, one of the things I had always loved was being in the mountains and seeing how the sunlight played on the shaded side, the shadowed side, and there was something that was really compelling to me about that.

So, for my master’s, I worked with Jeff Dozier—who recently passed away—and I worked on developing a really fast algorithm to take Landsat data and drape it over digital elevation models.

In fact, that came out of a project I did in cartography at Berkeley where myself and my partner at the time took a map of Mount St. Helen’s and we just put a grid of dots over a contour map and so we had this array of numbers and then I took the array of numbers and put it into a mainframe computer program that did three-dimensional viewing.

And so we were able to make a three-dimensional map of Mount St. Helen’s which was – we’re talking about like 1979. So, that was pretty cool.

So, then when I got to grad school, I said, “This would be pretty cool. Why don’t we actually drape something over the elevation data now.” So, I did that and that actually got published. So, I’m not sure it’s still around…

{Crosstalk}

Dr. Ralph Dubayah: …and remote sensing. Maybe it’s still around, I don’t know.

So, and then after that, actually I switched to working on neural net[work]s. Believe it or not, you’re talking now in the mid-80’s and the issue there was there was nobody else to work with on this.

And I remember going to UC San Diego to one of the premier psychologists working in neural networks and he gave me a reel-to-reel tape and I took that reel-to-reel tape back with me to Santa Barbara and had their programs and I was messing with it. And after a couple of years I said, “You know, my heart’s not really into this right now, so let me go back and do natural science.”

So, then I went and returned to work with Jeff Dozier on looking at topographic solar radiation modeling over the kinds of prairie and other places like that. This was using the sophisticated DEM models.

And, again, at the time, there wasn’t a lot of people doing that kind of work and I did a lot in terms of using remote sensing for looking at energy balance at the time in addition to how topography tends to modulate the incoming signal from solar radiation.

So, when I came to Maryland, I started working in this area of topographic effects mainly on surface energy and water balance and I was very interested in how to get the energy underneath the canopy, like the canopy is interfering with things here. I was used to using DEMs that had no vegetation on them.

At that time I came across J. Bryan Blair at NASA Goddard and they had put an instrument up on the space shuttle called the shuttle laser altimeter and he showed me some of this data and I thought, “Wow, this is very cool. They’re seeing through the trees with this data.”

So this is probably in the early to mid-90’s and he said, “We think we can make one of these lidar instruments and put them into space as a free-flying instrument.” And I said, “Oh, that’s pretty cool.”

So, we talked about it and then I became the principal investigator of this mission called the Vegetation Canopy Lidar Mission. And it was a great mission; it never launched. The reason it never launched is they couldn’t get the lasers to work over at NASA Goddard; they kept kind of burning themselves up.

And so, we eventually got canceled and that put us behind almost 20 years really because that was accepted as the first NASA Earth System and Science Pathfinder Mission and it was going to be in its own spacecraft bus, never launched. And that was then almost 20 years since we actually finally got one and to do vegetation lidar from space but with GEDI.

So, that didn’t work and that was very – there’s nothing worse than having a NASA mission canceled and you’re the PI because the buck stops with you. {Laughter} So, that was kind of a pretty big failure and I thought, “Well, my career is ruined.”

Austin Madson: Well, it certainly hasn’t been the case.

Dr. Ralph Dubayah: Well, thank you for that. So, in that – and the time after that, then, NASA Goddard put effort into developing lasers and I put effort into working with airborne lidar, in particular NASA’s LVIS system, the [Land, Vegetation, and Ice Sensor] J. Bryan Blair was the PI on.

And so we did a lot of flying, a lot of mapping. We also did some work with ALS as well and the idea behind there was to apply the data in ways that really the community wasn’t aware you could do things with. This especially was true for carbon mapping, for biomass mapping, as well as for mapping habitat.

The advantage of LVIS is it has a wide swath—about a kilometer wide—and it flies on a plane that’s at high altitude so it can cover a lot of ground really fast.

So, we were able to map large areas super-fast which was something that ALS could not do at that time and actually still really can’t do the coverage of something like LVIS, but the tradeoff was resolution. We were talking about 10-[meter], 20-meter footprints versus point clouds.

Yes, so we did that through that first decade of the 2000’s and then when we got to – and we also worked on a mission called DESDynl to combine a radar with a lidar.

Austin Madson: Oh, I remember that, actually.

Dr. Ralph Dubayah: Yes, that was going forward and in 2010 NASA just decided they were going to cancel it because it was too expensive. So, that was pretty depressing.

So, now I was a PI of a mission that didn’t work that got cancelled {laughter} and then I put probably seven or eight years into DESDyni, that got canceled. That was another big discouragement.

And then NASA had a new program called Earth System Science Pathfinder where PIs could propose their own – again, now they had an instrument part of this where you could actually put an instrument and they would give you the ride up there.

Dr. Ralph Dubayah: So, we applied for that, the first GEDI—we’ll call that GEDI 1—in 2013. That was a great mission; it had two telescopes. It was one of three missions that were selected that could go forward but we weren’t selected.

So, now I have to tell you, Austin, to put one of these proposals together is about a two-year effort and it’s like you have to notify your next of kin that you’re doing this {laughter] because it takes all your time and all your effort.

So, now we have – I’ve struck out three times, so now I’m thinking I think I’m ready to quit on this and then in – I said, “Okay, I’ll do it one more time” because they had another call for this thing, because it’s every two years, and that was GEDI 2 and that got selected in 2015.

And I still remember very clearly when I got the call that it had been selected. I got a call from that NASA person and I remember I was in my backyard and I said, “Hello?” and they said, “Oh, this NASA.” And he said, “Ralph, why would you possibly want to do this again?”

And I screamed at the top of my lungs, like, “Woo hoo, we got selected.” So, that was super exciting and then the rest is where we’re at.

So, there was this arc that went really from undergrad and remote sensing, getting involved with it starting with my love of NASA and maps and things like that.

And then I would say persevering, I guess, is the best way to put it until we finally got in space what we wanted which was a lidar that was capable of taking ecosystem measurements and that’s where we are with GEDI.

So, sorry that was a long story, but that’s where it came from.

Austin Madson: No, it’s just great. Yes, it’s fantastic and I hope you’re riding the high of the last 10 years or so. {Laughter}

Dr. Ralph Dubayah: Yes, it’s been fun. Definitely been a lot of work but it’s been very fun.

Austin Madson: Let’s talk, then, about GEDI. I know governmental agencies in particular and NASA loves acronyms. I am a big Star Wars’ fan and in particular Return of the Jedi was and is my favorite of all the films, especially of the original three. So, who on the team was the big Star Wars’ fan here?

Dr. Ralph Dubayah: Well, I was {laughter} and I had come up with the name and, in fact, the name came out of a kind of separate retreat we had with scientists that the geography department, geographical sciences at Maryland had with our colleagues at NASA Goddard.

And this is before we even had a space mission. We were thinking about what could we do together and I said, “Well, what if we get all the structural data that we can and all the lidar data we can and put it together in a big database?”

And I said, “Yes, we could called it like the Global Ecosystem Dynamics Investigation” or something like that or something very similar to that. And then when we started putting together this mission with Goddard, I said, “Oh, this is a great name. It’s lasers in space, GEDI, what’s not to love? So, let’s go with GEDI.” And we just pushed to the side the fact that we’re not “truly global” because we don’t go above 51 degrees north or south.

But that was a detail I was willing to completely ignore to get what I think is a super-cool name for a mission that actually makes sense because it’s not just that the name is GEDI but we’re using lasers, so that to me was very cool.

Austin Madson: Well, so, it’s a good segue. You were talking about your orbital inclination. So, GEDI itself is mounted on International Space Station as a full waveform lidar. Can you talk a little bit about the instrument to better orient our listeners, things like footprint, spacing, beam pairs, inclination, coverage, all the good stuff?

Dr. Ralph Dubayah: Yes, definitely. So, technically it’s a geodetic altimeter because it’s precise enough to give you geodetic grade measurements and it consists of three lasers and a .8 meter beryllium telescope that is all self-contained in an instrument about the size of a very, very large refrigerator that’s on the International Space Station.

In particular, it’s on the Japanese Experimental Module, JEM, Exposed Facility–which is just hanging out there in space so it’s called the JEM-EF—and these are plug-and-play instruments, so you get launched.

We were launched by SpaceX and the robotic arm grabs you and plugs you straight into the JEM-EF. All those interfaces are standard and once you do that, you get power, you get coolant, you get all of that.

So, we have three lasers; two of the lasers are full power and by the time the laser beam spot hits the ground, it’s 25 meters across and this is in the near IR region. And then one of the lasers, the beam, is split into two beams so they have essentially half the strength or half the power.

And so, at any one instant you have four laser spots that are hitting the surface of the Earth and then we electro optically dither these four beams so that every other shot, four beams appear somewhere else across track. So, the net result is that you get eight tracks of data.

And along track you have a 25-meter spot and then you skip a spot and then another 25-meter spot. So, it’s not continuous along track; it’s every other shot along track because you’re dithering and then across track it’s about 500 meters.

So, you have these eight beams of data separated by 500 meters across track so you have about a four kilometer range that you’re looking at there, each of them collecting data. The fires, they shoot at about 242 pulses per second.

We can control only slightly where the shots are going. We can point off nadir about plus or minus five degrees. Beyond that we have no control over where the tracks are laid down. That is completely determined by the orbital parameters on the International Space Station.

So, it’s almost as if you’re randomly placing these tracks down depending on what the orbit of the International Space Station is doing, but through time you start to fill in across track.

During the first two years of its operation, we, I believe, sampled only about 4% of the Earth’s land surface. What also happens is that the ISS sometimes wants to raise its altitude so that it could hit particular landing or launching sites, especially for ROSCOSMOS, for the Russian space agency.

And if they’re too high, we get into what’s called an orbital resonance and if you’ve ever looked at global GEDI data you see a diagonal pattern and that’s because when we’re at a high altitude we tend to go over the same locations during that orbital period.

And so, you get this diagonal pattern and we negotiated all the way up, even including ROSCOSMOS to try to get the ISS to lower their altitude more often so that we had a random precession versus this orbital resonance which doesn’t have very much precession in it and so you’re going over the same – it provides a measure of more certainty for launching because you get closer to the launch sites more often and so we’ve worked with them to do that.

So, again, because it’s lidar, we don’t see through clouds or anything like that. But that’s the basic idea behind the instrument. It has the return waveform coming back from the ground as digitized onboard and then that data gets transmitted back to – the full waveform gets transmitted back to the surface.

The telescope is beryllium but we had an issue with our telescope contractor as we were building it. They said, “You know, we need a lot more time and we need a lot more money to build your telescope.” And these are cost-capped missions, so that was – could have been a fatal occurrence for us.

But ICESat-2, which is a photon-counting altimeter, had a spare telescope and so we negotiated with ICESat-2 to use their telescope as ours with the agreement that if they broke their mirror or something bad happened, we would get – they would get their telescope back.

Fortunately, that didn’t happen, but we did lose a little bit of efficiency because the photon-counting lasers for ICESat-2 are in the visible wavelength and that’s so that they’re more sensitive to ice.

And for vegetation we wanted it to be in the near IR, so our telescope was going to be coded so that it was responsive to the near IR and this telescope was not coded to be particularly sensitive to the near IR.

So, that’s how we got our telescope. We have three star trackers onboard, we have our own GPS as well and we do a lot of orbital calculations to try to get the best geolocation that we can.

And currently we’re at about a 1-sigma value of about 10 meters, so 66% of our data are within that, but the mean is smaller than that.

Austin Madson: Yes. And you all are like 400 kilometers in altitude?

Dr. Ralph Dubayah: Yes, between the low 400s and about 420 and once we get above 416 or 417 is when we start getting at orbital resonance. That’s what the ISS does, it will boost up to 418 or 419 and then let itself slowly drift back and then boost up and slowly drift back and we’ve tried to negotiate with them to not boost up so often and to try to keep us at the lower altitudes.

Austin Madson: And so one of the things I always find fascinating and maybe a lot of our listeners are drone and air-borne based lidar folks and so we get orientation from FOG, IMUs or MEMS-based IMUs and inertial navigation systems.

But for GEDI and lots of other space-borne platforms, you all are using star trackers for your orientation which is quite a bit different. I’ve always found that really fascinating.

Dr. Ralph Dubayah: Yes, you have to know where your optical bench is on the ISS and you have to know which direction your optical bench is pointing. That’s how you orient it so that you can see where the optical bench is relative to the center of mass of the ISS and then the ISS has its own GPS; we have our own GPS as well.

We don’t see the whole sky; we’re blocked by the truss, we’re blocked by a lot of things. We’re lucky if we see four or five satellites at once and our star trackers also get blocked as well. So, it becomes a really super-complicated orbital – precision orbit determination and we have maybe the best precision orbit determination person in the US…

Austin Madson: It sounds like it.

Dr. Ralph Dubayah: …named Scott Luthcke at NASA Goddard who also does the – we call it the POD, precision orbit determination, for ICESat-2.

So, when we first started this project, we thought, “Okay, this will be a kind of skunkworks project. It’s not going to have all the bells and whistles that we wanted because we have a constrained cost and let’s do the best that we can.” And it turned out we’ve done really an incredible job of making it work and so that’s been pretty gratifying.

Austin Madson: And so the mission link for GEDI I think was recently extended, is that right?

Dr. Ralph Dubayah: Yes, so…

Austin Madson: Or was it always up to 2030 or so?

Dr. Ralph Dubayah: Oh, no. Oh, no. Originally we were only a one-year mission and then when we got on orbit, the ISS could not guarantee us that they could give us enough power all the time. And so, we went from a one-year to a two-year mission. It turned out we could get all the power we needed, so we became a two-year mission.

That was going to be our end of life and then we got a mission extension and that extension went through – I think it was a three-year extension and then we got a second extension that goes through October of 2026.

Now, in between there, however, we had to be removed from our working spot on the JEM-EF and move to a nonworking storage location where the only thing you had was a small amount of power to drive survival heaters to keep the lasers from getting too cold.

So, we were put into hibernation in March of 2023 and this was so a Department of Defense mission could do their mission and we were supposed to be off until October of 2024 but the DOD mission finished early and we got plugged back in and were working again by April of 2024.

And our current plan is we would go through October of 2026. Some time in 2026 we will have a NASA senior review that will decide if they want to keep funding us.

We can stay on; there’s nothing to take our place all the way through 2030, but it’s dependent on NASA having funding and wanting to keep our extension going which is always based on NASA reviews and the like: Do they have money? Is it threatening other missions? We’re not very expensive as missions go to keep going, so we’re hopeful and our impact has been enormous.

And so, we’re hopeful with all of that the fact that there is no other ecosystem lidar operating and will not be at least until 2030 that we’re hopeful to keep that going.

But it’s somewhat to me miraculous that we turned back on because – and our lasers didn’t damage because we were never meant to last that long in space and certainly the lasers were never meant to last that long on survival heaters.

The survival heaters are solely for when you come out of the dragon trunk and you’re sitting there in space, the cold of space on the end of the robotic arm and it could take up to seven hours to transfer you over and so that’s what the survival heaters were meant to be doing.

Austin Madson: For seven hours of heating.

Dr. Ralph Dubayah: And we had to say – for seven hours and we said, “Wait a minute. We’re going to hope these things last for a year.” And we did some calculations and we said, “Well, it’s possible, but let’s see” and it just is a testament to how well the instrument was made.

NASA has classes of instruments and most of these cheaper instruments are what we call Class D. With GEDI we were Class C, we built some redundancy in in case something died. And so we’re built at a slightly more stringent standard but certainly the lasers was remarkable.

And we were only thinking that the lasers had to shoot two billion shots maybe, maybe one billion shots to meet our requirements. And right now each of the lasers has been fired 20 billion times, {laughter} each of the lasers. So, we’re way out there and we’ve hard seen any degradation. So…

Austin Madson: That’s amazing.

Dr. Ralph Dubayah: Yes, so that’s pretty cool. So, yes, so where we are right now is we’re fully operational, everything is working great, we’ll have a review before October of 2026 to see if we can continue collecting data, so that’s where we currently are.

Austin Madson: Yes, it’s fantastic. Let’s take a quick break and hear from our sponsor at LAStools and then we’ll continue this great conversation with Dr. Dubayah.

{Music}

The LIDAR Magazine Podcast is brought to you by rapidlasso. Our LAStools software suite offers the fastest and most memory efficient solution for batch-scripted multi-core lidar processing. Watch as we turn billions of lidar points into useful products at blazing speeds with impossibly low memory requirements. For seamless processing of the largest datasets, we also offer our BLAST extension. Visit rapidlasso.de for details.

Austin Madson: So, continuing talking a little bit about GEDI, Dr. Dubayah, I know you’ve been a part of maybe a couple other missions that are in the works or at least one other mission in the works. Can you talk a little bit about some of those missions or missions that are in planning?

Dr. Ralph Dubayah: Sure. Well, the first one is going to be launched soon which is the NISAR mission. The NASA-ISRO SAR, so that is a SAR mission and really that’s the SAR component of what the DESDyni mission was supposed to be. And so, GEDI is basically the lidar that was supposed to go along with the SAR and that mission should be launching, I think, in summer.

And I was originally one of the science co-leads of NISAR on the ecosystem side, also on Solid Earth there is a lead and for Cryospheric Science there’s a lead. And then once GEDI became a reality, I went off the science team and then since then have been doing some work with NISAR continuing at some level with the science team.

I won’t be continuing on the Admission Science team, however, after it’s launched. There will be, though, a – for anybody listening who’s interested in these kinds of things, there will be a competition for scientists to come on the team to the science applications. So that’s one mission.

A new mission is the EDGE mission which is a free-flying lidar mission, it’s part of the NASA Earth Science Explorers’ portfolio and these are competitive missions as well. This is the Earth Dynamics Geodetic Explorer, EDGE, and this is led by a fantastic crysopheric science scientist, Helen Fricker out of US San Diego. And the ecosystems piece is led by my colleague here who’s also on GEDI, Dr. Professor John Armston who’s a deputy PI of that mission but in charge of the vegetation; essentially its vegetation structure piece.

So, it’s currently in Phase A. NASA selected four missions in the last competition to go forward to Phase A which is the formulation stage: Phase A and Phase B is formulation, basically. I believe at the – sometime in fall they will be selecting two missions to go forward out of these four.

And the EDGE does basically everything ICESat-2 did in terms of measuring ice sheets and more; also, though, covers all the vegetation pieces. So, it really meets the needs of both of these disciplines and it’s a waveform lidar but its huge innovation is that it’s a swath mapping lidar and so there will be essentially 15-meter footprints that are contiguous in a 120-meter swath and so you’ll have these swaths that are going across the landscape.

And, essentially, we’ve never had that before. This is almost like LVIS from space, this is one step closer to doing swath mapping from space, continuous swath mapping from space. So, it’s pretty exciting and you get direct measures of change as these swaths go on top of each other and it’s way more data than we ever had from GEDI. There’s no gaps in it the same way you had with GEDI.

So it’s really going to be demonstrating the first time using the technology that we developed with GEDI which is this dithering kind of technology that lets you do this swath mapping and the idea that eventually you’ll be able to do this with much larger swaths and have more of them.

So, it’s a step along the way. As I said before, my goal was to always have ALS from space, right?

Austin Madson: Right, yes.

Dr. Ralph Dubayah: And that would be fantastic. So, this is one step along that which is let’s get smaller footprints that have a lot more capability per footprint, much better geolocation because it’s a free-flying orbital, and let’s actually get something bigger than a football field right across of data that you’re mapping all the time.

So, that’s super exciting and so we’ll have our fingers crossed that that’s going to go.

Austin Madson: Well, so, let’s talk more about this dream of yours: ALS from space. I remember reading at some of your past work you worked with Sigma Space and Leica on single photon lidar systems a number of years ago and you’ve worked with GEDI and a little bit on ICESat-2 and this EDGE mission that is hopefully going to launch in the next number of years.

But I assume you’ve been following some of these changes that are going on in the private sector with single photon lidar systems in orbit. How do you see this industry moving forward? You’re in the public sector kind of things, but there are other entities really at play. So, let’s talk about this dream.

Dr. Ralph Dubayah: You mentioned Sigma Space, that eventually became part of the Hexagon group of companies which Leica Geosystems is and they had developed primarily probably for I think it was defense purposes ways of trying to get photon counting – so, you had a lot of photons in the air at the same time so that you could do essentially continuous mapping at a somewhat higher altitude than you could.

And they developed a prototype of this and as far as I know we might have been the first organization to ever use this data because I had them map an entire county in Maryland with this data just to see how well did it work.

And it worked pretty well. It was still indivisible, I believe, at the time because you couldn’t get the sensors, the detectors to reset fast enough if you went to the near IR. So, as I recall it was still working indivisible.

I’m not sure whether the current Leica Geosystems one is or isn’t. I think that also might still be visible but that system evolved into what Leica Geosystems now flies and as I recall they don’t actually sell the instrument but they will fly it for you. This is just my vague recollection of this now.

I was very excited about that because there is a lot of applications where you actually would really like to be able to delineate canopies from my perspective. And so, I thought this would be great if we could do this in an efficient and economical way and if we could get the sensors to work in the near IR. And then I did some work with ICESat-2.

But the dream for me is we’re doing all of this with ALS right now and we all know what a kind of pain that is; even within the United States you’ve got different companies flying different ways, you have different point counts, you got to try to get the data; all those act together.

And if you’re interested in doing long-term change, having stability in your platform is important even if it’s something as simple as “I want to be able to monitor canopy cover change through time,” optically even. Even doing that, if you work with counties or at the state level, you realize that if they’re trying to meet their mandates for how much cover is gained or lost, it is not such an easy task because you can’t necessarily relate what you’ve mapped today to what somebody mapped 10 years ago, even when you add the lidar data with it to make sure that you’re actually mapping what you think you are.

So, to be able to do this globally would be great. Now, I would take 15-meter footprints globally in a heartbeat, but the place you want to get to is, I think, having this finer resolution. So, that’s why I was very intrigued by NUVIEW and I do believe you had the CEO talk on a podcast.

Austin Madson: Last year, yes.

Dr. Ralph Dubayah: Last year. And I actually invited him to a GEDI science team meeting and had him come talk to our science team—which is across all of these institutions—just because I thought that it was something we should all be aware of and to see where the industry is going.

And that was a great talk and I was happy to have him speak. So, I’m quite excited to see that I think it’s called [Mr.] SPoC NUVIEW, the [Mr.] SPoC bus and I don’t know where it’s at right now, but this demonstration of what can be done and I think that’s pretty exciting.

I know that Lockheed Martin, maybe with Lincoln Labs and maybe that’s what NUVIEW is using, I’m not up to speed on all of this right now, but they had also demonstrated some capability as well. So, I think that’s very exciting to see how that’s going to move forward. That’s probably the next frontier.

EDGE is waveform and there’s very good reasons for why it should be waveform and why this will still be a transformational mission because we’ve never had anything like that before. And it is providing continuity and it’s also providing a way of seeing the landscape in places we’ve never been able to observe before.

What comes next after EDGE, probably something with NASA is a surface topography and vegetation concept. Nobody knows what that’s going to be: Is it going to be lidar? If lidar, is it photon counting or is it waveform? Is it going to be radar? Is it going to be optical? What’s happening with all of that?

And the optical side is something we haven’t talked about either but that as well has a very natural marriage, I would say, with lidar data, especially with ALS but also with waveform as you consider what companies such as Maxar and Planet can do.

And with Maxar launching their Legion satellites now, they’ve developed this precision 3D product which is looking at a very deep stack of optical data and getting out 3D information from them. And we actually worked with Planet and we’ve worked with Maxar looking at what are the possibilities that you could do with this kind of data.

And NASA’s clearly very interested in commercial data and going forward without a doubt that’s going to play a major role in how they meet their science requirements and how to better integrate this data into their long-term planning.

And this is a very tricky issue because, how do you develop the licensing agreements {laughter} if – it becomes very hard because the particular company would like to sell their data to NASA and NASA would like to make their data public. You have to find a licensing agreement that makes sense on both sides. And – go ahead.

Austin Madson: And they recently changed that license agreement within the last year or two, I don’t know. So, our access now is totally different.

Dr. Ralph Dubayah: Yes. It is. This is something that is not clear yet about how this will evolve, but I’m fairly confident that it’s going to be towards facilitating both the purchase of the data and the use of the data.

So, it’s still to be determined and I’m on the National Academies’ Committee on Earth Science and Applications from Space, CESAS we call it, and this is a topic of conversation that’s very important going forward, especially as we start to determine what the next decadal survey is going to look at.

With like for NASA the decadal survey is where the National Academies provides recommendations for what the major science topics are and what kinds of instrumentation and the like, what kinds of observations we should be getting.

And it’s hard to predict on the commercial sector what’s going to be around for the next decade, in 10 years, but you can’t ignore it either. And it’s not just on the optical side, it’s also on if you have a company like NUVIEW now providing lidar data from space, that now is also going to have to play into this whole commercial data realm as far as NASA is concerned, I would guess.

Austin Madson: Yes. Well, so, let’s wrap up. I know, so, we talked a little bit about GEDI and EDGE, an upcoming mission, and then we’re thinking a little bit about what happens after EDGE: Is it waveform? Is it photon counting?

What do you want to see after EDGE, coming with this lens of ecosystem dynamics, I guess?

Dr. Ralph Dubayah: I think from the perspective of ecosystem dynamics, what you want to be able to resolve—I would say at the 10- to 20-meter length scales—is you want to be able to resolve the topography underneath the canopy, you also want to be able to resolve the canopy structure.

I think if you can get that at those length scales everywhere on the globe, you’ll be able to do two to three times orders of magnitude of transformational science.

And you have to be able to do this on a repeat basis as well and I’d be happy to have it done once every five years, but if you can do this during the year, if you have a synoptic view of how things are changing, that becomes very important.

So, past EDGE, what I would like to see is this continuation of being able to do wall-to-wall mapping from space at a relatively fine resolution. I don’t think we necessarily need to have every tree individually mapped yet, but {laughter} you could kind of get the canopies from passive optical kind of – in some areas we have open canopies and you can.

So, I can’t predict whether or not it’s going to be photon counting, is it going to waveform lidar, we don’t really know which way it’s going to play out and I’d become very, very hesitant to make a requirement because back at a – I think it was [Forest Sat] meeting in Scotland—and this was probably in the late 90’s or early 2000, I don’t remember when—but I gave a keynote speech and I said, “I don’t really care about ALS data all that much.” {Laughter} It’s like, “Who needs all that data and it kind of gets in the way, it’s complicated and I could do almost as good with waveforms” and I’ve grown to regret {laughter} that comment.

So, I’m not going to speculate about where it’s going to go, but I do know the direction we do need to go which is we have to be able to resolve the canopy, we have to be able to resolve the topography below the canopy, and we have to be able to do this at a relatively fine resolution, and we have to be able to repeat these observations.

And I’m hoping whatever technology we develop gives us that capability 10 years, let’s say, from now.

Austin Madson: Well, this has been a really refreshing chat, Dr. Dubayah. We appreciate you letting us pick your brain over the last 45 minutes, however long it’s been, and that’s all we have for this episode.

So, again, I really want to extend a heartfelt thanks to Dr. Dubayah for chatting with us today and thanks to everyone for tuning in. I hope you all were able to learn something new.

If you haven’t already, make sure to subscribe to receive episodes automatically via our website or Spotify or Apple Podcasts or whatever flavor you choose.

Stay tuned for other exciting podcast episodes in the coming weeks and months and take care out there. Thanks again, Dr. Dubayah, we really appreciate your time.

Dr. Ralph Dubayah: Thank you.

{Music}

Announcer: Thanks for tuning in. Be sure to visit lidarmag.com to arrange automated notification of new podcast episodes, subscribe to newsletters or print publication and more. If you have a suggestion for a future episode, drop us a line, here. Thanks again for listening.

This edition of the LIDAR magazine podcast is brought to you by rapidlasso, our flagship product. The LAStools software suite is a collection of highly efficient, multi-core command line tools to classify, tile, convert, filter, restore, triangulate, contour, clip and polygonize lidar data. Visit rapidlasso.de for details.

{Music}

THE END