00:00:04:17 - 00:00:38:10 Speaker 1 My name is Craig Glennie. I'm the technology co-lead for the project. I am a professor of civil and environmental engineering at the University of Houston, and I run a NSF facility called the National Center for Airborne Laser Mapping. And so yesterday we spent a lot of time on the science different science areas. Today, we're going to talk a little bit about some of the measurement needs, the technologies, the platforms and the architecture. 00:00:39:14 - 00:01:15:17 Speaker 1 And so I will start us off by just talking a little bit about some of the measurement needs. So what I wanted to start with is this example of a science to measurement traceability. And, you know, don't don't pay too much attention to the text in the table. What's really more important is the headings there. And, you know, so we kind of go from left where we have a science goal all the way to the right, where we have mission requirements above the table. 00:01:16:12 - 00:01:46:21 Speaker 1 We have these these arrows, these areas defined. And that's really where that information originates from. So our science goal is given to us by BI, by NASA and informed by the Decadal Survey. We then take that science goal and hopefully our science leads can drive that down into objectives. What observables we need, what physical parameters we need to observe. 00:01:47:18 - 00:02:11:24 Speaker 1 And then once we have those kind of measurement requirements, that's really where we start today, where we start to talk about the technology that can actually inform those science measurement requirements. We get a little way into the technology, and then you also have to involve the the system architecture because you need to make sure that you can actually build and launch it. 00:02:12:04 - 00:02:31:17 Speaker 1 That's apparently an important step when you put a satellite up, you have to you have to think about that a little bit. And so our flow is kind of from left to right. And you'll see it's kind of the way we we set up the workshop we spent yesterday on the science. Today, we're going to do the kind of the technology and the architecture. 00:02:31:17 - 00:02:54:09 Speaker 1 So it's important to understand that that's, you know, the direction, the the information flows, the science informs what we need for technology. And then the technology hopefully informs the architecture about what we have to to build to accomplish those science goals. 00:02:54:09 - 00:02:55:08 Speaker 2 Because we have some. 00:02:57:14 - 00:03:22:11 Speaker 1 Additional people. You know, we have had the benefit of of a smaller group at a workshop in June. I wanted to just kind of go back and remind or inform about we talk a lot about the 2017 Decadal Survey, but I thought it would be important to talk a little bit about what the guidance is. And it's important to note that it's just it's just guidance. 00:03:22:24 - 00:03:51:18 Speaker 1 You know, these things can change. But within the survey, the the the Decadal Survey, you can see up here, there's kind of five bullet points on guidance. And I've highlighted, you know, the last four here because I kind of think they're important to frame our our conversation. So identify which measurement needs can be obtained through suborbital and which require a space based component. 00:03:53:03 - 00:04:20:12 Speaker 1 Identify those ready to compete and venture class opportunities. So right away, they're saying, you know, maybe it's not a full satellite mission. Maybe there's an airborne component. You know, you have to you have to make that that call already kind of saying maybe it's not going to be one platform that does this this whole project identify any proposed components that could be ready for Earth System Explorer opportunities by the midterm assessment. 00:04:21:20 - 00:04:51:04 Speaker 1 You know, midterm assessment is going on right now and we maybe haven't haven't quite got to the point where we were ready for an Earth system explorer opportunity. But, you know, that was that was the guidance that was given. And then the third point kind of reinforces that first one consider appropriate split between global from space and potentially less expensive higher resolution airborne measurements. 00:04:52:05 - 00:05:15:09 Speaker 1 So again, you know, maybe we don't need to do everything from space. Maybe there's an opportunity to do things some things airborne as well. And then the last one I think we've we've talked a little bit about, but I think this is important. There's a lot of commercial companies out there doing radar imagery. There's a there's a couple of venture startups that are even looking at later. 00:05:15:24 - 00:05:40:07 Speaker 1 And, you know, we want to make sure that we're not duplicating some of the effort from those commercial companies, that we are doing something that is in the end unique, that will answer the science goals or inform process on the on the Earth's surface. And so we need to be cognizant of the fact that we're not just building it just for the sake of building it. 00:05:40:07 - 00:06:08:17 Speaker 1 We're building it because it doesn't doesn't exist elsewhere and is needed to inform from a science question. A little bit more guidance from the Decadal Survey. This is on the bottom there you can see it's on page 155. They give the science applications and value and they also talk a little bit about observational approach. Again, just just guidance. 00:06:08:17 - 00:06:38:16 Speaker 1 But I think it's important to frame the conversation on top. There is quite long. But, you know, basically they say if you could do surface to surface topography at five meter spatial resolution and decimate a vertical, it would provide a lot of new insights. Interesting thing here is they actually don't talk about change direction at all in that it just talks about kind of a snapshot of of the earth is being important. 00:06:39:19 - 00:07:08:16 Speaker 1 You know, that's not to say we shouldn't do change detection, but it just goes to say that even if we were able to capture one high resolution snapshot, the you know, the the focus on the Decadal Survey Committee thought that that would answer a number of fundamental science questions and was probably important enough on its own, maybe that, you know, defines a break between space and airborne observations. 00:07:08:16 - 00:07:39:15 Speaker 1 I don't know. But, you know, I think it's important to to frame our discussions with that kind of guidance. And then the bottom there, they talk a little bit about the observational approach. You know, they go through the fact that although there's, you know, STM and tend to mix, they felt that much higher resolution was needed. And then, you know, the first time that kind of vegetation comes in here is in the bottom driving vegetation, height from radar and optical methods. 00:07:39:15 - 00:08:15:09 Speaker 1 And they talk about, you know, troubles in getting canopy height from those observables. So they, you know, they kind of said that, you know, the vegetation is important as well. But again, nothing about kind of repeat observations of that. It's more focused on this kind of snapshot of technology. And I guess, you know, the bottom there is probably why it was a incubation study because that last sentence, they they basically talk about the three different technologies and and why none of them are kind of ready to do what they think needs to be to be done. 00:08:16:05 - 00:08:49:03 Speaker 1 So anyways, I wanted to just kind of go back to that because, you know, I know probably not everyone went and read the 750 page Decadal Survey on Sunday, Monday morning before they showed up here. And I think that's important to frame. Our measurement needs a little bit. So if we look at that, then the the incubation study, this is table 3.2 from the incubation study. 00:08:49:03 - 00:09:17:07 Speaker 1 And this tries to kind of get at the measurement needs of the community. And you can see if you look at, let's say, horizontal resolution and vertical accuracy, you know, they're kind of bracketed around that requirement that came out of the decadal decadal survey of five meters and and and ten best emitters. They've added some additional vertical resolution. 00:09:17:19 - 00:09:58:14 Speaker 1 And then we get into, you know, kind of the repeat frequency and we have, you know, the median in the most stringent. So if you look at the most stringent there for aspirational, for example, 4.03 months, if my math is correct, is every day, which is, I think, very aspirational in my opinion. But, you know, if you look at maybe the threshold, you know, three months or or somewhere around six day repeat, okay, maybe that's that that could be a little bit more realistic. 00:09:59:01 - 00:10:26:19 Speaker 1 And we have some kind of accuracy specifications. You know, the accuracy is kind of new. The horizontal axis, he really wasn't talked about in the Decadal Survey. So that's, I think, an important one. We have to we have to kind of frame this kind of reminds me before I was at the University of Houston, I, I ran to service companies and we would do remote sensing for people in a lot of light. 00:10:26:19 - 00:10:47:14 Speaker 1 Our surveys and the, the product that we would give, we would always tell people, you can be cheap, you can be fast or you can be accurate, but you can only pick two of those. And you know, here I look at this and I think, okay, high accuracy, high resolution. And low latency. It kind of has that same feel. 00:10:48:03 - 00:11:14:23 Speaker 1 You know, you can probably only take two of those. And so, you know, I don't know if I'm maybe speaking out of turn there, but I think that choice of which ones are the most important maybe will refine our measurement needs a little bit and allow us to to kind of define what can be done from a satellite, what can be done from from airborne to to kind of accomplish the the STB mission. 00:11:15:10 - 00:11:49:14 Speaker 1 So just just my thoughts. They're not endorsed by, by, by anyone else in the room, but hopefully at least a couple of points for discussion. And, you know, I would actually going back to this, I would like to give credit to Steve DeLong yesterday in the Solid Earth, he kind of brought up this point that, you know, one snapshot would be really good and after listening to that, I went back to to kind of read and, you know, that that's kind of what the Decadal Survey says as well. 00:11:49:14 - 00:11:52:13 Speaker 1 So, you know, thanks for the kick in the butt, Steve. I appreciate that. 00:11:55:03 - 00:11:55:21 Speaker 3 State of mind. 00:12:00:03 - 00:12:15:03 Speaker 2 It just. Just a quick question on this table between the threshold and the aspirational, why did that disciplines change? So sometimes there's more added in the aspirational rice. Yeah. 00:12:16:14 - 00:12:38:02 Speaker 3 I'll answer that because I was on the team that made that table. Every group came up with aspirational threshold needs and so that may have changed, right? If they met, if their threshold, they met what they needed, it might not end up in their aspirational bin. So they had different scales of things they were trying to measure. That's why. 00:12:40:00 - 00:12:48:21 Speaker 2 So at least everything in it, right? It seems like at least everything in the threshold ought to be in the aspirational. But that's not always true either. 00:12:50:03 - 00:13:12:18 Speaker 3 That wasn't how we treated it. We we took averages basically of aspirational threshold measurements. And then we asked them to make those tables themselves. And then we looked within those bins that they did there, if that makes sense, how the process was, I don't know if was good or bad. Paul has something to say. 00:13:12:18 - 00:13:39:02 Speaker 2 So Craig, I think that the you know, the showing the Decadal Survey with you know, what struck me was how for the for the one the one off map at five meters is how limited what they you know, they there is nothing about cry. Oh there it was really I you know, geologic morphic informing, you know, groundwater or water flow hydrology. 00:13:39:15 - 00:14:09:06 Speaker 2 There was stuff about what else is there trying to think of sea level in relation to sea level rise or you know, maybe storm surge stuff like that. But a lot of things were were missing. And I think what we've seen in a lot of the discussions are that a lot of the really interesting stuff, both scientifically and of course, for applications, revolves around change detection and processes relating to change. 00:14:09:06 - 00:14:28:22 Speaker 2 So I think I think that, you know, whether the Decadal Survey put it in there didn't, you know, didn't take that leap and go that far, I think that's where the really cutting edge and as you go to 2035, that's going to only be more more essential, I think, and more, you know, make you to something. It's not just like s our where. 00:14:28:22 - 00:14:51:04 Speaker 2 Oh, now we got a better team but it's really and an evolving data set that that you know has a lot going for it in terms of continuing and, you know, detecting things that are changing in time. Because, you know, from the Insar side, we, you know, as somebody does deformation for most of my work, you know, it's all about change. 00:14:51:09 - 00:15:19:16 Speaker 2 But there are things where topography is fundamentally complimentary to deformation, things that are incoherent in in in deformation that are in or too big fall naturally within the topography realm. So I think we shouldn't, you know, rely you know, it's good to base it on the Decadal Survey. But I think as we did in this study, you know, we should be adding, you know, not minimizing the time component. 00:15:19:16 - 00:15:45:12 Speaker 1 I don't think I was doing that at all, Paul. I was merely stating that they said there was a lot of value in that and that perhaps that was a division point between satellite and airborne for for process. So yeah, it was not I was not discounting change. I was just saying that it might be a natural line where we say, okay, we can we can do the snapshot probably from space or, you know, vice versa. 00:15:45:12 - 00:15:52:17 Speaker 1 And then we can use the airborne or sub orbital to look at process and change. Yeah. 00:15:53:00 - 00:16:14:00 Speaker 3 Then if I can just add to that, the figure we keep saying seeing and, and Ben had in his talk of our constellation implies that we have orbiters plus targeted measurements from aircraft. The other is if you look at our coverage maps, we all want the baseline going topography map so we can see what's written in the landscape and then we want the change in targeted areas on top of that. 00:16:14:11 - 00:16:21:20 Speaker 3 And I think implicit in that is, oh, thank you. You know, the cost, right? We have to figure out what what makes sense. 00:16:22:22 - 00:16:52:00 Speaker 1 Yeah. So that's the the figure that Andrea was just mentioning on the cover there where we show in a bunch of different things. Yeah. So my, my apologies, Paul. I did not mean to divide it like that. I was just. Yeah, yeah, understood. We just. Yeah, but what Paul said was, we don't need to settle into something that's just doable just for the for purposes of, of completing the project. 00:16:52:00 - 00:17:39:13 Speaker 1 I yes, we we need to make sure that we're under informing the, the science with with the change as well. So rest of the measurement needs, I thought it might be interesting to look in an example where we have all three data sets in the same area. And so this is, you know, a site in Sitka, Alaska, where we have airborne light our a synthetic aperture radar theme, commercial product and then an Arctic the from the from the worldview satellites out all over the same area fairly close to the same time. 00:17:39:13 - 00:18:05:06 Speaker 1 And you know it if you look at the histograms on the bottom there it's there's there's a lot of open areas, but there's also a lot of vegetation and there's also a significant amount of terrain variability as well. So if you look on the right there, there's kind of the overview of the the surface models in the DTM for the for the USGS Airborne SAR product. 00:18:05:06 - 00:18:48:09 Speaker 1 They actually it's a dual band and they provided both a DCIM and DTM from that. So just looking at their results here and I don't think there's any any shocking revelations from this, but, you know, a lot of curves on the on the right there. If you look at solid lines are differences with respect to the light our digital terrain model dotted lines are differences with respect to the light, our canopy model or digital surface model, first return surface. 00:18:48:23 - 00:19:12:08 Speaker 1 And so you can see, for example, in the top, as you would expect when you look at the material photo, the differences are basically the tree height, you know, with the f star, the the differences with the daytime, it gets rid of some of the some of the tree heights, but not quite not quite all. It's still kind of biased, higher. 00:19:12:08 - 00:19:32:09 Speaker 1 And then the interesting thing is when you compare with the DSM, you know, the the lighter, the mean and at least for the for the lighter in the world view is pretty good. The f sar bias is a little bit negative, so it's kind of below the tree height a little bit. And that is then encapsulated in the graph on the on the bottom there. 00:19:33:09 - 00:20:08:07 Speaker 1 You know, the interesting thing to me when I look at that is the slopes with respect to vegetation height are fairly consistent. The the if SAR seems to kind of level off at a value, which to me maybe implies that we could perhaps use that to train a machine learning model to to to estimate where the terrain is based on that kind of penetration depth of of the the if sar of course using using the lighter is the ground truth here. 00:20:08:07 - 00:20:30:02 Speaker 1 But, you know, not to say the lighter is better but it was you know, it was captured from a helicopter and it's 25 points a square meter. So it's it's kind of a nice reference surface to compare everything else. I don't think we can get 25 points per square meter from from space probably yet. So we're going to have to figure out how we can combine those. 00:20:30:02 - 00:20:56:06 Speaker 1 But the fact that, you know, there's there's some signal in both the world view and the the FSR that that we might be able to take advantage of. I think talks well to perhaps doing some fusion in between them. I have a table of stats here too, but I won't spend too much time on those. You know, the the standard deviations are fairly big because we have a lot of a lot of terrain. 00:20:57:15 - 00:21:20:19 Speaker 1 One thing I didn't plot on here is there doesn't really seem to depend very much on terrain slope. It seems to be fairly consistent. It's more correlated with the vegetation height than anything else. And so that that's kind of a a good thing. I think you don't have to worry about significant additional areas in kind of mountainous regions. 00:21:20:19 - 00:21:56:10 Speaker 1 So what are we what are we missing? You know, I think probably Joe and Mark will go into this a little bit more detail later on. But, you know, I think we we all know these are the kind of maybe shortcomings of some of the approaches. You know, co stereo gets are top surface only does not penetrate vegetation well when you because you have to correlate between the images that sometimes tends to kind of round out sharp edges because of that correlation window. 00:21:56:10 - 00:22:37:23 Speaker 1 You know, if sa, you know, you have some penetration into the vegetation, it, you know, coupled of datasets I've looked at, there seems to be something left in the signal that would perhaps be able to maybe model the terrain underneath. I think that needs to be fleshed out. Lighter, of course, does fairly well at vegetation penetration still challenge to classify the data between vegetation and ground for later, especially when you get into heavy vegetation, there's not there's not really an automatic process, that automatic process, I should say that that does it perfectly. 00:22:39:00 - 00:23:09:15 Speaker 1 And then there's also the question, you know, how how dense can you get the data and can you actually do a global high density? You know, I would I would put myself in the pessimistic camp on that one right now. So what our needs you know, I, I don't I don't think it's I don't think there would be many that would say that it's there's probably not one technology that's going to answer all the questions. 00:23:09:15 - 00:23:33:14 Speaker 1 I think that's, you know, came out in the Decadal Survey and I think in most of our discussion, we need maybe a combination of technologies. Which ones and how is it still kind of be to be determined? I think we need to start looking at fusion algorithms a little bit. You know, we have experts in in stereo and in lighter and radar in the room. 00:23:33:14 - 00:24:10:24 Speaker 1 But we need to do some some crosstalk between groups to see if we can come up with a best in class solution using all of them or a component of them together. Last thing I would talk about is, you know, I think one of those things we need we've been discussing is the airborne campaigns. You know, picking targets of interest that we can go and and collect using all possible technologies and, you know, perhaps also collect between or in at different temporal spacing to maybe look at process change in areas if there's things that are moving quick enough. 00:24:10:24 - 00:24:41:07 Speaker 1 So, you know, I think one of the things that we'll talk about hopefully in our breakouts today is what areas of interest are there that we can do this and can we leverage some existing data sources? And, you know, my last slide here, I would say there are a lot of existing data sources that we can use. So this is a nice Jeff of the light, our coverage in the U.S. by the USGS as an example. 00:24:41:07 - 00:25:06:24 Speaker 1 So the darker the green gets, the more recent the data sets get, you know, lighter can be a little bit expensive to to acquire. You know, NASA already has the kind of radar and stereo imaging, you know, your example UAV star in quakes. And so it might be interesting to leverage some of this existing lighter data and and try to collect at the same time with these. 00:25:07:11 - 00:25:32:19 Speaker 1 And by the way, the slides will be posted in this this Jeff is on the the USGS website. I have no special skills in GIS that allowed me to make this. So hopefully that kind of frames the measurement needs a little bit or maybe muddies the water more. I'm not sure which one I did. You tell me afterwards. 00:25:32:19 - 00:25:45:03 Speaker 1 Amazingly, we're right on schedule, which never happens when I talk. So I we had one, one question which will take us off schedule. 00:25:45:03 - 00:25:55:14 Speaker 3 It says quality, light our data. What does that mean? What meter? Ten centimeter, what's spacing. 00:25:55:14 - 00:27:03:21 Speaker 1 So out to basically means four points per square meter and a 95% confidence accuracy in vertical of 19 and a half centimeters. So fairly fairly good data I think to to compare and that's for you know most of this data on here is that what USGS calls quality level to any questions online. Okay if not we'll we'll move on to our next presenter. 00:27:03:21 - 00:27:53:20 Speaker 4 Q Good morning. My name is really Lou. I'm the Radar technology lead the caretaker or or whatever the person okay person herding the cats so today I'm going to talk about first I'm going to introduce the radar related findings from the incubation study. So I can't believe that was done in 2021 to date, two years ago. So basically, I'm going to talk about current and funded Spaceborne radar emission. 00:27:53:20 - 00:28:35:04 Speaker 4 So I did another sort of update refresh to see what's out there since we did the study in 2020 and the report came out in 2021. So since then there have been a lot of commercial spaceborne radar constellations or companies aspiring to launch deploy radar constellations in space, although so far they are all in X-band and talked a little bit about airborne radar assets and the funded radar projects funded by the SDI and also the IP programs. 00:28:35:23 - 00:29:11:17 Speaker 4 So all the SDO investments towards STV and what are the measurement gaps? Okay. So I love this picture because it shows what we're aspiring to do. So here are some of the the things that we think, you know, there are four types of I've listed the four types of radar measurements that we could potentially use for STV observing system. 00:29:11:17 - 00:30:11:14 Speaker 4 So for example, you know the insar for global topo mapping kind of like the tandem acts follow on that gives us a nice decent resolution, high resolution, global theme and then perhaps some sort of tomo sa clusters to with multiple small sets to map vegetated areas so you can get vegetation structure and then perhaps some sort of high altitude, long endurance or airship based persistent observation to look at rapidly evolving events like earthquakes or volcano eruptions or who knows, wildfires and or flooding, and then some sort of flexible airborne observation with really high resolution to do targeted 3D imaging. 00:30:12:15 - 00:30:48:09 Speaker 4 So that's kind of our take of what radar technology can read, our instruments can contribute to. STV So this is out of the incubation study report. Basically it shows that, you know, in terms of radar technology investment, the vertical axis shows the higher benefits. So so for example SA and INSAR signal processing and data format improvements, it's some, it's a necessary evil. 00:30:48:09 - 00:31:23:04 Speaker 4 It doesn't take very much to invest in and it gives the the payoff is low. Whereas on the other hand, you know, things like ampoule processing sensor webs and size, weight and power reduction are more challenging to accomplish and yet the payoff is high. So this is kind of the table we came up with the chart. We came up with and we identified two high payoff, high priority investments. 00:31:23:04 - 00:31:58:15 Speaker 4 One is the multi frequency and imaging trade study because you know, they're depending on the application. You may want a low frequency radar that could planetary vegetation to get this terrain model versus a higher frequency to get vegetation top or you know flying higher frequency radar gives you better resolution so you can see smaller features. So that's a tradeoff that we thought needed to be invest in and also just the imaging technique. 00:31:59:03 - 00:32:28:03 Speaker 4 But I think since then we kind of settled on, you know, perhaps Thomas R is the mostly need it maturation activity as far as that imaging trade study goes. Thomas R or Paul INSAR. So performance modeling and evaluation. So those two go hand in hand. One thing that was not mentioned was the cross-platform radar synchronization and calibration. That's a necessarily needed technology. 00:32:28:03 - 00:33:04:02 Speaker 4 If we want to do multi-platform formation flying for Thomas up and it was I wanted to point it out because actually some of the funded work addresses that number three technology you need. Okay, so here's a summary of the available state owned and commercial search satellites. Well, that's currently flying or being planned to be launched. So you've got the Sentinel one, a seaborn. 00:33:04:02 - 00:33:32:12 Speaker 4 And so from our perspective, we're probably looking at Sentinel one, C and D and then so that's the ESA mission. And then they are also funding their rows, L and B, which are going to be launch in 28 and 31. That's the target launch, launch dates and the rows. L Although they are two spacecrafts, but they're not doing formation flying. 00:33:32:12 - 00:34:02:09 Speaker 4 They are just in opposite sides of the earth so they can have twice as frequent coverage and so alos the Japanese elbon mission. They also plan to launch halos four. So I should mention that the ones in the shaded kind of look like color means that those missions that are the read the data are not freely available, so you have to purchase them. 00:34:03:09 - 00:34:34:09 Speaker 4 Whereas the ones that are not shaded are the open, open data access and global high area coverage rate. Okay. So the another one that we could utilize is the biomass P by mission, which is going to be launched in 2025. Radarsat C-Band constellation is very useful. Unfortunately, it's it's not free and open, so you have to purchase the data. 00:34:35:18 - 00:35:17:04 Speaker 4 Nice. R is going to be launch elbon is free and open data esp and is not global. So it's S-band is mostly over India, Asia area and maybe some select U.S. area. Tandon L Mission proposal by DLR. I have a question mark there because currently is not funded. Okay, so that's the current situation. And then below that dotted line, dash line is all the commercial constellations. 00:35:17:04 - 00:35:45:12 Speaker 4 So I see I see was so there are X-band so great for high resolution change detection but not so great for vegetation perhaps and they are also regional coverage. So eyesight's got a nice constellation and they're still continuing to add to the constellation and then there's Capella, I think those two and then were the front runners two years ago. 00:35:45:12 - 00:36:20:09 Speaker 4 But Umbra Lab has been catching up with funding from DOD so they have a really capable expense. Are again not global because these are small satellites. So there are regional and then there are other companies, probably at least five companies that I can think of that are either getting ready to launch or have aspirations to develop SA Constellation. 00:36:20:09 - 00:37:18:08 Speaker 4 So the satellite business, I think they're mostly after change detection in terms of not necessarily topography but more just change. So we call those coherent change detection. So like maybe trees were cut down or you know, for military purposes, troops were deployed or for, you know, ships moved, barges moved, parking lots or full, that kind of stuff. So more commercial or insurance type of business model, although we have been talking to, I think I know Capella was on the their data is on the NASA contract for us to examine. 00:37:18:08 - 00:37:49:13 Speaker 4 Not sure about ice. I don't remember. Yes, they are. Okay. What about Umbra and next year? Okay, great. So we're going to get our hands on some of these commercial data sets and we'll be able to see, you know, how useful they are and also engage them and see how interested they are in contributing to this. TBH, this one I'm not going to go over because Irena gave a very nice overview yesterday. 00:37:50:22 - 00:38:24:19 Speaker 4 So there are all these direct derivative products, which is basically what STB is after. I think the important thing is also the dates is on the website is 2026 but she said it's going to end of life is 2029. So in any case, after 2029, we do not have pandemic data and as you heard from her, you know, it takes like four years or so to get one global mapping of the deal. 00:38:25:02 - 00:39:11:23 Speaker 4 So, you know, when you think about getting this one great high resolution global, the realistically you can do it right now. You can probably do it once every four years, something like that. So when we do the trade study, we're going to have to prioritize and kind of carve, you know, what's real, what targets are really need, really require to be repeated much more often and how often so you kind of have to carve out the mission objectives and figure out, you know, what platforms and vehicles that you could use to achieve your measurement needs. 00:39:12:16 - 00:39:47:03 Speaker 4 Okay. So the airborne SaaS system that I only listed the task ible ones because you know, there are airborne instruments that have demonstrated operation, but can they but they're not for tasking. It's just technology demo. Now last time I did this there was Goes-R, which was like the ideal insar PE band insar and help expand Insar so you can get a terrain model and surface model all in one flight. 00:39:47:09 - 00:40:28:05 Speaker 4 So that's what they used for the Alaska SAR mapping, which was Alaska, the mapping that was great, but Goes-R has been retired. So right now there's one commercial company in her map. They have to expend tops R and Pam polar imagery and then UAV SAR which, you know, were available for tasking. But and also we were building a new instrument that's well that will allow us we're going to move on to a different Gulfstream, which will be modified. 00:40:28:05 - 00:41:06:17 Speaker 4 So we can we're also repackaging the instruments so that we can have three frequencies on the same platform at the same time with the camera. So with the optical imager. So we can at least have stereo imaging and perhaps pencil ban on the same flight or El Ben and carbon insar on the same flight, which is something that I'm really eager to see what benefits we can gain from having a key band D and a stereo imaging D together with album polar imagery. 00:41:07:06 - 00:41:47:17 Speaker 4 Okay, so here are the funded radar projects. So the top three are technology projects. I'm going to go into them a little bit. And then there are the Radar RC Project led up by Marco Lavallee and they are also going to look at Tomo Sample and SA performance with small cell formation and I believe Robert true have it has also funding from that proposal to look at algorithms for vegetation structure study. 00:41:49:07 - 00:42:38:07 Speaker 4 And then there's another very relevant proposed project that's in force information system. SDF is it's led up by Susanne Saatchi and in that project they're doing visualization, they're doing modeling of light our radar combine. So data fusion from the modeling perspective. And they also have a very nice visualization tool they're developing. So to help us visualize Tomo, Sok K and, and then there are these sort of science, they're classified as science proposals like Petro's proposal. 00:42:38:07 - 00:43:18:07 Speaker 4 Matthew Siegfried is is that the one Roger you're on to? Okay. So that one. So these are all uses different use cases for with radar measurements, radar insar and looking at sensitivity, accuracy, requirement coverage. So they're going to a measurement needs. So basically they're going to help frame help us frame the requirements for radar based measurements. Okay. Here are the radar subgroup members. 00:43:18:15 - 00:43:50:15 Speaker 4 I it's very nice that we have people from NASA centers, from universities as well as from commercial companies. So it's a nice mix. But we and I just show this when Marco show it this way, it's really nice because you can see that the radar people are also interested in number of different science disciplines, so they're very much interrelated. 00:43:50:15 - 00:44:20:05 Speaker 4 So next I'm going to go over three of the project summaries. So the first first one is JPL's project. I'm the PI, it's Flex DSR, it's a flexible, distributed, synthetic aperture radar. So we want it to be a but we want to build an aperture across multiple platforms so we don't have to build a large antenna. We can just do the OC, do the assembly in space. 00:44:20:13 - 00:44:51:18 Speaker 4 So the way we're developing this digital beamforming electronics, it's it can be applicable to any microwave frequencies. You just have to plug in the front end electronics and the proper antenna elements and then figure out how you want to deploy the spacecrafts. It could be airborne as well. So we're going to demonstrate to demonstrate it with UAV SAR because UVs are have multiple TR modules. 00:44:51:18 - 00:45:20:01 Speaker 4 So we can treat each TR module as a single element on the separate platform. And together with that we have a multi frequency and imaging trade study. Okay. So the next two are the projects from Olafson Singh. So they're developing this lightweight small footprint US X-band radar that they want to be able to fly on an airship or a halo. 00:45:20:11 - 00:46:10:00 Speaker 4 So this is an IP. So I believe they are planning on doing some flight demonstration next year. This one is also from Aloft Sensing. They're developing this very small package for the positioning navigation timing unit where because when you do tomo sa, especially when you go to higher frequency, you need very precise navigation and position information. So having this small module, you can put it on each platform and coordinate the, the timing and the position seem to be a very useful technology to demonstrate this is funded by the DSI. 00:46:10:20 - 00:46:43:13 Speaker 4 Okay, technology gaps. So these are the technology gaps, formation, flying technology needs. So cross-platform synchronization. We talked about this in cross-platform calibration. These two will be demonstrated with flex DSR. There are other people working on this as well. In other countries metrology. I mentioned that low latency. So something that's interesting and important is global processing and long endurance airborne platform. 00:46:43:13 - 00:47:23:10 Speaker 4 So those need a lot of investment and suborbital radar technologies like formation, flying, multi skin observation, those things. We need airborne data to develop algorithms and also making the radar generally lighter and more flexible so you can deploy on different platforms. Okay, I think this is all I have here. I'm just listing our charges are for this afternoon's breakout session, so I encourage you to come join us and help us figure out what the gaps are and what are your hopes and dreams. 00:47:24:00 - 00:47:29:24 Speaker 4 Thank you. 00:47:29:24 - 00:47:54:09 Speaker 1 And perfect timing. Okay. Our next speaker is Ben Smith. 00:47:54:09 - 00:48:06:04 Speaker 5 It's thanks a lot. 00:48:08:09 - 00:48:33:09 Speaker 5 So I'm Ben Smith. I'm out of the University of Washington Applied Physics Lab. I'm going to be presenting about later a lot of the stuff that comes in this report comes from the light. Our group, which has been meeting every couple of weeks pretty much since the meeting in Tahoe, and a lot of it comes from the 2021 that's to be reported. 00:48:33:09 - 00:48:58:18 Speaker 5 So I'll call both of those out when I think of it. But I'm going to miss a lot of chances where I've stolen something from somebody else. But I appreciate all the help that's gone into this. So for light hour, we have basically three missions on orbit that can provide data for us, potentially both for testing out algorithms and just as an example of the things that are possible. 00:48:59:17 - 00:49:35:01 Speaker 5 I think the best known of these is I set to, which is one of NASA's flagship missions designed for measuring change over ice sheets. It has global coverage going to plus -80 degrees, measuring with a photon counting light our in the green and the photon counting sort of defines the sampling and purples energy components of this mission. So it has low energy per pulse, but it transmits very often, which allows building up the photons to make detailed measurements of the surface. 00:49:36:16 - 00:49:57:24 Speaker 5 GEDI is a mission that's on the International Space Station. It's not operating right now, but it's going to go back into operation within the year, maybe next year. Since it's on the International Space Station, it has somewhat limited coverage, but that's the coverage over most of the world's visiting nationals and over the densest parts of the world's vegetation. 00:49:58:17 - 00:50:24:01 Speaker 5 It has eight ground tracks operating at infrared with a full waveform measurement. So it has more dynamic range than I said to you does somewhat coarser. A long track sampling with 25 meter footprints, which have been in some cases described as the optimal for vegetation, the one that I know the least about is Girlfriend seven, which is a Chinese satellite. 00:50:24:20 - 00:50:49:08 Speaker 5 And this is an interesting one for us, I think, because it's a combination of a stereo imager and a lighter. So the light hour is there to provide control for stereo pairs taken with the two cameras and the light is not meant to stand alone, I assume, because it operates only at three hertz. So it's taking one measurement every two and a half kilometers. 00:50:50:11 - 00:51:19:17 Speaker 5 But it makes up for some of that with extremely intense pulses. This is very reminiscent of Icesat one, which used high intensity pulse at a higher update than this, but still at a lower rate compared to Jedi. And I said to four sensors that can operate from airplanes and autonomous platforms, there are an awful lot that are listed in the CV 21 report. 00:51:20:16 - 00:51:46:04 Speaker 5 I copied out a selection of those and people have pointed out that technology has improved since then. So these are a few years out of date and there are better things from each of the commercial providers. But maybe the point to make this chart is that there are lots of options that could get us lots of returns per square per square meter at either 532 or 1064. 00:51:46:15 - 00:52:25:11 Speaker 5 I think the 532 ones that I've listed here are typically at lower resolution and higher purpose energy and that's because those these are the ones that are capable of seeing through water so that they have more energy going into each measurement to try and see deeper water than would be necessary for some of the 1064 surveying instruments. So lots of good options if we need to make measurements from the air either for study development or for covering parts of the mission that we can't do from space. 00:52:25:11 - 00:52:53:12 Speaker 5 I have four emerging technologies and I'm hoping that people will come to the breakout group and tell me about more that I don't know about. But roughly speaking, they are Cosell's new view. And then two technologies that came sort of out of the quad charts and sort of out of conversations with people who know about these things. Geiger mode, flashlight, ah and imaging light are based on a jet heritage. 00:52:53:12 - 00:53:23:02 Speaker 5 So Casals has a mission that's under development at Goddard or it's an instrument that's under development at Goddard. It uses a wavelength tuning technology to change, to scan a laser beam using a grating optics. It has a high efficiency laser and high efficiency photon counting detectors operating at one micron, capable of operating both at one micron and 532 nanometers. 00:53:23:02 - 00:53:48:15 Speaker 5 So there's some stuff in the quad charts about free form optics that would allow the use of a smallsat platform and about using machine learning based real time data analysis, which would allow better targeting the measurements from space. So it's a it's a new scanning system. It's a new way of making measurements, and it's potentially more efficient than things that have gone before. 00:53:49:19 - 00:54:23:14 Speaker 5 New View is a commercial venture, commercial platform that's also that is much more vague about what exactly they're going to do. It definitely involves small SATs and apparently it involves operating at low duty cycle. So rather than the measurements that we have now from Icesat and Jedi or the Just Keep the light are turned on all the time, this would turn the light out on and off to save energy and presumably to save downlink capability. 00:54:23:14 - 00:54:45:24 Speaker 5 The third one is Geiger mode flashlight, which is a technology where the light, our returns are detected with a 2D array of pixels, which sort of along the lines of a camera where every pixel is a light our detector. And so this is a way of putting lots of spots on the ground and seeing the ground multiple times as the satellite passes overhead. 00:54:46:17 - 00:55:25:14 Speaker 5 So it's a powerful technique for making detailed measurements, and it's potentially a way of turning a diagram, a detector that can only detect one photon per pulse into a technology that can see through canopies and and make detailed measurements over vegetation. And then the last one is an imaging light our based on today heritage. So the people who brought you jet I have been working on their instruments since then and they have more efficient one micron lasers and detectors that allow maybe a at least a tenfold improvement in coverage over the current jet. 00:55:25:14 - 00:56:02:09 Speaker 5 I and these are high dynamic range measurements which are good for both vegetation and ice, and they use a solid state scanning scanning system that's based on crystals that change their refractive index depending on the voltage applied to them, which can do a really repeatable and stable beam locations for scanning blazers, for mapping light. Our needs to measurements sort of defines light our measurement parameters and has sort of a general idea of how they would map to light our instrument parameters and mission parameters. 00:56:02:11 - 00:56:33:10 Speaker 5 So for measurement parameters, you might specify the sampling resolution, the per sample resolution, the repeat frequency, the per measurement precision, can it be penetration of water penetration? And these are going to need to map to the instrument parameters like the geolocation precision, the scanning strategy, the wavelength, the other ones that have listed here, and also mission parameters like how long the mission needs to be up, what's the orbit going to be, the number of platforms and the supplementary data sources that could fill in between lighter gaps. 00:56:33:20 - 00:57:20:10 Speaker 5 So these are the sorts of things that we're going to need to define as the study goes forward. I pulled out some of the requirements from some of the disciplines and I was mostly looking for what is stringent in the data tables that was in the report. Generally, most of the disciplines were somewhat agnostic about footprint size, but coastal processes and solid earths both seem to be enthusiast talk about really fine scale measurements that might be difficult to make with a large footprint for horizontal sampling, it was possible to find quite small sampling needs in each of the disciplines. 00:57:21:15 - 00:57:50:24 Speaker 5 Landis mostly kept things in the 10 to 30 meter range hydrology, was looking for one meter measurements for repeat intervals. Everybody, at some point in their table listed one day as their desired repeat interval. But there were also more relaxed intervals in some of the some of the disciplines for vertical precision. A range of numbers between centimeters and meters. 00:57:50:24 - 00:58:16:12 Speaker 5 And then there are some requirements that you can sort of divide by thinking about how people need to make their measurements for vegetation. Vegetation requires a fair amount of signal strength for canopy penetration, hydrology and coastal processes. Both need to see the bottom, which can be difficult through turbid water. So that requires more signal strength. Wavelength visible is critical for seeing through water. 00:58:16:13 - 00:58:41:16 Speaker 5 It's difficult to do that in the near-infrared for sea ice and land ice. They would both probably prefer near-infrared because there are fewer problems with subsurface scattering, but obviously can do just great with with green. And then the needs that people have are probably different from place to place. So there are not closed canopy, closed canopy trees everywhere. 00:58:42:02 - 00:59:14:20 Speaker 5 So it's possible that we might need to make some vegetation measurements from space and some work from airborne platforms. So for light our light our has a lot of potential to make high precision measurements and to do a lot where it makes the measurements, but to make dense light our measurements everywhere, we would need to improve the efficiency of the system so we can put more power onto the ground than I think is currently feasible. 00:59:14:20 - 00:59:42:02 Speaker 5 So improvements in transmitter, detector and platform efficiency would all be required to make light our measurements all the time. Everywhere. There are also different things that we might put on the table to advance, including being able to make measurements from small sets or multi platforms, or to be able to work at different energies or with different configurations at different times to get the measurements we need where we need them. 00:59:42:02 - 01:00:07:20 Speaker 5 But Lighter also has a lot of potential as a sampling component of a multi sensor system. So to advance that technology, we need more sort of modeling studies and mission design studies that would allow us to use the lighter where we have it optimally in conjunction with other sorts of measurements. So as I said before, light our has the strengths of high precision mission and mission continuity. 01:00:07:20 - 01:00:36:06 Speaker 5 So light our missions can be directly compared against each other in relatively simple ways. So a light. Our mission over ice can be compared directly with I set one and I said to lighter has fine it can have fine spatial resolution and can cover vertical profiles of canopy density directly. Likewise make bathymetric measurements directly for weaknesses. What it doesn't see through clouds very well. 01:00:36:06 - 01:01:09:06 Speaker 5 So to actually see the surface you might need to operate for a longer time or to have wider swaths that give you more chances of seeing a given piece of surface from multiple orbits and coverage is somewhat limited by the power requirements and scanning hardware. So it might be important to be able to make adaptable measurements to use the limited power and footprint number of where we can. 01:01:09:06 - 01:01:57:21 Speaker 5 I have a slide for lighter synergies because I think there's a lot of potential for light our strengths to complement measurements from other technologies. It's right now there are a lot of studies coming out that use light our where we have it and imagery where we don't have it to map vegetation and ecosystems globally now and this is typically done by using instruments on separate platforms to do to have broad area of images and then to calibrate those where we have measurements from Icesat two and Jedi for Light, our combinations with stereo optical and inside dense light, our makes very unbiased and precise measurements, typically again over limited areas. 01:01:57:21 - 01:02:43:03 Speaker 5 And then we can fill in the gaps with either optical or in certain measurements that provide more detailed beam coverage. This can be done with either a single platform or a separate platforms. For Stereo Optical, there are probably some geometric advantages to doing everything with a single platform, and these are studies that will be interesting to do to decide what geometry works best and how best to get it to do this or with vegetation to do stereo vegetation I think is a developing technology and it'll be interesting to see how those algorithms might get developed in future CV activities. 01:02:44:10 - 01:03:16:10 Speaker 5 And then lighter and radar measurements over vegetation have sort of similar advantages to some of the other places where light out can provide direct measurements of canopy structure, where radar provides broader swath measurements, perhaps without the same vertical resolution as the lighter. For these, I think that most of the radar measurements over vegetation are envisaged as insar. So that's a side looking geometries. 01:03:16:10 - 01:03:52:20 Speaker 5 So It likely requires multiple platforms to do this effectively. I have a very general set of advancements needed to achieve. PressTV Most of these involve a studies to constrain laser parameters or under what circumstances we would need to use airborne in combination with satellite measurements. There are some models of campaigns that have done multiple sensor that provide datasets that we should be able to work with. 01:03:53:01 - 01:04:26:15 Speaker 5 So like neon, which is doing vegetation with hyperspectral light, our above, which has an archive of measurements, Snow X has been doing both radar and light photogrammetry over seasonal snow. Icebridge has a multi-center archive over land and sea ice with lots of repeat measurements. I don't know as much about Arc six and Bioscope, and then there is a Casals airborne demonstration that should happen sometime in the next year. 01:04:28:08 - 01:04:43:17 Speaker 5 So my summary slides are fairly generic, but I would like people to join me as much as they can for the breakout group. 01:04:43:17 - 01:05:11:05 Speaker 1 Excellent. Thanks, Ben. Just a reminder for those in the room for lunch, if you would like to order lunch in advance to make sure you can actually eat lunch during lunch, there are some menus up beside Andrea, just this one. There's some menus around just turn in your choice to the to the desk out front before the end of the break. 01:05:11:05 - 01:05:21:16 Speaker 1 So 1015 or so and yeah. Where you got your nametag. Okay. Next speaker is Curtis Padgett. 01:05:23:05 - 01:05:27:13 Speaker 2 Yes, I'll be doing it remotely stymied by the traffic. 01:05:27:19 - 01:05:35:04 Speaker 1 So, Curtis, do you want me to share the slides from here? Are you going to do it from from where you are? 01:05:35:04 - 01:05:43:00 Speaker 2 Please go ahead. Okay. Next, I guess. 01:05:44:16 - 01:05:57:24 Speaker 1 Yeah. Just trying to get it up here. Stand by. 01:05:57:24 - 01:06:51:02 Speaker 2 Okay. So Mel is actually the the lead on the stereo imaging breakout session. Unfortunately or fortunately, she's on leave. And I will try my best to replace her. And this is stereo imaging or stereo photogrammetry, whatever we want, call it. And next, the current capabilities, I think these are commercial satellite providers that can be used for stereo imaging. 01:06:51:02 - 01:07:36:15 Speaker 2 There's a wide variety of them. I'm not going to go into detail on any particular one other than, you know, for the money that we have, you can you can buy typical imagery. You can task some of the satellites and, you know, cost there would be a factor. But they provide a wide variety of wavelength and multi hyperspectral and different resolutions on the ground all the way down to point three meters on some of these. 01:07:37:03 - 01:08:21:08 Speaker 2 Most of them, of course, are aren't going to be that that good. But you can get very good coverage of most of the Earth with these these systems. Next, there are a lot of government satellites that are still available to us, and these are mostly the CSA ones. I didn't cover other countries, but I would assume that some of these can be these imagery can be obtained and used. 01:08:22:15 - 01:09:47:20 Speaker 2 And once again, they cover most of the globe at varying resolutions. Next. So I'm just have two emerging capability is like I would to volunteer at JPL for NASA is basically a structure for motion sensor that flies on airborne platforms in the currently generates two suas of 12 by three kilometers. It can be used for small damage their multi view stereo imagery and you know depending on the platform and what your needs are, you can fly, of course closer or further away, depending on whether you want area coverage or higher precision and the one I believe is scheduled to launch either next year, it might have been delayed a year or this late this year. 01:09:48:17 - 01:10:28:07 Speaker 2 Can I guess is as a I would call structure for motion design system that can generate to the kind of quality imagery that that would cover most of the globe as well. And I think the it actually has a higher resolution for that for other applications, but for science applications I believe would start the point five meters next. 01:10:28:07 - 01:11:24:08 Speaker 2 So this one, a very generic slide. I, I think stereo imaging and provide decent measurements for, a variety of the science applications, solid earth and vegetation structure, both of those it can handle with with some limitations. But the it's typically well utilized or understanding surface structure and the canopy at least water the surface and bathymetry stereo has more difficulty working. 01:11:24:08 - 01:12:06:02 Speaker 2 It's not typically great on snow or water and bathymetry is sort of a shallow water bathymetry if it's clear there are possible avenues for use of stereo imaging with it would be not certainly universal. And the reason for this, it has limited view through canopy vegetation and typically if you want to understand structure, you would need to pair it with other sensors that can actually see the surface. 01:12:07:20 - 01:12:36:22 Speaker 2 And then it has of course, available availability issues, lighting and visibility limit its applicability in cloudy, cloudy regions and depending on the orbital dynamics for orbital platforms, you know, atmospheric issues might also cause difficulties with precise measurements. 01:12:38:23 - 01:13:49:12 Speaker 2 Next. So performance, a lot of the satellites available, the commercial satellites and even most of the government systems, I mean, they're designed to take imagery, not necessarily stereo imagery. So using them or multiple platforms for trying to discover ground structure has its difficulties. And you need to be able to take into account the different spatial resolutions, not ideal geometries and there are certainly difficulties in understanding orientations of the particular images. 01:13:51:05 - 01:15:04:20 Speaker 2 And then that a related phenomenon is that there's a great deal of uncertainty as to how precisely measurements are for given that multiple platforms might be imaging the same ground and with differing resolutions. It's quite hard to work out what the actual accuracy of the final result is. So visibility is also a driver here. Most of the stereo imaging, of course, is going to be focused on the sunlit side and vegetation and clouds can limit the utility of the measurements and make it very difficult to understand what the structure is. 01:15:04:20 - 01:15:56:07 Speaker 2 You can see through haze, etc., but it's your measurements might degrade and your accuracy then is a less well understood. And then you do have atmospheric effects as well that reduce precision of your measurements. And then finally, I think we need better publicly available algorithms to meet accuracy and understand whether accuracy is for for various imagery that that's used to uncover structure and. 01:15:56:07 - 01:16:59:16 Speaker 2 There needs to be a a more consistent way to understand with the geo registration areas as well. So those are advances that could greatly improve the utility of stereo imaging measurements. Next strengths I think one of the most significant strengths is, is a long historical record of imagery, stereo imaging has been used for at least 100 years to understand structure and it has for the longest historical background in terms of data availability. 01:16:59:16 - 01:17:42:07 Speaker 2 So you can do longer studies on changes. Second strength is you can do high precision measurements relatively cheaply. Airborne platforms like quad rotors allow you to do high precision local measurements down to two centimeters or or even better. And you can do that for as little as, you know, $1,000 to go out in a major some local phenomena. 01:17:42:07 - 01:18:24:02 Speaker 2 Higher altitude planet platforms can cover more area and imagery is typically easy to understand by humans and provides great context for any other modality that you might use to recover surface structure. And there are a lot of algorithms that are publicly available and or you can buy in the commercial domain that will help you generate a 3D structure from imagery. 01:18:24:07 - 01:19:16:11 Speaker 2 And they do a reliably good job with they don't do necessarily is provide you a sort of an accuracy and it's not clear that most of them actually provide the best accuracy for or achievable accuracy with multiple imagery, multiple views of the same scene. Weaknesses already mentioned gets handsy or difficult to see through heavy clouds and foliage the ground, and makes it difficult to do measurements on the structure. 01:19:16:11 - 01:20:18:01 Speaker 2 There is also difficulty if the platform is not both the nature. The farther way you go, the more difficult the atmospheric influence on the light is. And that reduces your precision and your accuracy of the measurement, of course, can't. For most of the wavelengths we're interested in for stereo imaging, it's very difficult to use on the dark side and some can rain ice water is not really good candidates for stereo imaging, at least from the orbital platforms, airborne platforms that are specially designed can can deal with that. 01:20:18:01 - 01:21:11:01 Speaker 2 But you know, that's a more local solution and I think there's significantly more processing involved to get to the structure from imagery than than other products radar lighter. So understanding, you know, how to go from, you know, collecting your images to actually getting a surface structure takes a lot of work and most likely involves supercomputers and extensive processing. 01:21:11:01 - 01:22:07:24 Speaker 2 Next synergies, I think overall stereo imaging is a great candidate for for use with any of the other sensors that we might use for structure. It provides easy to understand context. There's other things you can do with the images themselves other than, you know, generate a point cloud or a dam. It they are can be analyzed locally and segmented to understand more about the scene and can just typically be determined from either satellite measurements or or radar. 01:22:09:15 - 01:23:05:05 Speaker 2 And they have I think one of their strengths is, you know, a long pedigree in space. So there's all sorts of options in terms of sensors for a mission concept. Next. So the needed advancements, I think the bigger ones are what happens when conditions are favorable both the imaging sensors, either the nightside or dense clouds. It's very hard to actually utilize measurements or generate measurements from from the images. 01:23:05:21 - 01:24:06:20 Speaker 2 And so there would be gaps there. There's also what I think this goes for all the sensors. How do you establish, you know, precision, geo registration? And then finally, I think there is some significant gap in how you do multi image, multi view stereo imaging. Not all the algorithms that are available can handle it. Certainly some of the commercial products that are typically used don't reliably utilize all the information to generate the most precise measurement and some things that that could be considered. 01:24:06:20 - 01:24:52:02 Speaker 2 Pair the measurements with other sensors, use other wavelengths to try and see through some of the visibility reducers. And of course, you can always wait until conditions improve, but that does impact your availability. And then finally, you can provide a database of global landmark to improve your ability to geographic security or your data. Next so needed experiments. I didn't really go into this. 01:24:53:06 - 01:25:33:00 Speaker 2 There are a number of things you could do to validate any of the techniques, but I think, you know, that's not that's more of a workshop exercise, I think. And I don't have any current techniques that I, I think should be followed. I think there are a number of different ways we could do this. And, and and there's a lot of flexibility in terms of generating a mission design to accomplish this. 01:25:33:00 - 01:26:23:15 Speaker 2 So I left that sort of on unpopulated right now and next, though just to go over stereo imaging provides wide area covered high resolution from orbital airborne assets. It has the most extensive data in terms of time and coverage and sort of the one thing that is missing is understanding. When we do use stereo imaging, what sort of accuracy and how do we how do we determine that? 01:26:25:02 - 01:26:33:21 Speaker 2 And that is? All I have right now and look forward to the workshop. 01:26:35:10 - 01:26:50:23 Speaker 1 Okay. Thank you, Curtis. So that that brings us to our break. But before you stand up and walk away, Andrea has a couple of announcements. 01:26:50:23 - 01:27:06:17 Speaker 3 So the first is Curtis couldn't make it all the way here because of traffic, but he will be here for the breakout session. So just so you know, there will be a real person in the room running. So we'll see. Curtis See and meet some of you. Curtis soon. The second is we're going to do a group photo. 01:27:07:15 - 01:27:30:18 Speaker 3 So right before lunch we're going to go scope out and see if the stairs over here are busy with a meeting or not because that would be a nice place to do it. And it's not raining hopefully. So don't go to lunch then make sure you turn your menus in. And I want to try for the online people to have you all turn on your cameras and do some screenshots so we can add that into the group photo. 01:27:32:03 - 01:27:35:02 Speaker 3 Anyway, we'll see how that works. That will all be just before lunch. 01:27:36:15 - 01:27:59:20 Speaker 1 Okay. So we're we will start back up in in 20 minutes at at 1020. So try to be to Bill Dietrich to the coffee or there'll be nothing left station, and then we're going to have a panel after that. So our first presenter is Matt Friedlander. So Matt, please go ahead. 01:28:01:07 - 01:28:32:13 Speaker 6 Thanks very much. So I'm going to be talking about the platform group. And I think that this figure for us TV is is really indicative of how important platforms are for this mission. I guess the first chart I'm going to show is kind of a historical chart came out of Esto in the early 2000s and the idea was, hey, you know, we really have to look at how we integrate measurements across all of these vantage points and recognize the value of each of these vantage points for the different science goals that we have. 01:28:32:13 - 01:28:53:16 Speaker 6 So I think this one's really important for this group to look at. This is a great summary of the different types of platforms that Earth scientists use and they're deployed and on the right hand side kind of shows what their various roles are. I'm going to focus on aircraft mostly because that's what I know best. We're not really to the point in this community where we can say, hey, this is the this is the satellite bus we need. 01:28:54:08 - 01:29:12:04 Speaker 6 I'm also not going to focus on ground stuff so much just because of the limited coverage. But I'm not going to completely ignore it. But so let me go on now. This is a a fictional hangar that has all of our aircraft to scale. These are the science aircraft that NASA uses. There's a few historical ones on here. 01:29:12:04 - 01:29:31:15 Speaker 6 You'll see the Global Hawk on there. Sierra is the small UAV that we operate out of Ames. So left you'll see the IR two. That's the high flier that goes to 70,000 feet on the right. And you have the big jumbo jet, the DC eight. The only the other reason I going to bring that up is because the DC eight by the time Steve is up is going to be long gone. 01:29:32:00 - 01:29:55:08 Speaker 6 The DC it is retiring next year and we're bringing up a777 currently that we hope will begin operations in 25. So the 777 is going to be an essentially a world class resource that's going to have global coverage. And so I think that's definitely that this community should be thinking about is how do we load that plane up with liners and radars and imagers and do in-air comparisons on the same platform? 01:29:56:04 - 01:30:13:05 Speaker 6 But besides that, the other kind of trend in our fleet is that we're moving towards Gulf streams. The business jets have long legs. They can they can stay up a long time. They can right now, we're we two Nader ports and RG three energy five. And then we're also working on this G-4 that I'll show in a second. 01:30:13:23 - 01:30:38:19 Speaker 6 But so just to kind of demonstrate the the breadth of the geographic coverage, this is a map showing UAB star flights since 2008. You'll see California as a major target. The faults here, we fly repeatedly, but it also just I guess demonstrates over the last 15 or so years that even if even when you're flying 500 hours a year, there's limited coverage from aircraft. 01:30:38:22 - 01:31:02:00 Speaker 6 And so it's really critical when you're talking about airplanes to really focus your measurements on the places that aircraft are well-suited to, as opposed to more systematic global measurements. I just added this one quickly just because it was brought up earlier by tunneling. But this is the G-4 that's currently at Langley Research Center. It's being modded by a team out of Armstrong, and then we'll come back out West. 01:31:02:24 - 01:31:32:11 Speaker 6 This is still notional, but this is likely what it's going to look like. The other reason I bring it up is just in addition to the ability to carry multiple radars, what we're hoping we can do is also put an airport in here so that we could do coincident light our and radar collects. So that may be something this community suggests to my boss, Bruce Tagg that for us TV, we would really like to have a multi-mission UAV SAR platform next gen SAR platform that allows for more than just radar. 01:31:33:15 - 01:31:58:24 Speaker 6 So let me just switch gears here. This is another way to look at our planes. This is an endurance and altitude matrix that also overlays range payload. So I include this mainly just as a tool as the teams are thinking about, hey, what's the what's the right aircraft for the work that that I'm doing with, the payload that I have, this is a way to kind of hone in on the capabilities. 01:31:58:24 - 01:32:23:05 Speaker 6 It also shows how we try to maintain access to a fleet with a variety of capabilities, can fly low, medium, high altitude, longer endurance. The other thing I guess, that I find really exciting about STB is because of its long time frame, it really is going to open up the architecture to some new technologies that that are only really being matured right now. 01:32:23:16 - 01:32:55:11 Speaker 6 So note this is altitude and endurance, but the endurance is a large scale. And so this kind of demonstrates the really interesting new capabilities of, if you want to call them, hail, waves, high altitude, long endurance movies or high altitude platform systems or pseudo satellites. There's I think people are converging on harps as the name for these. But the thing to note here is a lot of the operational harps have a very, very tiny payload mass of about £10 or so. 01:32:55:11 - 01:33:19:23 Speaker 6 There are there are exceptions. And I'll talk about those. But this is kind of a way to see the really vast spread in capabilities across uncrewed systems. So these are some of the HAPS platforms that that I'm tracking. It's one of my jobs. And then the airborne science program is is kind of being the uncrewed systems lead. And also I have a couple different projects that are looking at demonstrating these capabilities for science. 01:33:20:18 - 01:33:41:05 Speaker 6 So at the top is probably one of the larger ones. That's the HAPs Mobile Sun Glider. HAPs Mobile is a partnership between air environments and SoftBank, and so they are Sun Glider resulted from their first five year partnership. They just inked a new deal for another five years to build a second version of this. So we're all very anxious to see what that's going to look like. 01:33:41:24 - 01:34:00:20 Speaker 6 But this did get it into the stratosphere. So you can consider this a two year or seven or eight platform. Sky one is a really interesting new airship that's operating out of New Mexico. This one is another Tier seven or eight. It can do Thomas take off can go all the way to stratosphere and come back down. It has significant payload capacity. 01:34:01:15 - 01:34:20:21 Speaker 6 So I think this is one that people should be considering. We just flew our imaging spectrometer on it last week. That was a USGS payload that we made arrangements for. So that one is is it also has a partnership with New Mexico to look at methane leaks. The Aerostar Thunderhead balloon. You may have heard of the Google Loon project. 01:34:21:09 - 01:34:39:14 Speaker 6 They Google partnered with Aerostar to develop these stations seeking balloons. These are balloons that have an AI driven autopilot that models winds at different altitudes and it changes altitude to find the winds that'll take it back to where it wants to go. So never really stays over the spot you want, but it'll stay around the spot you want. 01:34:40:14 - 01:35:03:06 Speaker 6 The Zephyr is the most mature haps system out there. It's been flying since the early 2000s. Is and European countries have been exploiting that. But again, very, very small payload. It says 10 to £20. That's not true. I'm sorry about that. It's ten or less prismatic B systems. That's a that one's a little bit dark. I haven't heard from them in several years. 01:35:03:06 - 01:35:22:08 Speaker 6 Some I know they're flying, but they're probably flying for the other side of the fence, but very capable platform as soon as it's available for civilian use. We're going to jump on that one. And then the swift tool is one that I'm involved in. That's an SBIR funded effort to develop a small scale prototype that can carry about £10. 01:35:22:20 - 01:35:44:13 Speaker 6 Here is a cool picture of Sky just from last week launching. The thing about this airship that makes it unique is the material that is able to actually capture that helium and not let it leak. Most airships leak helium and that's a very expensive proposition and also limits your endurance. So this is this is one to watch. The other one just popped up. 01:35:44:13 - 01:36:03:11 Speaker 6 I actually just talked to them this morning is key aerospace the atmos this is a this is their large scale version that they haven't built yet, but they are flying a small scale version. They've got about ten flights and they're looking to get into the stratosphere very soon. This will carry again. We'll carry about £10 in that pod. 01:36:03:11 - 01:36:25:05 Speaker 6 I'll just talk briefly about a couple. So this is the school, the Swift Engineering Mission. The Forest Service has come to fund a phase three. That's a follow on from the prototype development to carry an air sensor. The goal is to launch out of spaceport and then fly over the HeLa forest and just sit over there for a week or more and watch fires as they progress from ignition through their spread. 01:36:26:01 - 01:36:46:00 Speaker 6 So think of it as a sit and stare capability. It can just it can just continually map and provide data as you need it. And the goal here is to provide about a five meter pixel for fire detect. So there's a sensor system from sensory labs that's under development there. The other one that we just funded is the electro. 01:36:46:19 - 01:37:09:20 Speaker 6 So this is a company out of Manassas or Falls Church, and so they're developing a solar electric plane, fixed wing. You'll you'll see they all kind of kind of look the same and they all kind of have the same capabilities. And that's just because of the physics in terms of how much power density the batteries have and how efficient the solar panels can be, where they will be hopefully getting into the stratosphere in this nebula. 01:37:09:20 - 01:37:29:12 Speaker 6 We are also using the Aerostar balloon. We're going to be doing a demonstration with the Forest Service this next year to carry an LTE payload as well as sorry about. And the goal there is to provide less material to the fire camps, but also to test out the capability of this platform for doing the types of science that we want to do. 01:37:29:12 - 01:37:35:19 Speaker 6 I didn't mention the payload capacity on this, but the gondolas, about £125 after you take away the structure and wires. 01:37:36:08 - 01:37:37:02 Speaker 2 I don't really care. 01:37:37:17 - 01:38:07:09 Speaker 6 About the payload so fairly significant. I should also note that they're not concerned by volume large aperture radar that falls within £80 that they could still fly. So that's one really neat capability. The Aerostar is they're not volume limited. Okay. The other thing I wanted to mention related to HAPS, it's very exciting, but we're really early both in the maturity of the technology as well as the policies and procedures that are needed to make of these vehicles. 01:38:07:21 - 01:38:11:01 Speaker 6 So the fixed wing UAS, I think are. 01:38:12:13 - 01:38:13:02 Speaker 2 Going to be able to. 01:38:13:02 - 01:38:27:09 Speaker 6 Pull this off. We're going over an area. Unfortunately, the FAA has not come up with a great way to get those into our airspace yet. And we're asking all the time. So there is a part of NASA that's working on upper E traffic management. 01:38:28:24 - 01:38:29:07 Speaker 2 Systems. 01:38:29:09 - 01:38:50:02 Speaker 6 For that, but that's not sure yet. But it will be mature over the next four or five years. But that's also why balloons and airships are valuable right now, because they don't. Well, as long as they collapse while in flight and come straight down, they fly under balloon rules and they can they can get access to the airspace that our other systems can't. 01:38:51:03 - 01:39:05:19 Speaker 6 This is just a I should have put this with the other one. This is the Strabo Project overview that shows the sold us backhaul so that you can then provide high bandwidth satcom and then we're looking at maybe adding StarLink. So then we don't have to worry about spectrum approvals and and tethering to a line of sight station. 01:39:06:18 - 01:39:30:04 Speaker 6 The other thing I'll mention to you before I get into the meat of this is just for background is Suez hasn't had a lot of discussions of that. But this is a system we've recently developed through SBIR and proven and is now in operations by the USGS. This is a catapult launched small US that can carry about five or £6 with outfitted it with thermal and IRR cameras as well as gas sensing cameras. 01:39:30:04 - 01:39:51:12 Speaker 6 So the the picture you see on the left is a mosaic between of the McCUTCHEON volcano showing the air collects overlain on the visible imagery and for this one we actually the the picture on the bottom there shows the the airspace where we actually launch from Dutch Harbor Airport and then flew beyond visual an asset to the volcano came back about a 7070 minute mission. 01:39:52:04 - 01:40:11:04 Speaker 6 But I think that's that should be one thing in the tool kit is looking at how we use all these different platforms for specific needs. All right. So I guess now to the teaser for the the breakout later. These are some of the some questions that I think need to be addressed for our platform group to be successful. 01:40:11:04 - 01:40:32:12 Speaker 6 So obviously, one of the payloads, how do you want to fly them? What kind of mass power, data storage and telemetry requirements are there? So onboard our aircraft we have lots of good computers. We have satcom, but maybe it's not enough. Maybe we need GPU clusters, maybe we need laser comm. Those are the kind of things that we want to know because then we can inform the airborne science program as to what equipment is going to be needed to. 01:40:32:12 - 01:41:09:16 Speaker 6 Support your science also latency that gets into the telemetry and satcom and then the interoperability between these different systems. Do we need to fly multiple systems? Do we need to how do we want to order them and stack them? So some of the sub goals really looking at throughout the whole lifecycle from now until operations, trying to understand where aircraft come in, whether it's instrument testing, cowbell and then ultimately algorithm development and then and I guess what's unique about STV is it may be actually part of the mission as opposed to just a satellite with the aircraft helping. 01:41:10:02 - 01:41:25:10 Speaker 6 I think that's what I find very exciting about this mission is that we're looking at aircraft as being part of a constellation. But in terms of things that our group is going to do, we're going to come up. I already have a catalog of all the aircraft, both within ACA and across the federal agencies that we can get access to. 01:41:25:10 - 01:41:47:22 Speaker 6 So hopefully you'll see this group as a resource for that. But then also helping to support the development of the flight campaigns, both to look at science goals as well as technology maturation. Here are some of the capability gaps that we would consider or that we can discuss. I don't think I need to go walk through all these, but of course, what are the different roles of these systems? 01:41:48:14 - 01:42:07:24 Speaker 6 Our ground sensors are going to be part of this. I haven't heard too much about that, but I know within the disaster and fire community that more and more people are linking up video cameras that have really interesting capabilities. You know, whether they're fixed or can scan, I assume networked type of ground sensors are going to become more ubiquitous. 01:42:07:24 - 01:42:32:14 Speaker 6 So that may be something to consider. And then I guess also from an from the standpoint of ACS, most ACS really take into account satellite observations. I haven't seen a lot of options for including aircraft observations. So that might be an interesting area where the platform group works with the OSI, an architecture groups to try to do a better job of capturing the pros and cons of adding aircraft to a constellation. 01:42:33:21 - 01:42:55:15 Speaker 6 And then I mentioned this before, but data processing and telemetry, we really want to understand what those requirements are and are they sufficient for what this group wants to do? We've talked about airborne campaigns, and that's really I think that's going to be a really important step over the next five years, as is really understanding the capabilities and limitations of these different systems ideally flying them together. 01:42:55:15 - 01:43:09:24 Speaker 6 So as I mentioned, I think our our G5 and G3 are really good for that. The 777 will be good for that in terms of having multiple locations to set up these instruments and and do error comparisons. I guess the other. 01:43:10:12 - 01:43:11:16 Speaker 2 Input that we need. 01:43:12:06 - 01:43:34:07 Speaker 6 And this was discussed yesterday, think in the coastal area is the coastal one, but really understanding what are these areas where we really need to have that high repeat, high resolution where. It makes sense to think about the maps as a as a role. So we've just, you know, volcanoes, they change really quickly on a daily basis. That would be an obvious one. 01:43:34:07 - 01:44:00:09 Speaker 6 Looking at deforestation and pre and post hurricane landslides would be another one. But, you know, thinking about what what are those special, special areas that really need more attention? And I guess, you know, Operation Icebridge was brought up yesterday. I was involved in in the early formulation of that project. And that was really the thrust of our of our effort when we looked at airborne scoping. 01:44:00:09 - 01:44:18:13 Speaker 6 So there was a Smallsat group that was looking at, can we do a small set in between Icesat one and two? And then there was the airborne one. You know, we knew our aircraft could not possibly do global coverage. So it really became a question of, based on the instruments and aircraft that we have and the range and the basing locations, you know, what are the what are the critical things that we absolutely need? 01:44:18:13 - 01:44:57:06 Speaker 6 Time, series measurements. So that included sea ice. That included of these mountain glaciers that were mentioned yesterday that I said isn't good at. And so I think this community needs to do something similar in terms of focusing on what are these specific phenomena or or regions that are better suited to these these aircraft type measurements. So in summary, really looking forward to the workshop break out this afternoon to get a little to get more information and see where the interests are and hopefully develop a plan for, how we can mature some of these platforms to to serve this community. 01:44:57:06 - 01:45:21:18 Speaker 6 Again, these game changing technologies, I think, are something that a lot of missions are aren't able to to include in their plan. So I think it puts us at a really interesting advantage to do the types of measurements that really haven't been acquired before. And I'll just I'll end with something that helps, I think, get a sense for the scale scalar differences between orbital measurements and the like. 01:45:21:18 - 01:45:44:01 Speaker 6 A haps. So this would be if you were looking out the window of icesat two would kind of be what you would see if you're looking at us right now. And that little blue box is that is the what you would see if you were at 60,000 feet. So, you know, that's a pretty big scale mismatch. Well, not a mismatch, but it's the the hap systems. 01:45:44:01 - 01:45:56:22 Speaker 6 While they are very high altitude, they're still very low altitude compared to spacecraft. So they're going to have a much smaller swath and coverage area. So again, really, I think the important element there is to just to make sure that they're targeted in the right spot. 01:45:58:04 - 01:45:59:09 Speaker 2 All right. That's it. 01:45:59:17 - 01:46:05:02 Speaker 6 Thanks so much. 01:46:05:02 - 01:46:31:15 Speaker 1 Thanks. Spent a lot of toys. Makes me want to come work in your group. Maybe so. Our our next next presentation is going to be a a panel. And so we're going to maybe have have Joe and Mark kind of set up the panel and then so we don't blind our panel participants the display is. 01:46:33:13 - 01:46:58:24 Speaker 2 All right. Thanks for sharing with us. So so for our panel, we have a distinguished group here. We have Keith Kraus, Suzanne Saatchi, David Sheehan, Laura Magruder and Robert Trout. And if I could ask them to come by here just so that when we transition, we go quickly. So this panel is about the separate ability of vegetation from the surface technology. 01:46:58:24 - 01:47:23:18 Speaker 2 This is one of the challenges we recognize for for us TV. I have a couple examples just to prime the pipe here. This is from quakes. This is still imaging of Mammoth Mountain and in a very small region of interest in this, you know, there's a train model and of course, the color imaging and then there's an estimated surface below it. 01:47:24:05 - 01:47:59:20 Speaker 2 And they have had some success with measuring the vegetation height with areas that that don't have vegetation for reasons like fire, fire scars, of course, light arms, as we've discussed in some of the break out yesterday there's there's vegetation signatures to be had that separate from the ground and the radar you know of backs rubber rubber studies with long term changes of force structure using using using insar. 01:47:59:20 - 01:48:26:19 Speaker 2 But we recognize our trades in these technologies trades their fields of view, trades with coverage, clouds interfere. You need good lighting for stereo day versus night for for imaging is important in general. You know, you have to think about the spatial resolutions of sample versus sampling height resolution and then sampling the success of penetrating vegetation. Cross track field of view of your sensor. 01:48:26:23 - 01:48:52:24 Speaker 2 Is it a kilometer? Can you get dozens to hundreds of kilometers in a swath? Because that cuts at the world coverage needs. How how how sensitive are you to the observational conditions? When does it fail? Automate ability of the approaches that we might consider for doing the separation in a robust, reliable, quantified way in some of the technologies have our considerations. 01:48:52:24 - 01:49:25:17 Speaker 2 Imaging is passive. It's nice in that sense. But others, of course require, you know, have may suffer well to one of our square defects. Let's drive it into lower or lower the orbits. And there's sometimes considerations for the move to say the cross our techniques cross compatible other themes of of the whole program so bathymetry versus vegetation in lighter drives you to one wavelength versus another you really don't want to be at 532. 01:49:26:01 - 01:49:53:23 Speaker 2 Ironically the green laser for green vegetation, you want to be at 1064, but then you're not going to have a good time for Bathymetry. Then. And and as a rate described, you know, there's the world of modality and data fusion. How do we blend these techniques? So a couple couple of the panelists give me a few slides. I just want to show show I think this helps sharpen the pencil a little bit on our panel discussion. 01:49:53:23 - 01:50:23:16 Speaker 2 The idea that the surface topography versus the vegetation structure have a low correlation to to the to the underlying structure. So so so the idea of just doing interpolation may not always be a successful approach for inferring missing terrain underneath the vegetation and to vegetation height. So that's a consideration. And we cannot return to these questions because I won't have these there, but we can return to them on the screen. 01:50:24:15 - 01:50:44:05 Speaker 2 I return to them on the on the on the on the laptop is the panel. So here so keep Jeff also brought up some some material here so let's let's let's some let's see let's the transition to the panel because I worry about our time. So when we turn off the projector because I noticed they. 01:50:44:05 - 01:50:47:15 Speaker 3 Didn't work properly. So if they're just. 01:50:49:10 - 01:51:56:14 Speaker 2 Gonna be a good Yeah. All right. Thank you. Skip, may the panelists please join us here? Yeah, we're gonna. And so so why don't we have each of you introduce yourself very briefly and provide some initial about this this topic of vegetation separation. Hi everyone. I'm Keith Krauss with Battelle Memorial Institute. Back in 2010, I joined the Airborne Remote Sensing team at Neon and mostly focused on waveform for vegetation structure. 01:51:57:10 - 01:52:25:12 Speaker 2 But as part of this STV funded project we have right now, we're evaluating existing data and also doing some simulations to look at this problem a little bit more about what are the instrument and collection needs as far as capturing both canopy detail, understory detail and, penetrating down to the ground. And just in general, some of these concepts have been brought up earlier in talks about things like the canopy penetration but also classification of vegetation versus ground. 01:52:25:21 - 01:52:56:21 Speaker 2 But I think it's also a question of the spatial sampling. So for instance, you know, how many points on the ground do you actually need in order to properly interpolate surface model at some resolution? So I think these are still some questions that haven't really been fully vetted. And a lot of times, especially like with Neon, you'll just go ahead and run an automated processing and it'll interpolate over pretty large distances and nobody actually goes back to look at how well that's actually doing or where the errors are. 01:52:57:07 - 01:53:24:15 Speaker 3 Lori Magruder from University of Texas at Austin. I've had I'm a a light car enthusiast, I guess. And I worked on I sat and I sat too. In the realm of calibration and proving the quality of the data. And also at my my team at UT has worked a lot on airborne systems and and scaling up what we learn from airborne systems like our systems for waveform photon counting to possible space based opportunity. 01:53:24:15 - 01:53:51:13 Speaker 3 But what I'm also been really enthusiastic about is that the fusion of data in the sense of things like taking in tandem X derived items that we've heard about, getting biased in the canopy and not providing that, but using ice at two as a calibration factor correlating to that environment. And then through machine learning can predict we can correct those areas where Tandem X is not measuring. 01:53:51:13 - 01:54:18:02 Speaker 3 So you're kind of getting the wider coverage of Tandem Max with with a way to make it a better product. And then for Bathymetry, we do something similar if we want the wide spatial coverage of of imagery and then use Icesat two as a measuring at a ground control point and to get the absolute depth. So I think that's really a strong thing to consider in this, to complete the advantages of each type of measurement. 01:54:18:02 - 01:54:44:21 Speaker 7 So Sun's actually from JPL. I've been working on this vegetation and structure and ground for for a while now, trying to see how we can separate those. So I put three questions in my slide, so I'm going to repeat those questions. I think when you look at the literature and what has been done, especially in the vegetation community, you see two things. 01:54:44:21 - 01:55:08:15 Speaker 7 One, the more we look at the land and the more we look at the forest and where trees are, the more we find the problem is more complex than can be easily solved either analytically or with one instrument. And at the same time, I think as we look at our instruments, we find out that we have some solutions. 01:55:09:18 - 01:55:47:07 Speaker 7 They might be simplistic, but they have solved some of the problems, as our team was the example that was mentioned that we were able to do the kind of some level of topography globally. So there are three questions comes to my mind that I would like us to address. One is our existing systems, which is a source system that does wall to wall mapping globally and within to from is doing and tomographic measurements that now is being implemented on the space and with the light. 01:55:47:07 - 01:56:27:00 Speaker 7 Our system that we have now, which is more mostly doing sampling globally, can we really solve this problem of separating vegetation? The second question comes is on the complexity. We really cannot solve this problem just in the old fashioned analytical way because it's too many type of algorithms. But can AI and machine learning help us to tackle this problem a complexity by bringing data but letting a slightly different system to learn and solve this problem over large scales? 01:56:28:05 - 01:56:53:12 Speaker 7 The third question Andrea brought it up was We've been looking a lot on the static problem. What is the state of the vegetation to structure and therefore surface topography? Can a system that does a really good job to map this state of the system, such as a search team or such as a system that can does sampling enough to really solve the problem? 01:56:54:01 - 01:57:12:06 Speaker 7 Can the same system do the dynamics? Is there a way to build a system to architecture to really address the dynamics at the same time, that does a good job to map for the first time this state of the system. Thanks. 01:57:12:06 - 01:57:14:21 Speaker 2 Thanks, everybody. I'm David GREENE at the. 01:57:14:21 - 01:57:18:04 Speaker 5 University of Washington. 01:57:18:04 - 01:57:19:08 Speaker 2 And I guess I am. 01:57:19:08 - 01:57:41:00 Speaker 5 Responsible for stereo imaging and thinking about vegetation, ground classification. And my job is pretty easy. We all know that stereo can't give us ground returns right? I see people smiling, right? I mean, how many people have used some commercial structure for motion software like yourself? 01:57:41:00 - 01:57:42:03 Speaker 2 Better shape or picks? 01:57:42:03 - 01:57:52:07 Speaker 5 40 mapper to look at some vegetated areas. I'm just curious, Johannes, did anybody use that classification options showed. 01:57:52:07 - 01:57:53:05 Speaker 2 An example of that. 01:57:53:11 - 01:57:58:21 Speaker 5 How many people have tried the classification? How many people are happy with the classification? 01:57:59:13 - 01:58:04:00 Speaker 2 Okay, a few people. Okay So it is possible to do this. 01:58:04:00 - 01:58:09:24 Speaker 5 There's commercial capabilities out there that are doing this right now. And the classification routines are. 01:58:10:05 - 01:58:10:14 Speaker 2 You know. 01:58:10:17 - 01:58:14:15 Speaker 5 Your choice of classification makes all the difference. So they're using the RGV. 01:58:14:16 - 01:58:16:20 Speaker 2 Values from the images as well as the. 01:58:17:03 - 01:58:29:04 Speaker 5 Three dimensional point information and a number of other assumptions. Okay So I could say more about that, but I just wanted to add some people are aware of the problems. For those who aren't. It is possible to do this. 01:58:29:04 - 01:58:47:03 Speaker 2 Yeah, just a caveat there. I'm happy with the classifications, given my understanding of what the limitations are. I think they do pretty well, given what you're actually using to do it with. 01:58:47:17 - 01:58:50:23 Speaker 5 So yeah. So there's a lot of caveats. Yeah. Certain types of vegetation. 01:58:50:23 - 01:58:53:04 Speaker 2 Work better, so there's more. 01:58:53:04 - 01:58:54:06 Speaker 5 Work that needs to be done. 01:58:54:06 - 01:58:54:14 Speaker 2 Right. 01:58:55:20 - 01:59:10:19 Speaker 5 I Guess just I don't want to spend too much time, but I just wanted to get a quick poll. So I'm mostly a cryosphere scientist and thinking about exposed areas. So vegetation has always been my noise. But in the past couple of years as part of this project, I've been working with the folks at Goddard. 01:59:11:03 - 01:59:11:20 Speaker 2 And thinking a lot. 01:59:11:20 - 01:59:17:14 Speaker 5 More about vegetation recovery and actually looking at our ability to do satellite stereo imaging over. 01:59:17:14 - 01:59:18:21 Speaker 2 Vegetation for. 01:59:18:21 - 01:59:19:22 Speaker 5 Canopy height. 01:59:19:22 - 01:59:20:10 Speaker 2 Return of. 01:59:20:12 - 01:59:26:04 Speaker 5 Canopy surface, as well as thinking about how big of a gap do we need to actually see the ground. 01:59:26:04 - 01:59:30:00 Speaker 2 And what are the imaging acquisition parameters that we need to actually. 01:59:30:00 - 01:59:35:23 Speaker 5 Do this? It is possible, right. And and I there's a lot we need to do as others. 01:59:35:23 - 01:59:37:05 Speaker 2 Have mentioned, with deep. 01:59:37:05 - 01:59:41:01 Speaker 5 Learning. So we're developing routines or training models. 01:59:41:01 - 01:59:44:14 Speaker 2 On airborne light, our data where we have very good ground returns. 01:59:44:24 - 01:59:46:11 Speaker 5 And we can do this from. 01:59:46:11 - 01:59:47:10 Speaker 2 Satellite imagery. 01:59:48:12 - 01:59:53:04 Speaker 5 And I think the fusion is really the answer. We've heard that over and over again this today. 01:59:53:14 - 01:59:54:18 Speaker 2 I can talk more about it. 01:59:57:04 - 02:00:00:18 Speaker 5 If you give me a couple of sparse wide beams and you give me. 02:00:00:18 - 02:00:04:14 Speaker 2 A handful of sparse ground returns, I I'm. 02:00:04:14 - 02:00:16:03 Speaker 5 Confident that we can actually do some intelligent interpolation and extrapolation from those using what we can get from the canopy in some sparse kind of surface returns with stereo. 02:00:16:03 - 02:00:16:14 Speaker 2 Imaging. 02:00:17:04 - 02:00:21:20 Speaker 5 So I'll leave it at that. I have other thoughts and I can tell you about some of the things Mac SA. 02:00:21:21 - 02:00:22:03 Speaker 6 And other. 02:00:22:03 - 02:00:53:17 Speaker 2 Companies are doing. They're selling these products. I'll pass on the mic for now. Thank you. My name is Robert. True Heft. I started in radar remote sensing about 20 years ago. I was looking at very long baseline interferometry, looking up at quasars. So I have an interferometric. I've been at JPL for 40 years and had several different positions during that time. 02:00:53:17 - 02:01:26:12 Speaker 2 So I feel like I haven't been at one place. As far as the question of can we find the ground? So if I can meet you at my poster, I can show you a ground bump. And I think in Interferometric SAR is has been my, my specialty. And there are a couple of ways to find the ground. One is with multi baseline imaging, which is what we have out there with UAV, SAR. 02:01:28:02 - 02:02:03:03 Speaker 2 Another way to do it is to histogram the phases of returns that come from different spots on the ground, different looks, as we call them in radar. And there are a few other ways, but my main interest, however, in remote sensing is being able to attribute some biophysical feature of the of this plot of 50 meters worth of trees, their interactions with each other, the biology from the remote sensing this is coming from someone who doesn't really know what a protein is. 02:02:03:23 - 02:02:32:24 Speaker 2 But I think the real exciting thing is to get to the biology, both for its own sake and to help guide the remote sensing. So that's where my pushes and I'd be very happy to discuss the poster with you because that goes into more detail about my interests. Thank you, everybody. We're going to open up for questions, but I'm going to kick off a question. 02:02:34:00 - 02:03:11:04 Speaker 2 So April's so, so a lot of a lot of what we just heard. At least what I heard was there's different kinds of inference that's being evoked versus measurement for for inferring the ground. There's there's like our providing a yardstick for seeing, seeing, seeing points in the water. And then you're using imaging to just sort of fill it in where the features are similar and and with with different, you know, intelligent interpolations, perhaps machine learning approaches. 02:03:11:20 - 02:03:37:16 Speaker 2 These are not measurements that are more than inference in. And where do we go from a science perspective? Is that is that when it's inferences is in a quantity things we have ability to do and a quantification of uncertainty of that inference. When do we when do we lose validation and trust in those measurements? David, you mentioned, you know, works on some vegetation types and not others. 02:03:37:21 - 02:04:18:24 Speaker 2 Obviously, this is an early practice that has to evolve and mature. So does anyone want to take on the idea of inference versus measurement and what's needed here? I would just want to know if I could ask you, can we change the word inference test to estimation? Because estimation is totally legitimate. The inference has this stigma of we waved our hands over it and but with estimation of the type that's done with lots of different types of interferometry, you're using a physical model which has integrity, and that's not inference, that's estimation. 02:04:19:09 - 02:04:42:11 Speaker 2 So maybe I misunderstood what you meant by inference. If I could just maybe clarify it can could you define like I mean, Joe was asking sort of for like a confidence level on whatever, however we decide what that intermediate point is. I mean, so maybe there's measurements, maybe it's not a measurement of that exact space, but you're interpolating. 02:04:42:15 - 02:05:14:06 Speaker 2 So I think there's lots of ways to get out. If it's machine learning, that may be a little mysterious as to how we arrived at something and but but is there a way to attach a confidence level to it? Can people tell the difference between one point that that was measured and, let's say, maybe tied down to the light hour versus another one that was interpolated between spots or I mean, so at the end when we have a data product, what does the data product look like and how do people use it and how much confidence should they have in it? 02:05:15:01 - 02:05:39:23 Speaker 2 Well, errors are key. And they they come out of either physical or statistical models. Statistical, I'd say, is what they are. Maybe we don't know the underlying physics or biophysics. But then there's another set of models which are based on physical principles that are well accepted. And they almost all I mean, they, they tout the error analysis. It's important focus. 02:05:40:13 - 02:05:43:18 Speaker 2 So anyway, I didn't want to focus on this. 02:05:44:13 - 02:06:13:03 Speaker 7 I think the reality is this there are a lot of models, especially physically based models. They're extremely complex. If you come to the optics reduced of transfer and you come to the radar, the electromagnetic models have a lot of parameters that cannot really be inverted to really give. So you make a lot of assumptions on that estimation. So the key problem becomes uncertainty analysis. 02:06:13:14 - 02:06:41:06 Speaker 7 So when we look at the uncertainty analysis, we notice that some models work. Most of the literature is actually on the sorry is on the flat train trying to retrieve height. And when you go to the complex train, things becomes difficult. Some of these estimations have large uncertainty because you can estimate, but what is the accuracy of that estimation becomes difficult. 02:06:41:07 - 02:07:07:24 Speaker 7 The same thing goes with lighter. If you look at the Jedi data set, you notice that across tropics, all the low power lasers hardly reach the ground and give you estimation of the ground or the actual height. So in the tropics we only look at the high powered lasers and even that in the topography areas, a large number of those filter like almost sometimes 80%. 02:07:07:24 - 02:07:30:11 Speaker 7 The jet I did the lucky we're lucky that it has billions of billions of samples. So that's another issue with that. And then stereo is another thing. So I think the solution we are thinking about is to really thinking about out of the box that one model would do everything but thinking about how do we really bring all of these things together? 02:07:30:21 - 02:08:06:08 Speaker 7 Data fusion is going to be the key thing. And then solving the problem with data fusion really require AI systems for the first time in our approach, even though we all come from the analytical background, this is something that we may not be able to solve analytically. So we need to think about A.I. by bringing all of all of these systems together in a kind of so I think STV would benefit to think about, to think about these multiple assets that we have, how to use them. 02:08:06:08 - 02:08:35:12 Speaker 3 I feel like you can get a global solution and that's full of inference and measurement combined. And then if you can identify enough locations where you have maybe high resolution light hour as an example, as a as a reference with enough diversity in an environment may mean canopy cover or train slope or all the things that cause uncertainties, then your your AI is to model the uncertainties based on these feature predictors in the in the models. 02:08:35:12 - 02:08:55:11 Speaker 3 And it seems to, to work pretty well and you don't actually have to have a melt. You can pull in steam even though it's a heritage product and use that slope just because the model learns than what the estimated uncertainty is. So that seems like a good way to to start understanding the quality of your data. 02:08:55:11 - 02:09:17:05 Speaker 2 I would just add, I think one of the challenges today is that a lot of the data products out of the box might have some quality filters, but it's not totally clear. You know, if they don't flag it as bad for some unknown reason, like cloud cover or signal levels, you know, how do you how accurate that ground detection point is? 02:09:17:16 - 02:09:38:01 Speaker 2 And so I think more work to be done there just to really go especially say in the case of Jedi, if you're looking at individual shots and the Jedi teams talked about this now starting to look at regional groups of shots and trying to be a little more intelligent about marking points that they think might not be estimating the ground accurately. 02:09:38:15 - 02:10:04:12 Speaker 2 And if we could have that higher level of maybe quality assurance, then I think we know which measurements are the good measurements and then what areas are missing and then how you might combine that with the, you know, estimation or inference that you would fill in the gaps between that good measurements that you're highly confident actually saw the ground area. 02:10:04:15 - 02:10:05:17 Speaker 2 Andrea has a question. 02:10:07:02 - 02:10:28:03 Speaker 3 How much field validation do you need to to do when you show the vegetation model? We by doing field work, we're getting a pretty good handle of what we're missing, of the tops of the narrow trees. But I'm curious what experience you have or what you find for the necessity of that. 02:10:28:03 - 02:11:02:01 Speaker 7 It's another tough question technique. We have been more successful getting vegetation height from LIDAR and in field. Field measurements of vegetation height is extremely erroneous. Unless you find single trees and you can do with a laser and you find two people there. So Lider has really helped us airborne light hours and drones and stuff to redo this problem, especially again when, you go to the complex areas and try to find, you know, a tall 360 meter someplace and measure it on the ground. 02:11:02:01 - 02:11:29:12 Speaker 7 You definitely have 5 to 10 meter errors depending on who measures. The second thing is ground topography. There are techniques that people have used in open spaces to with laser or other techniques to create a slope that has been traditional way of doing it. But when it comes to under the forest, all of our techniques into really getting elevations and mapping ground topography becomes really difficult. 02:11:29:22 - 02:12:06:03 Speaker 7 Even terrestrial lasers haven't been very successful under the forest to do that. They do the vegetation as structure very well, but under topography. So we really do not have, even though we do a lot of field work on the ground for some of our data set. But to this question of how do we get ground topography or vegetation height, it would be difficult, but, you know, laser or terrestrial lasers are pretty good on the vegetation structure on the ground and becoming more and more available. 02:12:08:01 - 02:12:10:24 Speaker 5 Yeah, well, I think it's a it's a great question and. 02:12:11:10 - 02:12:15:24 Speaker 2 My philosophy is, you know, Craig showed the three depth map. 02:12:16:13 - 02:12:21:06 Speaker 5 Right? We have in this country, in Europe and other parts of the world. We have an incredible. 02:12:21:06 - 02:12:21:22 Speaker 2 Archive. 02:12:22:05 - 02:12:49:02 Speaker 5 Of airborne laser scanning data available to us. And each one of those surveys has a vendor metadata report where they've done detailed evaluation, validation of their returns both in the open and then also underneath the canopy. So that's vetted. We know the uncertainty of those, and that's our training data and we have it over time. In Seattle, for example, we have four airborne lighter surveys done in the last 20 years. 02:12:50:01 - 02:12:51:16 Speaker 5 So we have repeat. 02:12:51:16 - 02:12:53:12 Speaker 2 Measurements from Airborne and. 02:12:53:20 - 02:12:58:11 Speaker 5 Something from 2003 is very different than something that was collected last year in terms of the. 02:12:58:11 - 02:13:02:23 Speaker 2 Quality of the data and the validation that's done. 02:13:02:23 - 02:13:15:14 Speaker 5 It brings up another question, which is how how long can you use an airborne light? Our dataset with ground that's actually changing. So some of us are thinking about permafrost. This is a huge issue, like you can fly. 02:13:15:19 - 02:13:17:16 Speaker 2 Airborne light hour and a. 02:13:17:16 - 02:13:30:06 Speaker 5 Year or two later it's no longer usable as ground truth. Right. So that's an important question that I don't think we've really thought much about. We kind of say airborne light is the gold standard or it's the truth or the training data. 02:13:30:06 - 02:13:30:24 Speaker 2 That we can use. 02:13:30:24 - 02:13:40:05 Speaker 5 But as I think it was, Stephen showed it yesterday, the 2030 centimeter offsets between these swaths in some of these airborne products. Right. 02:13:40:05 - 02:13:43:02 Speaker 2 So I think there's a lot of things we need to be thinking about there. 02:13:44:11 - 02:13:47:07 Speaker 5 And I could say more about that. I think with the some of the. 02:13:47:19 - 02:13:49:04 Speaker 2 Creaking and understanding. 02:13:49:04 - 02:13:56:04 Speaker 5 The spatial correlation of errors in these truth data sets, it allows us to answer these uncertainty questions or at least start. 02:13:56:17 - 02:14:40:13 Speaker 2 To model it a little better. I can only relate my experience with fieldwork, which is in the tropics and in boreal forest. What we do is in both cases is or you may I with someone who is experienced in the dense tropical forest lives there can do height measurements better than a laser whereas or about as good. And so we have 1 to 2 meter accuracy on height or let's say agreement between Interferometric SAR and the field work. 02:14:40:20 - 02:15:15:06 Speaker 2 So we need to measure is usually for some hundred trees or so per per plot. The height, the height to the bottom, the canopy, the debate. And then we make a model profile out of that and we compare those to the profiles from UAV saw. And qualitatively and this is only qualitative conclusion, it looks pretty good as far as heights go. 02:15:15:06 - 02:15:42:17 Speaker 2 Studies come out of DLR showing one two meter type accuracies agreement with with fieldwork. So I guess the short answer is various heights and debates and that gets you a good meter level agreement with remote sensing. Yeah. 02:15:47:07 - 02:16:06:07 Speaker 3 And then partly where I was going that is that your training set. But you need to do some parameter zation so you're going to count for what you're missing in your remote. We further remotely synced observations, right? That's got to be part of your model. You have ground truth and we know when we get far away and the resolution degrades, we miss things. 02:16:06:07 - 02:16:17:13 Speaker 3 And you kind of alluded to that we need to extrapolate and have sophisticated models that can account for it. 02:16:17:13 - 02:16:39:11 Speaker 2 Any questions from the I like the discussion somewhat philosophical between inference estimate and measurements and Susannah was mentioned a few times AI and machine learning, which to. 02:16:39:11 - 02:16:40:01 Speaker 5 Me is like a. 02:16:41:01 - 02:17:15:00 Speaker 2 And doesn't tell you how wide things work and that they would classify that as inference which Robert is trying to stay away from to explain biology, for example. So I'm kind of curious of what the opinions are. Feelings are from you guys on these different aspects. Well, I interesting to put it that way. So it is philosophical. My personal is this is totally not for public consumption or it is recorded. 02:17:15:08 - 02:17:37:21 Speaker 2 I'll never mind. My view is I'm very interested in air and I'm not dancing around it. But my my opening line is always the best thing that I can do is teach us how to do it analytically, show us the solution, and then we go and find out why. 02:17:37:21 - 02:18:07:14 Speaker 7 I think it's unfortunate that our community doesn't know so much about. I. One of the first AI technique was done by Shannon, who was a signal processing guy who basically tried to infer and distribution histogram of your signal when you only sampled the signal in few places. That's Shannon theory where the maximum entropy approach comes in. So we maximize the entropy. 02:18:07:23 - 02:18:35:19 Speaker 7 You can reconstruct your distributions from some samples of that distributions. It's a mathematical model that's done most of the A's or mathematics, and it works basically on this statistical approaches. The Random Forest is a good example of it. Then you also works on it. It comes from mathematics, it comes from optimization theory to really get the distribution done. 02:18:36:08 - 02:19:06:07 Speaker 7 So it is in a statistical approach. But the interesting thing is how it learns from a large type of datasets where analytical models cannot easily learn from that because the analytical model are complex. Unless you go to linear regression, which most of our models are working like that we we start with the tough electromagnetic models and then when we want to estimate, we just to the most simplest relationship to really do this. 02:19:06:16 - 02:19:35:22 Speaker 7 So my think is data fusion and AI is becoming more and more widespread. There are new techniques are coming out. The generative AI techniques that recently on self-learning is helping a lot to bring samples from different parts of the world instead of trying to write an electromagnetic equation for it, which is difficult to do to learn how to solve your problems. 02:19:36:05 - 02:20:03:19 Speaker 7 There are also right now ways to combine analytical models, and I to improve the estimation. So I think I think there are room in this think we know one one of the approaches doesn't work because we start with a complex physical model. When we come to estimations, everybody approaches the simple linear equations or a very approach to do this. 02:20:04:05 - 02:20:27:04 Speaker 7 And those techniques really cannot work widespread because when you go to another area, you would have to change your model and come up with the different ways to do it. So I would be open to A.I. to help us with data fusion doesn't mean that we have to give up the measurements. The measurements has to be done anyway. 02:20:27:04 - 02:20:49:11 Speaker 3 I don't know if I'm going to answer your question, but I was at a workshop recently on trusted AI like and the trust of the AI. You're using AI and do you really believe what it's telling you? Because I think it's a really powerful thing. And they were saying that in terms of autonomy and and using AI for autonomous vehicles as an example, the reason it's not trusted is the humans don't trust it. 02:20:49:11 - 02:21:06:12 Speaker 3 Like in that when you have a what is it that adaptive cruise control on your car and and people say it never works they keep slammed on the brakes themselves because they don't trust the car to to stop on itself. And I think we need to get to a point where we do use AI because it's really powerful in a lot of ways. 02:21:06:12 - 02:21:24:13 Speaker 3 We're doing feature extraction or uncertainty modeling globally, but we have to start figuring out a way to trust it. And it's, I don't know, I, I think it's a it's to be the the way we do a lot of the separation of vegetation from train. 02:21:24:13 - 02:21:36:20 Speaker 5 So Mark, it's a really good question. And I would argue well, I think as scientists and engineers, we want to know why something works, but does it really matter? 02:21:36:20 - 02:21:38:06 Speaker 2 I guess on some level. 02:21:38:16 - 02:21:57:15 Speaker 5 If we're getting the well, we'll be validated we're getting the right answer. I know what my inputs are. I know what the models see and I know what it's the information that's been given it don't need to know exactly what it's doing. I'd like to and if we want to understand the physical processes, that's that's a different story. 02:21:57:15 - 02:22:20:07 Speaker 5 But I'm talking about fundamental measurements. I'm talking about an accurate point cloud and an accurate elevation model, not a higher level, a level three kind of measurement that we want. So I think it's our job design, the machine learning training workflows and the the architectures to have the appropriate validation to do this so we can trust these things. 02:22:20:07 - 02:22:27:20 Speaker 5 And I think that one model is not going to work. I don't know how many how many people have actually trained a convolutional neural network or done some kind of machine learning. 02:22:28:10 - 02:22:30:14 Speaker 2 Okay, you ask that question in. 02:22:30:14 - 02:22:36:11 Speaker 5 Five years and there's going to be even more. And maybe some of us are managers and but someone in your group is doing this, right? 02:22:37:05 - 02:22:39:22 Speaker 2 It's got to be. Yeah. Every student right now, every. 02:22:39:22 - 02:22:51:01 Speaker 5 Postdoc, every student is doing this. Okay. So I could say a lot of things. One model is not going to work for this this problem. 02:22:51:10 - 02:22:52:02 Speaker 2 We're going to need. 02:22:52:02 - 02:23:06:08 Speaker 5 To have separate models for different regions different climate regions, different vegetation types, because transferability and generalizability is going to always be an issue for us. So I think that's going to be one of the challenges. It's not just going to be the magic. 02:23:06:12 - 02:23:07:14 Speaker 2 Eye that we put in. 02:23:08:06 - 02:23:28:18 Speaker 5 Stereo light our own radar from all over the planet, and we get the right answer everywhere. So I think that's that's a really big gap and key question where we're looking at these little study sites right now. But we need to be training on continent scale datasets and trying to figure out how to split this up. 02:23:28:18 - 02:23:47:24 Speaker 3 But I'll just be real quick. Oh, I just wanted to say to your point, because that's very true what we're doing, we're developing an a long track product for Icesat two in Bathymetry, which is a really hard problem because you're similar to vegetation from ground. You just turn that upside down because we need water surface from seafloor, right? 02:23:47:24 - 02:24:08:03 Speaker 3 And when one model is does great in the Bahamas but fails in other locations. And so we have to start thinking about ensemble learning. You get all the models and as long as the errors are correlated, then you model the output of the models and you get at something greater than the sum of its parts like that. And then you cover the it makes it a little more robust. 02:24:08:03 - 02:24:14:20 Speaker 3 So I think that's something to think about moving forward. 02:24:14:20 - 02:24:26:19 Speaker 6 So I just wanted to ask a question to shift gears a little bit to think about, oh, well, I'm so sorry. 02:24:26:19 - 02:24:29:23 Speaker 3 So John Randall, you wanted to respond to this and then Matt will move on. Sorry. 02:24:31:11 - 02:25:01:23 Speaker 2 Go ahead. Yeah So I put a link to my age. You talk in the chat, everybody wants to look at it. We've been developing our models and machine learning models here based on neural networks, random forest. And we're building a model called Quick GPT, which you'll be hearing more about. Basically, it stands for Generative Pre-Trained Transformer. It's the same technology that Chad GTP is GPT is built on, but we're doing time series prediction with it. 02:25:02:10 - 02:25:31:08 Speaker 2 So that's a very general thing and we're doing it specifically earthquake now casting. But the basic code which I'm happy to give anybody basically is the Time series prediction code. So we're finding it's quite versatile. Many of our codes, both neural network codes and machine learning codes are have been written actually with help from GPT. You can't just put in a prompt and get something back, you have to debug it. 02:25:31:19 - 02:25:55:23 Speaker 2 But I will say that GPT saved me many months of I was able to get a code running in about one day which would have normally taken me probably six months, so I'm finding it extremely useful. I will also add that that openai I came out with something called the GPT Score Store that you may have heard about, where you can actually build with similar, simpler prompts. 02:25:55:23 - 02:26:20:18 Speaker 2 You can build GPT codes and put them in a store. Other people can download them and use them like the App Store, and that's probably going to be extremely important going forward. So there are so many things you can do with this and it really has helped us quite a lot in our progress. In fact, I feel like I've got an assistant me all the time now helping me write code anyway, so that's my take on it. 02:26:21:19 - 02:26:25:02 Speaker 2 Thank you very much. Next question. Thanks. 02:26:25:02 - 02:26:50:19 Speaker 6 I guess I was moving ahead to think about what sort of experiments could be done in the future to help with this problem. And when I look at like the Pacific Northwest on Google Earth, you see kind of a checkerboard pattern of these big clear cuts right next to a beautiful old growth stand. It would seem to me, potentially partnering with some of these forest landholders to do before and after or even just imaging what they're currently is. 02:26:50:19 - 02:27:07:02 Speaker 6 Where you have content, you have one area that's bare earth that was just maybe logged last year versus a forest and would that be useful or potentially working with them to say, hey, I want to scan a spot before you before you log it and then you go back and log it and see how good your ground returns were. 02:27:07:05 - 02:27:08:12 Speaker 6 Is that something to consider? 02:27:08:17 - 02:27:30:22 Speaker 2 Yeah. Well, we've been we've been using. Yeah, I think definitely that's together with learning to even monitor deformation that just like the guys was geothermal studies where somebody took say a more recent where there was a clear cut and they have a more confidence in the terrain and actually trying to go back and take a look at a previous collected. 02:27:30:22 - 02:27:59:03 Speaker 2 No no time to see what kind of accuracy you could get. So for instance, with some of the neon sites like Abbey Road, for instance, that would be possible. But I don't think I've ever seen a study where somebody actually went to do that. So I think maybe coordinating those efforts would be a good thing. And so making sure that you have that before a shot that's closer in time with the same instrument, that then you could get that cut, clear cut with, I think that would make a better dataset. 02:27:59:03 - 02:28:24:17 Speaker 2 Hey. Yeah, hello. Yeah. So speaking of AI and like test sites and how we can capitalize on this, I don't want to be vintage in this, but, you know, a quote from Lord Kelvin, if you can't measure it, you can't improve it. And so when we talk about AI and GPT models, we want to and we want to make sure they don't have illuminations. 02:28:24:18 - 02:29:05:08 Speaker 2 It's a term that came up recently with GPT three models we should acquire and kind of have these multi dimensional data sets acquire with multiple sensors at the same time, light our radar for metric and all we can do at this time and I think within this STV project and is to make those data available like you have data set of vehicles or annotated satellite images that people use or teams use to improve their AI models. 02:29:05:22 - 02:29:24:15 Speaker 2 The same thing could happen for STV. And so you can kind of build some experiments where you have a multitude of data sets. Most of the techniques acquire the same time and then release it and see what what happens in the future make the make those open access. So that's, that's my takeaway. Thank you. Yeah. 02:29:24:23 - 02:29:26:18 Speaker 5 Just following up on that and to. 02:29:26:22 - 02:29:27:21 Speaker 2 That question. 02:29:27:21 - 02:29:33:01 Speaker 5 Earlier, so we need systematic collections not just. 02:29:33:01 - 02:29:34:22 Speaker 2 With, you know, airborne litter over. 02:29:34:22 - 02:29:36:09 Speaker 5 Here and then in SA. 02:29:36:09 - 02:29:36:17 Speaker 2 Every. 02:29:37:04 - 02:30:03:08 Speaker 5 Year to be contemporaneous, ideally within hours, but preferentially within a few days. And defining those criteria is going to change or it's going to it's going to depend on how surface is changing. So we have we've tried to do these tasking campaigns with asset to off nature observations with stereo observations using, all of the commercial resources up there using available commercial SA. 02:30:03:21 - 02:30:05:03 Speaker 2 And then flying UAV. 02:30:05:03 - 02:30:21:24 Speaker 5 Light. So I think if we can coordinate those for a subset of sites and I think that's one of the things that we may be able to get out of this workshop is finding sites like the Harvard Forest or the search site or someplace where there's validation. There's a lot of study that's been done. 02:30:22:05 - 02:30:23:00 Speaker 2 And coordinating. 02:30:23:00 - 02:30:24:11 Speaker 5 These systematic tasking. 02:30:24:11 - 02:30:25:11 Speaker 2 Campaigns in airborne. 02:30:25:11 - 02:30:27:14 Speaker 5 Campaigns. It gives us something to move. 02:30:27:14 - 02:30:28:02 Speaker 2 Forward. 02:30:28:02 - 02:30:38:03 Speaker 5 As well as some of the simulations that Keith has been talking about. So I've got a few and I've started I've been mining the archives of the 3D applied our world view stereo, the Icesat. 02:30:38:10 - 02:30:39:13 Speaker 2 And Jedi collections. 02:30:39:20 - 02:30:49:20 Speaker 5 To try and find coincident measurements across the United States. And there's dozens of them within a few days. But I think this key question of like, how long can I go. 02:30:49:20 - 02:30:52:16 Speaker 2 Between my radar measurements, my lighter measurement, my. 02:30:52:16 - 02:30:56:02 Speaker 5 Stereo measurement for the ground or whatever I'm trying to measure has. 02:30:56:02 - 02:30:56:22 Speaker 2 Actually changed. 02:30:57:08 - 02:31:01:06 Speaker 5 For for sea ice. It's seconds. That's a real problem. 02:31:01:17 - 02:31:02:04 Speaker 2 Okay. 02:31:02:17 - 02:31:28:06 Speaker 5 For the middle of the desert, it's potentially years. Okay. So and vegetation is one that I'm not as familiar with. I'd love to learn more from what people think is acceptable, but just watching trees blow around in the wind, maybe it's also seconds in terms of actually getting the same measurement, the same object contemporaneously open for the please think about that. 02:31:28:06 - 02:31:32:11 Speaker 5 If your ideas, let me know. Let's talk to. 02:31:33:04 - 02:31:33:19 Speaker 6 Everybody. 02:31:34:05 - 02:32:04:11 Speaker 2 So I have another question with air and the machine. I need that air and the motion. Any requests you have very carefully security training datasets. So cave is a complexity is and he said it would be a cool open mission okay that complexity associated with the vegetation structure, all the dynamic processes as you space and the time and also very nice uniform archive of data distribution. 02:32:05:01 - 02:32:30:10 Speaker 2 I was wondering if any of you have thought about how we overcome that problem. If we apply that to air and emotionally in the operational data product and to quantification. That's once a second question is that is always we are going to face that sufficient the data as can the training dataset is of course with more data coming, that situation won't improve. 02:32:31:02 - 02:32:55:17 Speaker 2 But if you have insufficient data to train your machine and irregardless what kind of machine, any model you use you always have those kind of false, negative, false, positive estimation. Have you ever about how we overcome that there this kind of I hope is kind of iterative evolution processes. 02:32:55:17 - 02:33:23:07 Speaker 7 So right back to just briefly I think the problem DG said it's it's real that's the problem with A.I. but that's the problem with any model you if you don't have training you cannot really build the model with the coefficients right coefficient is regardless if it's a regression or a complex, you know, electromagnetic models. So having that training is important. 02:33:23:07 - 02:33:55:08 Speaker 7 So the question for us is this TV is do we have enough light our sampling globally? Did I really have revolution in the samples? Regardless, if you throw away 80% of the data, 20% gives you billions of shots with really high quality vegetation structure globally. I think that's that's an amazing that we can do this. Then we have more and more global datasets coming nice. 02:33:55:11 - 02:34:22:21 Speaker 7 It is going to be launched. It's going to create 100 terabytes of data per day for their results that they want to come. Science that to come. So I think we need really to think about this think when we are designing a study is there areas that one instruments trains another instrument. The community of the SA has been using data fusion with LIDAR to help them to resolve the problem. 02:34:22:21 - 02:34:52:24 Speaker 7 So if you do great job with the wall to build radar measurements globally in the Tomographic mode or Poland SAR mode, you have enough light, do you have enough light or samples to do this? So I think that question is part of our. STV So I'm going to go back a little bit to my first question is does the community, regardless of A.I. or any other type of model, does the community think the instruments we have now can do the job we? 02:34:52:24 - 02:35:21:00 Speaker 7 Just need to think about how to put it in space or how do we really design that architecture? Or do we need to come up with the different sensors to do this? If the answer is yes, we know how to do it. We know how to build sensors to do it. The question becomes architecturally exciting. We really do just seem to have that global solution regardless of it. 02:35:21:00 - 02:35:33:24 Speaker 7 Use A.I. or no. But that's, I think, the way we need to go forward. 02:35:33:24 - 02:35:59:22 Speaker 2 Can I? So I wanted to ask this last question about the data quality for A.I., but I might frame it in a different way then, because in the early morning we were talking about the trade off between doing one snapshot of our landscape really accurate or going for a solution model, focusing on the change that will give you maybe less accurate measurements, but we'll track that changed. 02:36:00:15 - 02:36:28:11 Speaker 2 I was wondering if that for the AI solution, the biophysical, you know, modeling, if they will be more relevant for this one snapshot because it will give us way more accurate data to do the developments of the well hey why are a few don't see if you see the benefits of trying both solutions like tracking change or having this one snapshot in time. 02:36:28:11 - 02:36:54:11 Speaker 7 I tell you the truth, I don't know. I think the question comes in on the complexity of the problem. If you don't know how things are changing globally because you really don't have data to show how things are changing. You just have the measurements, right? You have the sensor measurement, then would it be also appropriate to train models that can detect changes? 02:36:55:02 - 02:37:21:04 Speaker 7 Because the reality both static. I mean, I showed the picture of another forest in China that you showed those rock outcrops and trees growing on top of it, that complexity and try to model it anyway. So you you want to be able to have a global solutions and accurately globally. So that's where I think I can help you. 02:37:21:04 - 02:37:42:12 Speaker 7 It doesn't mean that you shouldn't have the measurements. You need to have the measurement, you need to have the training and and a good quality training data. And you also need to know the physics of the problem. So you already and on the physics the problem, I think we know we have limitations to really getting this done accurately globally. 02:37:42:12 - 02:37:56:08 Speaker 7 So that's why I suggest the AI as option. But it doesn't have to be. If you really can solve the global problem with analytical models, why not? 02:37:56:08 - 02:38:15:24 Speaker 5 So I think I'm hearing the same kind of questions come up like how can we trust the AI? That is the training dataset good enough? Because that's that's really the key if your training data are if you have enough, usually it's the volume, but also the quality of the training data determines the prediction quality of your model. 02:38:15:24 - 02:38:16:19 Speaker 2 Sure, the content. 02:38:17:04 - 02:38:37:06 Speaker 5 But I think we as I keep saying, we have huge archives, we can collect new datasets, but the other tool that we have is to create purely synthetic datasets. And Keith can potentially comment on this. Where we start with we create some synthetic surface or simulated surface, we put some trees on. 02:38:37:06 - 02:38:39:18 Speaker 2 There and then we can simulate any some. 02:38:39:18 - 02:38:40:08 Speaker 5 Of the losses. 02:38:40:08 - 02:38:43:02 Speaker 2 That Marco was talking about. We can simulate any of the. 02:38:43:08 - 02:38:44:16 Speaker 5 Configurations, the different. 02:38:45:00 - 02:38:50:00 Speaker 2 Techniques. It's that's what you need to train these models, right? And you have. 02:38:50:00 - 02:38:50:13 Speaker 5 The ability. 02:38:50:13 - 02:38:54:16 Speaker 2 To alter things, introduce noise, whatever, and then, you know. 02:38:54:21 - 02:38:58:07 Speaker 5 The answer, you know, you have like a perfect validation data set at. 02:38:58:07 - 02:38:58:20 Speaker 2 That point. 02:38:59:10 - 02:39:02:13 Speaker 5 And there's, there's no limit. You could generate your own training data. 02:39:02:24 - 02:39:05:19 Speaker 2 So I think we need to be doing more with that. 02:39:05:19 - 02:39:10:02 Speaker 5 And then also bringing in as much of the observer observations we have from. 02:39:10:02 - 02:39:10:18 Speaker 2 Everyone lighter. 02:39:10:18 - 02:39:16:17 Speaker 5 Etc.. So keep I don't know, do you want to say anything about that? 02:39:16:17 - 02:39:39:12 Speaker 2 Yeah, I'll, I'll just add, I think one of the challenges with the simulation is it's pretty labor intensive to build these complex scenes, but and to build them large enough to think about space type measurements. But with that being said, there's different models. There's Terzic from Rochester Institute of Technology, there's dirt out of Europe. There's a few other ray tracing programs, too. 02:39:39:22 - 02:39:59:04 Speaker 2 But you potentially could go and build these forests in complex terrain and then be to put the optical properties and the scattering properties to cover, you know, visible through shortwave, infrared, thermal and even radar and so on. And we haven't quite done the radar side of things with our forests, and we're mostly in the, you know, visible through shortwave. 02:39:59:12 - 02:40:32:01 Speaker 2 But these are all possible. And as David saying, you know, that does you a truth data set, assuming that you believe that these tree structures in ground that you've built are reasonable to what nature is actually doing. But then yeah, we could go in and start to simulate different mission concepts and see not only accuracy of data products, but also just the spatial sampling, and then use that to train, you know, whether it be AI or physical models and evaluate how they're doing in a more quantitative fashion. 02:40:32:01 - 02:40:56:23 Speaker 2 On the question of scale. Antonio was asking, I think there's still a mode where you do an experiment on a couple of tens of hectares and see if you can understand those results and then make your scale bigger and bigger. I don't I don't think you get an understanding as much as you can along the way and eventually incorporating A.I.. 02:40:56:24 - 02:41:29:07 Speaker 2 By the way, I'm not saying don't use A.I.. So that's my that's my point is that there is a component of of the community which does smaller experiments and there is still point to that. And there may even be a point in STV for starting small and building complexity. So one thing I might just ask what you think about it is I mean I don't know, I've, but as I showed yesterday, like it's going to take a long time to get global coverage. 02:41:29:07 - 02:41:48:01 Speaker 2 So one thing I've been looking at is, you know what, if you under sample use machine learning to try to fill in the missing spots or whatever. So it's sort of tied down, but you could get different, you know, sample patterns, investigate, you know, are there optimal sample patterns? How much can you under sample and still recover the data? 02:41:49:14 - 02:42:15:22 Speaker 2 But I think one worry that it sounds like is like, you know, is what if we make a mistake and maybe we just build that somehow into the requirements that we have some kind of allowance for false positives or false negatives or an error bar for an under sampled region, anything that's what we would consider a direct measurement where we know the validation. 02:42:15:22 - 02:42:36:04 Speaker 2 We just say, you know, for for some things, you know, kind of a little up here and a little down there, it might sort of wash out if you're but but maybe if you're trying to predict where water going to flow, if you if you make a real mistake, you know, the water is going to go one way versus another way. 02:42:36:04 - 02:43:03:12 Speaker 2 So, like, is there a way to just sort of not to be sort of model agnostic and just say, you know, part of our requirement needs to be a certain level of of accuracy of of our, I don't know, not inferences, but of our interpolations or our modeled results and just try to so that we can verify it if do an A.I. model, you usually have some kind of confidence level. 02:43:04:14 - 02:43:25:04 Speaker 2 Can can we just build that into a requirement are? You saying that when you start with your A.I. model, you already have some idea, some qualitative idea of what's going on. And it better give you that. Is that is that you're saying like because if this stream is flowing to the right or left, you kind of know before you do the experiment which way it's going. 02:43:25:24 - 02:43:44:16 Speaker 2 Well, that could be I mean, I don't know, like if maybe that was a bad example. I'm just saying, like, what are the consequences of making a mistake if like, you know, you predict there's a boulder somewhere and it's not there? Does anyone care? Right. There's a tree lands in the force. But but some things are going to be more consequential than others are. 02:43:44:16 - 02:44:04:21 Speaker 2 We wouldn't be making the measurements. So is there a way to sort build in the consequences of making mistakes and then trying to just work out, just acknowledge we're going to make some mistakes in a sense, if we're trying to measure everything all over his simulate, simulate the mistakes and see how they show up in data. 02:44:05:20 - 02:44:40:17 Speaker 7 But I think we can do a lot of things like this to to make sure that we find out areas that we make errors. But I want to go back to the measurement part of this. There are so many products now globally on vegetation, height and underlying topography. Even Airbus uses the Terra X with these DLR two meter accuracy and they create a bottom topography. 02:44:40:21 - 02:45:12:08 Speaker 7 But when you look at those data sets and in California or anywhere, you notice that they don't match at all the reality, the height is not accurate and the ground topography the so going back beyond I, I like to see on the measurement sites on TV, how do we really want to approach because air is the next step or any retrieval is the next step on the measurement part of it. 02:45:12:22 - 02:45:44:16 Speaker 7 What we need to do to really make sure that we are having enough information if your problems cannot be solved enough. So that's why my first question comes because you can see a lot of data. There is a ten meter global height measurements and there is a five meter to six meter global earth topography, both topography from Airbus, they sell it and there are all of these things that exist. 02:45:44:16 - 02:46:03:01 Speaker 7 So I want to make sure that as a community working on such a study, we really want to tackle the problem of the measurements first and also thinking about how to retrieve or estimate those values. 02:46:03:01 - 02:46:05:07 Speaker 2 We have we have two online questions. 02:46:05:14 - 02:46:12:12 Speaker 3 Yeah. And just to respond to you about boulders. If you're looking for glacial, erratic, you care where those are. 02:46:12:12 - 02:46:13:24 Speaker 2 But we're down to just a few minutes. 02:46:14:06 - 02:46:23:22 Speaker 3 So this is so yeah, we only have 3 minutes, but George McFadden and Brian Hubert, you've been patiently waiting, so we'll let George go first. If you could keep it concise, we'd appreciate it. 02:46:23:23 - 02:46:51:11 Speaker 2 I'm going back to talking about measuring the ground level of ground location here in Western Oregon, we have two data sets that should be of interest. We have a little watershed of 5700 acres where there were 13 sets of LiDAR flown over a ten year period. And we have checked the difference between the ground locations on all 13 and they're just within two centimeters every time they've been flown. 02:46:51:11 - 02:47:15:00 Speaker 2 So it seems to work very well that data sets freely available. The other data that we have out here is in the Douglas complex, fire. That is one that's about 25,000 acres. But we have the data was flown two years prior to the fire. The that was flown the year of the fire and light are two years post and four years post. 02:47:15:18 - 02:47:38:18 Speaker 2 So when someone was saying, well, look at all these checkerboards, that's probably BLM ownership and management in Western Oregon, and that could be some of this area where it's at. But we have that we've done some analysis on that and it seems to work very well. So anyway, those types of datasets are out there and freely available if anybody wants them. 02:47:38:18 - 02:47:43:03 Speaker 3 Thank you. And I think Brian maybe has a similar type of comment, but go ahead, Brian. 02:47:43:04 - 02:48:13:02 Speaker 2 Yeah, just additional sources of information for people that they may not be aware of. Two in particular we have about a fifth of the of the Minnesota as level zero light our data and one of the primary reasons is for forestry applications because as know trees grow. The second is the other project I worked on with the Great Lakes where it's Glass Dawg and that's all worldview imagery to two meter resolution across the entire Great Lakes Basin. 02:48:13:23 - 02:48:37:24 Speaker 2 So you may want to look at those two sources because I also represent this group. I put my hat on. I'm also the president for the Minnesota Forestry Association. I'm also a remote sensor, and that's 200,000 woodland owners just in Minnesota that would love to help out in this helping this global project, because forests are one of the few items that actually store carbon and. 02:48:37:24 - 02:48:46:24 Speaker 2 We just have to manage it better for the future, I think. Thank you very much. We're going to take one final question and then we'll wrap up the panel. 02:48:47:15 - 02:49:18:21 Speaker 4 I want to make an observation. I mean, everybody says that LIDAR is the golden standard. So from all the comments, it seems that it is true for ground bare earth, right, but is not necessarily true for the top of the canopy. If we have different lidar instruments, which over the same forest or not, that top of the canopy will be totally different. 02:49:19:03 - 02:49:58:13 Speaker 4 And your correlations actually will be pretty dismal. I mean, they might be between 50 to 60% to 70%. Is this enough? Do we feed all of these? If you want to feed all those measurements on air, how good are they? I mean, if you do a correlation between discrete LiDAR, height canopy versus full waveform light our height canopy, I guarantee if you get 70% correlation in between measurements, you will be extremely happy. 02:49:59:03 - 02:50:40:00 Speaker 4 So and that is not like with big time between the two collects, they can be very close. So that vegetation didn't change that much to really explain that. So we think we understand our measurements, but do we really understand our measurements. I mean, I don't know. In my experience, I always put the question actually what we see when we measure vegetation with LiDAR, I don't contest bears, but I have some questions about vegetation. 02:50:40:23 - 02:50:58:20 Speaker 2 Can I comment on that one real quick clarification needs? Well, I think we're really out of time. I want to thank our panelists and the audience. Excellent discussions around. 02:50:58:20 - 02:51:20:18 Speaker 1 Okay. So before we break for lunch, so 1:00, we'll come back for the breakout rooms before that 1245 right now we're going to do a here. We're going to do a group photo. So before you go for lunch, just read out right outside here.