0:02 hi my name is Becky Johnson and I'm a
0:04 senior systems engineer at AEI today I'm
0:07 going to show you a quick demonstration
0:08 that highlights SDK EOIR in its ability
0:12 to model the detection tracking and
0:14 imaging of sensors for a space
0:16 situational awareness application the
0:19 scenario that I have here is an
0:21 installed scenario with yo IR so we're
0:24 going to start and look at three
0:26 different examples the first thing that
0:28 we want to look at is Leo satellite
0:31 looking at a geo satellite so a couple
0:34 of things we want to note in our 3d
0:35 window right off the bat so we have the
0:37 Leo satellite that is transiting the
0:39 earth imaging the geo satellite we want
0:42 to take a look at what that sensor can
0:44 see in terms of the geometry in the
0:47 lighting conditions and how we're able
0:48 to resolve that image using an EO IR
0:51 sensor when you have an ER ir installed
0:55 you have the ability to model the sensor
0:58 characteristics the way that's done if I
1:00 open the properties for one of my
1:02 sensors in the basic definition page
1:04 what you'll see is a series of pages
1:08 tabs if you will for the spatial
1:10 spectral optical and radiometric
1:13 properties for the EO ir sensor you can
1:15 model multiple bands here you see two
1:17 that have been modeled in the spatial
1:19 setup we're modeling the horizontal and
1:22 vertical field of view of the sensor as
1:24 well as the number of pixels in the
1:26 spectral setup we're modeling the upper
1:28 and lower band edges depending on what
1:30 wavelength it is that you're considering
1:32 here as well as the number of intervals
1:34 for the optical setup you're looking at
1:37 things like the F number the
1:39 longitudinal to focus the focal length
1:41 as well as the image quality so we can
1:44 do anything from diffraction-limited so
1:46 a perfect situation through optical
1:49 aberrations also up to the custom data
1:53 files that the user has the ability to
1:55 import for image quality in the
1:58 radiometric setup of the sensor we're
2:00 looking at the radiance or irradiance of
2:02 the sensor either at a high level or you
2:05 can also switch it to a low level so
2:07 that provides a higher level of fidelity
2:09 for the customer to bring in information
2:12 for their radiometric portion of the
2:14 sensor so one
2:16 stats set up in this case the one that
2:18 we are looking at is a one point five
2:20 degree horizontal and vertical half
2:22 angle for the sensor so we're gonna be
2:24 able to see how well we're able to
2:26 resolve that geo satellite the way that
2:29 that's done in SDK is through a
2:31 synthetic scene generator with a oír
2:34 installed you have a plugin which gives
2:37 you the option to create that synthetic
2:39 scene by making use of the little
2:42 telescope button along the plugin so if
2:45 I highlight the sensor that I'd like to
2:47 generate that synthetic scene for click
2:49 that little telescope button it'll take
2:51 a couple of seconds but then it'll
2:53 generate that synthetic scene now what
2:55 you're looking at here is an image where
2:58 the object isn't well lit so we might
3:01 want to consider changing the geometry
3:03 such that the lighting conditions will
3:05 allow for a better synthetic scene you
3:09 can see a couple of objects in here that
3:12 that you may may be able to resolve this
3:15 such as a bright star down here but
3:18 let's go ahead and change our geometry
3:20 so that our resident space object is
3:22 more well lit by the Sun and then let's
3:25 regenerate that synthetic scene as I do
3:28 this we're now able to see that we have
3:31 more images within the synthetic scene
3:34 that we're able to take a look at if I
3:37 were to animate this scenario so I'm
3:39 just gonna step forward in time this is
3:41 one way that the user can figure out
3:43 what it is that it might be in their
3:46 synthetic scene depending on if you're
3:47 looking at in this case a star
3:49 background or an image that might be
3:51 intertwined with the Stars so in this
3:54 case the image the geo satellite that
3:56 I'm looking at is right here its
3:58 remaining stationary while the stars
4:01 move around it if I right-click in the
4:05 synthetic scene I can bring up a details
4:07 window this allows the user to click on
4:10 the pixels and provide information about
4:13 objects in the synthetic scene for
4:15 instance I just clicked on a star and
4:17 it's given me information about the
4:19 catalog number the right Ascension and
4:22 declination as well as the temperature
4:24 and material for that star all the stars
4:27 in a oír modeled as blackbody point
4:29 sources
4:29 that information is all coming from the
4:31 catalog you can also change the color
4:34 map the scene detail as well as contrast
4:38 and brightness of the scene for instance
4:41 now if I click on the object in the
4:45 center which happens to be our satellite
4:46 you'll notice that it's giving us the
4:48 name geo this is an object that's
4:50 represented in our scenario called geo
4:52 so it's also giving us information about
4:54 the distance to that object the azimuth
4:57 elevation and the temperature and
4:59 material properties for that object
5:01 oftentimes the 3d visualization is great
5:05 and we want to be able to see that
5:06 synthetic scene but being able to look
5:09 at the analytics and everything that's
5:10 being calculated under the hood often
5:12 time goes a long way to give a nice
5:14 compliment to the 3d window so we can
5:17 generate the signal-to-noise ratio
5:19 versus time in a graph that will give us
5:22 information about what's going on with
5:24 these two objects the geometry and the
5:26 lighting conditions such that we can
5:29 then take a look at them both together
5:31 so if I minimize my graph a little bit
5:33 place it up in the upper right hand
5:35 corner such that I can still see the 3d
5:38 window and my graph and then I hit play
5:42 what we're able to see is that
5:44 signal-to-noise ratio vastly changing as
5:47 well as the dropouts that we're seeing
5:49 in here and the reason for that in this
5:51 case is as that Leo satellites traveling
5:53 around the earth when it's on the
5:55 backside and it doesn't see the geo
5:56 satellite we're seeing those vast
5:58 dropouts of the signal-to-noise ratio
6:00 but this is a really good representation
6:02 of that analytically being able to look
6:04 at the graph as well as what's going on
6:07 in the 3d window to give us that
6:09 indication of how we're able gonna be
6:11 able to resolve that image and when for
6:13 these two objects now let's move into
6:16 another example let's take a look at a
6:19 Leo satellite imaging another Leo
6:21 satellite in this case you'll see that
6:24 there is an yellow line indicating that
6:27 the second Leo satellites orbit is
6:29 illuminated by the Sun so this is
6:31 important because we want to be able to
6:33 see with different backgrounds how well
6:35 we're able to resolve this image and the
6:38 first one that we're going to start with
6:40 is the star background
6:43 so once again it's going to be similar
6:45 to what we've seen before in the sense
6:47 that we have a number of different stars
6:49 and our synthetic scene we may not know
6:52 what our object of interest is that
6:53 we're tracking but we can animate that
6:56 forward to give us a better indication
6:58 and then if we look in our synthetic
7:00 scene we can see this object is
7:02 remaining stationary as the other
7:04 objects the Stars are passing through
7:06 the synthetic scene one thing the user
7:08 can also do is turn on motion blur for
7:10 the stars so in this case the stars
7:12 would appear to blur as they move
7:14 through the scene and the object of
7:16 interest that we're focused on would
7:17 remain stationary in the center if we
7:19 want to take a look at an earth daylight
7:22 background we can generate that
7:24 synthetic scene to get more information
7:26 about what that would look like and
7:28 whether or not we'd be able to resolve
7:30 that image so in this case you'll see
7:32 that we have the center field of view
7:35 represented here again in this case the
7:38 satellites image or the satellites orbit
7:41 track is once again illuminated by the
7:43 Sun but so is the Earth's background so
7:46 in this case we're not really able to
7:48 resolve an image a guess against this
7:50 background because we have both daylight
7:54 on the ground as well as the the
7:56 satellite itself finally we can look at
7:58 an earth night background so in this
8:01 case once again the object is being
8:03 illuminated by the Sun and the earth
8:06 behind it is dark so we're really able
8:08 to see quite well what that object would
8:11 look like against that dark background
8:13 one more thing that we can do is
8:15 generate a synthetic scene graph that
8:19 gives us an indication of what this is
8:22 going to look like for this background
8:23 so this graph gives you an indication of
8:26 how the signal-to-noise ratio is going
8:28 to drop off as well as the target
8:29 irradiance as the satellite moves from a
8:33 Sun illuminated orbit into eclipse
8:36 so you'll notice we have the line right
8:37 here indicating that that's happening
8:39 the saddle is getting ready to move into
8:41 its eclipsed part of the orbit and so
8:43 both the signal-to-noise ratio as well
8:45 as the target irradiance are dropped off
8:47 and then they'll come back up again as
8:48 it continues around in its orbit again
8:51 now finally we can look at us ground to
8:55 space applique
8:56 so if we move into looking at a sensor
9:00 that's located in Socorro New Mexico and
9:02 looking straight up we want to be able
9:04 to see once again what that sensor would
9:06 be able to resolve from a synthetic
9:08 scene perspective now in this case
9:11 you'll notice that this satellite is
9:13 going to track through the field of view
9:15 from the bottom left to the upper top
9:18 right and we want to be able to see what
9:20 that looks like so if we generate a
9:22 synthetic scene in this case one of the
9:25 things that we can do here again is
9:27 animate through this scenario to see how
9:30 this satellite tracks through the scene
9:32 it's a very faint object here that
9:35 you'll see in the synthetic scene and it
9:38 can track through that space oftentimes
9:41 though folks would like to output the
9:44 information that's coming from the
9:45 synthetic scene and then post-process it
9:48 after the fact which is something that
9:51 you can definitely do with a oír and
9:53 this is a case where we've taken those
9:54 output images and created a video and if
9:57 you'll notice this is actually tracking
10:00 through the scene here in the video
10:02 where we've put together the different
10:04 scenes of the output for the object
10:06 itself one other item to note is that
10:10 the target objects you have a couple of
10:13 different ways of modeling each of the
10:16 target objects in this case our geo
10:18 satellite was represented as a sphere so
10:21 if you go to the basic UI are shaped
10:24 page you'll notice that was represented
10:26 as a sphere with a three meter radius
10:29 and a single material and temperature
10:32 was applied we can also look at for the
10:34 Leo imaging example we selected a more
10:38 composite shape a Leo imaging satellite
10:41 that was created for us once again then
10:43 we can define single material to those
10:46 satellite to that satellite or we can
10:48 define different material properties to
10:50 each part of the satellite we can also
10:52 either define a static temperature one
10:57 temperature that remains the same or we
10:59 can bring in a time dynamic thermal
11:01 profile this could be done using
11:03 something called SDK seat the space
11:06 environment effects tool could also be
11:08 another application that you may
11:10 have for thermal modeling and you want
11:12 to bring that in externally that's
11:14 another thing that can be done if you
11:15 have your own thermal models one other
11:18 item to note is if you have a custom
11:22 mesh we do ship with a number of obj
11:26 files that have been created in blender
11:28 these are about a thousand polygons or
11:31 less of multiple objects that you can
11:34 load in as target objects and the EOIR
11:36 or you can create your own and this
11:38 would be the way that you would select
11:39 that as the target object as well so you
11:42 have the ability and the flexibility to
11:43 model the sensors both on the that's the
11:47 sensor side as well as the target
11:49 objects to create that synthetic scene
11:52 in a oír thanks for watching and if you
11:56 have any more questions about yo IR feel
11:58 free to go to our website at wwlp.com
12:10 you