Phased Array Radar Isosurface

This is all-around cool stuff ... it brings together a lot of the different research my group is doing. The biggest research effort at the lab right now is modifying a Navy SPY-1 phased array radar so that it can be used for weather detection. The fact that we could collect this data is itself the product of a lot of new and innovative stuff.

Then ... hand-editing functionality to dealias the velocity data from the phased array radar, because automated methods don't work well on it just yet. We may need to use the human-edited data to create a good dealiasing technique in the future.

Then, an automated algorithm (called LLSD: local least squares derivative) to identify areas of rotation based on the velocity data.

And finally (the newest thing in this list): a multi-product, multi-source interactive, isosurfacing capability.

Put all of this work together, and you get this great picture:
The red is the hail core of the storm. The blue/teal shows where the air cores are rotating (i.e. the tornado) and the grey/black shows the entire thunderstorm structure.

What's my role in all of this work? Only on the periphery of all of this ... I developed statistical methods to verify the calibration of the radar, and techniques to correctly assemble the data as it is adaptively scanned. Also, I designed the system -- WDSS-II -- within which much of this work is done. But nothing sexy, like the isosurface picture!


  1. This thing truly swept me off my feet (no pun intended)

  2. Very exciting stuff. Makes me miss being in Norman. However, I don't envy the hand editing to dealias the velocity data. In my early days as a student, I did plenty of that for 88D data. Maybe it's easier and faster now with better tools, but you've got lots more data, right?

  3. Slightly better tools, but it's still tedious work. Good thing we have enthusiastic students, even if they will complain about it 10 years from now!