Tuesday, December 29, 2009

Measuring Systems Part 2 - Lasers and Cameras

Background

Continuing with our series on Baker's Mobile LiDAR system (See Part 1), we introduce you to the lasers and cameras.  We will discuss the basic overview and specifications of the sensors.  After all, the information captured with these four sensors is what you really want to see and utilize.

The images below depict an image captured with one of the onboard cameras and a colorized point cloud derived from the laser data and the picture (See our Bedford Springs post)



Lasers

Frequency & Returns
Our vehicle utilizes two Optech lasers - their LYNX Mobile Mapper™ system. The system allows us to collect at different rates (50kHz, 100 kHz or 200kHz) depending upon specific project requirements.

Building upon the technology developed for aerial systems and the need to penetrate tree canopies, the system is also capable of measuring multiple returns of the laser (up to 4). This ability allows for the capture of information behind hedges and through other brush. From the multiple returns, we're able to derive more complete "bare earth" elevation models.

Accuracy & Precision
Basically, precision is a measure of repeatability of a measurement. The lasers have a defined precision of 7 mm.  Surveying total stations, by comparison, vary from 2 to 7mm depending on manufacturer, type and target (reflectorless total stations are generally less precise when used as such).

Often times, that term "accuracy" is thrown around indiscriminately, misinterpreted, misrepresented and, quite truthfully, over exaggerated.  There are a number of ways to measure accuracy and a number of processes that either improve or reduce accuracy.  In the most basic sense, we are realizing raw accuracies from the system itself of between 3 and 8 cm.  Using ground control and point cloud constraint measures, we are realizing accuracies in the neighborhood of 1.5 - 4 cm. *There are a number of dependencies and each project is designed and performed to specifications to meet accuracy requirements.

Accuracy will be greatly expanded upon in a future blog.  We will review what dilutes accuracy, what procedures we employ and how we establish ground control.

Safety
The light produced by the lasers is not visible and is eye safe at nadir.  Therefore, drivers in surrounding vehicles and bystanders are safe from injury by the laser and distraction.

Cameras

We have two 5 mega-pixel cameras on our platform.  They can measure up to 3 frames per second depending, again, on project specifications and requirements.  We can position them forward or behind the lasers.  They are also mounted in brackets that allow for a full range of orientation.  If interested in a sign inventory, the passenger side camera may be positioned forward of the laser facing in the direction of travel while the driver side camera may be positioned behind the laser facing backward.  For assessing pavement, the cameras would be located in the rear and pointing down.

From the images, we're able to develop a range of products from colorized point clouds to movies and attributes.

The rapid reconfiguration of the sensors (both laser and cameras) allows our operators to tailor a collection to a specific set of goals and products.




Thursday, December 17, 2009

Mississippi River Levee

Recently, we surveyed a few miles of the Mississippi River Levee south of Baton Rouge, LA using the Mobile LiDAR vehicle.  At first, there were two things we wanted to learn from the collection.  Our primary concern was the ability to capture the toe of slope of the levee with the system - the reasoning behind raising the sensor platform higher than the luggage rack.  As you can see from the screen captures below, we didn't have any trouble with that test.



The second phase of testing dealt with camera settings. Since the sun angle changes rapidly with the vehicle in constant motion, our crew tested different collection techniques and settings in an attempt to have consistant brightness.  Our goal is to limit the patchwork affect that is slightly visible in the roadway of the image above.  With new software we've employed, we're able to then perform basic color matching to smooth out those regions.



Now that we have this information, we have elected to perform an assessment of "soft targets".  We are performing field surveying in support this assessment - the results of which will be shared upon completion. 

Monday, November 23, 2009

Measuring Systems Part 1 - Positioning

Background
In traditional Surveying & Mapping, points are determined by measuring an angle and distance to an object from an instrument over a known point (X, Y, Z or elevation).  We have all seen, at one point or another, a person at a total station locating a rod held by another field crew member - often while we zoom past at highway speeds. 

Now, with Mobile LiDAR, we are Surveying & Mapping at highway speeds.

In order to better understand how accurate measurements are collected from our vehicle, it is important to know the instruments onboard and the function of each.  As individual sensors, they do not provide an adequate solution.  But, when used as part of a system, they provide the foundation for accurate information. 

The purpose of this post is to provide a basic overview of our system.  Additional parts will cover the lasers, cameras and other components and processes which provide a complete solution.  If you have questions, please leave a comment.



Global Positioning System (GPS)
The primary method of location for the vehicle is GPS.  We utilize 2 onboard units to provide the position of the vehicle as well as the heading (having a known baseline, distance and direction, between antennas provides a measure of direction of the vehicle).  The fundamental issue with GPS as a sole (hence I used primary) source of positioning is that we measure a position one time per second.  For those of you breaking out your calculator, that equates to 88 feet per second when traveling at 60 mph.  In that second, our lasers could have measured 400,000 points (covered in part 2, stay tuned).  Therefore, we rely on our second measurement instrument.

Inertial Measurement Unit (IMU)
The IMU measures the attitude (roll, pitch and yaw) of the vehicle, much like an aircraft, at a rate of 200 times per second.  By measuring those changes in direction about the X, Y and Z axis, we are able to calculate the vehicle position at those increments.  Using interpolation, we are able to further refine intermediate positions.

Distance Measurement Instrument (DMI)
The often overlooked member of the measurement family is the DMI.  Barely noticeable and not mounted on the "LiDAR Wing", the DMI has two distinct purposes: determine the distance traveled by measuring the revolutions of the wheel 1,024 times per second and tell the system when the vehicle is stopped.  Since there is drift in GPS and the IMU, the DMI basically determines when the wheel stops revolving.  By measuring the diameter of the wheel and calculating circumference, we also know the distance traveled per partial revolution.

For NASCAR fans:  since we measure the diameter of the wheel at rest, the circumference of the wheel is calibrated while we're driving and GPS provides a good solution.

For non-NASCAR fans: as we drive, the tire will begin to heat and will build pressure thereby increasing tire circumference and impact the distances measured by the DMI.

In a nutshell... one system helps calibrate another system.  In a later post I'll cover what happens when we lose GPS!!! 

Comments are moderated, so if you have a question, comment or suggestion please let me know.

Thanks for following!

Target of Opportunity

Great Bridge Bridge
Virginia

Some time ago, we were in Chesapeake, Virginia performing a collection and demonstration.  While driving, we saw a target of opportunity - the Great Bridge Bridge - and collected roughly 10 seconds of data.  There was no purpose for the collection, but after processing I thought it would make an excellent example for showing point cloud samples - given the joint in the split drawbridge.



Above, I've render the point cloud two different ways.  On the right hand side are point colorized by height/elevation only. There is nothing to differentiate one point from another as far as composition.  On the left hand side I've included the intensity of the return in the colorization.  From the intensity, you can begin to differentiate the striping, road and materials. Just as in traditional surveying where measured points will have an associated description, we can begin to see how intensity is used in point classification - more to come on that subject.

In addition, I've compiled a brief video of the bridge. Feel free to post comments and suggestions.


Tuesday, November 17, 2009

Subscribe to our Blog

First, I would like to welcome you to the new and improved blog.  I figured after spending the money to invest in a high tech piece of equipment, our blog should have the look and feel similar to our vehicle - not a default template.

I've been asked by several people how to go about following the blog without having to check the site daily.  Therefore, here are two simple ways to get updated postings.  Perhaps the easiest is to submit your email in the upper right-hand corner of the home page. You will need to follow a few steps to receive emailed blog posts.

Alternatively, you can consume the blog posts directly in Microsoft Outlook.  First, right click RSS Feed in the Mail Items; then click "Add a New RSS Feed"; finally, type in http://feeds.feedburner.com/bakermobilelidar and click Add.  The blog posts will appear similar to emails.

If you have any comments or recommendations for blog posts, please leave a comment.


Thursday, November 5, 2009

2009 Texas GIS Forum

The Presentations from the 2009 Texas GIS Forum are now available to download.  Check out Mobile LiDAR: Surveys at the Speed of Business presented by Stephen Clancy.  (Link to all other presentations)


(*Edited to remove dead hyperlinks)

Pipe Ladder Animation

A few weeks ago, our Mobile LiDAR crew collected several roads around a heavy industrial area - including refineries and a railyard.  Adjacent to and above our survey area was a complex labyrinth of pipes, fencing, railroad tracks and utilities. Making sense of all of the data could be quite daunting if looking at the entire collection. Utilizing animation, I've been able to illustrate how clipping the information can yield a new perspective. The video is a simple animation compiled showing a pipe ladder that we drove under. All of the information depicted was captured in seconds. The photograph shows the pipe ladder as taken with one of the onboard cameras.

Note:  If you're not able to view YouTube videos and would like a copy,  leave a comment with your contact information.  Comments are moderated and your information will not be published to the board.

Wednesday, October 28, 2009

To Debunk or Not To Debunk - That is the Question

I sit here writing this blog today facing a difficult decision to make.  It's not a decision that I take lightly and one that I would gladly take advice on from you.

Shortly after installation of our system,  we captured an area in Bedford County, Pennsylvania little known to the outside world - especially to those who no longer take the time to get off the interstate and enjoy such roadside attractions as the Corn Palace, Wall Drug Store or Weeki Wachee Springs.  I am referring to Gravity Hill - a place described as:
"Cars roll uphill and water flows the wrong way. It's a place where gravity has gone haywire."
Since that day, I have avoided processing the collected data to keep from being the one to perhaps disprove what many have become to understand as a natural phenomena which occurs in an obscure, but beautiful, part of the Pennsylvania hills.  Having grown up in Florida and traveled the state quite extensively, I am all too familiar with Spook Hill.  I can vaguely remember sitting in the back of my grandparent's Cadillac when my grandfather put the car in neutral at the base of the hill and we proceed to roll backwards up the incline.  To say that I was a little spooked was an understatement.

As a father, I look forward to these "less exciting" experiences and may allow the data to go unprocessed.  Still, there is a part of me that wants to develop a bare-earth model and see that water really does flow up hill.  I leave it to you, the readers, should this data be processed?

To find a "Gravity Hill" in your area, check out this list.  Enjoy!

Friday, October 9, 2009

Baton Rouge - No Rest for the Crew

Huey P. Long Bridge
Following our East Coast Road Show, the Mobile LiDAR vehicle arrived "home" to Baton Rouge.  Although we managed to put over 3,000 miles on the vehicle in 3 weeks, our crew made their way back out into the field for collection of the Huey P. Long Bridge over the Mississippi River.
The bridge carries 4 lanes of US 190 (Airline Highway) and 1 line of the Kansas City Southern rail.  Our Mobile LiDAR crew made two passes in each direction and various approaches to capture the underlying support structures, levee, railway and roadways.
LA DOTD Offices and I-110
After the collection of Huey P. Long Bridge, our crew collected the Louisiana Department of Transportation and Development office along I-110.  They made several passes of the office building and surrounding approaches to generate dense coverage.  After processing the trajectory, I created a Google Map of the drive.  If you click on the location point, you can watch a YouTube video of the collection.

Monday, October 5, 2009

Atlanta - Spaghetti Interchange

In order to minimize traffic in the collection, our crew worked through the early morning hours to capture almost 20 different approaches to the I85 - I285 interchange in Northeast Atlanta.  Called the "Spaghetti Interchange" by locals, the data capture for the trajectory shown to the right took approximately 2.5 hours.  The Smoothed Best Estimate of Trajectory (SBET) provides a constant position and orientation of the vehicle to process with the laser range data to calculate X, Y and Z coordinates of all measured points (point cloud) - the minimum type of processing performed.


The image above shows intensity merged with colorization by height (blues are lower than reds).  The support structures, decking, retaining walls, barrier walls, and other roadway features were accurately captured with our vehicle traveling at 50 - 60 mph.

Friday, September 18, 2009

Pittsburgh Airport Data Collection



Just days before the G20 Summit, Baker gained access to Pittsburgh Airport for an hour and a half.  Using an escort vehicle, which maintained constant communication with Air Traffic Control, we surveyed two runways, several taxiways, approach lighting and the entire terminal.  Our collection was highly successful.  Due to the unobstructed sky, our GPS positioning yeilded constant accuracies of 8 - 12 mm for our trajectory utilizing a single base station.

To the left is an image showing intensity of runway 28L.  There are two interesting components of this image.  First, the bright dots to the left and right are field lights - their colored plastic lenses yeild higher intensity returns than their metal cases.  Secondly, the rubber laid down by landing aircraft is clearly visible due, once again, to the intesity difference to the surrounding asphalt and white, reflective striping.

Wednesday, September 16, 2009

First Assignment - Bedford Springs



Shortly after the installation and initial testing, we presented the vehicle to Baker's management at their Fall Manager's Meeting.  Shown above is a colorized point cloud of data captured in under 20 minutes.  Colorizing a point cloud assigns a red, green, blue value for each measured point from the pictures taken with the onboard cameras.  The image shows a decimated point cloud - where only 50% of the points are rendered. 



During collection, we also captured portions of US 220.  The image above is not a black and white photograph.  It is an image comprised of millions of points rendered by the intensity of the returned light from the laser.  The red lines denote a cross-section and the graph shows the profile.  Note the superelevation of the roadway as the highway curves to the right.  Nothing special was done to derive these images.

Saturday, September 12, 2009

Baker Mobile LiDAR - Day 1 System Install

We decided on a Chevy Suburban 2500 (3/4 ton - 4x4) for our first Mobile LiDAR vehicle. A support structure was built to raise the sensors an additional 1.5' for better line of sight. We had custom drawers fabricated for our GPS and survey equipment. And the electrical system of the vehicle was upgraded to provide consistent, reliable power to the system.

The installation took place at our Beaver, PA office on September 12th. We had 4 Baker staff assist with the install: Aaron Morris - Program Manager, Stephen Clancy - Vehicle Manager, Justin Thornton - Vehicle Operator and Mark Anderson - Processing Technician. The team made quick work of the installation and stocking of the vehicle with supplies. Following the installation, we performed calibration drives and began testing.