RT-Range systems are used to make relative measurements between an RT inertial navigation system and a number of other Targets.
The RT of concern is referred to as the Hunter (Vehicle Under Test) and all other points of concern are called Targets.
Targets can be designated as Mobile Targets (other RT systems), Fixed Points (stationary Targets) or a set of Feature Points (lanes, traffic signs, etc.).
The Range system calculates the distance (range) to the Target and resolves this into forward and lateral measurements in the Hunter vehicle's coordinate frame.
Let's take a look at an example with a Hunter and mobile Target. The Hunter has had the sensor point set to be the front of the vehicle and the bull's eye has been configured to the rear of the Target.
In this example we are measuring between the sensor point on the Hunter and the bull's eye on the Target. As all of the measurements are resolved into the Hunter's frame, alterations to the orientation of the Target will not effect the measurements providing the relative position of the bull's eye and sensor point do not change.
A change on the orientation of the Hunter will effect all of the measurements reported by the Range system, as shown below.
Measurements made by the Range system are made in the level frame, under the assumption that vertical measurements are negligible. For mote information on coordinate frames see this article. As this assumption is used for measurements it can also be very useful to define a local coordinate system for the test area. This can be thought of as a plane set tangent to the Earth at a user defined origin point. The user also defines the orientation of the x-axis. The diagrams below show similar scenarios to the previous three, with the inclusion of a local coordinate system and some of the measurements available. Note that in all three scenarios the local coordinate positions of the sensor point and bull's eye do not change.
So far we have looked at basic range (distance) measurements. Relative velocities and accelerations can be calculated in the same way. Looking at the two closest points and then decomposing this into the Hunter's coordinate frame.
As measurements are resolved into the Hunter's frame it is of great importance that the heading accuracy in the Hunter vehicle is as accurate and robust as possible. Please see this guide on optimising the accuracy of the RT. The diagrams below shows how a misalignment in the heading of the RT in the Hunter vehicle can cause perceived measurement errors in both lateral and longitudinal measurements.
Range systems can use other measurement origins other than simple sensor points and bull's eyes. It is possible to define polygons around the Hunter, mobile Targets and fixed points. If a polygon is designated the measurement will be taken at the shortest distance between either the point and the polygon's edge or the two polygon's edges. The example below shows a Hunter vehicle configured with a sensor point and a Target vehicle with a simple polygon defined. The resultant range (red arrow) is calculated and resolved into forward and lateral measurements based on the Hunter's heading, as shown earlier.
If the Hunter approaches the Target the measurement is now taken at a different position along the rear edge of the Target's polygon.
Let's take this idea further into a scenario with multiple polygons. The image below show a typical park assist test with three fixed point and a Hunter. All have had simple polygons defined. FP1 and FP2 are supposed to represent two parked vehicles, whereas FP3 is representative of a kerb. Note that in reality these could really be vehicles and a kerb or just cones or some other markings or nothing there at all.
As the Hunter vehicle progresses through the manoeuvre the shortest measurements to each polygon changes and is updated appropriately. All of these are then decomposed into forward and lateral components in the Hunter vehicle's frame as before.
Polygons are defined using a series of up to twenty four points. All of the examples above only show simple four point polygons. The screen capture below shows the parking assist scenario as viewed in real time in the RT-Range software. Note the complexity of the various polygons.
How representative of the real vehicles under test the polygons are will depend on how the user configures them. It is possible to configure a polygon in such a way that the actual vehicle perimeter lies outside of the polygon perimeter and caution should be taken when configuring polygons.
If there are multiple Targets set up with polygons then the Range system can calculate Target visibility. This can be very useful for testing with emerging vehicles and pedestrians. In the diagram below T1 and T2 are both mobile Targets with defined polygons, the Hunter system has a sensor point set to the front of the vehicle. Initially the Range system will report the visibility of both Targets as 100%.
But as T2 moves across to obscure T1 the visibility of T1 will change accordingly and T2's visibility will remain 100%. In the diagram below T1's visibility will be reported as 40% and T2 will be 100%.
Time to Collision
The Range system also reports the time to collision (TTC) when the Hunter is closing on a Target and there is a valid solution i.e. a collision would eventually occur. Before getting into how this is calculated it is worthwhile to define some nomenclature.
As shown earlier, Range calculations are carried out in the Hunter vehicle’s frame of reference, i.e. forward and lateral with reference to the Hunter, regardless of the orientation of the Target. The resultant distance between the Hunter vehicle and Target vehicle can be calculated from their instantaneous positions. This can then be decomposed into a lateral and forward component relative to the Hunter.
When calculating the relative velocities of the Hunter and Target vehicle we look at lateral and forward velocity differences between the Target and the Hunter, arranged such that when the Hunter is gaining on the Target these values ought to be negative. The resultant velocity can be given by taking the difference in distance divided by the difference in time for a given time step.
The basic formula for calculating TTC are provided below. These deal with each direction, forward and lateral, independently. For most use cases the final TTC used will be the maximum of and , although in some cases users may only be concerned with .
If either or are greater than 0.5 m/s² then their effect ought to be considered. The equations below provide the solution for both lateral and forward cases. Similar to the TTC calculations without acceleration the final value used will likely be the maximum of the two for most users. Please note that it may only be necessary to use one of these solution methods alongside the without acceleration equation for the other direction, e.g. and or and .
As these equations are quadratic there are a range of real and imaginary solutions, however we are only concerned with certain real solutions. The general form of the equation is provided below.
If the discriminant () is less than zero there are no real solutions for TTC.
If the discriminant equals zero then there is only one solution for TTC, which is the solution without acceleration.
If the discriminant is greater than zero then there are two solutions for TTC ( and ).
If then is the valid solution,
if not and then is the valid solution,
if not and then is the valid solution,
if none of the above are true then is the valid solution.
This article has touched on working with mobile targets and fixed points, with polygons, sensor points and bull's eyes for measurements. The Range system can also target a series of Feature Points. These are configured using a Feature Point file. This is a list of data points, each with a latitude, longitude, altitude and heading. A field of view can be setup at the Hunter, this is separate to the sensor point field of view mentioned above. As the Hunter navigates through the test are the Range system will report the range to the closest one of the Feature Points defined within the file. The diagrams below show the Hunter moving forward through the points and the highlighted (circled red) point would be the one that relative measurements are reported for.
Feature point files are currently limited to 65,535 points, up to 500 metre range. As up to four Targets can be designated in a Range system the user could load the same Feature Point file into all four Targets and have the measurements to the closest four points reported to them, rather than just the closest one.
Lanes can be defined in a very similar way to feature points, using a series of data points, each with a latitude, longitude, altitude and heading. Individual lanes can be combined into a map file containing several lanes. Three defining points, A, B and C, are specified at the Hunter and measurements from all of these points are available to the lanes surrounding the Hunter.