Introduction

The goal of this file is to provide an example of rotation estimate using the singular value decomposition (SVD) on a raw point-cloud, so without feature extraction.

In this case we will deal with fully simulated data in order to know, at a very fundamental level, the ground truth of the experiments. For this reason, as it will be presented, we will regard rooms as made by closed polygons; for the generation of the point-cloud it will be used a custom-built ray-cast function that will check the collision of a ray with a given wall of a room. Later will be introduced how to construct virtual environments to simulate and how to set properties of the robot (position and uncertainties).

Finally, some numerical example will be carried out using the SVD as attitude estimator and an exhaustive comparision will be presented.

library(matlib)
## Warning: il pacchetto 'matlib' è stato creato con R versione 4.2.2
library(data.table)

Preliminary function definitions and coding

Raycasting

As a first thing, we would like to see if a casted ray intersects (or not) a wall and, if so, at which distance. We considera a wall as charactedized by just the two extremal points \(P_1 = (x_1, y_1)\) and \(P_2 = (x_2, y_2)\). Assuming that the ray starts from a point \(O = (O_x, O_y)\) with direction \(v = (v_x, v_y)\), then finding the intersection between the wall and the casted ray is described by the following equation \[ P_1 + t(P_2 -P_1) = O + d v \] leading to the set of linear equations \[ \begin{cases} x_1 + t x_2 - t x_1 = O_x + d v_x \\ y_1 + t y_2 - t y_1 = O_y + d v_y \end{cases} \] that can be converted in the following matrix form given the unknowns \(t, d\): \[ \begin{bmatrix} x_2 - x_1 & - v_x \\ y_2 - y_1 & -v_y \end{bmatrix} \begin{pmatrix} t \\ v \end{pmatrix} = \begin{pmatrix} O_x - x_1 \\ O_y - y_1 \end{pmatrix} \] Unless the system is singular (i.e., the casted ray is parallel to the wall), we have a solution \(t^\star, d^\star\) that must be physically regarded if and only if \(t^\star \in [0,1]\): only in this case we are sure that the intersaction lies exactly in between \(P_1\) and \(P_2\). In all other cases, the function returns \(-1\) by default (no intersection actually happened).

ray_distance <- function(O, v, P1, P2){

  A   <- cbind(P2-P1, -v)
  b   <- O - P1
  
  if ( abs(det(A)) <= 1e-6 ) # singular case
    return(-1)  
  
  res <- solve(A, b)
  t   <- res[1]
  d   <- res[2]

  if ( (t<0) | (t>1) | (d<0) ) # if intersection is outside of the wall or is "behind"
    return(-1)

  return(d)
}

Definition of robot and environment

We characterize a robot structure as a object containing information concerning the characteristics of the LIDAR measurement system that are

  • angle_uncertainty: standard deviation of the bearing measurement; casted ray are in fact now sent at a deterministic value, as it’s not feasible in practise;
  • lidar_fov: the field of view of the measurement system;
  • lidar_N: number of samples taken equispaced in the whole LIDAR’s field of view;
  • lidar_sigma: uncertainty of the polar measurement.

For what concerns the status of the robot, it’s required to set:

  • heading: orientation of the robot (in degrees) w.r.t. the \(x\) axis;
  • pos: a 2D vector containing the cartesian position of the robot.
robot                     <- list()                    
robot$angle_uncertainty  <- 0.5       # in degrees
robot$lidar_fov          <- 360       # in degrees
robot$lidar_N            <- 100       # number of scanned points
robot$lidar_sigma        <- 0.1       # standard deviation of lidar measurement
robot$heading            <- 10        # heading of the robot w.r.t ground, in degrees
robot$pos                <- c(0, 0)   # position of the robot w.r.t. ground

# For drawing purposes, the following function is used to plot the robot in the cartesian space
draw_robot <- function(robot, fillcol='White'){
  main_dir <- direction_vector(rad(robot$heading))
  sec_dir  <- direction_vector(rad(robot$heading + 90))
  PP       <- cbind(
    2/3*main_dir, 
    -1/3*main_dir+1/4*sec_dir, 
    -1/3*main_dir-1/4*sec_dir
    )
  return(polygon(robot$pos[1] + PP[1,], robot$pos[2] + PP[2,], col = fillcol))
}
robot_ref         <- list()   # This robot is the reference one to plot the scanned pointcloud
robot_ref$pos     <- c(0, 0)
robot_ref$heading <- 0

An environment instead contains information concerning just the position of the walls that are considered as a list of closed polygons.

environment <- list()           # information of the environment
poly_1 <- matrix(c(             # first polygon
    -4,   -4,
    -4,   4,
    4,    4,
    4,    -4,
    -4,   -4                    # Note: the last element is equal to the first to close the polygon
  ), ncol = 2, byrow = TRUE)
poly_2 <- matrix(c(             # second polygon
    1,    1,
    1,    3,
    3,    3,
    3,    1,
    1,    1
  ), ncol = 2, byrow = TRUE)
environment$poly_list <- list(poly_1, poly_2)

# This function is used to automatically render the true environment layout 
draw_room <- function(env){

  p = env$poly_list[[1]]
  plot(p[,1], p[,2], type='l', asp=1, xlab='', ylab='', main='room layout')

  if(length(env$poly_list) > 1){
    for(i in 2:length(env$poly_list)){

      p = env$poly_list[[i]]
      lines(p[,1], p[,2])

    }
  }
}

Computation of the point-cloud

Firstly we define some function that will be useful throughout the whole file

# Compute the unitary vector with heading alpha with respect to the x axis
direction_vector <- function(alpha){
  return(c(cos(alpha), sin(alpha)))
}

# Function that converts from degrees to radians
rad <- function(angle){
  return(angle * pi / 180)
}


# Function that converts from radians to degrees
deg <- function(angle){
  return(angle * 180 / pi)
}

With all that has been said so far we are now able to generate the proper pointcloud given a robot and an environment object to raycast the laser to.

compute_pointcloud <- function(robot, env){
  
  PC <- list()  # pointcloud object

  # angles at which robot should compute the point cloud
  # this two cases are dealt separately: when we have a "partial" fov scan vs when there's a full view 
  # In the latter case we have to avoid that the last measures overlaps with the first one
  if(robot$lidar_fov < 360) 
    PC$angles_true <- seq(
      -rad(robot$lidar_fov / 2), rad(robot$lidar_fov / 2), 
      length.out = robot$lidar_N
      )
  else 
    PC$angles_true <- seq(
      -rad(robot$lidar_fov)/2, rad(robot$lidar_fov)/2, 
      length.out = robot$lidar_N + 1
      )[1:robot$lidar_N]

  PC$angles_noisy <- PC$angles_true # add some noise on the true angles
    + rnorm(robot$lidar_N, sd = rad(robot$angle_uncertainty) )

  PC$polar_true <- rep(1000, robot$lidar_N) # initialize memory for polar distance measurements
  PC$polar_meas <- rep(1000, robot$lidar_N)

  # computation of the true pointcloud
  for(poly in env$poly_list){ # for each polygon in the environment
    
    for(i_angle in 1:robot$lidar_N){

      angle <- PC$angles_true[i_angle]                      # abs. angle of the ray
      v_dir <- direction_vector(angle + rad(robot$heading)) # and respective unitary vector headed w.r.t. the robot
      
      for(i_edge in 1:(dim(poly)[1]-1)){ # for each segment in the polygon
        
        # extract the point from the polygon matrix and compute the distance
        P1 <- unlist(as.list(poly[i_edge,  ]))
        P2 <- unlist(as.list(poly[i_edge+1,]))
        d  <- ray_distance(
          robot$pos, v_dir,
          P1, P2  
          )
        
        if(d>0 & d<PC$polar_true[i_angle]) # replace distance only if we see a closer wall
          PC$polar_true[i_angle] <- d

      }
    }
  }
  
  # computation of the noisy pointcloud
  for(poly in env$poly_list){

    for(i_angle in 1:robot$lidar_N){

      angle <- PC$angles_noisy[i_angle]
      v_dir <- direction_vector(angle + rad(robot$heading))

      for(i_edge in 1:(dim(poly)[1]-1)){
        
        P1 <- unlist(as.list(poly[i_edge,  ]))
        P2 <- unlist(as.list(poly[i_edge+1,]))
        d  <- ray_distance(
          robot$pos, v_dir,
          P1, P2  
          )
        
        if(d>0 & d<PC$polar_meas[i_angle])
          PC$polar_meas[i_angle] <- d

      }
    }
  }
  # add some noise on the polar measurements
  PC$polar_meas <- PC$polar_meas + rnorm(robot$lidar_N, sd = robot$lidar_sigma)

  # conversion of the pointcloud in cartesian coordinate with respect to the robot
  # we generate a matrix with 2 rows: one for "x" measurements, one for "y" measurements
  PC$X_true <- matrix(rep(0, 2*robot$lidar_N), nrow = 2)
  PC$X_meas <- matrix(rep(0, 2*robot$lidar_N), nrow = 2)
  
  for(i in 1:robot$lidar_N){ # conversion to polar coordinates
    v             <- direction_vector(PC$angles_true[i])
    PC$X_true[,i] <- v * PC$polar_true[i]
    PC$X_meas[,i] <- v * PC$polar_meas[i]
  }

  # extraction of the single components
  PC$x_true <- PC$X_true[1,]
  PC$y_true <- PC$X_true[2,]
  PC$x_meas <- PC$X_meas[1,]
  PC$y_meas <- PC$X_meas[2,]

  return(PC)
}

The following code shows the result of the yet defined functions

# generation of the pointcloud
pointcloud <- compute_pointcloud(robot, environment)

par(mfrow=c(1,3))
# draw the room and the robot
draw_room(environment)          
draw_robot(robot_ref, 'Green')

# draw the true laserscan
plot(pointcloud$x_true, pointcloud$y_true, asp=1, pch = 20, xlab='', ylab='', main='true p.c.')
draw_robot(robot_ref, 'white')

# draw the measured laserscan
plot(pointcloud$x_meas, pointcloud$y_meas, asp=1, pch = 20, xlab='', ylab='', main='meas. p.c.')
draw_robot(robot_ref, 'white')

First case study: simple room

For the first case, we consider a simple environment with just an open plain room, with a scan in 360°. Given just 2 configuration of the robot, we want do discuss how to estimate the motion of the robot.

environment1 <- list() # build the environment
poly_1 <- matrix(c(     
    -2,   -6,
    3,    -6,
    3,    -2,
    2,    -1,
    3,    1,
    3,    3,
    2,    4,
    -1,   5,
    -4,   4,
    -4,   1,
    -2,   -2,
    -3,   -4,
    -2,   -6
  ), ncol = 2, byrow = TRUE)
environment1$poly_list <- list(poly_1)

robot1 <- list()          # set main features of the robot model
robot1$angle_uncertainty  <- 0.1    
robot1$lidar_fov          <- 360    
robot1$lidar_N            <- 300    
robot1$lidar_sigma        <- 0.1    
robot2 <- robot1

Attitude estimation with no noise

As first step we want to discuss just the estimation of the attitude in case of no noise. For this purpose the position of the robot is fixed, while we consider a change in attitude of \(30^\circ\) clockwise, as shown here:

# set robot configuration
robot1$heading            <- 90     
robot1$pos                <- c(-1,-1) 

robot2$heading            <- 60
robot2$pos                <- c(-1, -1)

# computation of the point clouds
pc1 <- compute_pointcloud(robot1, environment1)
pc2 <- compute_pointcloud(robot2, environment1)

# drawing
par(mfrow=c(1,2))
draw_room(environment1)
draw_robot(robot1, 'Green')
draw_robot(robot2, 'Orange')

plot(pc1$x_true, pc1$y_true, col = 'Green', pch = 20, cex = 0.6, 
  asp = 1, main = 'laser scans', xlab = 'x', ylab = 'y', 
  xlim = c(-7,7), ylim = c(-6,6)
  )
points(pc2$x_true, pc2$y_true, asp = 1, col = 'Orange', pch = 20, cex = 0.6)
draw_robot(robot_ref, 'white')

As suggested by theory, to estimate the attitude variation we have to detrend the measurements by subtracting the mean value of each row:

x1_mean <- mean(pc1$x_true)
y1_mean <- mean(pc1$y_true)
x2_mean <- mean(pc2$x_true)
y2_mean <- mean(pc2$y_true)
map1    <- rbind(pc1$x_true - x1_mean, pc1$y_true - y1_mean)
map2    <- rbind(pc2$x_true - x2_mean, pc2$y_true - y2_mean)

The so retrieved mean vaues are \(\mu_1 = \begin{pmatrix} 0.28& -0.61\end{pmatrix}\) and \(\mu_2 = \begin{pmatrix} 0.55& -0.39\end{pmatrix}\); their difference can be mainly associated to the fact that the rotation distorts their location with respect to the different reference frames.

At this point the rotation can be estimated by means of the singular value decomposition of the detrended measurements, in particular by looking at the values of the matrices \(V_i\):

svd1 <- svd(t(map1))
svd2 <- svd(t(map2))
V1   <- svd1$v
V2   <- svd2$v

that numerically evaluates to: \[ V_1 = \begin{bmatrix} 0.98& 0.17\\ 0.17& -0.98\end{bmatrix} \qquad V_2 = \begin{bmatrix} 0.77& 0.64\\ 0.64& -0.77\end{bmatrix} \]

Theory suggests us that the estimated rotation matrix \(R_2^1\) that transforms the second reference frame in the initial one is

R21       <- V1 %*% t(V2) 
R12       <- t(R21)
alpha_hat <- atan2(R12[2,1], R12[1,1]) * 180/pi

\[ R_2^1 = V_1V_2^T = \begin{bmatrix} 0.87& 0.50\\ -0.50& 0.87\end{bmatrix} \qquad R_1^2 = \big(R_2^1\big)^T = \begin{bmatrix} 0.87& -0.50\\ 0.50& 0.87\end{bmatrix}\] Knowing that the true rotation is of \(30^\circ\), the estimated angle from the rotation matrix \(R_1^2\) can ba computed by it’s first column as \[ \hat \alpha = \arctan \left( \frac{0.5}{0.8660254} \right) = 30^\circ \] The fact that there’s no noise makes everything easier, still it proves the power of singular value decomposition.

Code for comparing the pointcloud

The procedure yet described can be summarized in the following function that, given two pointclouds, computes the singular value decomposition on the detrended measurements and provides the rotation matrix as well as the angle estimation. Furthermore it performs some checks to avoid strange cases where singular values are close together (in the same laserscan) or they appear very different in the two pointclouds.

compare_pointclouds <- function(pc1, pc2){

  res       <- list()
  res$mu1   <- c(mean(pc1$x_meas), mean(pc1$y_meas) )
  res$mu2   <- c(mean(pc2$x_meas), mean(pc2$y_meas) )

  X1        <- rbind( pc1$x_meas - res$mu1[1],
                      pc1$y_meas - res$mu1[2] )
  X2        <- rbind( pc2$x_meas - res$mu2[1],
                      pc2$y_meas - res$mu2[2] )
  svd1      <- svd( t(X1) )
  svd2      <- svd( t(X2) )
  res$V1    <- svd1$v
  res$V2    <- svd2$v
  res$sv1   <- svd1$d 
  res$sv2   <- svd2$d 

  sigma     <- svd1$d 
  if(abs(abs(sigma[1]) - abs(sigma[2])) <= 1)
    print('Warning: the two singular values are close together!')  

  if(norm(res$sv1 - res$sv2, type='2') > 1) 
    print('Warning: singular values between laserscans are not close together')

  res$R12     <- res$V2 %*% t(res$V1)
  res$R21     <- res$V1 %*% t(res$V2)
  res$alpha21 <- deg(atan(res$R12[2,1] / res$R12[1,1]))
  res$alpha12 <- deg(atan(res$R21[2,1] / res$R21[1,1]))

  return(res)

}

Attitude estimation with noise

Let’s stick with the same scenario but considering now a more real case for which the measurements are affected by uncertainty. Computing the singular value decomposition provides us \[ V_1 = \begin{bmatrix} 0.98& 0.17\\ 0.17& -0.98\end{bmatrix} \qquad V_2 = \begin{bmatrix} 0.77& 0.64\\ 0.64& -0.77\end{bmatrix} \] that leads us to the rotations \[ R_2^1 = \begin{bmatrix} 0.87& 0.50\\ -0.50& 0.87\end{bmatrix} \qquad R_1^2 = \begin{bmatrix} 0.87& -0.50\\ 0.50& 0.87\end{bmatrix}\] In this case the estimated angle is \(30.0311107^\circ\). Still we obtain remarkable results as the estimated angle is not far away from the ground truth of \(30^\circ\).

Attitude estimation with noise and translation

Let’s see what happens now when, on top on the noise and the orientation, we add some displacement to the robot. Computing the singular value decomposition provides us \[ V_1 = \begin{bmatrix} 0.98& 0.17\\ 0.17& -0.98\end{bmatrix} \qquad V_2 = \begin{bmatrix} 0.77& 0.64\\ 0.64& -0.77\end{bmatrix} \] that leads us to the rotations \[ R_2^1 = \begin{bmatrix} 0.87& 0.49\\ -0.49& 0.87\end{bmatrix} \qquad R_1^2 = \begin{bmatrix} 0.87& -0.49\\ 0.49& 0.87\end{bmatrix}\] In this case the estimated angle is \(29.3683165^\circ\). Still we obtain remarkable results as the estimated angle is not far away from the ground truth of \(30^\circ\).

Error comparison

Let us further improve our analysis by comparing the estimation outcome from the ground truth that we generated. The following function takes as inputs an environment env, a first robot rob1 that is placed still inside the environment and a second robot rob2 that is allowed to rotate in another position. The orientation of the robot is based on the first one and is obtained by superimposing an angle variation for each entry in the list delta_seq. Once we compute and compare all pointcloud for each heading of the second robot, we display the room with the position of the two robots, the laserscan in the initial position and the estimation error results.

compare_error <- function(env, rob1, rob2, delta_seq = seq(-90, 90, length.out = 41)){

  alpha0    <- rob1$heading
  alpha12   <- rep(0, length(delta_seq))
  alpha21   <- rep(0, length(delta_seq))
  pc1       <- compute_pointcloud(rob1, env)
  
  for(i in 1:length(delta_seq)){
    rob2$heading  <- alpha0 + delta_seq[i]
    pc2           <- compute_pointcloud(rob2, env)
    pc_com        <- compare_pointclouds(pc1, pc2)
    alpha12[i]    <- pc_com$alpha12
    alpha21[i]    <- pc_com$alpha21
  }

  rob2$heading <- rob1$heading 

  par(mfrow=c(1,3))
  draw_room(env)
  draw_robot(rob1, 'Green')
  draw_robot(rob2, 'Orange')
  plot(pc1$x_true, pc1$y_true, col = 'Grey', pch = 4, cex = 0.6, 
    asp = 1, main = 'laser scans', xlab = 'x', ylab = 'y'
    )
  points(pc1$x_meas, pc1$y_meas, col = 'Green', pch = 20, cex = 0.6)
  draw_robot(robot_ref, 'white')

  plot(delta_seq, abs(alpha21 + delta_seq), col='Blue', pch=4, main='Estimation error', xlab='delta [°]', ylab='error [°]')
  points(delta_seq, abs(alpha12 - delta_seq), col='Red', pch=3)

}

Applied to our case

compare_error(environment1, robot1, robot2)

We see here that the estimation error follows a strange behaviour: given the attitude variation, the estimate can be good or very bad, even depending on the rotation matrix that we choose (the red measurements are the angle estiamted with the \(R_1^2\) matrix, while the blue measurements given \(R_2^1\)). This can be mainly related to some numerical issues and instabilities of the algorithms in the SVD computation

Still, if we assume that the angle variation is not too extreme, estimation results are quite good:

compare_error(environment1, robot1, robot2, seq(-10, 10, length.out = 31))

Second case study: more complex room

Let us deal now with a more complex scenario where instead of a simple open room, we have a lot of elements in the scenario. Let’s firstly build the environment.

environment2 <- list()        
poly_1 <- matrix(c(            
    4,7,    4,8,    10,5,   10,1,   7,1,
    7,-1,   8,-1,   8,-3,   6,-3,   6,-6,
    -7,-6,  -7,-1,  -5,-1,
    -5,1,   -6,1,   -6,5,   -4,5,   -4,8,   -1,8,   -1,7,   4,7                      
  ), ncol = 2, byrow = TRUE)
poly_2 <- matrix(c(            
    -6,-5,  -6,-3,  -5,-2,  -5,-3,  -4,-4,  -5,-5,  -6,-5                 
  ), ncol = 2, byrow = TRUE)
poly_3 <- matrix(c(            
    -3.5,-3,  -3.5,1, -3,1, -3,-3,  -3.5,-3                 
  ), ncol = 2, byrow = TRUE)
poly_4 <- matrix(c(            
    -3,2,   -5,4,   -4,4,   -3,2                
  ), ncol = 2, byrow = TRUE)
poly_5 <- matrix(c(            
    -2,4,   -2,6,   4,6,    3,4,    2,5,    0,5,    -1,4,   -2,4                
  ), ncol = 2, byrow = TRUE)
poly_6 <- matrix(c(            
    5,4,    5,5,    7,5,    8,4,    6,3,    6,4,    5,4                 
  ), ncol = 2, byrow = TRUE)
poly_7 <- matrix(c(            
    4,2,    6,2,    6,-2,   4,2                
  ), ncol = 2, byrow = TRUE)
poly_8 <- matrix(c(            
    1,-1,   3,-1,   3,-2,   1,-2,   1,-1               
  ), ncol = 2, byrow = TRUE)
poly_9 <- matrix(c(            
    -2,-5,  -1,-4,  1,-4,   4,-3,   5,-4, 4,-5, -2,-5                
  ), ncol = 2, byrow = TRUE)
poly_10 <- matrix(c(            
    1,2,    1,2.5,  1.5,2.5,  1.5,2,  1,2                
  ), ncol = 2, byrow = TRUE)

environment2$poly_list <- list( poly_1, poly_2, poly_3, poly_4, poly_5,
                                poly_6, poly_7, poly_8, poly_9, poly_10 )

robot2$heading <- 85
pc1            <- compute_pointcloud(robot1, environment2)
pc2            <- compute_pointcloud(robot2, environment2)
pc_comp        <- compare_pointclouds(pc1, pc2)

par(mfrow=c(1,2))
draw_room(environment2)
draw_robot(robot1, 'Green')
draw_robot(robot2, 'Orange')

plot(pc1$x_true, pc1$y_true, col = 'Green', pch = 20, cex = 0.6, 
  asp = 1, main = 'laser scans', xlab = 'x', ylab = 'y', 
  xlim = c(-7,7), ylim = c(-6,6)
  )
points(pc2$x_true, pc2$y_true, asp = 1, col = 'Orange', pch = 20, cex = 0.6)
draw_robot(robot_ref, 'white')

In this case the second robot is rotated clockwise by \(5^\circ\) and the algorithms estimates a value \(4.6816638^\circ\). We can analyse the more general behaviour of this algorithm. In this case we see that the error increases, mainly due to disparity issues in the laserscans.

This issue is further amplified if we embed some motion on the robot:

## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"

Note: the previous outputs were due to the checks performed while computing the SVD, so it indicates us that the algorithm could have some trobules in estimating the rotation, as shown in the graphs.

One way to cope with this issue is to increase the laserscan sampling frequency, by allowing to have data with lower translation to compensate for:

## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"

Third case study: reduced field of view

Up to now we considered a LIDAR system that was able to scan data at \(360^\circ\), while in some situations this might not be possible. Reducing the fielf of view of the camera of course negatively impacts the ability of the system to estimate the rotation. Let us consider the following example

# Robot with reduced field of view
robot5            <- robot1 
robot5$pos        <- c(0, -3)
robot5$lidar_fov  <- 180
robot5$lidar_N    <- 181
robot6            <- robot5 
robot6$heading    <- 85

par(mfrow=c(1,2))
draw_room(environment1)
draw_robot(robot5, 'Green')
draw_robot(robot6, 'Orange')

pc1 <- compute_pointcloud(robot5, environment1)
pc2 <- compute_pointcloud(robot6, environment1)

plot(pc1$x_meas, pc1$y_meas, col = 'Green', pch = 20, cex = 0.6, 
  asp = 1, main = 'laser scans', xlab = 'x', ylab = 'y', 
  xlim = c(-7,7), ylim = c(-6,6)
  )
points(pc2$x_meas, pc2$y_meas, asp = 1, col = 'Orange', pch = 20, cex = 0.6)
draw_robot(robot_ref, 'white')

pc_comp   <- compare_pointclouds(pc1, pc2)
## [1] "Warning: singular values between laserscans are not close together"
alpha_hat <- pc_comp$alpha21

Even still the second robot is rotated clockwise of just \(5^\circ\), the algorithm provides a bad estimation of \(9.4872357^\circ\). We can further compare the estimation error for different angles, and we see how the singular value decomposition fails in this case:

## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"
## [1] "Warning: singular values between laserscans are not close together"

One way to overcome this limitation is to use algorithms that allows to extract features from the environment: in this case we might be able to retrieve better informations. In fact changing the angle of the robot, changes what he’s able to see, and the mismatch could be relevant.