I am trying to calibrate my LiDAR with my camera. I initially tried the Matlab LiDAR-Camera calibrator app to perform calibration, but when I projected the LiDAR points back onto the image, they did not line up.
Assuming something was incorrect with the initial calibration, I manually selected the calibration points in both Image and 3D point cloud as shown in the images.
bool calibration = cv::solvePnP(ptCldObjectPtsPOI,
imgObjectPtsPOI,
K,
d,
R,
T,
true,
cv::SOLVEPNP_EPNP);
I am able to get an transformation, but when I use this transformation to project the LiDAR points back on the image, the LiDAR point doesn't overlap correctly.
I use the following code to apply transform to the 3D points
def applyTransformation(points, tranformation):
assert tranformation.shape[0] == 4
assert tranformation.shape[1] == 4
assert points.shape[0] == 3
points = np.dot(tranformation[:3,:3], points)
for i in range(3):
points[i,:] = points[i,:] + tranformation [i,3]
return points
and use the following code to project the 3D point into image frame
def projectPointsToCamFrame(points, K):
assert K.shape[0] <= 4
assert K.shape[1] <= 4
assert points.shape[0] == 3
depth = points[2,:]
# convert K into projection Matrix
P = np.eye(4)
P[0:K.shape[0],0:K.shape[1]] = K
# Concatinate 1s to 4th layer
points = np.concatenate((points,np.ones((1,points.shape[1]))))
points = np.dot(P,points)
points = points[:3,:]
# Normalize the points
points = points/points[2:3, :].repeat(3, 0).reshape(3, points.shape[1]) #These points are in camera co-ordinates
return points
I'm not sure where I am going wrong. Any help is appreciated.


