1. 程式人生 > >線結構光視覺感測器/線鐳射深度感測器標定工具

線結構光視覺感測器/線鐳射深度感測器標定工具

線結構光視覺系統有著結構簡單、使用靈活、受周圍光照環境影響小等一系列特點,在實際中得到廣泛的應用。在該技術的使用中,標定是避免不了的一個環節。線結構光的標定過程大概可以分為兩個部分:相機標定和線結構游標定。目前相機標定技術比較成熟,尤其是以張正友平面標定法為代表的相機標定方法,得到了廣泛的應用和認可。而線結構光的標定方法,目前也有一些標定方法在實際中應用。

本人在學習和工作中對線結構光視覺系統的標定進行了研究和程式設計實現,目前完成了一個基於matlab 2015a的線結構游標定程式。該程式只需要使用一塊棋盤格標定板即可完成線結構光視覺系統的標定。下面是該標定軟體的具體使用過程。

首先選擇用於標定的圖片所在的資料夾,然後點選標定按鈕,軟體將會自動完成相機標定和線結構游標定,並把標定結果存於txt檔案中。








相機引數

[456.835948  0  319.163600

0  455.503272  224.297494

0  0  1]
畸變係數

[-0.428655 0.240668 0.000222 -0.000840 -0.074862]

平均重投影誤差

0.148146pixel

線結構光引數

[0.996652  0.009722  0.081176  135.203206]

附原始碼地址(.p檔案

http://download.csdn.net/detail/j10527/9694703

部分程式碼:

1.相機標定按鈕程式碼

% --- Executes on button press in camera_calibration.
function camera_calibration_Callback(hObject, eventdata, handles)
% hObject    handle to camera_calibration (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

disp 'Camera calibration begin...';
images = imageSet(get(handles.dir_cam, 'String'));
% images = imageSet('robart');
imageFileNames = images.ImageLocation;
for k = 1:length(imageFileNames)
    im = imread(imageFileNames{k});
    if size(im,3)==3
        im=rgb2gray(im);
    end
    im = imadjust(im);
    imwrite(im,imageFileNames{k});
end
% Detect calibration pattern.
[imagePoints, boardSize] = detectCheckerboardPoints(imageFileNames);

% Generate world coordinates of the corners of the squares.
% squareSize = 10; % millimeters
squareSize = str2num(get(handles.square_size, 'String'));
worldPoints = generateCheckerboardPoints(boardSize, squareSize);

% Calibrate the camera.
[params, ~, ~] = estimateCameraParameters(imagePoints, worldPoints, ...
    'EstimateTangentialDistortion', true, 'NumRadialDistortionCoefficients', 3);

if get(handles.show, 'value')
    figure;subplot(2,1,1);
    showReprojectionErrors(params);
end
for k = 1:size(imagePoints,3)
%     im = imread(imageFileNames{k});
%     im = rgb2gray(im);
%     J = undistortImage(im, params);
%     imwrite(J,['undistorted_calib/im' num2str(k) '.jpg']);
    err(k) = norm(mean(abs(params.ReprojectionErrors(:,:,k)),1));
end

for k = 1:round(size(imagePoints,3)*0.3)
    [~,ind]=max(err);
    err(ind)=[];
    imagePoints(:,:,ind)=[];
end

% [imagePoints, boardSize] = detectCheckerboardPoints(imageFileNames);

% Calibrate the camera.
[params, ~, ~] = estimateCameraParameters(imagePoints, worldPoints, ...
    'EstimateTangentialDistortion', true, 'NumRadialDistortionCoefficients', 3);
if get(handles.show, 'value')
    subplot(2,1,2);
    showReprojectionErrors(params);
    figure;
    showExtrinsics(params, 'CameraCentric');
end
dir_str = get(handles.dir_str, 'String');
images = imageSet(dir_str);
system('mkdir undistorted_stripe');
delete undistorted_stripe/*
for k = 1:images.Count
    I = imread(images.ImageLocation{k});
    [J,newOrigin] = undistortImage(I,params,'OutputView', 'same');
%     if get(handles.show, 'value')
%     figure;
%     imshow(J);
%     end
    imwrite(J,['undistorted_stripe/im_stripe' num2str(k) '.jpg']);
end

% images = imageSet('pics_raw/');
% for k = 1:images.Count
%     I = imread(images.ImageLocation{k});
%     [J,newOrigin] = undistortImage(I,params,'OutputView', 'same');
%     figure;
%     imshow(J);
%     imwrite(J,['pics/im_' num2str(k) '.jpg']);
% end

fd = fopen('cam_paras.txt', 'w+');

fprintf(fd, '%f %f %f %f\n', params.IntrinsicMatrix(1), params.IntrinsicMatrix(3), ...
    params.IntrinsicMatrix(5), params.IntrinsicMatrix(6));
fprintf(fd, '%f %f %f %f %f\n', params.RadialDistortion(1), params.RadialDistortion(2),...
    params.TangentialDistortion(1), params.TangentialDistortion(2), params.RadialDistortion(3));
fprintf(fd, '%f\n', params.MeanReprojectionError(1));
fclose(fd);
disp 'Camera calibration done.';
disp 'Camera calibration results were saved in cam_paras.txt.';


stripe_paras.txt檔案按照如下程式碼儲存的:


基於qt5和opencv3的新版本可執行檔案,下載地址

https://github.com/jah10527/laserLineToolkit



上面只顯示了Z座標,X,Y未顯示在介面上。