{"id":335108,"date":"2016-12-12T11:52:40","date_gmt":"2016-12-12T19:52:40","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&#038;p=335108"},"modified":"2018-07-26T08:21:45","modified_gmt":"2018-07-26T15:21:45","slug":"a-flexible-new-technique-for-camera-calibration-2","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/a-flexible-new-technique-for-camera-calibration-2\/","title":{"rendered":"A Flexible New Technique for Camera Calibration"},"content":{"rendered":"<p>We propose a flexible new technique to easily calibrate a camera. It is well suited for use without specialized knowledge of 3D geometry or computer vision. The technique only requires the camera to observe a planar pattern shown at a few (at least two) different orientations. Either the camera or the planar pattern can be freely moved. The motion need not be known. Radial lens distortion is modeled. The proposed procedure consists of a closed-form solution, followed by a nonlinear refinement based on the maximum likelihood criterion. Both computer simulation and real data have been used to test the proposed technique, and very good results have been obtained. Compared with classical techniques which use expensive equipments such as two or three orthogonal planes, the proposed technique is easy to use and flexible. It advances 3D computer vision one step from laboratory environments to real world use.<\/p>\n\t<div data-wp-context='{\"items\":[]}' data-wp-interactive=\"msr\/accordion\">\n\t\t\t\t\t<div class=\"clearfix\">\n\t\t\t\t<div\n\t\t\t\t\tclass=\"btn-group align-items-center mb-g float-sm-right\"\n\t\t\t\t\tdata-bi-aN=\"accordion-collapse-controls\"\n\t\t\t\t>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Expand all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onExpandAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tExpand all\t\t\t\t\t<\/button>\n\t\t\t\t\t<span aria-hidden=\"true\"> | <\/span>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Collapse all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onCollapseAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tCollapse all\t\t\t\t\t<\/button>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t\t\t<ul class=\"msr-accordion\">\n\t\t\t\t\t\t\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-2\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-2\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-1\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tExperimental data and result for camera calibration\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-1\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-2\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>The model plane contains a pattern of 8&#215;8 squares, so there are 256 corners. The size of the pattern is 17cm x 17cm. The 2D coordinates (in inches) of these points are available here. (We assume the plane is at Z=0.)<\/p>\n<p>We have taken five an off-the-shelf PULNiX CCD camera with 6 mm lens. The image resolution is 640&#215;480. The five images are available here: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image1.gif\">Image 1<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image2.gif\">Image 2<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image3.gif\">Image 3<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image4.gif\">Image 4<\/a> and <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image5.gif\">Image 5<\/a>. The first two are shown below. We can observe a significant lens distortion in the images.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335183 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image1-300x225.gif\" alt=\"image1\" width=\"300\" height=\"225\" \/>\u00a0\u00a0\u00a0\u00a0 <img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335186 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image2-300x225.gif\" alt=\"image2\" width=\"300\" height=\"225\" \/><\/p>\n<p>The corners were detected as the intersection of straight lines fitted to each square. The image coordinates are available here: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/imagepointsone.txt\">image points 1<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/imagepointstwo.txt\">image points 2<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/imagepointsthree.txt\">image points 3<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/imagepointsfour.txt\">image points 4<\/a> and <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/imagepointsfive.txt\">image points 5<\/a>. Below we show the first two images with detected points indicated as +.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335156 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/marked1-3-300x225.gif\" alt=\"marked1-3\" width=\"300\" height=\"225\" \/>\u00a0\u00a0\u00a0\u00a0 <img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335159 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/marked2-4-300x225.gif\" alt=\"marked2-4\" width=\"300\" height=\"225\" \/><\/p>\n<p>(other images with marked points are available here: <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/marked3.gif\">marked 3<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/marked4.gif\">marked 4<\/a>, and <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/marked5.gif\">marked 5<\/a>.)<\/p>\n<p>And here is what the calibration tells us about the camera: The pixel is square (aspect ratio = 1); the focal length = 832.5 pixels; the image center is at (303.959, 206.585); there is a significant radial distortion: k1 = -0.228601, k2 = 0.190353. The complete calibration result is available <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/completecalibration.txt\">here<\/a>. (The format of the calibration file is: a, c, b, u0, v0, k1, k2, then the rotation matrix and translation vector for the first image, the rotation matrix and translation vector for the second image, etc.)<\/p>\n<p>The estimated distortion parameters allow us to correct the distortion in the original images. Below, we show the first two such distortion-corrected images, which should be compared with the first two images shown above. We see clearly that the curved pattern in the original images is straightened.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335177 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/UndistortIm1-5-300x225.gif\" alt=\"undistortim1-5\" width=\"300\" height=\"225\" \/>\u00a0\u00a0\u00a0\u00a0 <img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335180 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/UndistortIm2-6-300x225.gif\" alt=\"undistortim2-6\" width=\"300\" height=\"225\" \/><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-4\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-4\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-3\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tApplication to image-based modeling\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-3\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-4\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p>Two images of a tea tin (see below) were taken by the same camera as used above for calibration.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335171 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/TeaBox1-7-300x225.gif\" alt=\"teabox1-7\" width=\"300\" height=\"225\" \/>\u00a0\u00a0\u00a0\u00a0 <img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-335174 alignnone\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/TeaBox2-8-300x225.gif\" alt=\"teabox2-8\" width=\"300\" height=\"225\" \/><\/p>\n<p>Mainly two sides are visible. After correcting the radial distortion (<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/teatin1.gif\">tea tin 1<\/a>, <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/teatin2.gif\">tea tin 2<\/a>), we manually picked 8 point matches on each side. The matches are <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image1text.txt\">here for Image 1<\/a> and <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/image2text.txt\">here for Image 2<\/a>.<\/p>\n<p>The <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/www-sop.inria.fr\/robotvis\/personnel\/zzhang\/software-SFM.html\">structure-from-motion software we developed ealier<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> was run on these 16 point matches to build a partial model of the tea tin. The reconstructed model is in VRML. The reconstructed points on each side are indeed coplanar, and we computed the angle between the two reconstructed planes which is 94.7 degrees. Although we do not have the ground truth, but the two sides of the tea tin are indeed almost orthogonal to each other.<\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-6\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-6\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-5\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tTranslations\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-5\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-6\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<ul>\n<li><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/softdroid.net\/novye-tehniki-kalibrovki-kamery\">Russian translation<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n<li><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/13motors.com\/myblog\/calib\/\">Kazakh Translation<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n<li><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/animalia-life.com\/articles\/calib.html\">Polish translation<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n<li><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"http:\/\/elelur.com\/calibration\/\">Slovak translation<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n<li><a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" rel=\"noopener noreferrer\" target=\"_blank\" href=\"https:\/\/www.chameleonjohn.com\/translations\/Calib-Indonesian\">Indonesian translation<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><\/li>\n<\/ul>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t\t\t\t\t<\/ul>\n\t<\/div>\n\t\n","protected":false},"excerpt":{"rendered":"<p>We propose a flexible new technique to easily calibrate a camera. It is well suited for use without specialized knowledge of 3D geometry or computer vision. The technique only requires the camera to observe a planar pattern shown at a few (at least two) different orientations. Either the camera or the planar pattern can be [&hellip;]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13562],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-335108","msr-project","type-msr-project","status-publish","hentry","msr-research-area-computer-vision","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"1999-12-02","related-publications":[145334,320642,326138],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[],"msr_research_lab":[199565],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/335108","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":2,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/335108\/revisions"}],"predecessor-version":[{"id":497486,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/335108\/revisions\/497486"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=335108"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=335108"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=335108"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=335108"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=335108"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}