What if the images are at an angle to each other?

Until now, we were looking at images that were on the same plane. Stitching those images was straightforward and we didn't have to deal with any artifacts. In real life, you cannot capture multiple images on exactly the same plane. When you are capturing multiple images of the same scene, you are bound to tilt your camera and change the plane. So the question is, will our algorithm work in that scenario? As it turns out, it can handle those cases as well.

Let's consider the following image:

What if the images are at an angle to each other?

Now, let's consider another image of the same scene. It's at an angle with respect to the first image, and it's partially overlapping as well:

What if the images are at an angle to each other?

Let's consider the first image as our reference. If we stitch these images using our algorithm, it will look something like this:

What if the images are at an angle to each other?

If we keep the second image as our reference, it will look something like this:

What if the images are at an angle to each other?

Why does it look stretched?

If you observe, a portion of the output image corresponding to the query image looks stretched. It's because the query image is transformed and adjusted to fit into our frame of reference. The reason it looks stretched is because of the following lines in our code:

M, mask = cv2.findHomography(src_pts, dst_pts, cv2.RANSAC, 5.0)
result = warpImages(img2, img1, M)

Since the images are at an angle with respect to each other, the query image will have to undergo a perspective transformation in order to fit into the frame of reference. So, we transform the query image first, and then stitch it into our main image to form the panoramic image.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset