Equirectangular to Stereographic Projections (Little Planets) in MATLAB

The camera included in Google’s Android mobile OS has a feature called “Photo Spheres” that allows you to take a series of photos and create a full spherical panorama. The Photo Sphere feature is included on Google Play Edition (GPE) phones –phones that incorporate Google’s version of unadulterated Android– including my Nexus 5. When you take a Photo Sphere, the camera seamlessly stitches the individual photos into an Equirectangular panorama. For example, here is a panorama I took of the rapeseed fields in central Germany:

Generated using Android's Photo Sphere function

There is a bit of distortion (see Tissot’s indicatrix), especially at the top and bottom of the image, but this is due to the problem of projecting a sphere onto a plane. On the Nexus 5 (and other GPE phones), the Gallery application includes a feature that allows you to either view the resulting Photo Spheres as spherical panoramas or to create “Little Planets”/”Tiny Planets”, which are actually Stereographic Projections of the spherical panorama. I found the effect really neat, so I wanted to see if I could recreate the projection in MATLAB.

As a teaser, here’s the output for my code:

MATLAB Little Planet of German Rapeseed Field

Click through to get more information on the MATLAB implementation.
Read More

GoPro SuperView-like adaptive aspect ratio

For Christmas, my parents got me a fantastic gift for photographers and outdoors enthusiasts, a GoPro Hero3+ Silver Edition digital camera (Amazon). If you are not familiar with GoPros, they are small action cameras that have a very wide-angle lens and come with a water-resistant case. It’s a fantastic little camera that packs a lot of punch for such a compact package.

There are a number of GoPro editions, but the newest one are the Hero3+ Silver and Black. The Silver Edition is very similar to the GoPro Hero3+ Black Edition (Amazon), but omits a few features, including some very high-resolution video recording modes and a feature that GoPro calls “Superview”. This post talks about a way that I attempted to emulate their Superview mode in MATLAB and put together an adaptive aspect ratio function that allows one to change the image’s aspect ratio while maintaining “safe regions” with minimal distortion.

This function allowed me to resize 4:3 images and video, like this one:

To a wider aspect ratio, for example 16:9:

Click through for info on how I implemented the code.

Read More

Magic Lantern HDR video to tonemapped video with MATLAB scripts

I have a Canon T3i with a Canon EF 50mm f1.4 lens that I use for the gross majority of my day-to-day photography these days. I’ve been using a custom firmware for the Canon called Magic Lantern that provides a some interesting (and useful!) functions. One of them is HDR video. Here’s a beautiful example of what can be done:

http://www.youtube.com/watch?v=bLxYTT_0GEI

I tried my hand at processing the HDR video output and was able to get a reasonably nice tone-mapped video:

After the break, you’ll find how I processed the initial Magic Lantern video using MATLAB and exiftool and tone-mapped the output using Luminance HDR.

Read More

Effect of Environment Map Blur on Perceived Surface Properties

Here are a couple of quick demos, illustrating how blurring a cubic environmental map can lead to a change in the perceived roughness of the surface of 3D rendered objects.



I created a series of HDR cube maps using NVIDIA’s CubeMapGen (currently hosted on Google Code). Starting with the Debevec light probes, I applied a Gaussian blur with increasing kernel size (10°, 20°, 30°, 40°, and 50°), creating 6 cube maps (one for each blur). In the videos, the cube maps have increasing blur from left-to-right, top-to-bottom. Note that I did not tone-map or account for changes in overall exposure (so the specular reflections can appear blown-out, especially for the higher blurs). After the break, you can see the effect using different light probes (and different shapes).

Read More

Misperceived axis of rotation for objects with specular reflections

Katja Dörschner visited JLU last week and talked about her work investigating structure from motion with specular reflections and textures (see more info in her recent paper: Doerschner, Fleming, Yilmaz, Schrater, Hartung, & Kersten, 2011). She showed an interesting situation where the axis of rotation of a 3D teapot was misperceived due to the motion of the specular reflections on the surface of the teapot (see: Yilmaz, Kucukoglu, Fleming, & Doerschner, 2011), an effect first demonstrated by Hartung and Kersten (2002).

Using the OpenGL/Psychtoolbox framework I have previously described, I replicated this interesting effect. When you play the following movie, a 3D sphere (with sinusoidal perturbations) is rotated. Note the axis of perceived rotation when the object has specular reflections (1st half of the movie) and when the environment map is “painted” onto the surface (2nd half of the movie).



The physical motion of the object is the same in both cases — the object rotates around the vertical axis. When the object only has specular reflections, it appears to rotate around an oblique 45° axis, but when textured it rotates around the vertical axis. After the break, I show similar effects when the object is rotated around the horizontal axis, 45° axis, and when the spatial frequency of the perturbation is manipulated.

Read More