I am now testing some new images to further my experimentations with camera mapping and uvprojection modifier in blender. The first two images are what I am beta testing with and the other attached images are screen shots from blender for the camera calibrations, modeling, camera projections, each model’s matte & alpha channels and a test animation render.
I am getting closer to my final decision on what I need to do for my “Night Cap: Crimes of Passion.” My only bottle neck is that I need to find a way to collected survey data if the background photos that I take at the locations where I can’t bring a crew. This information will have to be place in Blender and then film my scenes with a real live camera. Now this camera mapping & uvproject workflow that I am displaying here is perfect for wanting to take a still image and animate it at if you actually had a video camera moving through the scene. Compositing live action with and matching then lens and focal lengths is another story. Before going in a taking pictures, I would have to do an animatic to see what focal length and orientation that will work best for the shot. One way I see is too film the locations and green screens with same HDDSLR, lens (including the barrel distortions), and the height if the camera lens. So in other words, survey data and meta data when be the hero’s for those coveted shots.
You can see the animation here.