Eyesis going for the three-sixty
With the first Eyesis content available I started to prototype the panoramic workflow. We would like to have a reference design to be available in such way that a camera can be used without custom development. For practical reasons I have started with the relative low-quality JPEG output. Like for all current Elphel camera’s JPEG, JP4 and JP46 modes exist. For framerate and quality the JP4 mode will be the optimal sweetspot, it gives a theoretical maximum of 5.31 fps (panorama), with host side debayer.
With a total of nine camera’s, producing a total of 3 video-streams of data, our first task is to export and to align the produced content from the video-streams. The EXIF information per frame contains enough information up to sub-second timestamps to align the correct images with eachother. Because all sensors are synced, the values should be and are equal. For each frame, 3 images are available. These 3 frames, are taken from the different sensors, and we chop them into 3 individual images. Resulting in 9 images per timestamp.
In order to blend the different pictures taken at a single moment into a panoramic photo, some free software is currently available. An integrated environment can be found at the Hugin project. First of all, it is good to know where we are standing regarding to lens alignment, and why invest more time in sofware development if something functional already does exist?
Using the following image you can see the overlap in the image. I have used a reference frame and cylindrical projection in order to get a stretched view. The camera was not calibrated, hence all alignment was doing using shared visual information.
The current output needs some work, but still shows what is already possible, even with slightly shifted lenses and without a shared whitebalance. Click on the image for a 12MB full resolution version!
Leave a Reply