- Target board: Elphel 10393 (Xilinx Zynq 7Z030) with 1GB NAND flash
- U-Boot final image files (both support NAND flash commands):
- Build environment and dependencies (for details see this article) :
Running OSLO’s optimization has shown that having a single operand defined is probably not enough. During the optimization run the program computes the derivative matrix for the operands and solves the least squares normal equations. The iterations are repeated with various values of the damping factor in order to determine the optimal value of the damping factor.
So, extra operands were added to split the initial error function – each new operand’s value is a contribution to the spot size (blurring) calculated for each color, aberration and certain image heights. See Fig.1 for formulas.
The Error Function calculates the 4th root of the average of the 4th power spot sizes over several angles of the field of view.
Elphel has embarked on a new project, somewhat different from our main field of designing digital cameras, but closely related to the camera applications and aimed to further improve image quality of Eyesis4π camera. Eyesis4π is a high resolution full-sphere panoramic and stereophotogrammetric camera. It is a tiled multi-sensor system with a single sensor’s format of 1/2.5″. The specific requirement of such system is uniform angular resolution, since there is no center in a panoramic image.
1. Get X Virtual FrameBuffer
sudo apt-get install xvfb
2. Launch ImageJ (“cd” to the ij.jar directory):
Xvfb :15 &
DISPLAY=:15 java -Xmx12288m -jar ij.jar -run "TestIJ Plugin"
Links that helped:
It is available for downloading here.
|We have finally received the parts for Elphel-Eyesis-4π camera and started assembling them hoping that all will fit together as we planned. And for the most part they do, which seems a bit like magic to us: you design the camera on the computer in a 3D CAD program, make a long list of parts it will consist of and then a couple months later it all turns into physical object, not just a virtual 3D design.
Of course some of the parts will need minor modifications – some are due to mistakes made by us, and some are manufacturing problems. But none of them were significant enough to prevent us from assembling the first 3 prototypes, that will be 100% operational spherical panorama cameras. Elphel-Eyesis-4π is the second generation Panoramic Imaging System by Elphel Inc. It is able to capture high-resolution images in full 360 degrees and create 4π (in steridians) spherical panoramas at a high frame rate. The actual recording device consists of a weatherproof camera head that contains the image sensor front-ends and lenses in spherical distribution to cover the entire 360 degree area. The rest of the electronic components as well as the SSDs for data storage are contained inside the camera pole.
Elphel-Eyesis-4π covers a full sphere. 24 sensors (8 in horizontal array, 8 pointing at +30 to + 90 degree (zenith) and 8 pointing at -30 to -90 degree (nadir) ensure a uniform high resolution distribution over the entire covered area. A new Internal Measurement Unit (IMU) mounted at the top of the camera pole provides high resolution 3D position and orientation of the camera.
We are proud to add a new product and camera KIT to the Elphel portfolio. See the pricelist.
The NC353L-369-IMU/GPS is a new camera configuration with an Inertial Measurement Unit (IMU) and optional GPS receiver. In addition to storing the geographical coordinates with each captured image in a video stream this allows to also save 3D orientation (yaw, pitch and roll) and 3D acceleration (Six Degrees of Freedom Inertial Sensor) of the camera at the moment of capturing an image at very high precision (2400 samples/second). A detailed description can be found in the previous post.
This April we attached Eyesis camera to a backpack and took it to the Southern Utah. Unfortunately I did not finish the IMU hardware then so we could rely only on a GPS for tagging the imagery. GPS alone can work when the camera is on the car, but with a camera moving at pedestrian speed (images were taken 1-1.5 meters apart) it was insufficient even in the open areas with a clear view of the sky. Additionally, camera orientation was changing much more than when it is attached to a car that moves (normally) in the direction the wheels can roll. Before moving forward with the IMU data processing we decided to try to orient/place some of the imagery we took manually – just by looking through the panoramas, adjusting the camera heading/tilt/roll to make them look level and oriented so the next camera location matches the map. Just tweaking the KML file with the text editor seemed impractical to adjust hundreds of panoramic images so we decided to add more functionality to our simple WebGL panorama viewer, make it suitable for both walking through the panorama sets and for the editing orientations and locations of the camera spots. (more…)