This April we attached Eyesis camera to a backpack and took it to the Southern Utah. Unfortunately I did not finish the IMU hardware then so we could rely only on a GPS for tagging the imagery. GPS alone can work when the camera is on the car, but with a camera moving at pedestrian speed (images were taken 1-1.5 meters apart) it was insufficient even in the open areas with a clear view of the sky. Additionally, camera orientation was changing much more than when it is attached to a car that moves (normally) in the direction the wheels can roll. Before moving forward with the IMU data processing we decided to try to orient/place some of the imagery we took manually – just by looking through the panoramas, adjusting the camera heading/tilt/roll to make them look level and oriented so the next camera location matches the map. Just tweaking the KML file with the text editor seemed impractical to adjust hundreds of panoramic images so we decided to add more functionality to our simple WebGL panorama viewer, make it suitable for both walking through the panorama sets and for the editing orientations and locations of the camera spots. (more…)
June 15, 2011
May 19, 2011
For almost 3 years we had a possibility to geo-tag the images and video using external GPS and optional accelerometer/compass module that can be mounted inside the camera. Information from the both sensors is included in the Exif headers of the images and video frames. The raw magnetometer and accelerometer data stored at the image frame rate has limited value, it needs to be sampled and processed at high rate to be useful for the orientation tracking , and for tracking position it has to be combined with the GPS measurements.
We had developed the software to receive positional data from either Garmin GPS18x (that can be attached directly to the USB port of the camera) or a standard NMEA 0183 compatible device using USB-to-serial adapter. In the last case it may need a separate power supply or a special (or modified) USB adapter that can provide power to the GPS unit from the USB bus. (more…)
April 26, 2011
Tacking high-resolution panoramic images in the remote places, that can only be accessed by feet is an option now available with the Elphel-Eyesis 360 degree panorama camera. The camera’s size (1.3 x 0.3 meters) and relatively light weight (10kgs) allow to mount it on a backpack frame and carry by a person.
There were multiple requests for the backpack option by our customers since the development of Elphel-Eyesis camera, but other projects were of higher priority, until this spring, when we finally decided it was time to take Eyesis hiking. After all, we have worked hard on this project for many months, so we ought to have some fun with it too, and take panorama images of the places we knew and enjoyed for it’s scenery.
On April 19th, 2011, we took the Elphel mobile office to camp for 5 days in Southern Utah near the Goblin Valley State Park to try out Eyesis in beautiful places not yet available in continuous panoramic imagery, mainly because of their inaccessibility for car, ATV, or even a tricycle.
Hiking with Eyesis in the backpack (30 lbs/14 kg total weight with the battery pack) allows us to capture a continuous stream of geotagged (GPS) 360° panoramic images. With the current battery pack (just a regular UPS with lead battery) we can take up to one hour and forty minutes of footage at a rate of 5 frames per second.
March 30, 2011
UPD: Google has updated their Salt Lake City images in their StreetView – so, the comparison is better now.
Works in Firefox 4.0 and Chrome. Does NOT work in Firefox 3.6.X.
The above sample shows how custom panoramas (in our case: a panorama we shot with our Elphel Eyesis) can be integrated in a custom StreetView.
Elphel Eyesis images is the top-left window. The top-right window is the current (lower resolution) official Google Street View in Salt Lake City supposedly made by the 2nd or 3rd generation of GSV cameras.
The Open Street Map is for switching between the available panorama points and the Google Map with a pegman is filling the otherwise unused space at the same time generously showing the view direction.
March 14, 2011
Last week our phone system was broken into and we’ve got a phone bill for some five hundred dollars for the calls to Gambia. That expense was not terrible, but still that amount is usually enough for many months of the phone service for our small company – the international phone rates in the VoIP era are (for the destinations we use) are really low. The scary thing was that the attack lasted for very short time – just minutes, not hours, so our damage could be significantly higher. (more…)
March 11, 2011
Both, Google Maps API and Open Layers API, are quite simple, though it can take some time to find a perfect example. The maps are added to the WebGL panorama view test page (read “Experimenting with WebGL panoramas”)
March 3, 2011
As of today 2 new accessory parts have been added to the official Elphel price list. These are 45 degree and 90 degree angle pieces for the sensor front end. They can also be combined to create a 135 degree angle piece.
February 28, 2011
Current state of the Eyesis project,
what worked and what did not. Or worked not as good as we would like it to
Most of the last year we spent developing Eyesis panoramic cameras, designing and then assembling the hardware, working on the software for image acquisition and processing. It was and continues to be a very interesting project, we had to confront multiple technical challenges, come up with the solutions we never tried before at Elphel – many of these efforts are documented in this blog.
We had built and shipped to the customers several Eyesis cameras, leaving one for ourselves, so we can use it for development of the new code and testing performance of the camera as a whole and the individual components. Most things worked as we wanted them to, but after building and operating the first revision of Eyesis we understood that some parts should be made differently.
January 3, 2011
December 21, 2010
This is a quick update to the Zoom in. Now… enhance. – a practical implementation of the aberration measurement and correction in a digital camera post published last month. It had many illustrations of the image post-processing steps, but lacked the most important the real-life examples of the processed images. At that time we just did not have such images, we also had to find out a way to acquire calibration images at the distance that can be considered “infinity” for the lenses – the first images used a shorter distance of just 2.25m between the camera and the target, the target size was limited by the size of our office wall. Since that we improved software combining of the partial calibration images, software was converted to multi-threaded to increase performance (using all the 8 threads in the 4-core Intel i7 CPU resulted in approximately 5.5 times faster processing) and we were able to calibrate the two actual Elphel Eyesis cameras (only 8 lenses around, top fisheye is not done yet). It was possible to apply recent calibration data (here is a set of calibration files for one of the 8 channels) to the images we acquired before the software was finished. (more…)