Quantcast

Shooting Pictures With The Assistance Of A Lighting Drone

July 12, 2014
Image Caption: In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder. Courtesy of the researchers

Alan McStravick for redOrbit.com – Your Universe Online

The Massachusetts Institute of Technology (MIT) and Cornell University announced a new advancement on Friday that will surely have the photographic assistants of the world crying into their beers this weekend. This is most likely because when a photographer arrives at a photo shoot location, they realize just how crucial lighting is to the art of photography and assistants are the ones that scope out areas and figure out the best means for lighting the area for a shoot. That assistance is, apparently, no longer needed.

The team of researchers seem perfectly appeased with putting the assistant out of work with their new product: a squadron of small, light-equipped autonomous robots that are able to automatically assume the positions necessary to produce lighting effects that are specified through a simple, intuitive, camera-mounted interface.

These researchers plan to present their findings this August at the International Symposium of Computational Aesthetics in Graphics, Visualization and Imaging. At this event, they plan to present a prototype system that employs an autonomous helicopter which can effectively produce a difficult lighting effect known as “rim lighting.” This effect basically highlights only the outside edge of the subject that a photographer is working with.

This particularly difficult lighting effect was chosen on purpose, according to Manohar Srikanth, who worked on the system first as a graduate student and postdoc at MIT and now as a senior researcher at Nokia, due to the challenge it presented to the team. Other team members include MIT professor of computer science and engineering Frédo Durand and Kavita Bala of Cornell University.

“It’s very sensitive to the position of the light,” said Srikanth in a recent MIT statement. “If you move the light, say, by a foot, your appearance changes dramatically.”

Their system relies on a program of intuitive control from the photographer who can indicate the direction from which the rim lighting should come. Once that is determined, the drone helicopter positions itself in such a way that it performs exactly as the photographer expected. Once lighting is established, the photographer can then communicate to the drone-copter the exact width of the rim based on a percentage of its initial value. This process can be repeated until the desired effect is achieved.

“If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he’s looking 90 degrees away from you, then he’s exposing his chest to the light, which means that you’ll see a much thicker rim light,” Srikanth explained. “So in order to compensate for the change in the body, the light has to change its position quite dramatically.” This is easily done because once the parameters are established the drone-copter holds the rim-lighting effect steady.

Much the same way the system can adapt to movements of the subject, the lighted helicopter can also compensate for any movement done by the photographer as well. Regardless of who changes perspective, it is the camera that supplies the control signal to the hovering light source. Additionally, the camera is able to produce images that are not necessarily stored on the device, but rather are transmitted to a computer that runs the researchers control algorithm. This algorithm focuses on the rim width of the drone-copter and adjusts the robot’s position accordingly.

“The challenge was the manipulation of the very difficult dynamics of the UAV [unmanned aerial vehicle] and the feedback from the lighting estimation,” Durand said. “That’s where we put a lot of our efforts, to make sure that the control of the drone could work at the very high speed that’s needed just to keep the thing flying and deal with the information from the lidar [the UAV’s laser rangefinder] and the rim-lighting estimation.”

The team concedes that their first ideation of the project was most probably flawed. Srikanth explains, “When we first started looking at it, we thought we’d come up with a very fancy algorithm that looks at the whole silhouette of the object and tries to figure out the morphological properties, the curve of the edge, and so on and so forth, but it turns out that those calculations are really time-consuming.”

The team settled on an algorithm that is far more simple. It looks for the most dramatic gradations in light intensity across the whole image and measures its width. As the team explains, with a rim-lit subject, most of those measurements will congregate around the same value. This, according to the team, is typically what their algorithm takes to be the width of the rim.

Through their design and experimentation process, the team noted that this quick approximation was able to keep up with the motions of both the subject and the photographer while maintaining a consistent rim width.

To test their prototype, the researchers worked in a motion-capture studio. This environment allowed them to use a bank of high-speed cameras to measure the position of specially designed light-reflecting tags that had accuracy ratings within a millimeter. The team affixed several of these tags to the drone-copter.

As Srikanth explained, the tests were conducted primarily to evaluate the control algorithm. From the testing process the team was very satisfied with how the drone-copter operated. Algorithms that gauge robots’ location based only on measurements from onboard sensors are currently receiving a lot of attention in the field of robotics. The team claims their new system could work well with any of the new advancements in robotics. While their testing was graded down to the millimeter, Srikanth explains that this level is beyond what might be necessary. “We only need a resolution of 2 or 3 centimeters,” he stated.

“Rim lighting is a particularly interesting effect, because you want to precisely position the lighting to bring out silhouettes,” said Ravi Ramamoorthi, a professor of computer science and engineering at the University of California, San Diego. “Other effects are in some sense easier — one doesn’t need as precise positioning for frontal lighting. So the technique would probably generalize to other light effects. But at the same time, as-precise control and manipulation may not be needed. Manual static positioning might be adequate.

“Clearly, taking the UAV system out of the lab and into the real world, and making it robust enough to be practical is a challenge,” Ramamoorthi added, “but also something that should be doable given the rapid advancement of all of these technologies.”

—–

SHOP NOW: Parrot AR.Drone 2.0 Elite Edition Quadricopter


Source: Alan McStravick for redOrbit.com - Your Universe Online



comments powered by Disqus