Skip to content ↓

Drone lighting

Autonomous vehicles could automatically assume the right positions for photographic lighting.
Watch Video
Press Inquiries

Press Contact:

Abby Abazorius
Phone: 617-253-2709
MIT News Office

Media Download

In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder.
Download Image
Caption: In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder.
Credits: Courtesy of the researchers
The helicopter automatically adjusts its position to maintain the same lighting effect as the subject moves.
Download Image
Caption: The helicopter automatically adjusts its position to maintain the same lighting effect as the subject moves.
Credits: Courtesy of the researchers
Although the experiments took place in a motion-capture studio, the only measurement provided by the motion-captures system was the helicopter's horizontal position, which onboard sensors should be able to approximate adequately.
Download Image
Caption: Although the experiments took place in a motion-capture studio, the only measurement provided by the motion-captures system was the helicopter's horizontal position, which onboard sensors should be able to approximate adequately.
Credits: Courtesy of the researchers

*Terms of Use:

Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license. You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."

Close
In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder.
Caption:
In the researchers' experiments, the robot helicopter was equipped with a continuous-light source, a photographic flash, and a laser rangefinder.
Credits:
Courtesy of the researchers
The helicopter automatically adjusts its position to maintain the same lighting effect as the subject moves.
Caption:
The helicopter automatically adjusts its position to maintain the same lighting effect as the subject moves.
Credits:
Courtesy of the researchers
Although the experiments took place in a motion-capture studio, the only measurement provided by the motion-captures system was the helicopter's horizontal position, which onboard sensors should be able to approximate adequately.
Caption:
Although the experiments took place in a motion-capture studio, the only measurement provided by the motion-captures system was the helicopter's horizontal position, which onboard sensors should be able to approximate adequately.
Credits:
Courtesy of the researchers

Lighting is crucial to the art of photography. But lights are cumbersome and time-consuming to set up, and outside the studio, it can be prohibitively difficult to position them where, ideally, they ought to go.

Researchers at MIT and Cornell University hope to change that by providing photographers with squadrons of small, light-equipped autonomous robots that automatically assume the positions necessary to produce lighting effects specified through a simple, intuitive, camera-mounted interface.

At the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging in August, they take the first step toward realizing this vision, presenting a prototype system that uses an autonomous helicopter to produce a difficult effect called “rim lighting,” in which only the edge of the photographer’s subject is strongly lit.

According to Manohar Srikanth, who worked on the system as a graduate student and postdoc at MIT and is now a senior researcher at Nokia, he and his coauthors —MIT professor of computer science and engineering Frédo Durand and Cornell’s Kavita Bala, who also did her PhD at MIT — chose rim lighting for their initial experiments precisely because it’s a difficult effect.

“It’s very sensitive to the position of the light,” Srikanth says. “If you move the light, say, by a foot, your appearance changes dramatically.”

Intuitive control

With the new system, the photographer indicates the direction from which the rim light should come, and the miniature helicopter flies to that side of the subject. The photographer then specifies the width of the rim as a percentage of its initial value, repeating that process until the desired effect is achieved.

Thereafter, the robot automatically maintains the specified rim width. “If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he’s looking 90 degrees away from you, then he’s exposing his chest to the light, which means that you’ll see a much thicker rim light,” Srikanth says. “So in order to compensate for the change in the body, the light has to change its position quite dramatically.”

In the same way, Srikanth says, the system can compensate for the photographer’s movements. In both cases, the camera itself supplies the control signal. Roughly 20 times a second, the camera produces an image that is not stored on its own memory card but transmitted to a computer running the researchers’ control algorithm. The algorithm evaluates the rim width and adjusts the robot’s position accordingly.

“The challenge was the manipulation of the very difficult dynamics of the UAV [unmanned aerial vehicle] and the feedback from the lighting estimation,” Durand says. “That’s where we put a lot of our efforts, to make sure that the control of the drone could work at the very high speed that’s needed just to keep the thing flying and deal with the information from the lidar [the UAV’s laser rangefinder] and the rim-lighting estimation.”

Quick study

As Srikanth explains, that required some algorithmic streamlining. “When we first started looking at it, we thought we’d come up with a very fancy algorithm that looks at the whole silhouette of the subject and tries to figure out the morphological properties, the curve of the edge, and so on and so forth, but it turns out that those calculations are really time-consuming,” Srikanth says.

Video thumbnail Play video
An aerial robot equipped with a portable light source lights various subjects, while a new algorithm processes the images from the photographer's camera. The algorithm provides motion commands to the robot to achieve the desired photographic effect.

Instead, the algorithm simply looks for the most dramatic gradations in light intensity across the whole image and measures their width. With a rim-lit subject, most of those measurements will congregate around the same value, which the algorithm takes to be the width of the rim.

In experiments, this quick approximation was able to keep up with the motions of both the subject and the photographer while maintaining a consistent rim width.

The researchers tested their prototype in a motion-capture studio, which uses a bank of high-speed cameras to measure the position of specially designed light-reflecting tags with millimeter accuracy; several such tags were affixed to the helicopter.

But, Srikanth explains, the purpose of the tests was to evaluate the control algorithm, which performed well. Algorithms that gauge robots’ location based only on measurements from onboard sensors are a major area of research in robotics, and the new system could work with any of them. Even rim lighting, Srikanth says, doesn’t require the millimeter accuracy of the motion-capture studio. “We only need a resolution of 2 or 3 centimeters,” he says.

“Rim lighting is a particularly interesting effect, because you want to precisely position the lighting to bring out silhouettes,” says Ravi Ramamoorthi, a professor of computer science and engineering at the the University of California, San Diego. “Other effects are in some sense easier — one doesn't need as precise positioning for frontal lighting. So the technique would probably generalize to other light effects. But at the same time, as-precise control and manipulation may not be needed. Manual static positioning might be adequate.”

“Clearly, taking the UAV system out of the lab and into the real world, and making it robust enough to be practical is a challenge,” Ramamoorthi adds, “but also something that should be doable given the rapid advancement of all of these technologies.”

Press Mentions

Wired

Wired reporter Margaret Rhodes writes about a new system developed by MIT researchers that uses drones as lighting assistants for photographs. The system operates by examining, “how much light is hitting the subject, and where the drone needs to move to adjust that light.”

Gizmag

Ben Coxworth of Gizmag writes about the new system developed by MIT researchers that allows photographers to achieve rim lighting during photo shoots. “Their system not only does away with light stands, but the light-equipped aircraft automatically moves to compensate for movements of the model or photographer,” writes Coxworth.

Fortune- CNN

In a piece for Fortune, Benjamin Snyder writes about how MIT researchers have developed a new system to help achieve the perfect lighting for photo shoots. Flying robots are programmed to produce rim lighting, which illuminates the edge of the subject in a photograph. 

Related Links

Related Topics

Related Articles

More MIT News

Gene Keselman headshot

Faces of MIT: Gene Keselman

At MIT, Keselman is a lecturer, executive director, managing director, and innovator. Additionally, he is a colonel in the Air Force Reserves, board director, and startup leader.

Read full story