A Robotic Spray Can That Paints Murals
Take a can of ordinary black spray paint, then attach a nozzle and two lightweight cubes covered in QR codes. Now, start moving your hand back and forth across a wall. The spray can begins to automatically release bursts of paint, depending on your hand’s location.
An hour later, you’re looking at a panda that covers 32 square feet: a mural-sized reproduction of a black-and-white photo, with accurate proportions and shading.
This “smart” spray can system uses computer-aided painting and robotics to help non-artists quickly and accurately reproduce photographs as large-scale murals. This technology has a number of applications, according to a Dartmouth College press release, including digital fabrication, artistic stylization and digital and visual arts.
The system was invented by a team of researchers led by Wojciech Jarosz, an assistant professor of computer science at Dartmouth College. The project was a collaboration between Disney Research Zurich (where Jarosz was previously a senior research scientist), ETH Zurich, Dartmouth College and Columbia University.
Spray by Numbers
Creating murals on walls and other large surfaces poses logistical and technical challenges even for skilled artists—obstacles that the system seeks to remove.
“Our assistive approach is like a modern take on ‘paint by numbers’ for spray painting,” says Jarosz. “Most importantly, we wanted to maintain the aesthetic aspects of physical spray painting and the tactile experience of holding and waving a physical spray can while enabling unskilled users to create a physical piece of art.”
The “smart” system combines spray paint’s low cost and ease of use with computer-aided painting technology, which originated in the early 1960s.
While computationally assisted painting methods are typically restricted to the digital realm, combining computer graphics and computer vision techniques allows the researchers to “bring such assistance technology to the physical world even for this very traditional painting medium, creating a somewhat unconventional form of digital fabrication,” according to Jarosz.
How It Works
The system uses two webcams set up near the wall or canvas, which track the can’s relative position using QR-coded cubes attached to the can. A small actuation device is attached to the nozzle with a 3D-printed mount. The servo-motor that turns the nozzle on and off is controlled by radio transmitter.
Running in real-time on a nearby computer, an algorithm determines the optimal amount of paint of the current color to spray at the can’s current location. These elements combine to automatically operate the spray nozzle, reproducing a pre-programmed image as a mural.
As the person moves the “smart” spray can around, the painting gradually reveals itself. While the system offers feedback to the user—a screen showing areas that still need to be painted—the user doesn’t necessarily need to know what image they’re creating.
Although this system currently only supports painting on flat surfaces, the new technique may be applied on more complicated, curved painting surfaces. That’s just one of the directions the development team sees for its invention.
As the report concludes, “Our system enables researchers to explore multiple future directions (e.g., automation, user interaction, quality, training) either by completely replacing the user by a robot with full control over the trajectory, or by leveraging the user’s creativity with more complex tools such as stencils, or even by training the user by developing a guidance system.”
Greater control over color mixing models and more sophisticated tracking systems could also open intriguing avenues for future work, according to the team.