Remove All the Things: Using modern software to erase pesky objects
This post is by Articles: Digital Photography Review (dpreview.com) from Articles: Digital Photography Review (dpreview.com)
Does this sound familiar? After paying meticulous attention to composing and capturing a scene, you open the image later and your eye is drawn immediately to… a discarded wrapper on the ground, or a tourist walking through the scene. Or profanity-inducing sensor dust that appears in every. Single. Photo.
In days past you’d grab the Clone Stamp tool and wear out your mouse-clicking finger sampling nearby pixels to hide the offending object. Now, we have plenty of methods to remove distractions: healing tools, content-aware fill, and apps that do it all for us. You’ve no doubt seen examples where a quick swipe on a screen cleanly erases photo-bombers or exes.
But do these approaches really work? That depends on what you want to remove, so let’s start small and build up to ever more annoying intrusions, looking at which tools work in those situations.
The most common problems and usually the easiest to eradicate are smudges or ghosted areas caused by dust on the camera’s sensor or lens. They often appear in clear skies, and may not even be noticeable until you cycle through multiple images and spy the same darkened spot in each one. Nearly every image editor has some variation of a healing brush or spot healing tool. You click or paint over the spot and the software copies nearby pixels to replace it, feathering the area so it blends with the surrounding pixels. Here’s how it works in Lightroom Classic:
|Lightroom Classic includes a Visualize Spots feature that helps you identify areas, like the smudge at the top left, using a high-contrast view of the image.|
|In Lightroom Classic, a blurry bird in the sky is distracting.|
|The Heal tool removes the bird by copying pixels from a nearby area.|
This basic Heal tool works well when the spot is not too large and the surrounding area is fairly uniform. In more complicated situations, since it’s copying and pasting pixels from elsewhere in the image, there’s always the danger that identifiable features get duplicated in a way that makes the repair obvious. Large areas or objects against noticeable patterns aren’t as successful using this technique.
When Adobe previewed Content-Aware Fill in Photoshop CS5, it was magic. A single click or swipe over an area removed the offending object and filled the gap in a smarter way than the other clone or healing tools.
Now similar technology is found in most editing software. The technique is similar: you click or drag over the item to remove and the tool intelligently determines how to fill in the area. In this case it’s not directly lifting patches of pixels from nearby, but instead constructing a replacement based on the colors and tones in that section and elsewhere in the image.
Lightroom Classic 12.3 introduced a new Content-Aware Remove mode to its Heal tools, which often does a better job of fixing an area than the Heal tool. In ON1 Photo RAW the tool is called the Perfect Eraser. Photoshop includes a Content-Aware option in the toolbar for its healing tools. Luminar Neo offers an Erase tool. CaptureOne calls it the Magic Eraser. You’ll also find the same capabilities in standalone apps such as TouchRetouch. You get the idea.
The challenge is that the tool is still sampling pixels from within the image to invent its background, because it doesn’t have any other reference. So, again, large areas to be erased are more difficult, and it can stumble when there are patterns in the filled area.
|Removing people in front of these old stairs is a challenging task for the Content-Aware Remove tool in Lightroom Classic.|
|Immediately, it’s clear to see that Content-Aware Remove didn’t even line up the steps, leaving an obvious blemish on the image.|
|Photoshop’s Content-Aware technology has done a better job, but still includes artifacts like a big patch of ground in the middle of the stairs.|
And now here comes AI to swoop in and make the magic of Content-Aware Fill look like an old parlor trick. Tools based on machine learning (ML) technologies have the potential to erase larger areas and fill in the gaps based not just on the surrounding image, but on what they know about thousands of similar scenes. They’re also geared toward identifying objects to be removed, speeding up the process of making a fix.
For example, the Magic Eraser feature on the Google Pixel 6 and Pixel 7 phones mostly lives up to its name. When you activate the tool in the Google Photos app, it scans the image for objects you likely want to erase, such as people in the background, and highlights them. You can also drag over things manually, but I find it does a good job of identifying possibilities. (Google recently made the feature available to Google One subscribers in the Google Photos app on other devices.)
|The Photos app on a Google Pixel 6 identified the tourists as likely subjects for erasure in this image.|
|Tapping Erase All removed them, mostly.|
Going back to Photoshop, Adobe recently released a public beta that includes a new Remove tool that takes erasing a step further than the healing tools. It samples pixels from the rest of the image, yes, but it’s also smarter about replacing the missing area. Leaning on Adobe Sensei (cloud-based AI) technology, it detects patterns and lines in background objects and renders them correctly more often (as usual, with the caveat that it depends on what that background is).
|The Remove tool in the public beta of Photoshop 24.5 kept the stairs intact when removing the people.|
Similarly, Inpixio is able to guess which subjects you may want to remove and erase them with a single click or a few swipes.
|Inpixio’s AI removal found the subjects…|
|…and removed them rather cleanly.|
That “few swipes” is key in all of this. Although the AI-assisted tools in general do a good job, you’ll still find yourself cleaning up areas.
The main limitation of the aforementioned tools is the limited supply of available pixels. When we look at a scene, we know what’s behind a person or object, but the software doesn’t. You may have other photos where that area is clear, and there are methods of compositing or blending images to do the trick.
What if the software could invent brand-new pixels instead of pull them from other areas? That’s the promise of incorporating generative-AI technologies into the editing process. At Google’s recent I/O conference, the company previewed an AI-enhanced Magic Eraser feature that will rebuild areas after moving elements in a scene. No release date has been announced other than “later this year,” but you can bet that we’ll see such announcements from other companies as the year progresses.
For a current example, DALL-E 2 features an ‘inpainting’ feature that will replace areas of an image you upload. It’s a somewhat convoluted process: in an image editor you mask out the item or person you want to remove, save a copy at 1024 by 1024 pixels, upload that to DALL-E 2 and describe the full image as you’d like it to appear. Then you generate options, download the best one and blend it with the original in Photoshop or other layer-based editor (likely upscaling the low-resolution generated version and compositing it with the higher-resolution original). (To view this process in action, watch this PIXimperfect video.)
No doubt this capability will arrive for most, if not all, of the aforementioned photo editing apps as the companies incorporate the technology over the coming months.
Despite the rapid pace of technological improvement in removing ever larger or more difficult objects from photos, we still often find ourselves retouching the results manually. So, don’t be surprised when you need to break out that Clone Stamp tool again to clean up what’s left behind in order to sell the illusion.