The holidays are when families get together and take more photos than usual. And although you can ask everyone to smile and look at you iPhonecameras, you can't control the distractions in the background. But once you take the pictures you can use Apple IntelligenceThe Cleanup tool in the Photos app in iOS or MacOS to remove those people or objects that attract unwanted attention.
Cleanup analyzes the image, suggests elements you might want to remove, such as people or vehicles in the background, and then fills in the removed area. Sometimes the fix is invisible to most viewers, and sometimes the results are laughably bad. Having worked through many types of photos with this tool, I've developed some general guidelines to help you get the most cleaned-up images possible.
The Cleanup tool can remove distractions.
Surprisingly, Photos on iPhone and iPad has never had a tool like Cleanup to remove minor distractions. The Mac version includes a basic retouching tool that can repair some areas, which is replaced by the Cleanup tool on compatible Macs.
But it's important to remember that Cleanup is an Apple Intelligence feature, so you'll only see it if you're using a compatible device. This includes iPhones running iOS 18.1 or later, iPads with M-series processors (and iPad mini with the A17 Pro chip) running iPadOS 18.1 and later, and Macs with M-series processors running macOS Sequoia 15.1 and later.
For more information about Apple Intelligence, see which features do I think you will use most often? and where is it notifications need improvement.
How is Clean Up different from other retouching tools?
The repair tools in most photo editing apps work by copying nearby or similar pixels to fill the space where you make the correction. For example, they are great for removing glare or dust spots from the sky.
The Cleanup tool uses generative AI that analyzes the entire scene and makes suggestions about what should fill the area you select. If you want to remove a dog standing in front of a tree, for example, generative AI creates a replacement based on what it knows about the texture of the tree and the foliage in the background, and takes into account the level and direction of lighting in the photo.
The “generative” part of generative AI has to do with how it creates an image. The pixels that fill an area literally appear out of thin air: the program starts with a random set of points and quickly iterates to create what it thinks will appear in the same space.
Keep in mind that retouching tools using generative AI are YMMV perfect, otherwise “your experience may vary.” I had good results in complex compositions and terrible results in areas that I thought the app would be easy to handle.
Check this out: Siri could become more like ChatGPT. But why?
How to Remove Distractions Using Apple's Clean Up Tool
The Cleanup tool uses two approaches to restore photos. Using machine learning, it suggests objects such as people or vehicles in the background as objects that can be removed. Or you can drag and drop what you want to remove and direct the photos to work on that area. The process breaks down as follows:
1. Open a photo and tap Edit button. (On MacOS, click the button that says Editor press the return key.)
2. Tap Clear. The first time you use this Photos tool, you'll need to download Cleanup resources, which will take about a minute, depending on your internet connection. Photos analyzes the image and highlights with a translucent shimmer any objects that can be removed.
Open the photo editing interface and click “Clear”. The photos will tell you what to delete.
3. To remove a suggested item, tap it. Or draw a circle around any non-luminous object.
4. Don't be surprised if the area doesn't clear completely on the first try—you may have to tighten the remaining areas to remove more. If you're not happy with the fix, tap Cancel button.
If Cleanup doesn't capture everything (note that the person's legs in the image on the left are not selected), use the tool again to continue cleaning the area.
5. When finished, click Made. As with all Photos changes, you can revert to the original if you want to start over: tap More (…) and select Return to original state.
Unexpected and cool feature: Security filter.
You'll primarily use the Cleanup tool to get rid of distracting elements in a scene, but it has another trick: you can hide the identity of someone in a photo.
Draw a circle around their face. You don't need to fill it out – a simple swipe is enough. The photo uses a block mosaic pattern in place of the person's face to obscure it.
The Safe Filter is a clever use of the Cleanup tool.
Where You'll See the Most Success with Clean Up
Some scenes and areas work better with Cleanse, so it's good to know where to focus your efforts.
In testing, I had the most success in the following general categories of fixes:
- Small distractions. Items such as debris on the ground or dust and threads on people's clothes always turn out to be in good condition.
- Background textures. Areas such as tree leaves, grass or rock can be reproduced well.
- Lens flare. Unless it's too large, flare is caused by light reflecting between camera lens elements.
- Passersby or vehicles in the background, which do not take up much space.
- Areas with sparse detail or background.
Sometimes Cleanup works well, with originals at the top, edited versions at the bottom.
In general, when dragging an area, be sure to capture the reflections or shadows cast by the element you want to remove. Luckily, Photos often notices this and includes it in its selection.
Be sure to select shadows and reflections (on the left). Cleanup determines what to remove based on a broad selection (center). There is a small reflection left (on the right), but this can be removed with another movement of the tool.
Areas to Avoid When Using Clean Up
Some cleanup goals may frustrate you if you try to remove them. For example:
- Very large areas. If it's too big, Photos gives up and prompts you to mark the smaller area or it will get ruined. It's also inconsistent to come up with something that would plausibly appear in such a large space.
- Lively areas with distinct characteristics. Tree leaves in the distance usually work well, but not so much when there are recognizable structures or objects. For example, removing a prominent leaf from a leaf pile or moving people away from recognizable landmarks does no good.
Removing large objects in the frame turns into a jumble of pixels.
Where cleaning requires extra work
Keep in mind that Clean Up and other Apple Intelligence features are technically still in beta, although they are available to anyone with a compatible device that signs up for the beta program. (I have some thoughts about installing beta software All in all.)
And while you may get good results, there are still a few areas that Apple wants to improve in future releases. Namely, the quality of the replaced areas is poor, sometimes looking more like non-AI repair tools. I expected Apple's algorithms to do a better job of identifying what's in a scene and creating replacement regions.
From a user experience perspective, if you don't like what Clean Up offers for deletion, your only option is to undo or reset the edit. And if you cancel the operation and then try again, you will get the same results that have already been processed. Adobe Lightroom, on the other hand, offers three options for each correction, with the option of creating another set if you don't like how it turns out.
Lightroom (the iPhone app shown here) offers three deleted area options.
Clean Up—and other similar AI-powered removal tools—also suffers from its projected expectations: We've seen where it can do great things, which raises the bar for what we think every edit should do. When a tool gets confused and produces a jumble of disjointed pixels, we expect it to perform better. Perhaps in future releases.
To learn more about what Apple Intelligence can do for your Apple devices, see Take a look at the Visual Intelligence feature.






