Home Assistant Finally Lets You Undo and Redo in Automations

The home assistant officially removed his October release. Home Assistant 2025.10 brings some huge improvements in the automation editor, as well as smarter monitoring panels and even the ability of your connected AI model to generate images.

If you once created complex automation, you know the pain to make mistakes mistaken only in order to realize the only way to fix it to close the editor and start from scratch. This nightmare finally ended, since this release represents the so necessary cancellation functionality and again. Now you can cancel colossal 75 steps back in the history of editing, and yes, standard CTRL+Z and CTRL+Y keys work perfectly.

Another important headache is solved, because the insert with Ctrl+V is now very simple. If you have copied the automation unit (for example, a trigger or action), now you can just choose any other block and press Ctrl+V to insert a copied block directly under it. This is a small, but huge change, which is also welcome.

The side panel, which was introduced into The last releaseNow, fortunately, Satapno. The team also, according to the visible, noticed that the “repeated” construction unit in automation tried to do too much, covering four use options in one complex block. To simplify things, the home assistant divided it into four smaller, lighter blocks with clearer descriptions.

The blocks are repeated several times, repeat until repeat, and repeat for everyone. This is a great step to make complex automation of the cycle much more accessible, without changing the main structure for advanced users. Finally, the overflow menu is returned to the main section of the editor, which makes important actions, such as testing conditions, much easier to achieve.

Home assistant 2025.8 He gave us the opportunity to generate data using LLM, but now that AI becomes much more creative, because it can generate images. An example that the home assistant team showed was that every time your door call is pressed, you can get a notification with the animated version of Snapshot. It is pretty cute and opens up a bunch of opportunities. I am sincerely curious to see what wild and useful automation of images generation came up with a community.

The toolbar also becomes even smarter, representing the proposed entities. The main algorithm now tracks the organizations with which you interact the most, and offers the appropriate management elements depending on the hour of the day. This mainly allows your house to offer what you need to see when you need to see it. Moreover, you can integrate these predicted objects directly into any of your own manual panels.

For those who are in households with a dual language, or for those who want a separate local and cloud assistant, a home assistant finally opens a few words for vocal assistants based on the Ethmus. Now you can define two words and two assistants for each voice assistant in your home. This means that you could install Ok Ok Nabu for the French Cloud Assistant and “Hey Jarvis” for the local English.

Even better than Assist becomes a little less talkative. If you release the voice command, and all actions take place in the same area as the satellite device (for example, turning on light in one room), Assist will now play a simple confirmation of the “sound signal” instead of a complete oral response. This is very useful because you have already seen how the light turned on, so you do not need a complete speech confirming this.

Source: Home assistant

Leave a Comment