Critics scoff after Microsoft warns AI feature can infect machines and pilfer data

The goals are reasonable, but ultimately they depend on users reading dialog boxes that warn of risks and require careful approval before proceeding. This, in turn, reduces the value of protection for many users.

“The usual caution applies to mechanisms that rely on users clicking on a permission request,” Erlens Fernandez, a professor at the University of California, San Diego, who specializes in artificial intelligence security, told Ars. “Sometimes these users don't fully understand what's going on, or they just get used to it and click 'yes' all the time. At that point, the security boundary isn't really a boundary.”

As demonstrated series of “ClickFix” attacksmany users can be tricked into following extremely dangerous instructions. While more experienced users (including a fair number of Ars commenters) blame victims who fall for these scams, these incidents are inevitable for a variety of reasons. In some cases, even careful users become tired or emotionally stressed and end up making a mistake. Other users simply lack the knowledge to make informed decisions.

Microsoft's warning, one critic said, is nothing more than CYA (short for “cover your ass”), a legal maneuver that attempts to shield a party from liability.

“Microsoft (and the rest of the industry) has no idea how to stop rapid injections or hallucinations, making it fundamentally unsuitable for almost anything serious,” critic Reed Miedecke. said. “The solution? Put the onus on the user. Just like every LLM chatbot has a disclaimer of 'oh by the way, if you're using it for something important, be sure to check the answers', not to mention you won't need a chatbot at all if you know the answer.”

As Miedecke noted, much of the criticism applies to the AI ​​offerings that other companies, including Apple, Google and Meta, integrate into their products. Often these integrations start out as optional features and end up becoming the default capabilities, whether users want them or not.

Leave a Comment