It’s not my software, so take my opinions with a grain of salt…
It’s always a tough – how much should you prevent users from making mistakes. Which mistakes? How should you prevent them?
Warnings are a little like road signs. If you put up too many, most users will learn that they’re rarely useful, after a while they all get ignored. When they all get ignored then users miss the super important ones too – where they might lose data or make a permanent change.
In other words, too many warnings can be just as harmful as too few.
Most developers save warnings for only the most important things. For harmless, impermanent mistakes it’s often best to let the user discover them naturally. That’s what Preview mode is for, after all.
A bit in a manual? Perhaps. But almost every setting and checkbox has some nonsensical combination. Is it helpful to describe each way you misconfigure things? Again, my preference would be describe how something works and so long as the nonsensical stuff is not harmful or permanent – let the user deduce the rest naturally.
Everyone has their own feelings on this stuff – but you can find most of the guidelines I’ve laid out here in the Apple Human Interface Guidelines document. The guide to building consistent user-friendly software that Apple gave developers in 35 years ago. They really are pretty solid.
" Warn people when they initiate a task that can cause an unexpected and irreversible loss of data. Such warnings are important, but like other alerts, they lose their impact if they appear too often. Don’t warn users when data loss is the expected result. For example, the Finder doesn’t warn users every time they throw away a file because getting rid of the file is the expected result."