Nope. If you always ran it at max brightness then yes, it would start to age the backlight sooner, but you'd also burn your eyes out . If you just run it in HDR mode and set the desktop brightness as per normal it is no different than running in SDR mode. Only downsides are:Is there anything wrong with using HDR mode all the time for monitor longevity?
1) You can't do 144Hz, or at least not without a tradeoff. Doing 144Hz 8-bit RGB means using dithering. Doing 144Hz 10-bit 422 means chroma subsampling. You have to do 120Hz 10-bit RGB if you don't want any of that. Not that the last 24fps are a big deal, I would guess it would be totally unnoticeable, even if you had a game you could run that fast.
2) You have to use the local dimming, you can't turn it off. Now I personally don't find that to be an issue, I end up basically leaving my monitor in Mode 0 all the time for SDR and HDR content. But you can't turn it off if you want a static backlight.
3) Have to adjust brightness in Windows. Not a huge deal, but a little annoying.
4) There may be some games that get pissey with it. FF14 running in borderless Windows mode seems to go to dogshit performance wise with lots of FPS stuttering when running in HDR mode. I haven't messed with it since I normally leave my monitor on SDR mode as I like the increased saturation. But you may have some things that have issues.
But if you find it works well for you there's nothing at all wrong with it and indeed that is how MS would intend you do it. The idea is that you always run HDR monitors with HDR signaling, and then the OS deals with it.