Most websites don’t need DDOS protection.
Many websites which use Cloudflare to block basic bot vulnerability scanning. You could block this type of traffic with other methods; ja3/ja4, Ip to ASN & ASN filtering, etc.
While it may not impact your site, it does impact your hosting provider. As their costs go up, your costs go up. Anything on the Internet at this point needs DDoS / scraping protection. If may not drop your service, but your ISP or upstreams may blackhole your route.
The "old web" (current web) was largely based on an open exchange of information.
The "new web", post AI bot scraping, is taking its place. Websites are getting paywalls. Advertising revenue is plummeting. Hosting providers are getting decimated by the massive shift in bandwidth demand and impact to systems scraped by the bots.
I guess my products fall into a niche that doesn’t seem to attract AI crawlers. I’ve seen only a few and they haven’t been too aggressive. I mean they ignore typical crawl rate limits defined in robots.txt but account for maybe only 1-2% of my overall traffic.
DDoS and AI are mostly unrelated. Sure, AI companies are running low-quality scrapers, but they don't cause nearly as much traffic as a DDoS. They might cause as much CPU load as a DDoS, which is an application-level problem.