A use case I've needed cloud services for is web scraping. Sites which IP ban web scrapers will still allow scraping from major cloud provider IP space.
How are sites detecting and banning once-an-hour users? Yes, if you want to spider an entire website continuously, you could get IP banned, but if you're making 24 (or even 96) requests a day to visit one site and check some data, your traffic is indistinguishable from baseline.
I've seen bigger sites block requests from ip ranges that belong to hosting providers. Don't know about cloud providers, but doing it from home with a residential address has the highest chances of success. You might still end up in the cloudflare verification page though sometimes.