Siri's performance and quality seems to depend a lot on the on-board ML cores since it switched to on-device. It was basically unusable on my 6S Plus with its early ML cores, and now it's great on the 14 Pro Max I replaced it with. It seems like they ship a Siri to match the device capability.
I had the idea that Siri could only recognize "Hey Siri", and after that it would offload the task to Apple's cloud. If it's offline now, it would be great, but I don't see how the ML cores would help. Speech-To-Text is practically solved for most devices, after that you're interacting with a regular chat bot.
All I know is what I experienced: it got less reliable with the switch and stopped handling stuff it handled perfectly before, then got better with a newer phone.