It's an oversimplification. The idea is that laser beams are extremely collimated, with divergences well under 1 degree in many cases, so over 200 meters the beam might only expand a tiny amount. But not to worry, the beam still spreads. Calculate the beam spot size of a laser, where the area increases as the square of the distance. Meanwhile, intensity is the inverse of the area. At sufficiently long distances, this will become apparent.
It seems to be a perennial point of confusion, but as far as I can make out from googling, it's not strictly true that lasers follow the inverse square law. Why is the parent being downvoted?
The point being made there is also incorrect. This is only true for divergence = 0 which doesn't exist.
Also, the point made that "if I go 10x further away then my received power won't be 100x less" is misleading since inverse square law is talking about intensity.
Last time I looked at phased array technology it seemed like you need large number of elements to get real gain. At the same time you either need phase control and PA for each element, expensive. Or high power phase control which is inefficient. And still not cheap.
One gets dubious real quick considering the incentive there is to develop affordable beam steering technology. And the lack of examples outside of a few niche area's.
You need lots of antennae to get high directivity with phased arrays.
A millimeter wave (very small lambda) wafer scale (giant chip = super expensive) phased array integrated onto a chip achieves a half power beamwidth of around 6 degrees.
Directional waves are still subject to inverse square law, just like lasers.