From the article "When running the traffic video recognition demo, it consumed just 63 milliwatts of power. Server chips with similar numbers of transistors consume tens of watts of power" and "laptop that had been programed to do the same task processed the footage 100 times slower than real time, and it consumed 100,000 times as much power as the IBM chip" So if those statements are true I would say it is about 10,000 to 100,000+ more energy efficient. That is a rather large claim so we would need to see more proof...