Qualcomm has finally revealed its distribution policy regarding QNN: https://www.qualcomm.com/support/so...82ec93b7-601d-4b88-a87e-edea93e00241#overview
It is only available to members of "verified companies". It's not clear yet which companies will be verified, but it's obvious that free and hobbyist developers are ruled out. Since most android apps originate from those, this will be very detrimental for mobile AI. Moreover, this means that you buy a powerful NPU for a lot of money and then are not allowed to use it (particularly since Qualcomm has obviously stopped to support NNAPI on the NPU in the 8gen2 chipset, see https://browser.geekbench.com/search?k=ml_inference&q=kalama
, where NNAPI has roughly same performance as CPU and is much worse than on 8gen1 chipset).
For your benchmark this means, that results achieved with QNN suggest a performance, which will be only available to selected apps. So they are somewhat misleading. Therefore I think, it might be better not to use QNN in AI Benchmark. Considering the impact, that your benchmark has, this might lead to Qualcomm rethinking its current strategy. And this would be very beneficial to mobile AI.