Yes, the challenge is still open. The proposed task is probably not very familiar to the majority of participants, but you have still almost two months to develop an efficient solution for this problem.
Yes, this should be fine - only NHWC format is supported properly by TensorFlow Lite, thus ONNX-TensorFlow converter inserts the transpose layer. You can submit this model as long as it runs fine with the AI Benchmark using the NNAPI acceleration option.
It was mentioned in Codalab that if the tflite model is zipped along with the predictions and then submitted, the model will be evaluated on Huawei P40 Pro device. But the link to view the runtime for the same has not been updated yet. Is this accessible only in the testing phase ?