Hi folks, it's me again😁
Can't get runtime for model from lightspeed
Website constantly sends same request with no response
curl 'http://lightspeed.difficu.lt:60002/status/null' \
-H 'Accept: */*' \
-H 'Referer: http://lightspeed.difficu.lt:60002/' \
-H 'DNT: 1' \
-H...
Hi folks,
What version of iOS TensorFlowLite do you use for lightspeed service and for final evaluation?
Can you publish the code of iOS app you use to test the model?
I'm having an issue with submitting my model to lightspeed
Status: Error
Inference time (sec): Unknown
email -...
Regarding
- fully-quantized INT8 model (with no FP32 / FP16 / INT16 ops);
Is it ok to use tensorflow Integer with float fallback option?
Or we need to make it strictly INT8 model with tf.lite.OpsSet.TFLITE_BUILTINS_INT8 ?
Also which Bionic SoC will be used?
There is also a question of how the iOS app with tflight model will be built - as debug or release?
I guess as long as all contestants are evaluated on a same platform/device/build settings, right?
thnx
Hi
I'm trying to get inference time from http://lightspeed.difficu.lt:60002/ for tflight model created with TF 2.4, EfficientNet
And i've got - Status: Error
Is there a way to get more detail on the error?
Also how the final inference time will be calculated?
On real device?
Using lightspeed...