Deploying a trained model onto an Edge device requires meticulous attention and effort. The configuration of various subsystems for IoT Edge is outlined below. In this setup, an Android phone serves as an IoT device, functioning optimally for small-sized models. However, deploying a large model onto an Android phone can pose challenges and may not perform well unless the model size is reduced to fit the device.

The CNN model-based inference engine is coded in C++ and C. Creating a library using the inference engine code necessitates the installation of the Android NDK in Android Studio. ( detailed instructions for installing Android Studio and the NDK on an Ubuntu 18.04 x86 machine.!!!!!)