New Release Optimizes Machine Learning Applications for Mobile Devices
Google’s announcement of the release of a developer preview of a new mobile-centric version of the popular machine-learning software library TensorFlow creates a new mobile paradigm for AI development.
The forthcoming version of TensorFlow, TensorFlow Lite, is described by Google as “a lightweight solution for mobile and embedded devices” which prioritizes speed, small size, and cross-platform support to provide “low-latency inference of on-device machine learning models.” In other words, TensorFlow Lite offers mobile developers a way of fully leveraging pre-trained TensorFlow models in a completely new architecture designed to work best within the restraints of the comparatively limited computing resources of today’s mobile phones and IoT devices while also taking advantages of hardware acceleration and new AI chipsets.
Machine learning models still need to be trained on vastly powerful—and expensive—farms of specialized GPUs and other machine-learning optimized chips (such as Google’s own TPU) and/or trained on computing power rented from cloud-based providers of these specialized resources, but TensorFlow Lite will enable mobile devices to quickly run pre-trained models on the device itself. Mobile phones will never (at least not anytime soon) be able to provide the computational horsepower that gives neural networks their magical ability to train on data and hone themselves—the “learning” in machine learning—but with software such as TensorFlow Lite, phones and other small devices will still be able to leverage the neural network’s pre-trained magic in practical uses.
With such unique and optimized architectures designed to squeeze the most performance from mobile devices, TensorFlow Lite will not only offer speed and efficiency increases in running new data through models locally right on a user’s device, but may also lessen or obviate the need for Internet connectivity in apps employing machine learning models. As TechCrunch reports,
Unfortunately, training is still too computationally intensive to be preformed on smartphones. But even ignoring training, pre-trained models can still be a slog to deal with. If models can run on device, at the edge, they can avoid the cloud and internet all-together. This enables more reliable performance in any environment.
This renewed focus on mobile users (Google already has a TensorFlow Mobile which it currently recommends for developers but plans for TensorFlow Lite to eventually replace its current offering) reflects a larger industry-wide shift towards phones with more AI capabilities and specialized chips, but also carries the implication for revolutionary tech to come, such as the use of mobile machine learning-based image recognition models to quickly ID sick crops, in one study.
TensorFlow Lite supports Android envelopment with iOS support planned as well, and is guaranteed “out of the box” to run three of Google’s most popular machine learning models: MobileNets, “a family of mobile-first computer vision models for TensorFlow,” Google’s famous Inception-v3 image recognition model and finally On Device Smart Reply, an “on-device model which provides one-touch replies for an incoming text message.”