Apple Unveils 20 Core ML Models and 4 Datasets on HuggingFace: An Open Source Gift to Developers
Apple recently launched 20 new Core ML models and 4 datasets on the well-known AI model hosting platform, HuggingFace. These models are all open-sourced under the Apache 2.0 license, making them freely available for developers to use.
The new models, built on the Core ML framework, are designed to run AI tasks locally on devices, addressing potential privacy concerns by eliminating the need to send data to the cloud.
For instance, developers can create an app for image classification that, with user permission, accesses the photo library and processes images locally. Similarly, an app could quickly remove the background from an image without needing to upload the picture, thus safeguarding user privacy.
Another advantage of running models locally is the response speed. While cloud-based processing requires powerful servers to support concurrent usage, local processing does away with network latency, offering a more responsive experience.
The primary challenge with local AI model execution lies in chip performance. For instance, Apple's device-side AI models currently only support the A17 Pro and Apple M series chips, leaving older chipsets reliant on third-party developers, albeit with potential performance limitations.
HuggingFace's founder described this as a significant update, highlighting that Apple uploaded many Core ML-based models to the HuggingFace repository. These models operate strictly on-device without needing an internet connection, allowing developers' applications to maintain "lightning-fast" speeds while ensuring user data privacy.
Developers interested in these models can visit Apple's page on HuggingFace. Apple has also provided academic papers for some models, offering developers a quick way to understand their capabilities: https://huggingface.co/apple