2

Google details how Private Compute Core on Android works

 1 year ago
source link: https://toptech.news/google-details-how-private-compute-core-on-android-works/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Google details how Private Compute Core on Android works

Googlworks09-12-2022-1140x640.jpg

The Private Compute Core (PCC) was announced with Android 12 to let ML-backed features operate in a privacy-conscious manner independent of other parts of the OS, including third-party apps. Google recently published a technical whitepaper detailing the Private Compute Core Architecture.

“Backed by open-source code in the Android Framework,” the Private Compute Core is a “secure, isolated environment within [Android]” — specifically, a system-level, virtual sandbox — that can “host sophisticated ML features.”

It’s one of the technologies under Google’s “Protected Computing” toolkit, joining cloud enclaves, edge processing, or end-to-end encryption.

PCC collects ambient data like audio (from microphones), images (cameras), and location (GPS), as well as OS-level data: 

Private-Compute-Core-Architecture-1.jpg?resize=750%2C465&ssl=1

The inferences from ambient and OS-level data are retained inside Android System Intelligence, within Private Compute Core. All communications to external servers is done via Private Compute Services (PCS) privacy-preserving open-source technologies.

Updated via the Play Store, PCS was recently open-sourced, and responsible for updating ML models with support for:

  • Private Information Retrieval: Enables downloading slices of a dataset without revealing to the server which slice it downloaded.
  • Federated compute: Enables privacy-preserving aggregate machine learning and analytics across many devices, without any raw data leaving the device.
  • HTTP download: Enables access to static resources like updated ML models.

Meanwhile, Android System Intelligence (ASI) is what actually contains features like Now Playing (always-listening for music), Smart Reply (for notifications), Live Caption (on-device speech recognition for playing media), and Screen Attention (display remains on when you’re looking at it as determined by the front-facing camera).

The architecture whitepaper provides feature breakdowns for how data moves through the Private Compute Core:

Live Caption

Screen Attention





About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK