Google confirms Gemini can analyse photos to infer people, places and behaviour if users opt in
Google has confirmed a sweeping upgrade to its Gemini artificial intelligence platform, a move that fundamentally changes how the company’s AI interacts with personal data across its ecosystem. The update links multiple Google services together to create what the company calls a “truly personal AI,” but the announcement has reignited concerns about privacy, surveillance and user consent.
The upgrade initially launches for Google’s AI subscribers in the United States before expanding globally. Google has also confirmed that some version of the feature will eventually be available for free. Once activated, Gemini can connect data from services such as Gmail and Google Photos to personalise responses and recommendations.
Embed from Getty Images
While Gmail integration has already sparked controversy, Google Photos has emerged as the most sensitive component of the upgrade. According to Google, when users enable the feature, Gemini can use photos to infer interests, relationships with people appearing in images, and locations visited. This includes associating faces with timestamps and location data embedded in photo files.
Google has openly highlighted these capabilities in example use cases. One scenario shows Gemini recommending car tyres by analysing family road trips stored in Google Photos. Another demonstrates the AI extracting a vehicle’s licence plate number directly from an image. These examples underline the depth of analysis Gemini can perform when allowed access to visual data.
By linking multiple data sources, the upgrade represents a significant shift in how AI systems operate. Instead of responding to isolated prompts, Gemini can draw conclusions from years of stored information, building a more detailed picture of a user’s life. Google says this enables more relevant recommendations for travel, entertainment, shopping and planning.
The company insists there are safeguards in place. Google states that Gemini does not directly train on users’ Gmail inboxes or Google Photos libraries. Instead, training relies on limited information, such as user prompts and the AI’s own responses, to improve functionality over time. However, Gemini does access connected data to answer requests once users opt in.
Google emphasises that participation is voluntary. App connections are switched off by default, and users must actively choose which services to link. The company says users can disconnect apps or disable the feature entirely at any time. Google also argues that because the data already resides within its systems, users do not need to send sensitive information to third-party platforms to receive personalised AI features.
Despite these assurances, privacy experts warn the implications are far-reaching. Analysing emails is one thing, but using personal photos to infer identities, movements and relationships marks a new level of insight into users’ private lives. Even if consent is required, the scope of the data involved raises questions about long-term consequences and how such systems may evolve.
Once enabled, Gemini operates across web browsers, Android and iOS devices, and across all available Gemini models. This means a single decision can affect how AI interacts with personal data across multiple platforms and devices.
The timing of the upgrade adds further complexity. Other major technology companies are also pursuing hybrid approaches to AI, combining cloud-based processing with privacy-focused safeguards. It remains unclear how future alternatives will compare, or whether users will be offered more granular controls as competition intensifies.
For now, Google frames the update as a powerful tool that users can choose to embrace or ignore. The company presents it as a leap forward in convenience and intelligence. Critics, however, argue it represents another step towards normalising deep analysis of personal data.
As Gemini expands worldwide, users face a clear decision. They can opt in and gain an AI assistant that understands their lives in unprecedented detail, or they can step back, preserving distance between their memories and machine inference. The choice may be optional, but the implications are anything but small.