Google I/O 2024: all the news from the developer conference

Highlights

  • android 15 logo, a green upside-down triangle with a rocker launching into the stars and an android character looking up

    android 15 logo, a green upside-down triangle with a rocker launching into the stars and an android character looking up
    Image: Google

    Alongside Google’s ongoing developer-focused I/O conference comes the latest release of Android 15, which is now in its second beta. It’s got some cool new features, like the ability to now hide a collection of apps inside a “private space,” customizable vibrations so you can notice different types of notifications just by feeling, and also richer widget previews.

    The new private space function is the most interesting of the bunch: it can hide apps you don’t want others seeing into a biometric or PIN-protected container in the app drawer. It’s one of several new security features coming to Android.

    Read Article >

  • Android 15 will hide one-time passwords in notifications.

    In response to malware and social engineering attacks that work by snooping notifications or activating screen sharing, Google says Android 15 will hide notifications with one-time passwords (with some exceptions, like wearable companion apps).

    They’re also automatically hidden during screen sharing, and developers can enable their apps to check if Google Play Protect is active, or if another app might be capturing the screen during use.

    Simulated Android screenshot showing a bank app demo and a notification for a one-time passcode that doesn’t display the code, in order to keep it secure from malware that may try to steal it.

    Simulated Android screenshot showing a bank app demo and a notification for a one-time passcode that doesn’t display the code, in order to keep it secure from malware that may try to steal it.
    Image: Google
  • Google’s new glasses are just a prototype for now.

    The blink-and-you-missed-it AR glasses at Google I/O? “The glasses shown are a functional research prototype from our AR team at Google. We do not have any launch plans to share,” Google spokesperson Jane Park tells The Verge.

    However: “Looking further ahead, we anticipate that capabilities demonstrated with Project Astra could be utilized through wearables and other next-generation devices.”

    Image: Google
  • Lincoln Nautilus with Cisco’s Webex

    Lincoln Nautilus with Cisco’s Webex
    Image: Ford

    Google is adding several new apps to its in-car infotainment platforms, while also making it easier for developers to get their apps approved faster and with fewer complications.

    Google said that two new streaming services, Max and Peacock, are coming to cars with Google built-in. These include models from companies like Renault, Polestar, and Volvo. (Other brands with Google built-in, like Ford, Acura, and Honda, do not support video streaming while parked yet.)

    Read Article >

  • A driver using Mercedes-Benz’s Level 3 Drive Pilot system

    A driver using Mercedes-Benz’s Level 3 Drive Pilot system
    Image: Daniel Golson

    Google is lowering the barriers for new apps to be added to Android Auto and cars with Google software built-in, making it easier for developers of gaming and streaming apps to get them added to those platforms. It is also releasing new guidelines for developing apps for various screen sizes and shapes.

    Google is launching a new program for car-ready apps, essentially expediting the process for developers to get their apps approved for in-car platforms. As part of this program, Google says it will “proactively” review mobile apps that are already compatible with the increasingly large-sized screens found in more modern vehicles.

    Read Article >

  • Two Pixel Watch 2s side by side.

    Two Pixel Watch 2s side by side.
    Wear OS 5 is on the way.
    Photo by Vjeran Pavic / The Verge

    Wear OS 5 is on its way, and with it, Google says Android smartwatch users ought to see even better battery life. Running a marathon, for example, will purportedly consume 20 percent less battery than on Wear OS 4.

    This emphasis on battery life is similar to last year’s Wear OS 4 announcement — and for good reason. Wear OS 4 helped the Pixel Watch 2 last an entire day, something the original struggled to do. That improved battery life has seemingly bought some goodwill. Google says that, in the last year, the Wear OS user base grew by an impressive 40 percent across 160 countries and regions.

    Read Article >

  • Android logo on a green and blue background

    Android logo on a green and blue background
    More security features on more Android phones.
    Illustration by Alex Castro / The Verge

    Google is announcing an array of new security features as it releases its second Android 15 beta, including a feature that can detect the moment your phone is swiped from your hands. Some of these updates will be included with Android 15 when it arrives this fall, but theft detection and a number of other features will be available to phones with much older OS versions, too — bringing them to many more people.

    Theft Detection Lock works by recognizing the unusual motions that would indicate someone has yanked your phone out of your hand or a table in front of you. To prevent a thief from being able to access information on your device, the screen automatically locks. The system looks for other signals that indicate foul play, too, and will be able to lock the screen for protection if someone tries to take it off the network to prevent remote access.

    Read Article >

  • ADT’s new smart security system includes the option to use the facial recognition feature of Google Nest cameras to allow a “trusted neighbor” temporary access to your home when you’re away.

    ADT’s new smart security system includes the option to use the facial recognition feature of Google Nest cameras to allow a “trusted neighbor” temporary access to your home when you’re away.
    ADT’s new smart security system includes the option to use the facial recognition feature of Google Nest cameras to allow a “trusted neighbor” temporary access to your home when you’re away.
    Image: Google / ADT

    ADT has confirmed to The Verge that it’s rolling out a big upgrade to its ADT Plus home security system. The all-new hardware and software platform for ADT Plus features new ADT hardware, deeper integration with Google Nest hardware, and the ability to automatically disarm using facial recognition to let trusted neighbors into your home when you’re away. 

    I first reported on the new system last October, but until now, ADT had declined to comment despite publishing multiple support pages about it on its site. This week, ADT spokesperson Ben Tamblyn confirmed to The Verge the new ADT Plus system has started rolling out to some states and will be available nationwide in the coming months. The system can be self-installed or professionally installed and can work with ADT’s professional monitoring service.

    Read Article >

  • A white smart plug on a blue background

    A white smart plug on a blue background
    Eve, makers of smart home devices such as the Eve Energy smart plug, will soon launch an Android app.
    Photo by Amelia Holowaty Krales / The Verge

    After announcing that it would be bringing an app to Android way back in 2022, Eve is finally close to launching its Android app, possibly by this fall. Android users will be able to control Eve’s smart home products natively — including smart plugs, smart lights, and smart shades. They will be able to access features such as energy management that are not yet available in the Matter platforms they work with (including Google Home and Samsung SmartThings). Prior to Matter, Eve devices only worked with Apple HomeKit and were only controllable on iOS devices.

    “The highly anticipated app will allow Matter devices to be added, controlled and automated directly and without any proprietary connection mechanism or fragile cloud-to-cloud integrations,” the company said in a press release. “For the growing range of Matter-enabled Eve devices, Eve for Android will provide advanced functionality, such as measurement of energy consumption and generation for Eve Energy solutions, autonomous heating schedules for the smart thermostat Eve Thermo or Adaptive Shading for roller blinds in the Eve Blinds Collection.”

    Read Article >

  • A Thunderbolt 4 cable render against a starry sky with a rocket and a trail of light

    A Thunderbolt 4 cable render against a starry sky with a rocket and a trail of light
    Image: Intel

    Why can’t you just plug a USB cable between two PCs, drag your mouse cursor between their screens, and drop files between them, as if they were a single machine? Well, you can and have for years — but Intel may be about to turbocharge that idea with Thunderbolt Share.

    It’s a proprietary app that Intel will be licensing to laptop, desktop, and accessory manufacturers to bundle with new hardware. Install it on two Thunderbolt 4 or 5 computers, connect them with a Thunderbolt cable, and you should be able to share your mouse, keyboard, screens, storage, and other USB peripherals; drag and drop files at Thunderbolt speeds; and sync data between them. It won’t let you share an internet connection, though.

    Read Article >

  • Google Chromecast with remote

    Google Chromecast with remote
    Chromecast with Google TV will soon become a Google Home hub to control local and cloud-based smart home devices.
    Photo: Chris Welch / The Verge

    Google has announced that it’s opening up API access to its Google Home smart home platform. This means that any app maker — smart home-related or not — can access the over 600 million devices connected to Google Home and tap into the Google Home automation engine to create smart solutions for their users inside their own app.

    The Home APIs can access any Matter device or Works with Google Home device and allows developers to build their own experiences using Google Home devices and automations into their apps on both iOS and Android. This is a significant move for Google in opening up its smart home platform, following shutting down its Works with Nest program back in 2019.

    Read Article >

  • The best parts of Google’s I/O 2024 keynote in 17 minutes.

    We cut down the nearly two-hour presentation just for you, ICYMI. You can also read about everything that was announced if you prefer words. Happy Wednesday!

  • Vector collage showing different aspects of using AI tools.

    Vector collage showing different aspects of using AI tools.
    Image: The Verge

    Google I/O introduced an AI assistant that can see and hear the world, while OpenAI put its version of a Her-like chatbot into an iPhone. Next week, Microsoft will be hosting Build, where it’s sure to have some version of Copilot or Cortana that understands pivot tables. Then, a few weeks after that, Apple will host its own developer conference, and if the buzz is anything to go by, it’ll be talking about artificial intelligence, too. (Unclear if Siri will be mentioned.)

    AI is here! It’s no longer conceptual. It’s taking jobs, making a few new ones, and helping millions of students avoid doing their homework. According to most of the major tech companies investing in AI, weappear to be at the start of experiencing one of those rare monumental shifts in technology. Think the Industrial Revolution or the creation of the internet or personal computer. All of Silicon Valley — of Big Tech — is focused on taking large language models and other forms of artificial intelligence and moving them from the laptops of researchers into the phones and computers of average people. Ideally, they will make a lot of money in the process.

    Read Article >

  • An illustration of Google’s multicolor “G” logo

    An illustration of Google’s multicolor “G” logo
    Illustration: The Verge

    This is not a joke: Google will now let you perform a “web” search. It’s rolling out “web” searches now, and in my early tests on desktop, it’s looking like it could be an incredibly popular change to Google’s search engine.

    The optional setting filters out almost all the other blocks of content that Google crams into a search results page, leaving you with links and text — and Google confirms to The Verge that it will block the company’s new AI Overviews as well.

    Read Article >

  • Google logo with colorful shapes

    Google logo with colorful shapes
    Illustration: The Verge

    Google has announced that the code for Project Gameface, its hands-free gaming “mouse” that you control by making faces, is now available open-source to Android developers on Tuesday.

    Developers can now integrate the accessibility feature into their apps, allowing users to control the cursor with facial gestures or by moving their heads. For example, they can open their mouth to move the cursor or raise their eyebrows to click and drag.

    Read Article >

  • How to care for your AI.

    Google is distributing these little handbooks for prompting AI, which is kind of adorable? It has color-coded highlights breaking down the basic components of a prompt. There’s an early internet “How to use a search engine” vibe about it — I’m gonna hang on to this one for posterity.

  • Here’s Sergey Brin holding court with reporters at Google I/O.

    Sergey posted up outside the area where Google was giving demos of Project Astra multi-modal chats. He said he thinks Sundar is doing a good job making hard decisions as CEO, said he mostly uses AI for coding tasks, and politely declined to answer a question from Bloomberg’s Shirin Ghaffary about Larry Page accusing Elon Musk of being a “speciesist.”

    Sergey Brin at Google I/O 2024

    Sergey Brin at Google I/O 2024
    Sergey, Brinning.
  • Illustration of Google’s wordmark, written in red and pink on a dark blue background.

    Illustration of Google’s wordmark, written in red and pink on a dark blue background.
    Illustration: The Verge

    Google says its new AI model, LearnLM, will help students with their homework.

    LearnLM, a family of AI models based on Google’s other large language mode, Gemini, was built to be an expert in subjects, find and present examples in different ways like a photo or video, coach students while studying, and, in Google’s words, “inspire engagement.” 

    Read Article >

  • Android will use Gemini Nano AI for TalkBack image descriptions.

    At Google I/O 2024 today, Google announced a multimodal version of Gemini Nano, allowing the on-device processing-powered AI model to recognize images, sounds, and spoken language in addition to text.

    Those multimodal capabilities are also coming to the Android accessibility feature TalkBack, using AI to fill in missing information about unlabeled images, without requiring a connection to the internet.

    Animation showing Google Talkback powered by Gemini Nano AI recognizing an image and describing it for a user as “A close-up of a black and white gingham dress. The dress is shor with a collard and long sleeves. It is tied as the waist with a big bow.”

    Animation showing Google Talkback powered by Gemini Nano AI recognizing an image and describing it for a user as “A close-up of a black and white gingham dress. The dress is shor with a collard and long sleeves. It is tied as the waist with a big bow.”
    Google notes that “Description of images may vary.”
    Image: Google
  • Google Gemini video search showing someone asking why a film camera lever isn’t moving all the way.

    Google Gemini video search showing someone asking why a film camera lever isn’t moving all the way.

    Google made a lot of noise about its Gemini AI taking over search at its I/O conference today, but one of its flashiest demos was once again marked by the ever-present fatal flaw of every large language model to date: confidently making up the wrong answer.

    During a sizzle reel for “Search in the Gemini era,” Google demoed video search, which allows you to search by speaking over a video clip. The example is a video of a stuck film advance lever on a film camera with the query “why is the lever not moving all the way,” which Gemini recognizes and provides some suggestions to fix. Very impressive!

    Read Article >

  • But we may have written off Google’s Glasses too soon — because Google just revealed a new prototype pair in a blink-and-you-missed-it moment at Google I/O.

    Read Article >

  • Marc Rebillet kicking off Google’s I/O event in a rainbow-colored bath robe

    Marc Rebillet kicking off Google’s I/O event in a rainbow-colored bath robe
    Well, that’s one way to kick off a developer-focused tech event.
    Image: Google

    Developer conferences aren’t exactly known for having an energetic, party-like atmosphere, but thankfully, that didn’t stop Google’s latest hype man. The company’s I/O event this year was kicked off by Marc Rebillet — an artist known in online spaces for pairing improvised electronic tracks with amusing (and typically loud) vocals. He also wears a lot of robes.

    “If you have no idea who I am, I would expect that,” said Rebillet. He introduced himself as an improvisational musician who “makes stuff up.”

    Read Article >

  • How many times did Google say AI today?

    Obviously, someone noticed our video that clipped every single AI mention at I/O 2023 last year. Sundar Pichai closed the 2024 keynote by showing how AI can save us some work by using it to keep track. At the time, it was up to 121 AI mentions.

    …by the time they were finished, it was probably more like 124.

    Google CEO Sundar Pichai in front of a sign showing how many times “AI” was said during the I/O 2024 keynote (at that point, it was 121).

    Google CEO Sundar Pichai in front of a sign showing how many times “AI” was said during the I/O 2024 keynote (at that point, it was 121).
    Image: Google
  • Google I/O just ended — and it was packed with AI announcements. As expected, the event focused heavily on Google’s Gemini AI models, along with the ways they’re being integrated into apps like Workspace and Chrome.

    If you didn’t get to tune in to the event live, you can catch up with all the latest from Google in the roundup below.

    Read Article >

  • Quick, go play around with AI Studio.

    Head over to Google’s Vertex AI Studio site and click “Try it in console” to goof around with some of the AI tools Google talked about at I/O today. The site is meant for developers who want to test the company’s models out while deciding what works best for their software, but anyone can play with it.

    Sample screen using a Gemini Gem.

    Sample screen using a Gemini Gem.
    Image: Google

Go to Source