Google rolled out a bunch of new features with Android 16 on Tuesday, but the company appears to be saving its big Material 3 Expressive redesign for a future update. The update doesn’t feature the design language’s revamped elements, and a source tells Android Authority’s Mishaal Rahman that Google is planning to launch the new look on September 3rd, 2025, instead.
With Android 16, Google is starting to roll out support for Live Updates with progress-centric notifications and enhanced settings for users with hearing aids. The updates are coming to Pixel devices first, but according to Google, Android users will have to wait for another update to see Live Updates “fully realized.”
Google officially took the wraps off Material 3 Expressive following a leak last month, which features updates to icon shapes, type styles, and color palettes with “more natural, springy animations” across the Android interface. You can still check out some Material 3 Expressive updates in the Android 16 QPR1 beta that’s available now, but Rahman notes that Google plans on launching more design updates in the next Android 16 QPR1 Beta 2.
Google is expected to include Android’s desktop mode in a September launch as well. The new mode, which builds on Samsung’s DeX platform, optimizes apps and content for large-screen devices. It will allow you to resize multiple app windows across your screens, as well as connect phones and tablets to external displays for a desktop-like experience. Users with a Pixel 8 and up can try out these features in the Android 16 beta, but the rest of us will likely have to wait a few more months.
Google is starting to offer buyouts to US-based employees in its sprawling Search organization, along with other divisions like marketing, research, and core engineering, according to multiple employees familiar with the matter.
The buyouts, which Google is referring to as a "voluntary exit program," are currently not being offered to employees in DeepMind, Google Cloud, YouTube, or Google's central ad sales organization. Employees in Google's platforms and services group, which includes Android and the Pixel line of devices, were offered buyouts earlier this year before the company enacted layoffs. It's unclear if more layoffs will follow this week's buyout announcement. Employees in some orgs are being offered a minimum of 14 weeks' pay with a July 1st enrollment deadline.
Other parts of Google, including YouTube, are also requiring US employees within a 50-mile radius of an office to return to work at least three days a week by September, or be laid off with severance.
In an internal memo I obtained, Nick Fox, the head of Google's wider "Knowledge and Information" group that includes Search, called the buyout program a "supportive exit path for those of you who don't feel a …
Google has its own internal AI tools to help engineers be more productive.
Getty Images
Google CEO Sundar Pichai said the company is tracking how AI makes its engineers more productive.
During the "Lex Fridman Podcast," Pichai estimated a 10% increase in engineering capacity.
Separately, Google and Microsoft have publicly shared how much of their code is being generated by AI.
Google is tracking how AI is making its engineers more productive — and has developed a specific way to measure it.
Speaking on an episode of the "Lex Fridman Podcast" that aired last week, Google CEO Sundar Pichai said that the company was looking closely at how artificial intelligence was boosting productivity among its software developers.
"The most important metric, and we carefully measure it, is how much has our engineering velocity increased as a company due to AI?" he said. The company estimates that it's so far seen a 10% boost, Pichai said.
A Google spokesperson clarified to Business Insider that the company tracks this by measuring the increase in engineering capacity created, in hours per week, from the use of AI-powered tools.
Put simply, it's a measurement of how much extra time engineers are getting back thanks to AI.
Whether Google expects that 10% number to keep increasing, Pichai didn't say. However, he said he expects agentic capabilities — where AI can take actions and make decisions more autonomously — will unlock the "next big wave".
Google has its own internal tools to help engineers code. Last year, the company launched an internal coding copilot named "Goose," trained on 25 years of Google's technical history, Business Insider previously reported.
While AI Pichai said during the podcast that Google plans to hire more engineers next year. "The opportunity space of what we can do is expanding too," he said, adding that he hopes AI removes some of the grunt work and frees up time for more enjoyable aspects of engineering.
Separately, the company is tracking the amount of code that is being generated by AI within Google's walls — a number that is apparently increasing.
Google isn't the only one. Speaking at London Tech Week on Monday, Microsoft UK CEO Darren Hardman said its GitHub Copilot coding assistant is now writing 40% of code at the company, "enabling us to launch more products in the last 12 months than we did in the previous three years."
He added: "It isn't just about speed."
In April, Meta CEO Mark Zuckerberg predicted AI could handle half of Meta's developer work within a year.
As new questions arise about how AI will communicate with humans — and with other AI — new protocols are emerging.
gremlin/Getty Images
AI protocols are evolving to address interactions between humans and AI, and among AI systems.
New AI protocols aim to manage non-deterministic behavior, crucial for future AI integration.
"I think we will see a lot of new protocols in the age of AI," an executive at World told BI.
The tech industry, much like everything else in the world, abides by certain rules.
With the boom in personal computing came USB, a standard for transferring data between devices. With the rise of the internet came IP addresses, numerical labels that identify every device online. With the advent of email came SMTP, a framework for routing email across the internet.
These are protocols — the invisible scaffolding of the digital realm — and with every technological shift, new ones emerge to govern how things communicate, interact, and operate.
As the world enters an era shaped by AI, it will need to draw up new ones. But AI goes beyond the usual parameters of screens and code. It forces developers to rethink fundamental questions about how technological systems interact across the virtual and physical worlds.
How will humans and AI coexist? How will AI systems engage with each other? And how will we define the protocols that manage a new age of intelligent systems?
Across the industry, startups and tech giants alike are busy developing protocols to answer these questions. Some govern the present in which humans still largely control AI models. Others are building for a future in which AI has taken over a significant share of human labor.
"Protocols are going to be this kind of standardized way of processing non-deterministic information," Antoni Gmitruk, the chief technology officer of Golf, which helps clients deploy remote servers aligned with Anthropic's Model Context Protocol, told BI. Agents, and AI in general, are "inherently non-deterministic in terms of what they do and how they behave."
When AI behavior is difficult to predict, the best response is to imagine possibilities and test them through hypothetical scenarios.
Here are a few that call for clear protocols.
Scenario 1: Humans and AI, a dialogue of equals
Games are one way to determine which protocols strike the right balance of power between AI and humans.
In late 2024, a group of young cryptography experts launched Freysa, an AI agent that invites human users to manipulate it. The rules are unconventional: Make Freysa fall in love with you or agree to concede its funds, and the prize is yours. The prize pool grows with each failed attempt in a standoff between human intuition and machine logic.
Freysa has caught the attention of big names in the tech industry, from Elon Musk, who called one of its games "interesting," to veteran venture capitalist Marc Andreessen.
"The core technical thing we've done is enabled her to have her own private keys inside a trusted enclave," said one of the architects of Freysa, who spoke under the condition of anonymity to BI in a January interview.
Secure enclaves are not new in the tech industry. They're used by companies from AWS to Microsoft as an extra layer of security to isolate sensitive data.
In Freysa's case, the architect said they represent the first step toward creating a "sovereign agent." He defined that as an agent that can control its own private keys, access money, and evolve autonomously — the type of agent that will likely become ubiquitous.
"Why are we doing it at this time? We're entering a phase where AI is getting just good enough that you can see the future, which is AI basically replacing your work, my work, all our work, and becoming economically productive as autonomous entities," the architect said.
In this phase, they said Freysa helps answer a core question: "What does human involvement look like? And how do you have human co-governance over agents at scale?"
In May, the The Block, a crypto news site, revealed that the company behind Freysa is Eternis AI. Eternis AI describes itself as an "applied AI lab focused on enabling digital twins for everyone, multi-agent coordination, and sovereign agent systems." The company has raised $30 million from investors, including Coinbase Ventures. Its co-founders are Srikar Varadaraj, Pratyush Ranjan Tiwari, Ken Li, and Augustinas Malinauskas.
Scenario 2: To the current architects of intelligence
Freysa establishes protocols in anticipation of a hypothetical future when humans and AI agents interact with similar levels of autonomy. The world, however, needs also to set rules for the present, where AI still remains a product of human design and intention.
AI typically runs on the web and builds on existing protocols developed long before it, explained Davi Ottenheimer, a cybersecurity strategistwho studies the intersection of technology, ethics, and human behavior, and is president of security consultancy flyingpenguin. "But it adds in this new element of intelligence, which is reasoning," he said, and we don't yet have protocols for reasoning.
"I'm seeing this sort of hinted at in all of the news. Oh, they scanned every book that's ever been written and never asked if they could. Well, there was no protocol that said you can't scan that, right?" he said.
There might not be protocols, but there are laws.
OpenAI is facing a copyright lawsuit from the Authors Guild for training its models on data from "more than 100,000 published books" and then deleting the datasets. Meta considered buying the publishing house Simon & Schuster outright to gain access to published books. Tech giants have also resorted to tapping almost all of the consumer data available online from the content of public Google Docs and the relics of social media sites like Myspace and Friendster to train their AI models.
Ottenheimer compared the current dash for data to the creation of ImageNet — the visual database that propelled computer vision, built by Mechanical Turk workers who scoured the internet for content.
"They did a bunch of stuff that a protocol would have eliminated," he said.
Scenario 3: How to take to each other
As we move closer to a future where artificial general intelligence is a reality, we'll need protocols for how intelligent systems — from foundation models to agents — communicate with each other and the broader world.
The leading AI companies have already launched new ones to pave the way. Anthropic, the maker of Claude, launched the Model Context Protocol, or MCP, in November 2024. It describes it as a "universal, open standard for connecting AI systems with data sources, replacing fragmented integrations with a single protocol."
In April, Google launched Agent2Agent, a protocol that will "allow AI agents to communicate with each other, securely exchange information, and coordinate actions on top of various enterprise platforms or applications."
These build on existing AI protocols, but address new challenges of scaling and interoperability that have become critical to AI adoption.
So, managing agents' behavior is the "middle step before we unleash the full power of AGI and let them run around the world freely," he said. When we arrive at that point, Gmitruksaid agents will no longer communicate through APIs but in natural language. They'll have unique identities, jobs even, and need to be verified.
"How do we enable agents to communicate between each other, and not just being computer programs running somewhere on the server, but actually being some sort of existing entity that has its history, that has its kind of goals," Gmitruk said.
It's still early to set standards for agent-to-agent communication, Gmitruk said. Earlier this year he and his team initially launched a company focused on building an authentication protocol for agents, but pivoted.
"It was too early for agent-to-agent authentication," he told BI over LinkedIn. "Our overall vision is still the same -> there needs to be agent-native access to the conventional internet, but we just doubled down on MCP as this is more relevant at the stage of agents we're at."
Does everything need a protocol?
Definitely not. The AI boom marks a turning point, reviving debates over how knowledge is shared and monetized.
McKinsey & Company calls it an "inflection point" in the fourth industrial revolution — a wave of change that it says began in the mid-2010s and spans the current era of "connectivity, advanced analytics, automation, and advanced-manufacturing technology."
Moments like this raise a key question: How much innovation belongs to the public and how much to the market? Nowhere is that clearer than in the AI world's debate between the value of open-source and closed models.
"I think we will see a lot of new protocols in the age of AI," Tiago Sada, the chief product officer at Tools for Humanity, the company building the technology behind Sam Altman's World. However, "I don't think everything should be a protocol."
World is a protocol designed for a future in which humans will need to verify their identity at every turn. Sada said the goal of any protocol "should be like this open thing, like this open infrastructure that anyone can use," and is free from censorship or influence.
At the same time, "one of the downsides of protocols is that they're sometimes slower to move," he said. "When's the last time email got a new feature? Or the internet? Protocols are open and inclusive, but they can be harder to monetize and innovate on," he said. "So in AI, yes — we'll see some things built as protocols, but a lot will still just be products."
It seems like hardly a day goes by anymore without a new version of Google's Gemini AI landing, and sure enough, Google is rolling out a major update to its most powerful 2.5 Pro model. This release is aimed at fixing some problems that cropped up in an earlier Gemini Pro update, and the word is, this version will become a stable release that comes to the Gemini app for everyone to use.
The previous Gemini 2.5 Pro release, known as the I/O Edition, or simply 05-06, was focused on coding upgrades. Google claims the new version is even better at generating code, with a new high score of 82.2 percent in the Aider Polyglot test. That beats the best from OpenAI, Anthropic, and DeepSeek by a comfortable margin.
While the general-purpose Gemini 2.5 Flash has left preview, the Pro version is lagging behind. In fact, the last several updates have attracted some valid criticism of 2.5 Pro's performance outside of coding tasks since the big 03-25 update. Google's Logan Kilpatrick says the team has taken that feedback to heart and that the new model "closes [the] gap on 03-25 regressions." For example, users will supposedly see more creativity with better formatting of responses.
Google said the newest version of Gemini 2.5 Pro, now on preview, gives faster and more creative responses while performing better than OpenAI's o3.Read More
Google’s AI-powered notetaking app, NotebookLM, now lets you share your notebooks with classmates, coworkers, or students using a public link. Though viewers can’t edit what’s in your notebook, they can still use it to ask questions and interact with AI-generated content like audio overviews, briefings, and FAQs.
The steps to making your notebook available publicly are pretty similar to the way you share something in Google Drive, Docs, Sheets, and Slides. You just select the Share button in the top-right corner of the notebook, and then change the access to “Anyone with a link.” From there, hit the “Copy link” button and then paste the notebook link into a text, email, or even on social media if you want more people to interact with the information.
Google also lets you share your notebooks with others by entering their email address. Unlike with public link-sharing, you can give individual users the ability to edit your notebook. You can share audio overviews from within the Gemini app as well.
It has become a common refrain during Google's antitrust saga: What happened to "don't be evil?" Google's unofficial motto has haunted it as it has grown ever larger, but a shareholder lawsuit sought to rein in some of the company's excesses. And it might be working. The plaintiffs in the case have reached a settlement with Google parent company Alphabet, which will spend a boatload of cash on "comprehensive" reforms. The goal is to steer Google away from the kind of anticompetitive practices that got it in hot water.
Under the terms of the settlement, obtained by Bloomberg Law, Alphabet will spend $500 million over the next 10 years on systematic reforms. The company will have to form a board-level committee devoted to overseeing the company's regulatory compliance and antitrust risk, a rarity for US firms. This group will report directly to CEO Sundar Pichai. There will also be reforms at other levels of the company that allow employees to identify potential legal pitfalls before they affect the company. Google has also agreed to preserve communications. Google's propensity to use auto-deleting chats drew condemnation from several judges overseeing its antitrust cases.
The agreement still needs approval from US District Judge Rita Lin in San Francisco, but that's mainly a formality at this point. Naturally, Alphabet does not admit to any wrongdoing under the terms of the settlement, but it may have to pay tens of millions in legal fees on top of the promised $500 million investment.
Every smartphone maker is racing to find a way to put AI in your pocket, but no one has cracked the code yet. Samsung was an early supporter of Google's Gemini AI, which has largely supplanted its little-used Bixby assistant. However, a new report claims Samsung is planning a big AI shakeup by partnering with Perplexity on the Galaxy S26.
Perplexity pitches itself as an AI-powered search service, running on the same generative AI technology behind ChatGPT, Gemini, and all the others. However, it cites its sources around the web more prominently than a pure chatbot. Perplexity made waves during the Google search antitrust trial when executive Dmitry Shevelenko testified that Google blocked Motorola from using Perplexity on its 2024 phones. The company got its wish this year, though, with Perplexity finding a place on 2025 Razr phones.
A report from Bloomberg says Samsung will be the next to leverage Perplexity's AI. The companies are apparently close to signing a deal that will make this AI model a core part of the Galaxy S26 lineup. Motorola uses Perplexity for search functionality inside its Moto AI system, but the Samsung deal would be more comprehensive.
Google quietly launched AI Edge Gallery, an experimental Android app that runs AI models offline without internet, bringing Hugging Face models directly to smartphones with enhanced privacy.Read More
There's a running theory in tech circles that says, basically, AI is the new UI. Not long from now, some people argue, you simply won't need a homescreen full of app icons or a traditional web browser or really anything other than an interface to an AI assistant and agent that accomplishes everything on your behalf. Is that the actual future, absurd AI boosterism, or something in between? Who knows! But the ranks of the AI believers seem to grow every day.
Apple, however, appears poised to go… a different way. On this episode of The Vergecast, Nilay and David discuss some of the rumors surrounding WWDC, including the possibility of a huge redesign and a new naming scheme for all of Apple's software. It's all eminently reasonable, if slightly confusing. But is it a coat of paint on an old idea, when what Apple actually needs to do is ship the better Siri it has promised for so long? We have many thoughts. (Oh, and a party speaker update.)
On August 5th, 2024, Judge Amit Mehta ruled in the case of United States of America v. Google, saying, “…the court reaches the following conclusion: Google is a monopolist, and it has acted as one to maintain its monopoly. It has violated Section 2 of the Sherman Act.”
That ended the biggest tech antitrust trial since the US took on Microsoft in the 1990s — possibly aside from the government’s antitrust case targeting Google’s ad business — but it’s also just the start of the process. Now, lawyers for Google and the Department of Justice are arguing over the ruling, as well as what to do about the company and its products.
The DOJ argued that Google struck anticompetitive deals with Apple and other companies for prime placement of its search engine. Google maintains that its dominant market share is the result of a superior product. The DOJ says options to resolve the situation include breaking up Google to separate products like Chrome, Search, and Android, but it may be a while until we hear about their full plan.
Read on below for all of the updates and notes from the case.
Google’s AI assistant, Gemini, is gaining a more prominent place in your inbox with the launch of email summary cards, which will appear at the top of your emails. The company announced Thursday that users would no longer have to tap an option to summarize an email with AI. Instead, the AI will now automatically […]
Finding the best wireless earbuds can make a huge difference in your day-to-day life, whether you’re commuting, working out, traveling or just zoning out with your favorite playlist. Today’s earbuds aren’t just about cutting the cord — they’re smarter, more comfortable and packed with features like active noise cancellation, customizable sound profiles and even spatial audio support.
With so many options out there, there’s truly a pair of wireless earbuds for everyone. Whether you want something that's perfect for running, built for all-day wear at the office or tuned for serious audiophiles, the choices have never been better. Some models prioritize battery life, others deliver premium sound quality and a few somehow manage to do it all without blowing your budget. No matter what you’re looking for, we’re here to help you find the right set of wireless earbuds to match your lifestyle.
When it comes to shopping for earphones, the first thing to consider is design or wear style. Do you prefer a semi-open fit like AirPods or do you want something that completely closes off your ears? If you’re shopping for earbuds with active noise cancellation, you'll want the latter, but a case can be made for the former if you want to wear them all day or frequent places where you need to be tuned in to the ambient sounds. The overall shape of earbuds can determine whether you get a comfortable fit, so can the size and weight, so you’ll want to consider all that before deciding. And remember: audio companies aren’t perfect, so despite lots of research, the earbud shape they decided on may not fit you well. Don’t be afraid to return ill-fitting earbuds for something that’s more comfortable.
As wireless earbuds have become the norm, they’re now more reliable for basic things like consistent Bluetooth connectivity. Companies are still in a race to pack as much as they can into increasingly smaller designs. This typically means a longer list of features on the more premium sets of earbuds with basic functionality on the cheapest models. Carefully consider what you can’t live without when selecting your next earbuds, and make sure key items like automatic pausing and multipoint connectivity are on the spec sheet. You’ll also want to investigate the volume and touch controls as you’ll often have to sacrifice access to something else to make that adjustment via on-board taps or swipes. Some earbuds even offer app settings to tweak the audio profiles or firmware updates to improve performance over time.
For those in the Apple ecosystem, features like auto-pairing with devices, especially with AirPods Pro 2, can be an added advantage, while Android users may want to look for models that offer similar cross-device functionality.
When it comes to battery life, the average set of earbuds lasts about five hours on a single charge. You can find sets that last longer, but this is likely enough to get you through a work day if you’re docking the buds during lunch or the occasional meeting. You’ll want to check on how many extra charges are available via the case and if it supports wireless charging.
Companies will also make lofty claims about call quality on wireless earbuds. Despite lots of promises, the reality is most earbuds still leave you sounding like you’re on speakerphone. There are some sets that deliver, but don’t get your hopes up unless reviews confirm the claims.
Sound can be subjective, so we recommend trying before you buy if at all possible. This is especially true if you're an audiophile. We understand this isn’t easy when most of us do a lot of shopping online, but trying on a set of earbuds and listening to them for a few minutes can save you from an expensive case of buyer's remorse. If a store doesn’t allow a quick demo, most retailers have return policies that will let you take earbuds back you don’t like. Of course, you have to be willing to temporarily part with funds in order to do this.
We also recommend paying attention to things like Spatial Audio, Dolby Atmos, 360 Reality Audio and other immersive formats. Not all earbuds support them, so you’ll want to make sure a perspective pair does if that sort of thing excites you, especially if you plan to use them for playback of high-quality audio.
How we test wireless Bluetooth earbuds
The primary way we test earbuds is to wear them as much as possible. We prefer to do this over a one- to two-week period, but sometimes embargoes don’t allow it. During this time, we listen to a mix of music and podcasts, while also using the earbuds to take both voice and video calls. Since battery life for earbuds is typically less than a full day, we drain the battery with looping music and the volume set at a comfortable level (usually around 75 percent).
To judge audio quality, we listen to a range of genres, noting any differences in the sound profile across the styles. We also test at both low and high volumes to check for consistency in the tuning. To assess call quality, we’ll record audio samples with the earbuds’ microphones as well as have third parties call us.
When it comes to features, we do a thorough review of companion apps, testing each feature as we work through the software. Any holdovers from previous models are double checked for improvements or regression. If the earbuds we’re testing are an updated version of a previous model, we’ll spend time getting reacquainted with the older buds. Ditto for the closest competition for each new set of earbuds that we review.
Other wireless Bluetooth earbuds we tested
Beats Powerbeats Pro 2
The newest version of the Powerbeats Pro have an improved, comfortable design, balanced bass and new H2 chips and a heart rate sensor inside. But heart rate support is currently limited on iOS, and there's a possibility those capabilities make it onto the next AirPods Pro models.
Samsung Galaxy Buds 3
The Galaxy Buds 3 combine ANC with an open-type design, which renders the noise-blocking abilities of the earbuds mostly useless. Still, there’s great low-end tone with ample bass when a track demands it. There are also lots of handy features, most of which require a Samsung phone. But at this price, there are better options from Google, Beats and Sony
Sennheiser Momentum Sport
I really like the overall shape of the Momentum Sport earbuds. They’re more comfortable than the Momentum True Wireless 4 and fit in my ears better. What’s more, the body temperature and heart rate sensors work well, sending those stats to a variety of apps. However, that sport-tracking feature works best with Polar’s app and devices, so there’s that consideration. Also, the audio quality and ANC performance isn’t as good as the MTW4, and these earbuds are pricey.
Beats Solo Buds
There’s a lot to like about the Solo Buds for $80. For me, the primary perk is they’re very comfortable to wear for long periods of time thanks to some thoughtful design considerations. You only get the basics here in terms of features and, as expected, the overall sound quality isn’t as good as the pricier models in the Beats lineup. You will get 18 hours of battery life though, since the company nixed the battery in the case and beefed up the listening time in the buds themselves.
Bose Ultra Open Earbuds
Bose created something very unique for this set of earbuds that allows you to stay in-tune with the world while listening to audio content. The clip-on design is very comfortable, but sound quality suffers due to the open-type fit, especially when it comes to bass and spatial audio.
Audio-Technica ATH-TWX7
These stick buds have a compact design that’s comfortable to wear and the warm sound profile is great at times. However, overall audio performance is inconsistent and there’s no automatic pausing.
Master & Dynamic MW09
Retooled audio, better ambient sound mode and reliable multipoint Bluetooth are the best things the MW09 has to offer. They’re expensive though, and you can find better ANC performance elsewhere.
Wireless earbud FAQs
What is considered good battery life for true wireless earbuds?
Most wireless earbuds will last five hours on a single charge, at the least. You can find some pairs that have even better battery life, lasting between six and eight hours before they need more juice. All of the best wireless earbuds come with a charging case, which will provide additional hours of battery life — but you'll have to return each bud to the case in order to charge them up.
Is sound quality better on headphones or earbuds?
Comparing sound quality on earbuds and headphones is a bit like comparing apples and oranges. There are a lot of variables to consider and the differences in components make a direct comparison difficult. Personally, I prefer the audio quality from over-ear headphones, but I can tell you the sound from earbuds like Sennheiser’s Momentum True Wireless 3 is also outstanding.
Which wireless earbuds have the longest battery life?
With new models coming out all the time, tracking the hours of battery life for each this can be difficult to keep tabs on. The longest-lasting earbuds we’ve reviewed are Audio-Technica’s ATH-CKS5TW. The company states they last 15 hours, but the app was still showing 40 percent at that mark during our tests. The only downside is these earbuds debuted in 2019 and both technology and features have improved since. In terms of current models, Master & Dynamic’s MW08 offers 12 hours of use on a charge with ANC off (10 with ANC on) and JBL has multiple options with 10-hour batteries.
What wireless earbuds are waterproof?
There are plenty of options these days when it comes to increased water resistance. To determine the level of protection, you’ll want to look for an IP (ingress protection) rating. The first number indicates intrusion protection from things like dust. The second number is the level of moisture protection and you’ll want to make sure that figure is 7 or higher. At this water-resistance rating, earbuds can withstand full immersion for up to 30 minutes in depths up to one meter (3.28 feet). If either of the IP numbers is an X, that means it doesn’t have any special protection. For example, a pair of wireless earbuds that are IPX7 wouldn’t be built to avoid dust intrusion, but they would be ok if you dropped them in shallow water.
Which earbuds stay in ears the best?
A secure fit can vary wildly from person to person. All of our ears are different, so audio companies are designing their products to fit the most people they can with a single shape. This is why AirPods will easily fall out for some but stay put for others. Design touches like wing tips or fins typically come on fitness models and those elements can help keep things in place. You’ll likely just have to try earbuds on, and if they don’t fit well return them.
What wireless earbuds work with PS5?
PlayStation 5 doesn’t support Bluetooth audio without an adapter or dongle. Even Sony’s own gaming headsets come with a transmitter that connects to the console. There are universal options that allow you to use any headphones, headset or earbuds with a PS5. Once you have one, plug it into a USB port on the console and pair your earbuds with it.
Recent updates
May 2025: Updated to ensure top picks and buying advice remain accurate.
March 2025: Updated the top pick for the best sounding wireless earbuds - runner up.
January 2025: Updated the top pick for best sounding wireless earbuds.
July 2024: Updated our list to include the Samsung Galaxy Buds 3 Pro.
This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/best-wireless-earbuds-120058222.html?src=rss
The current incarnation of Google Photos was not Google's first image management platform, but it's been a big success. Ten years on, Google Photos remains one of Google's most popular products, and it's getting a couple of new features to celebrate its 10th year in operation. You'll be able to share albums a bit more easily, and editing tools are getting a boost with, you guessed it, AI.
Google Photos made a splash in 2015 when it broke free of the spiraling Google+ social network, offering people supposedly unlimited free storage for compressed images. Of course, that was too good to last. In 2021, Google began limiting photo uploads to 15GB for free users, sharing the default account level storage with other services like Gmail and Drive. Today, Google encourages everyone to pay for a Google One subscription to get more space, which is a bit of a bummer. Regardless, people still use Google Photos extensively.
According to the company, Photos has more than 1.5 billion monthly users, and it stores more than 9 trillion photos and videos. When using the Photos app on a phone, you are prompted to automatically upload your camera roll, which makes it easy to keep all your memories backed up (and edge ever closer to the free storage limit). Photos has also long offered almost magical search capabilities, allowing you to search for the content of images to find them. That may seem less impressive now, but it was revolutionary a decade ago. Google says users perform over 370 million searches in Photos each month.