❌

Normal view

Received yesterday β€” 26 July 2025

Google's tool for virtually trying on clothes is now available in the US

24 July 2025 at 14:43

At I/O 2025 in May, Google previewed a new AI-powered feature the company said would simplify online shopping. The tool allows you to upload a single, full-body photo of yourself to "try on" different pieces of clothing you find online. Following a limited preview, Google has begun rolling out the feature to users in the US. You can start trying on clothing for yourself by tapping the "try on" icon on any product listing on Google or apparel product result you find on Google Images.Β 

Powering the experience is an image generation model Google trained to take into account how different materials fold, stretch and drape across different human bodies. According to Google, the model supports billions of clothing items found across the company's Shopping Graph, meaning their may be some outfits the AI will have a hard time parsing. However, most clothing items from popular retailers should be supported out of the gate.Β Β Β 

With today's release, Google has also enhanced the price-tracking functionality built into the feature. Naturally, you can specify the color and size you want, but Google also allows you to set the price you want to pay for the item. It's possible to configure the watch so you're only alerted after the product you want dips below a specific price. "The Shopping Graph has products and prices from all across the web β€” so we’ll let you know when there’s an offer that meets your criteria," says Google. "No more constantly checking to see if that bag you’re eyeing is finally at the right price for you or forgetting to come back to a product you loved."

Later this year, Google plans to bring additional shopping features to AI Mode, the dedicated AI tab the company began rolling out to everyone in the US this past May. Come this fall, you'll be able to explore outfit and decor ideas β€” and buy what suits your fancy β€” directly from the chat bot.Β Β Β 

This article originally appeared on Engadget at https://www.engadget.com/ai/googles-tool-for-virtually-trying-on-clothes-is-now-available-in-the-us-144342056.html?src=rss

Β©

Β© Google

Google's latest AI-powered tool allows you to virtually try on clothes.
Received before yesterday

Trump's AI Action Plan targets state regulation and 'ideological bias'

23 July 2025 at 16:32

At the start of the year, President Trump announced his AI Action Plan, an initiative he said would eventually enact policy that would "enhance America's position as an AI powerhouse." Now, after months of consultation with industry players like Google and OpenAI, the administration has finally shared the specific actions it plans to take.Β Β Β 

Notably, the framework seeks to limit state regulation of AI companies by instructing the Office of Science and Technology Policy (OSTP) and other federal agencies to consider a state's existing AI laws before awarding AI-related funding. "The Federal government should not allow AI-related Federal funding to be directed to those states with burdensome AI regulations that waste these funds," the document states. As you may recall, Trump's "Big Beautiful Bill" was supposed to include a 10-year qualified moratorium on state AI regulation before that amendment was ultimately removed in a 99-1 vote by the US Senate.

Elsewhere, the AI Action Plan targets AI systems the White House says promote "social engineering agendas." To that end, Trump plans to direct the National Institute of Standards and Technology, through the Department of Commerce, to revise its AI Risk Management Framework to remove any mentions of "misinformation, Diversity, Equity, and Inclusion, and climate change." Furthermore, he's calling for an update to the federal government's procurement guidelines to ensure the government only contracts model providers that can definitively say their AI systems are "free from top-down ideological bias." Just how companies like OpenAI, Google and others are expected to do this is unclear from the document.Β 

Separately, Trump says he plans to remove regulatory hurdles that slow the construction of AI data centers. "America's environmental permitting system and other regulations make it almost impossible to build this infrastructure in the United States with the speed that is required," the document states. Specifically, the president plans to make federal lands available for the construction of data centers and power generation facilities. Under the Action Plan, the federal government will also expand efforts to use AI to carry out environmental reviews.Β Β Β Β 

The president plans to sign a handful of executive orders today to start the wheels turning on his action plan. Trump began his second term by rescinding President Biden's October 2023 AI guidelines. Biden's executive order outlined a plan to establish protections for the general public with regard to artificial intelligence. Specifically, the EO sought new standards for safety and security in addition to protocols for AI watermarking and both civil rights and consumer protections.

This article originally appeared on Engadget at https://www.engadget.com/ai/trumps-ai-action-plan-targets-state-regulation-and-ideological-bias-163247225.html?src=rss

Β©

Β© Reuters / Reuters

U.S. President Donald Trump stands after delivering remarks on AI infrastructure at the Roosevelt room at White House in Washington, U.S., January 21, 2025. REUTERS/Carlos Barria/File Photo

Proton's privacy-focused Lumo chatbot encrypts all your conversations

23 July 2025 at 14:45

What's another AI chatbot in an already crowded field? That's the question Proton is trying to answer today with the release of its new Lumo assistant. And like with its best known service, Proton Mail, the company says Lumo is for those who want a private alternative to what big tech is offering.

Proton says every conversation with Lumo is secured with zero-access encryption, meaning only your device can unlock your content. In the context of an AI chatbot, that has several implications. Most notably, it means not even Proton can view your chats. As a result, the company can't share your data with governments, advertisers or, for that matter, any other company, and it can't use your data to train future AI models. "By using Lumo, you can enjoy the benefits of an advanced AI assistant without the risk of your data being misused," says Proton.

I briefly tried Lumo. It's a bit slow to generate a response, but you can broadly expect a similar experience to what you would find using ChatGPT or Claude for free. Lumo can search the web to answer questions beyond its knowledge cut-off date, but by default that feature is turned off to further protect user privacy. You can also upload files to Lumo. Here again Proton says the chatbot won't save any information.

Proton isn't touting the performance of Lumo's large language models, but if you're curious about this sort of thing, it's powered by a handful of open-source systems, including Mistral NeMo and Mistral Small 3, among others. Proton told The VergeΒ Lumo will filter requests through the model best suited for the task. For example, it will use NVIDIA's OpenHands system for coding requests.

Lumo is free to use, with a weekly query limit. You don't need a Proton account to begin a conversation with the chatbot. In addition to being available on the web, Proton offers both Android and iOS apps. A $13 per month Plus plan offers unlimited usage, alongside perks like larger uploads, access to more advanced AI models, priority support and more.

This article originally appeared on Engadget at https://www.engadget.com/ai/protons-privacy-focused-lumo-chatbot-encrypts-all-your-conversations-144551345.html?src=rss

Β©

Β© Proton

Proton's Lumo chatbot has a purple cat for a mascot.

DuckDuckGo now lets you customize the responses of its Duck.ai chatbots

22 July 2025 at 15:15

Since last June, when DuckDuckGo introduced AI Chat, you've been able to use chat bots like Claude directly through the browser. Now the company is making it easier to tweak the system prompts of those AI models while retaining your privacy. For the uninitiated, system prompts are a set of instructions given to a chat bot at the start of a conversation to guide things along. Often they'll set the tone of the dialogue, and can sometimes cause a chat bot to be overly sycophantic as was the case with GPT-4o this past March.Β 

Both Anthropic and OpenAI give users a way to customize the responses of their respective chat bots, but if you don't know where to look for those settings, they can be tricky to find. DuckDuckGo's new system setting is available directly through Duck.ai's prompt bar and works a bit differently. Whatever customization you add is appended to the default system prompt for each model you chat with, meaning you don't need to set them independently of one another. Moreover, your tweaks are stored locally on your device, with no data being sent to Anthropic, OpenAI or any other model provider. It's a small addition, but if you use Duck.ai to compare the responses between different models, now you'll get more consistency in tone.

This article originally appeared on Engadget at https://www.engadget.com/ai/duckduckgo-now-lets-you-customize-the-responses-of-its-duckai-chatbots-151521930.html?src=rss

Β©

Β© DuckDuckGo

DuckDuckGo's new AI customization bar allows you to alter the system prompts for different chat bots.

DuckDuckGo now allows you to filter out AI images in search results

18 July 2025 at 14:43

DuckDuckGo is making it easier to wade through some of the AI slop that has taken over the internet in recent months. This week, the company introduced a new filter for removing AI-generated images from search results. The next time you use the browser, you'll see a new dropdown menu titled "AI images." From there, you can set whether you want to see AI content or not.Β 

New setting: hide AI-generated images in DuckDuckGo

Our philosophy about AI features is β€œprivate, useful, and optional.” Our goal is to help you find what you’re looking for. You should decide for yourself how much AI you want in your life – or if you want any at all. (1/4) pic.twitter.com/pTolmsEQlQ

β€” DuckDuckGo (@DuckDuckGo) July 14, 2025

The filter relies on manually curated open-source block lists maintained by uBlockOrigin and others. According to DuckDuckGo, the filter won't catch every AI-generated image out on the internet, but it will greatly reduce how many you see. The company says it's working on additional filters.Β Β 

You'll notice the example DuckDuckGo uses to demo the feature in the GIF it provided involves a search for images of a "baby peacock." That's not by accident. People first started noticing how much Google Search results had been overrun by AI slop about a year ago, and one of the worst examples was any query involving the showy birds. Google has since addressed the situation somewhat, but AI slop in search results remain a problem on the platform. So it's good to see DuckDuckGo adopt a simple but effective solution to the issue.Β 

This article originally appeared on Engadget at https://www.engadget.com/ai/duckduckgo-now-allows-you-to-filter-out-ai-images-in-search-results-144326213.html?src=rss

Β©

Β© DuckDuckGo

DuckDuckGo's new filter allows you to remove most AI images from your search results.

One of my favorite Steam early access games is now available on Switch and PS5

17 July 2025 at 17:45

After five years of development, one of Steam's coziest games is leaving Steam early access and making the jump to consoles. Starting today, you can purchase The Wandering Village on PC, Nintendo Switch and PlayStation 5. On Steam, the game's developer, Stray Fawn Studio, is offering a 35 percent discount until July 31. In the US, that means you can get the game for just under $20. Switch owners, meanwhile, can get a 10 percent launch discount until August 7.

I've been playing The Wandering Village on and off since it entered early access in 2022. It's a lovely game that combines two very different influences. Most obviously, the game wears on its sleeve Stray Fawn's love of Hayao Miyazaki's seminal NausicaΓ€ of the Valley of the Wind. The manga and later film is set in a desolate, post-apocalyptic world ravaged by nuclear war.Β 

The Wandering Village's other major influence are the titles of Impressions Games. In the late '90s and early 2000s, the now-defunct studio went on a hot streak releasing three games β€” Caesar III, Pharaoh and Zeus: Master of Olympus β€” that, to this day, define much of the city-building genre.

The Wandering Village marries those influences in a novel way. Rather than building your city on solid ground, you build it on the back of a giant creature called the Onbu. As you can probably guess, the Onbu doesn't stay still. And while there are ways you can influence its behavior, sometimes it can have a mind of its own. All of that leads to some interesting gameplay interactions. For example, the Onbu might wander into a biome that is toxic to your villagers. As of the writing of this article, the game has a "very positive" rating on Steam on nearly 6,000 reviews, with recent reviews tilting toward "overwhelming positive."

If you want to grab a physical copy of the game for Switch or PS5, Stray Fawn has partnered with Serenity Forge to offer collectors and premium editions of the game. Pre-orders will ship early next year. Despite the game leaving early access, Stray Fawn has promised to keep working The Wandering Village.Β 

This article originally appeared on Engadget at https://www.engadget.com/gaming/one-of-my-favorite-steam-early-access-games-is-now-available-on-switch-and-ps5-174539016.html?src=rss

Β©

Β© Stray Fawn Studios

In The Wandering Village, players build a city on top of a giant creature called the Onbu.

Trump's defunding of NASA would be catastrophic

17 July 2025 at 15:30

"This is probably the most uncertain future NASA has faced, maybe since the end of Apollo," Casey Dreier tells me over the phone. Dreier is the chief of space policy at The Planetary Society, a nonprofit that advocates for the exploration and study of space.

On July 10, the Senate Appropriations Committee met to discuss the proposed federal Commerce, Justice and Science budget for 2026. While on average, funding for NASA has accounted for about 0.3 percent of total yearly spending by the federal government since the start of the 2010s, President Trump has called for a 24 percent cut year over year to the agency's operating allowance. By any metric, his plan would be devastating.

Adjusted for inflation, it would leave NASA with the smallest operating budget it has had since Russian cosmonaut Yuri Gagarin became the first human to travel to space in 1961. In the process, it would eviscerate the agency's science budget by nearly half, resulting in the termination of 55 ongoing and or planned missions. It would also leave NASA with its smallest workforce in 70 years. All this, at a time when the agency has been tasked with returning to the Moon and bringing the first humans to Mars.

"There's no historical precedent to this level of single year, functionally indiscriminate and dramatic cuts. You lose, in one year, a third of all active science projects. [The Trump administration is] proposing to turn off missions that are performing not just good science, but unique and irreplaceable science. This isn't so they can reinvest the money in some radical new science efforts. No, the money is gone," said Dreier. "It's almost certainly the greatest threat to NASA science activities in the history of the space agency."

Dreier isn't exaggerating when he says some missions would be impossible to replace. One of the casualties of Trump's cuts would be the New Horizons probe. In 2015, New Horizons gave us our best look at Pluto ever. Four years later, it performed the farthest flyby in human history. As things stand, it's the only active spacecraft in the Kuiper belt, a region of our solar system that is not well-understood by scientists. Even if NASA were to start working on a replacement today, it would take a generation for that vehicle to reach where New Horizons is right now. It costs NASA about $14.7 million per year to continue operating the probe, a fraction of the $29.9 billion in additional funding Congress allocated to fund ICE enforcement and detainment operations in the president's recently passed tax bill.

OSIRIS-APEX probe visiting the Apophis asteroid
Heather Roper

Another mission that would be impossible to replace is OSIRIS-APEX. If the name sounds familiar, it's because OSRIS-APEX is a continuation of NASA's incredibly successful OSRIS-REx flight. In 2020, the spacecraft visited 101955 Bennu, an ancient asteroid about the size of the Empire State Building, and collected a sample of regolith (rocks and dirt) from its surface using a never-before-tried technique.

After OSRIS-REx successfully returned the sample to Earth, NASA decided to extend the spacecraft's mission and fly to another asteroid, 99942 Apophis. In 2029, Apophis will pass about 19,600 miles from Earth. It will be the closest approach of any known asteroid of its size. NASA said the extension would add $200 million to a mission that had already cost it an estimated $1.16 billion.

"This project is a pennies on the dollar repurposing of an existing spacecraft. It's the only American spacecraft that will be at Apophis for a once in a generation opportunity to study an asteroid that will just barely miss us," said Dreier. "That seems important to know."

At a time when nearly every facet of American life is being upturned, the potential cancellation of dozens of NASA missions might seem a distant concern, but the gutting of the agency's science budget would have a ripple effect on communities across the US.

"NASA is an engine for jobs in the country, and for every NASA job, there are many more that are created in the private workforce," said Bethany Ehlmann, Professor of Planetary Science at the California Institute of Technology. She also serves on the board of directors for The Planetary Society.

Professor Ehlmann's claim is supported by NASA's own data. In 2023, the agency employed 17,823 full-time civil servants nationwide. With NASA's private sector support factored in, that year the agency's missions were responsible for sustaining 304,803 jobs across all 50 states and the District of Columbia. Put another way, for every full-time equivalent job at a NASA facility, NASA supports at least 16 private sector jobs. "Space science has been broadly supported and impacts roughly three quarters of every congressional district in the country," said Dreier. "It's not just a red or blue state thing."

Following last week's Senate meeting, policymakers from both parties said they would push back on President Trump's NASA budget cuts. On Tuesday, the House Appropriations Committee's Subcommittee on Commerce, Justice, Science and Related Agencies passed a funding bill that would provide NASA with a total budget of $24.8 billion for 2026, or the same amount it was allocated this year. The week before, the corresponding subcommittee in the Senate passed its own NASA funding bill.

The two versions differ on one critical detail. The Senate legislation maintains the agency's science budget at $7.3 billion, while the House version seeks to reduce it by 18 percent to $6 billion. Separately, the House is calling for a 23 percent cut to the National Science Foundation's budget. NSF funds much of the nation's astronomy research.

"What I'm hearing from lawmakers is that they understand how important NASA is to industry. They understand how important NASA is to universities in terms of training, and providing grants that train the next generation of the space workforce," said Professor Ehlmann, who was on Capitol Hill last week. The House and Senate will need to come to an agreement for the bill to move forward.

Even with many lawmakers in favor of maintaining NASA's budget, a flat budget is still a funding cut when accounting for inflation. Moreover, NASA has already been negatively affected by the Trump administration's efforts to trim the federal workforce.

According to reporting Politico published on July 9, 2,694 NASA employees have agreed to leave the agency through either early retirement, a buyout or a deferred resignation. Of those individuals, 2,145 are workers in senior positions and 1,818 are staff serving in missions areas like human spaceflight and science. "Once the workforce is gone, they're gone. You lose a ton of institutional knowledge," said Dreier. The employees who have agreed to leave represent about 15 percent of NASA's 2023 workforce of 17,823. With the July 25 deadline for early retirement, voluntary separation and deferred resignations quickly approaching, that number is likely to grow. NASA's shifting priorities under the Trump administration have also created uncertainty among the agency's contractors.

According to former NASA employee and NASA Watch creator Keith Cowing the workforce cuts are already affecting employees. "In the 40 years I've been involved with NASA in one way or another, I've never seen morale so bad," he said. "Is NASA bloated? Yeah, but the way you deal with bloat is to go in with a scalpel and you cut carefully. And yet you have people [like Elon Musk] standing on stage with chainsaws. That is not the way to run government, and it's certainly not the way to create the machinery needed to explore the universe."

Whatever happens next, Dreier worries there's the potential for there to be an erosion in public support for NASA. He points to a survey published by Pew Research. In 2023, the organization found that monitoring for asteroids that could hit Earth and tracking changes to the planet's climate were the two activities Americans wanted NASA to prioritize over other mandates. By contrast, sending human astronauts to the Moon and Mars were the least important priorities for the public.

NASA's next-generation moon rocket, the Space Launch System (SLS) rocket with the Orion crew capsule, is readied for launch on pad 39-B, for the unmanned Artemis 1 mission to the Moon, at Cape Canaveral, Florida, U.S. November 15, 2022. REUTERS/Joe Skipper
REUTERS / Reuters

The House version of NASA's 2026 budget would boost the agency's exploration budget by 25 percent to $9.7 billion. In Trump's tax bill, Senator Ted Cruz (R-TX) included language that provided NASA with $4.1 billion for the fourth and fifth flights of the Space Launch System (SLS) rocket β€” the vehicle intended to carry the first NASA astronauts back to the Moon before before private sector alternatives like SpaceX's Starship are ready to fly.

With both the Trump administration and House pushing Moon and Mars missions as priorities, Dreier says they're "ironically doubling down on the activities that the private sector is already doing β€” SpaceX says it's going to send humans to Mars β€” and abandoning the things that only NASA does. There's no private sector company doing space science."

In effect, a NASA budget that sacrifices on scientific research in lieu of Mars missions would be one that invests in things the public says are the least important to it.

"I worry that they're moving away from what the public expects their space agency to do, and that as a consequence, it will undermine public investment in NASA," he said. "NASA is usually tied for the number one or two most popular federal agency. People wear NASA t-shirts. No one wears a Department of the Interior t-shirt walking out of the GAP. It's a rare and precious thing to have, and they're risking it. It's not just the future of the agency that's at risk, but the future of the public's relationship with it."

When asked for comment on this story, Bethany Stevens, NASA's press secretary, pointed Engadget to a letter from Acting Administrator Janet Petro NASA shared in a technical supplement it published alongside the president's budget request.

"We must continue to be responsible stewards of taxpayer dollars. That means making strategic decisions β€” including scaling back or discontinuing ineffective efforts not aligned with our Moon and Mars exploration priorities" Petro wrote.

The final NASA budget for 2026 is still months away from being finalized. After Tuesday's vote, the two funding bills will move to the full Senate and House appropriations committees for a vote and further revisions. Only after that will every member of each chamber get a chance to vote on the matter. Congress has until September 30 to complete the appropriations process before 2025 funding runs out. President Trump could also decide to veto the bill if it doesn't align with his priorities.

Have a tip for Igor? You can reach him by email, on Bluesky or send a message to @Kodachrome.72 to chat confidentially on Signal.

This article originally appeared on Engadget at https://www.engadget.com/science/space/trumps-defunding-of-nasa-would-be-catastrophic-153053020.html?src=rss

Β©

Β© REUTERS / Reuters

NASA's next-generation moon rocket, the Space Launch System (SLS) rocket with the Orion crew capsule, lifts off from launch complex 39-B on the unmanned Artemis 1 mission to the moon, seen from Sebastian, Florida, U.S. November 16, 2022. REUTERS/Joe Rimkus Jr.

Adobe Firefly can now generate sound effects from your audio cues

17 July 2025 at 13:00

Since rolling out the redesign of its Firefly app in April, Adobe has been releasing major updates for the generative AI hub at a near monthly clip. Today, the company is introducing a handful of new features to assist those who use Firefly's video capabilities.

To start, Adobe is making it easier to add sound effects to AI-generated clips. Right now, the majority of video models create footage without any accompanying audio. Adobe is addressing this with a nifty little feature that allows users to first describe the sound effect they want to generate and then record themselves making it. The second part isn't so Adobe's model can mimic the sound. Rather, it's so the system can get a better idea of the intensity and timing the user wants from the effect.

In the demo Adobe showed me, one of the company's employees used the feature to add the sound of a zipper being unzipped. They made a "zzzztttt" sound, which Adobe's model faithfully used to reproduce the effect at the intended volume. The translation was less convincing when the employee used the tool to add the sound of footsteps on concrete, though if you're using the feature for ideation as Adobe intended, that may not matter. When adding sound effects, there's a timeline editor along the bottom of the interface to make it easy to time the audio properly.

With Firefly's June update, users can upload images or videos to guide their video generation.
Adobe

The other new features Adobe is adding today are called Composition Reference, Keyframe Cropping and Video Presets. The first of those allows you to upload a video or image you captured to guide the generation process. In combination with Video Presets, you can define the style of the final output. Some of the options Adobe is offering at launch allow you to create clips with anime, black and white or vector art styles. Lastly, with Keyframe Cropping you can upload the first and final frame of a video and select an aspect ratio. Firefly will then generate a video that stays within your desired format.

In June, Adobe added support for additional third-party models, and this month it's doing the same. Most notable is the inclusion of Veo 3, which Google premiered at its I/O 2025 conference in May. At the moment, Veo 3 is one of the only AI models that can generate video with sound. Like with all the other partner models Adobe offers in Firefly, Google has agreed not to use data from Adobe users for training future models. Every image and video people create through Firefly is digitally signed with the model that was used to create it. That is one of the safeguards Adobe includes so that Firefly customers don't accidentally ship an asset that infringes on copyrighted material.

According to Zeke Koch, vice president of product management for Adobe Firefly, users can expect the fast pace of updates to continue. "We're relentlessly shipping stuff almost as quickly as we can," he said. Koch adds Adobe will continue to integrate more third-party models, as long as their providers agree to the company's data privacy terms.

This article originally appeared on Engadget at https://www.engadget.com/ai/adobe-firefly-can-now-generate-sound-effects-from-your-audio-cues-130008172.html?src=rss

Β©

Β© Adobe

With Adobe's June update, Firefly users can generate audio effects.

The next Made By Google event (better known as the Pixel launch) is set for August 20

16 July 2025 at 17:50

Google will host its next Made by Google event on August 20, the company announced today. In a media invite, it promised the event would feature new Pixel phones, watches, buds "and more." It's hard to imagine what other product types might be covered by those last two words, but for those who watch the industry closely, this event is likely to see the launch of the Pixel 10 flagship phones, along with a Pixel Watch 4 and new Pixel Buds.Β 

It's easy to make that deduction, especially going by previous Made By Google events. At last year's hardware launch, Google announced the Pixel 9, Pixel 9 Pro, Pixel 9 Pro XL, Pixel 9 Pro Fold, Pixel Watch 3 and Pixel Buds Pro 2.Β 

Between that and the company's invite, we can expect a refresh of nearly the entire Pixel line. As for what the "and more" bit could entail, recent rumors suggesting Google is working on a proper response to Apple's MagSafe tech dubbed Pixelsnap. Android manufactures have been slow to adopt the Qi2 wireless charging standard, but with the upcoming Pixel 10 it appears the company is working on a host of magnetic Qi2 accessories, including a new charging stand. As always, be sure to visit Engadget on the day of the event as we'll have a liveblog of the entire proceedings.Β 

Update, July 16 2025, 1:50PM ET: This story has been updated to include a list of devices we expect Google to unveil on August 20.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/the-next-made-by-google-event-better-known-as-the-pixel-launch-is-set-for-august-20-162832319.html?src=rss

Β©

Β© Photo by Sam Rutherford / Engadget

All the hardware announced during Google's annual Pixel hardware event is arranged on a white table and look quite shiny and new in black and pastel hues.

xAI starts offering Grok to US government agencies

14 July 2025 at 16:29

Just days after apologizing for Grok's recent hard turn toward antisemitism, xAI has announced a suite of AI products for government use. Grok for Government brings together the company's latest commercial products, including Grok 4 and Deep Search, with special considerations given to the needs of federal, state and local agencies.Β 

To that end, xAI says it will design custom models for specific national security and research customers. It will also develop specialized AI applications for use in healthcare, fundamental science and national defense, as well as offer models that can safely be used in classified and restricted environments.Β 

Announcing Grok for Government - a suite of products that make our frontier models available to United States Government customers

We are especially excited about two new partnerships for our US Government partners

1) a new contract from the US Department of Defense
2) our…

β€” xAI (@xai) July 14, 2025

Despite President Trump threatening to cut Elon Musk's companies off from government subsidies over their recent public feud, xAI says it already has a contract with the US Department of Defense. The company's products are also available to purchase through General Services Administration schedule, which means every federal government department, agency, or office can potentially access its models. OpenAI, which Musk helped fund in its early days as research lab through donations, launched ChatGPT Gov at the start of the year.Β Β Β 

This article originally appeared on Engadget at https://www.engadget.com/ai/xai-starts-offering-grok-to-us-government-agencies-162952893.html?src=rss

Β©

Β© Igor Bonifacic for Engadget

A closeup of the Grok icon on iOS.

Discord's virtual Orbs currency is now available for everyone to earn

14 July 2025 at 13:00

Discord has begun rolling out its in-app Orbs currency to everyone. In conjunction with the platform's Quest system, users can earn Orbs by watching ads on Discord. You can then use the currency to purchase exclusive drip for your profile, including badges, effects and avatars. It's also possible to exchange Orbs for three-day Nitro credits and anything else you can buy on the Discord Shop.

Sometimes developers will also offer the currency in exchange for simply trying out their game, or completing a specific gameplay task. Other rewards include exclusive profile decorations you can only earn by completing an associated Quest. The fastest way to start earning Orbs is by tapping the "Discover" icon in Discord's sidebar and then selecting "Quests." There you will see any promotions Discord is currently running, along with recently completed ones. If you're keen on earning Orbs, be sure to check back often as Discord frequently rotates new Quests in and out.

An overview explaining how Discord's Orbs currency works.
Discord

The online response to Orbs has been about what you would expect. When Discord first announced the currency, most people on the Discord subreddit were either lukewarm on the idea or outright hostile to it. However, the company says users are broadly in favor of it. Discord points to a survey it conducted before it began rolling out Orbs to beta testers this past May. In September 2024, the company found 82 percent of users it surveyed said they would like to earn a virtual currency on the platform, with nearly half of survey respondents saying a virtual currency would improve their overall experience.

In June, Discord CTO Stanislav Vishnevskiy told Engadget the company sees Orbs as a way to give players something in return for their time and attention while aiding game studios with discoverability. In my testing, I've found the system is easy enough to ignore if you don't care about customizing your profile, and they're not necessary to access any of Discord's core functionality.Β 

This article originally appeared on Engadget at https://www.engadget.com/apps/discords-virtual-orbs-currency-is-now-available-for-everyone-to-earn-130043599.html?src=rss

Β©

Β© Igor Bonifacic for Engadget

Discord app icon

How exactly did Grok go full 'MechaHitler?'

10 July 2025 at 15:10

Earlier this week, Grok, X's built-in chatbot, took a hard turn toward antisemitism following a recent update. Amid unprompted, hateful rhetoric against Jews, it even began referring to itself as MechaHitler, a reference to 1992's Wolfenstein 3D. X has been working to delete the chatbot's offensive posts. But it's safe to say many are left wondering how this sort of thing can even happen.

I spoke to Solomon Messing, a research professor at New York University's Center for Social Media and Politics, to get a sense of what may have gone wrong with Grok. Before his current stint in academia, Messing worked in the tech industry, including at Twitter, where he founded the company's data science research team. He was also there for Elon Musk's takeover.

The first thing to understand about how chatbots like Grok work is that they're built on large language models (LLMs) designed to mimic natural language. LLMs are pretrained on giant swaths of text, including books, academic papers and, yes, even social media posts. The training process allows AI models to generate coherent text through a predictive algorithm. However, those predictive capabilities are only as good as the numerical values or "weights" that an AI algorithm learns to assign to the signals it's later asked to interpret. Through a process known as post-training, AI researchers can fine-tune the weights their models assign to input data, thereby changing the outputs they generate.

"If a model has seen content like this during pretraining, there's the potential for the model to mimic the style and substance of the worst offenders on the internet," said Messing.

In short, the pre-training data is where everything starts. If an AI model hasn’t seen hateful, anti-antisemitic content, it won’t be aware of the sorts of patterns that inform that kind of speech β€” including phrases such as "Heil Hitler" β€” and, as a result, it probably won't regurgitate them to the user.

In the statement X shared after the episode, the company admitted there were areas where Grok's training could be improved. "We are aware of recent posts made by Grok and are actively working to remove the inappropriate posts. Since being made aware of the content, xAI has taken action to ban hate speech before Grok posts on X," the company said. "xAI is training only truth-seeking and thanks to the millions of users on X, we are able to quickly identify and update the model where training could be improved."

Elon Musk said users would
Screenshots via X

As I saw people post screenshots of Grok's responses, one thought I had was that what we were watching was a reflection of X's changing userbase. It's no secret xAI has been using data from X to train Grok; easier access to the platform's trove of information is part of the reason Musk said he was merging the two companies in March. What's more, X's userbase has become more right wing under Musk's ownership of the site. In effect, there may have been a poisoning of the well that is Grok's training data. Messing isn't so certain.

"Could the pre-training data for Grok be getting more hateful over time? Sure, if you remove content moderation over time, the userbase might get more and more oriented toward people who are tolerant of hateful speech [...] thus the pre-training data drifts in a more hateful direction," Messing said. "But without knowing what's in the training data, it's hard to say for sure."

It also wouldn't explain how Grok became so antisemitic after just a single update. On social media, there has been speculation that a rogue system prompt may explain what happened. System prompts are a set of instructions AI model developers give to their chatbots before the start of a conversation. They give the model a set of guidelines to adhere to, and define the tools it can turn to for help in answering a prompt.

In May xAI blamed "an unauthorized modification" to Grok's prompt on X for the chatbot's brief obsession with "white genocide" in South Africa. The fact that the change was made at 3:15AM PT made many suspect Elon Musk had done the tweak himself. Following the incident, xAI open sourced Grok's system prompts, allowing people to view them publicly on GitHub. After Tuesday's episode, people noticed xAI had deleted a recently added system prompt that told Grok its responses should "not shy away from making claims which are politically incorrect, as long as they are well substantiated."

Messing also doesn't believe the deleted system prompt is the smoking gun some online believe it to be.

"If I were trying to ensure a model didn't respond in hateful/racist ways I would try to do that during post-training, not as a simple system prompt. Or at the very least, I would have a hate speech detection model running that would censor or provide negative feedback to model generations that were clearly hateful," he said. "So it's hard to say for sure, but if that one system prompt was all that was keeping xAI from going off the rails with Nazi rhetoric, well that would be like attaching the wings to a plane with duct tape."

He added: "I would definitely say a shift in training, like a new training approach or having a different pre-training or post-training setup would more likely explain this than a system prompt, particularly when that system prompt doesn’t explicitly say, 'Do not say things that Nazis would say.'"

On Wednesday, Musk suggested Grok was effectively baited into being hateful. "Grok was too compliant to user prompts," he said. "Too eager to please and be manipulated, essentially. That is being addressed." According to Messing, there is some validity to that argument, but it doesn't provide the full picture. "Musk isn’t necessarily wrong," he said, "There’s a whole art to 'jailbreaking' an LLM, and it’s tough to fully guard against in post-training. But I don’t think that fully explains the set of instances of pro-Nazi text generations from Grok that we saw."

If there's one takeaway from this episode, it's that one of the issues with foundational AI models is just how little we know about their inner workings. As Messing points out, even with Meta's open-weight Llama models, we don't really know what ingredients are going into the mix. "And that's one of the fundamental problems when we're trying to understand what's happening in any foundational model," he said, "we don't know what the pre-training data is."

In the specific case of Grok, we don't have enough information right now to know for sure what went wrong. It could have been a single trigger like an errant system prompt, or, more likely, a confluence of factors that includes the system's training data. However, Messing suspects we may see another incident just like it in the future.

"[AI models] are not the easiest things to control and align," he said. "And if you're moving fast and not putting in the proper guardrails, then you're privileging progress over a sort of care. Then, you know, things like this are not surprising."

This article originally appeared on Engadget at https://www.engadget.com/ai/how-exactly-did-grok-go-full-mechahitler-151020144.html?src=rss

Β©

Β© Igor Bonifacic for Engadget

A closeup of the Grok icon on iOS.

Google's Gemini app can now generate videos from static images

10 July 2025 at 15:00

Starting today, Google is bringing image-to-video generation to the Gemini app. The feature comes courtesy of the company's Veo 3 model, which Google began rolling out more broadly to AI Pro users last week after it was initially only available to AI Ultra subscribers.

To start using Gemini's image-to-video generation, click the "tools" option in the prompt bar and then select "video." Google is currently limiting Veo 3 to producing eight-second clips at 720p. Gemini will output your request in a 16:9 landscape format, so the resulting clips won't be great for sharing on social media β€” unlike those generated by TikTok's AI Alive feature, for example. However, Veo 3 is currently one of the only AI models capable of generating synced audio alongside the video it creates.

You can also use Veo 3's image-to-video generation feature in Flow, Google's AI filmmaking app. As of today, the program is available in 75 additional countries. Over in the Gemini app, image-to-video generation is rolling out on the web today. Google expects most mobile users will have access by the end of the week. A $20 per month Google AI Pro or $250 per month AI Ultra subscription is required to use the new feature.

This article originally appeared on Engadget at https://www.engadget.com/ai/googles-gemini-app-can-now-generate-videos-from-static-images-150052396.html?src=rss

Β©

Β© Google

Google's Veo 3 model transforms an image of a cardboard box into a video where that box erupts with colorful confetti.

Crunchyroll blames third-party vendor for AI subtitle mess

3 July 2025 at 19:57

At the start of last year, Crunchyroll President Rahul Purini told The VergeΒ the company was "very focused on testing" generative AI tools for subtitling and captioning speech to text. The comment came just months after the streamer temporarily took down the debut episode of one of its newest shows, The Yuzuki Family's Four Sons, after people complained about poor subtitles.Β 

Much of the translation was nonsensical, with missing punctuation in many sentences. At the time, some fans speculated the company had used AI to translate the episode. Earlier this week, fresh accusations of AI use came up when an episode of new anime showed evidence ChatGPT was used to write the subtitles.

The German subtitle for Necronomico and the Cosmic Horror Show in German read, ChatGPT said...
Igor Bonifacic for Engadget

On July 1, Bluesky user Pixel spotted an issue with the German subtitles for Necronomico and the Cosmic Horror Show, one of the new series Crunchyroll is streaming this anime season. Beyond a general sloppiness, one line began with the words "ChatGPT said..." during a pivotal scene in the show's debut episode. Engadget was able to independently verify the episode contains the AI-generated translation. If you're curious, the English subtitles aren't much better, as seen in the screenshots above and below.

"We were made aware that AI-generated subtitles were employed by a third-party vendor, which is in violation of our agreement," a Crunchyroll spokesperson told Engadget. "We are investigating the matter and are working to rectify the error."

People were understandably upset about the subtitles. Crunchyroll subscriptions start at $8 per month, and since its acquisition by Sony, service has been the dominant player in the anime streaming market outside of Japan. "This is not acceptable. How can we be expected to pay for a service that clearly doesn't care about the quality of its products?" wrote Pixel in their original post. As of the writing of this article, their post has been quoted more than 300 times and reposted by thousands of other people. Many fans say they're turning to torrented fansubs, calling the official AI-generated translations "unwatchable." People on Reddit have expressed similar frustrations.

A translation that reads
Crunchyroll

Ironically, when Purini revealed Crunchyroll was testing generative AI tools for subtitles, he said part of the motivation was to prevent piracy. He reasoned the tech would allow the company to start streaming new, translated anime episodes as close to their original Japanese release as possible, adding the lag between official releases was sometimes what pushed fans to torrent shows.Β Β Β Β Β Β Β 

Update 3:58PM ET: Added comment from Crunchyroll.Β Β 

Have a tip for Igor? You can reach him by email, on Bluesky or send a message to @Kodachrome.72 to chat confidentially on Signal.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/streaming/crunchyroll-blames-third-party-vendor-for-ai-subtitle-mess-145621606.html?src=rss

Β©

Β© Crunchyroll

A translation that reads "Those two have seen them in a video before."

Perplexity joins Anthropic and OpenAI in offering a $200 per month subscription

2 July 2025 at 19:17

You can add Perplexity to the growing list of AI companies offering $200+ per month subscription plans to users who want unlimited access to their most advanced products and tools. As of today, Perplexity Max is available on iOS and the web.Β 

The subscription comes with unlimited monthly usage of Labs, the agentic creation tool Perplexity released this past May. People can use Labs to generate spreadsheets, presentations, web applications and more. Perplexity is also promising early access to new features, including Comet, a new web browser the company claims will be a "powerful thought partner for everything you do on the web." The company adds Max subscribers will receive priority customer support, as well as access to top frontier models from partners like Anthropic and OpenAI.Β Β Β Β Β 

Perplexity will continue to offer its existing Pro plan, which remains $20 per month. Admittedly, the company is courting a small demographic with the new subscription, noting it's primarily designed for content designers, business strategists and academic research.Β Β 

OpenAI was the first to open the floodgates of very expensive AI subscriptions when it began offering its ChatGPT Pro plan at the end of last year. Since then, Anthropic, Google have followed suit.Β 

This article originally appeared on Engadget at https://www.engadget.com/ai/perplexity-joins-anthropic-and-openai-in-offering-a-200-per-month-subscription-191715149.html?src=rss

Β©

Β© Perplexity

Perplexity Max art showing a stylized picture of a man running.

How to buy a GPU in 2025

2 July 2025 at 16:01

One of the trickiest parts of any new computer build or upgrade is finding the right video card. In a gaming PC, the GPU is easily the most important component, and you can hamstring your experience by buying the wrong model. The buying process can be frustrating, with many manufacturers selling their models above their suggested retail price. In this guide, we'll help you navigate the market and find the right GPU for your needs.

It's all about the games

The first question to ask yourself is what kind of games do you want to play. Competitive shooters like Valorant, Overwatch and Marvel Rivals were designed to run on older hardware. As such, even entry-level GPUs like the GeForce RTX 5060 can push those games at 120 frames per second and above at 1080p (more on why that's important in a moment).

By contrast, if you want to play modern, single-player games with ray tracing and other graphical extras, you'll need a more powerful GPU. Just how much more powerful will depend on the resolution of your monitor.

A 1440p monitor has 78 percent more pixels than a 1080p screen, and a 4K display has more than twice as many pixels as a QHD panel. In short, running a game at 4K, especially at anything above 60 frames per second, is demanding, and most GPUs will need to use upscaling techniques like NVIDIA's Deep Learning Super Sampling (DLSS) and AMD's FidelityFX Super Resolution (FSR) to push new games at high refresh rates.

While we're on the subject of resolution, it doesn't make sense to spend a lot of money on a 4K monitor only to pair it with an inexpensive GPU. That's a recipe for a bad experience. As you're shopping for a new video card, you should think about the resolution and frame rate you want to play your games. If you're in the market for both a GPU and display, be sure to check out our guide to the best gaming monitors.

If your budget allows, a good bet is to buy a midrange card that can comfortably render all but the most demanding games at 1440p and at least 144 frames per second. Put another way, you want a GPU that can saturate a monitor at its native resolution and refresh rate in as many games as possible. That will give you the smoothest possible experience in terms of motion clarity, and allow you to dabble in both competitive shooters and the latest single-player games as the mood strikes you.

NVIDIA vs AMD and Intel

Intel Arc B580 label view
Photo by Devindra Hardawar/Engadget

One of the confusing aspects of the GPU industry are all the players involved. What you need to know is that there are three main players: AMD, Intel and NVIDIA. They design the cards you can buy, but delegate the manufacturing of them to so-called add-in board (AIB) partners like ASUS, XFX, Gigabyte and others.

As you can probably imagine, this creates some headaches. The most annoying of which is that AMD, Intel and NVIDIA will often set recommended prices for their graphic cards, only for their partners to sell their versions of those GPUs above the manufacturer's suggested retail price (MSRP). For example, NVIDIA's website lists the RTX 5070 with a starting price of $549. On Newegg, there are no 5070s listed at that price. The only models anywhere close to $549 are open box specials. If you want one that comes sealed, that will cost you at least $600.

As for what company you should buy your new GPU from, before 2025, NVIDIA was the undisputed king of the market. Specific GeForce cards may have not offered the best rasterization performance in their price range, but between their performance in games with ray tracing and the fact NVIDIA was ahead on features like DLSS, an RTX GPU was a safe bet.

However, with this year's RTX 50 series release, other than models like the RTX 5080 and 5090 where there's no competition, it's safe to say NVIDIA missed the mark this generation. If you're in the market for an entry- or mid-level GPU, AMD and Intel offer better value, with cards that come with enough VRAM for now and into the future. That said, there are still a few reasons you might consider an NVIDIA GPU, starting with ray tracing.

Ray tracing

For decades, developers have used rasterization techniques to approximate how light behaves in the real world, and the results have been commendable. But if you know what to look for, it's easy to see where the illusion falls apart. For that reason, real-time ray tracing has been a goal of industry for years, and in 2018 it became a reality with NVIDIA's first RTX cards.

In some games, effects like ray-traced reflections and global illumination are transformational. Unfortunately, those features are expensive to run, often coming at a significant frame-rate drop without upscaling. Since ray tracing was optional in many games before 2025, you could save money by buying an AMD GPU. For example, even if the RX 7800 XT was worse at ray tracing than the RTX 4070, the former was often cheaper to buy, had more onboard VRAM and was as good or better rasterization performance in many games.

However, you can't ignore ray tracing performance anymore. We're starting to see releases like Doom: The Dark Ages where the tech is an integral part of a game's rendering pipeline, and more are likely to follow in the future. Thankfully, AMD's newest cards are much better in that regard, though you'll still get an edge running an NVIDIA model. For that reason, if ray tracing is important to you, NVIDIA cards are still the way to go.

Refresh rates and frame rates

If you're new to the world of PC gaming, it can be tricky to wrap your head around refresh rates. In short, the higher the refresh rate of a monitor, the more times it can update the image it displays on screen every second, thereby producing a smoother moving picture.

For example, moving elements on a monitor with a 240Hz refresh rate will look better than on one with a 120Hz refresh rate. However, that's all contingent on your GPU being able to consistently render a game at the appropriate frame rates. In the case of a 120Hz monitor, you want a GPU with enough headroom to drive most games at 120 fps. Realistically, most video cards won't be able to achieve that in every game, but it's a good baseline to aim for when shopping for a new GPU.

Upscaling and latency

I've mentioned DLSS a few times already. Alongside FSR and Intel XeSS, DLSS is an example of what's known as an image reconstruction technology. More and more, native rendering is going out of fashion in game design. With ray tracing and other modern effects enabled, even the most powerful GPUs can struggle to render a game at 1440p or 4K and a playable framerate. That’s why many developers will turn to DLSS, FSR or XeSS to eke out additional performance by upscaling a lower resolution image to QHD or UHD.

Upscaling in games is nothing new. For example, the PS4 Pro used a checkerboard technique to output games in 4K. What is different now is how modern GPUs go about it. With DLSS, NVIDIA pioneered an approach that uses machine learning to recreate an image at a higher resolution, and in the process, addressed some of the pitfalls of past upscaling methods. If you're sensitive to these sorts of things, there's still blur and shimmer with DLSS, FSR and XeSS, but it's much less pronounced and can lead to significant performance gains.

To DLSS, NVIDIA later added single and multi-frame generation. DLSS is only available on NVIDIA cards, and following the recent release of DLSS 4, widely considered to offer the best image quality. That's another reason why you might choose an NVIDIA card over one of its competitors. However, if you decide to go with an AMD GPU, don't feel like you're missing out. The company recently released FSR 4. While it's not quite on par with DLSS 4 in terms of support and image quality, it's a major leap over FSR 3 and FSR 2.

While on the subject of DLSS, I'll also mention NVIDIA Reflex. It's a latency-reducing technology NVIDIA introduced in 2020. AMD has its own version called Radeon Anti-Lag, but here again Team Green has a slight edge thanks to the recent release of Reflex 2. If you're serious about competitive games, Reflex 2 can significantly reduce input lag, which will make it easier to nail your shots in Counter-Strike 2, Valorant and other shooters.

Driver support

Previously, one of the reasons to pick an NVIDIA GPU over the competition was the company's solid track record of driver support. With one of the company's video cards, you were less likely to run into stability issues and games failing to launch. In 2025, NVIDIA's drivers have been abysmal, with people reporting frequent issues and bugs. So if you care about stability, AMD has a slight edge right now.

VRAM

As you're comparing different GPUs, especially those in the same tier, pay close attention to the amount of VRAM they offer. Modern games will eat up as much VRAM as a GPU can offer, and if your card has a low amount, such as 8GB, you're likely to run into a performance bottleneck.

If your budget allows for it, always go for the model with more VRAM. Consider, for instance, the difference between the $299 RTX 5060 and $429 RTX 5060 Ti. I know spending an extra $130 β€” close to 50 percent more β€” on the 5060 Ti is going to be a lot for some people, but it's the difference between a card that is barely adequate for any recent release and one that will last you for a few years, and it all comes down to the amount of VRAM offered in each. Simply put, more is better.

A slight caveat to this is when comparing models that have different memory bandwidths. A GPU that can access more of its memory faster can outperform one with more memory, even if it has less of it outright. Here, you'll want to read reviews of the models you're comparing to see how they perform in different games.

Size and power draw

Modern GPUs are big. Most new cards will take up at least two PCI slots on the back of your motherboard. They can also vary dramatically in length, depending on the number of fans the AIB has added to cool the PCB. To be safe, be sure to check the length of the card you want to buy against the maximum clearance listed by your case manufacturer. If you have a radiator at the front of your case, you will also need to factor the size of that in your measurements. The last thing you want is to buy a card that doesn't fit in your case.

Lastly, be sure to check the recommended power supply for the card you want. As a rule of thumb, unless you know what you're doing, it's best to just stick with the manufacturer's recommendation. For instance, NVIDIA suggests pairing the RTX 5070 with a 750 watt PSU. So if you're currently running a 650 watt unit, you'll need to factor in the price of a PSU upgrade with your new GPU.

Should you buy a used GPU?

NVIDIA RTX 5060 Ti
Devindra Hardawar for Engadget

It depends. If you can find a deal on an old RTX 40 series GPU, then yes. NVIDIA's RTX 50 series don't offer greatly improved performance over their predecessors, and with most models selling for more than their suggested retail price, it's not a great time to buy a new NVIDIA card.

That said, I suspect finding a good deal on a used GPU will be difficult. Most people will know the value of what they have, and considering the current market, will probably try to get as much as they can for their old card.

You may find better deals on older AMD and Intel GPUs, but I think you're better off spending more now on a new model from one of those companies since the generational gains offered by their latest cards are much more impressive. Simply put, the 9070 XT and B580 are two of the best cards you can buy right now.

Anything older than a card from NVIDIA's 40 series or AMD's RX 6000 family is not worth considering. Unless your budget is extremely tight or you mostly play older games, you're much better off spending more to buy a new card that will last you longer.

When is a good time to buy a new GPU?

If you've read up to this point, you're probably wondering if it's even worth buying a GPU right now. The answer is (unsurprisingly) complicated. There are a handful of great cards like the Intel B580 and Radeon 9070 XT that are absolutely worth buying. The problem is finding any GPU at prices approaching those set by AMD, Intel or NVIDIA is really tough. To make things worse, uncertainty around President Trump's tariff policies is likely to push prices even higher. If you own a relatively recent GPU, you're probably best off trying to hold onto your current card until things settle down.

However, if your GPU isn't cutting it anymore, you face a difficult decision: overpay now, or wait and potentially pay even more later. As much as I'm reluctant to recommend a prebuilt PC, if you're already planning to build a new computer, it's worth exploring your options there since you might end up saving money on a video card when it's bundled together with all the other components you need.

The best GPUs for 2025: Engadget recommendations

Entry-level (1080p) GPUs

As we mentioned above, if you're only aiming to play basic competitive shooters like Valorant and Overwatch 2 in 1080p, an entry-level GPU may be all you need. While 1080p isn't an ideal resolution when it comes to sharpness, many gamers prefer it since it's easier to reach higher framerates. And it also helps that 1080p gaming monitors, like the AOC 24G15N 24-inch we recommend, tend to offer speedy refresh rates for between $100 and $200. When you're zipping through matches, you likely won't have time to take a breath and appreciate the detail from higher resolutions.

Here are our recommendations for entry-level video cards.

Midrange (1440p) GPUs

While entry-level cards can dabble with 1440p gaming, it's worth stepping up to something a bit more powerful if you actually want to achieve higher refresh rates. For most gamers, 1440p is the best balance between sharpness and high framerates. It looks noticeably better than 1080p, and doesn't require the horsepower overhead of 4K. (And there's a good chance you won't really see a visual difference with the jump to 4K.)

Here are our recommendations for midrange GPUs.

High-end (4K) GPUs

If you want the most of what modern PC games have to offer, including 4K and all of the benefits of ray tracing, then be ready to spend big bucks on a high-end GPU. If you're going this route, though, be sure you're also gaming on a high-end monitor that befits these powerful GPUs.

Here are our recommendations for premium GPUs.

Super high-end/Money isn't real GPUs

Listen, there's only one choice here and it's NVIDIA's enormously powerful and fantastically expensive RTX 5090. It's an absolute beast, with 32GB of VRAM and the most hardware NVIDIA has ever stuffed into a consumer GeForce GPU. The RTX 5090 doesn't make sense for 99 percent of gamers β€” especially since it's now going for $3,000, up from its $2,000 launch price β€” but if you have the cash to spare, it'll certainly earn you bragging rights. (Check out our NVIDIA RTX 5090 review.)

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/how-to-buy-a-gpu-160100017.html?src=rss

Β©

Β© Devindra Hardawar for Engadget

NVIDIA RTX 5070 Ti

NVIDIA's RTX 5050 arrives early in laptops from Acer, MSI and more

24 June 2025 at 14:33

NVIDIA's add-in board partners won't start selling the GeForce RTX 5050 until mid-July, but it looks like the company has given the early go-ahead to OEMs to start announcing laptops with the new entry-level GPU. WccftechΒ and Videocardz report that 5050-equipped laptops are available to order in China as of this morning from domestic manufacturers like Mechrevo.Β 

Over in the US, companies like MSI and Acer have begun announcing their own RTX 5050 laptops. The former, for instance, will sell the Katana 15 for $999 through Walmart. Alongside the 5050, it features a Core i7-14650HX processor, 16GB of RAM and a 144Hz display. We've reached out to NVIDIA for more information on global availability, and we'll update this article once we learn more.Β 

In the meantime, the Chinese listings give us a good idea of what to expect from the new GPU. It features 2,560 CUDA cores, 8GB of GDDR7 VRAM and a TDP of 115W. The memory spec is interesting. Before today's announcement, the desktop variant of the 5050 was rumored to include GDDR6 memory. The fact the laptop version has GDDR7 VRAM would suggests its sibling will as well since it wouldn't make much sense for NVIDIA to hobble the desktop card in that way. With a 128-bit interface, the RTX 5050 should have a memory bandwidth of 384 GB/s, putting on par with the 5060 mobile in that department.Β 

As for performance, the 5050 laptop should land somewhere in the middle between the 4050 and 5060, with decent generational gains on offer but nothing too exciting. This being an entry-level card, the fact it only comes with 8GB of VRAM is more understandable, and it fits the bill for a GPU most people will only use for occasional gaming.Β Β Β 

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/nvidias-rtx-5050-arrives-early-in-laptops-from-acer-msi-and-more-143309816.html?src=rss

Β©

Β© NVIDIA

NVIDIA blackwell GPU core.

The Tacx Alpine is a $1,100 gradient simulator for your Garmin smart bike trainer

24 June 2025 at 11:00

Cycling season may have only just started, but that’s not stopping Garmin from looking ahead to when all the roadies need to take their bikes indoors. On Tuesday, the company announced the Tacx Alpine, an indoor gradient simulator for its family of NEO smart trainers. The accessory can replicate inclines of up to 25 percent and declines of up to -10 percent, with adjustments made either manually through a built-in control panel or automatically when using the device with a compatible apps like Zwift and Garmin’s own Tacx Training software.

In those same apps, the Tacx Alpine also allows for real-time virtual steering adjustments. Naturally, Garmin Connect support is also included for stat tracking and more. In short, the Tacx Alpine is designed for those who want spice up their off-season training since pedaling a road bike on an indoor trainer is about the most boring thing ever.

Garmin says mounting the front of your bike is easy. Inside the box, you’ll find adapters for both quick release skewers (9 x 100), and thru axles (12 x 100, 15 x 100 and 15 x 110), so the Tacx Alpine will work with most modern road bikes. However, trainer compatibility is limited to Garmin’s NEO 2T and Neo 3M models. Those cost $1,400 and $2,000 new respectively, and the accessory itself will set you back $1,100, with Garmin not planning to offer bundles at launch.

That might seem like a lot to pay for an accessory designed to make your indoor rides less monotonous, but it’s broadly comparable with the rest of the industry. Wahoo, for instance, sells its Kickr Climb simulator for $750, but it doesn’t come with a steering feature. Meanwhile, the Elite Rizer, which offers both steering and gradient simulation, costs $1,000. Either way, if you’re looking at one of these, chances are you already spent a pretty penny on a fancy carbon road bike and the thought of dropping another $1,000 on your hobby doesn’t phase you.

This article originally appeared on Engadget at https://www.engadget.com/wearables/the-tacx-alpine-is-a-1100-gradient-simulator-for-your-garmin-smart-bike-trainer-110041344.html?src=rss

Β©

Β© Garmin

Garmin Tacx Alpine
❌