Normal view

Received before yesterday

How exactly did Grok go full 'MechaHitler?'

10 July 2025 at 15:10

Earlier this week, Grok, X's built-in chatbot, took a hard turn toward antisemitism following a recent update. Amid unprompted, hateful rhetoric against Jews, it even began referring to itself as MechaHitler, a reference to 1992's Wolfenstein 3D. X has been working to delete the chatbot's offensive posts. But it's safe to say many are left wondering how this sort of thing can even happen.

I spoke to Solomon Messing, a research professor at New York University's Center for Social Media and Politics, to get a sense of what may have gone wrong with Grok. Before his current stint in academia, Messing worked in the tech industry, including at Twitter, where he founded the company's data science research team. He was also there for Elon Musk's takeover.

The first thing to understand about how chatbots like Grok work is that they're built on large language models (LLMs) designed to mimic natural language. LLMs are pretrained on giant swaths of text, including books, academic papers and, yes, even social media posts. The training process allows AI models to generate coherent text through a predictive algorithm. However, those predictive capabilities are only as good as the numerical values or "weights" that an AI algorithm learns to assign to the signals it's later asked to interpret. Through a process known as post-training, AI researchers can fine-tune the weights their models assign to input data, thereby changing the outputs they generate.

"If a model has seen content like this during pretraining, there's the potential for the model to mimic the style and substance of the worst offenders on the internet," said Messing.

In short, the pre-training data is where everything starts. If an AI model hasn’t seen hateful, anti-antisemitic content, it won’t be aware of the sorts of patterns that inform that kind of speech — including phrases such as "Heil Hitler" — and, as a result, it probably won't regurgitate them to the user.

In the statement X shared after the episode, the company admitted there were areas where Grok's training could be improved. "We are aware of recent posts made by Grok and are actively working to remove the inappropriate posts. Since being made aware of the content, xAI has taken action to ban hate speech before Grok posts on X," the company said. "xAI is training only truth-seeking and thanks to the millions of users on X, we are able to quickly identify and update the model where training could be improved."

Elon Musk said users would
Screenshots via X

As I saw people post screenshots of Grok's responses, one thought I had was that what we were watching was a reflection of X's changing userbase. It's no secret xAI has been using data from X to train Grok; easier access to the platform's trove of information is part of the reason Musk said he was merging the two companies in March. What's more, X's userbase has become more right wing under Musk's ownership of the site. In effect, there may have been a poisoning of the well that is Grok's training data. Messing isn't so certain.

"Could the pre-training data for Grok be getting more hateful over time? Sure, if you remove content moderation over time, the userbase might get more and more oriented toward people who are tolerant of hateful speech [...] thus the pre-training data drifts in a more hateful direction," Messing said. "But without knowing what's in the training data, it's hard to say for sure."

It also wouldn't explain how Grok became so antisemitic after just a single update. On social media, there has been speculation that a rogue system prompt may explain what happened. System prompts are a set of instructions AI model developers give to their chatbots before the start of a conversation. They give the model a set of guidelines to adhere to, and define the tools it can turn to for help in answering a prompt.

In May xAI blamed "an unauthorized modification" to Grok's prompt on X for the chatbot's brief obsession with "white genocide" in South Africa. The fact that the change was made at 3:15AM PT made many suspect Elon Musk had done the tweak himself. Following the incident, xAI open sourced Grok's system prompts, allowing people to view them publicly on GitHub. After Tuesday's episode, people noticed xAI had deleted a recently added system prompt that told Grok its responses should "not shy away from making claims which are politically incorrect, as long as they are well substantiated."

Messing also doesn't believe the deleted system prompt is the smoking gun some online believe it to be.

"If I were trying to ensure a model didn't respond in hateful/racist ways I would try to do that during post-training, not as a simple system prompt. Or at the very least, I would have a hate speech detection model running that would censor or provide negative feedback to model generations that were clearly hateful," he said. "So it's hard to say for sure, but if that one system prompt was all that was keeping xAI from going off the rails with Nazi rhetoric, well that would be like attaching the wings to a plane with duct tape."

He added: "I would definitely say a shift in training, like a new training approach or having a different pre-training or post-training setup would more likely explain this than a system prompt, particularly when that system prompt doesn’t explicitly say, 'Do not say things that Nazis would say.'"

On Wednesday, Musk suggested Grok was effectively baited into being hateful. "Grok was too compliant to user prompts," he said. "Too eager to please and be manipulated, essentially. That is being addressed." According to Messing, there is some validity to that argument, but it doesn't provide the full picture. "Musk isn’t necessarily wrong," he said, "There’s a whole art to 'jailbreaking' an LLM, and it’s tough to fully guard against in post-training. But I don’t think that fully explains the set of instances of pro-Nazi text generations from Grok that we saw."

If there's one takeaway from this episode, it's that one of the issues with foundational AI models is just how little we know about their inner workings. As Messing points out, even with Meta's open-weight Llama models, we don't really know what ingredients are going into the mix. "And that's one of the fundamental problems when we're trying to understand what's happening in any foundational model," he said, "we don't know what the pre-training data is."

In the specific case of Grok, we don't have enough information right now to know for sure what went wrong. It could have been a single trigger like an errant system prompt, or, more likely, a confluence of factors that includes the system's training data. However, Messing suspects we may see another incident just like it in the future.

"[AI models] are not the easiest things to control and align," he said. "And if you're moving fast and not putting in the proper guardrails, then you're privileging progress over a sort of care. Then, you know, things like this are not surprising."

This article originally appeared on Engadget at https://www.engadget.com/ai/how-exactly-did-grok-go-full-mechahitler-151020144.html?src=rss

©

© Igor Bonifacic for Engadget

A closeup of the Grok icon on iOS.

Google's Gemini app can now generate videos from static images

10 July 2025 at 15:00

Starting today, Google is bringing image-to-video generation to the Gemini app. The feature comes courtesy of the company's Veo 3 model, which Google began rolling out more broadly to AI Pro users last week after it was initially only available to AI Ultra subscribers.

To start using Gemini's image-to-video generation, click the "tools" option in the prompt bar and then select "video." Google is currently limiting Veo 3 to producing eight-second clips at 720p. Gemini will output your request in a 16:9 landscape format, so the resulting clips won't be great for sharing on social media — unlike those generated by TikTok's AI Alive feature, for example. However, Veo 3 is currently one of the only AI models capable of generating synced audio alongside the video it creates.

You can also use Veo 3's image-to-video generation feature in Flow, Google's AI filmmaking app. As of today, the program is available in 75 additional countries. Over in the Gemini app, image-to-video generation is rolling out on the web today. Google expects most mobile users will have access by the end of the week. A $20 per month Google AI Pro or $250 per month AI Ultra subscription is required to use the new feature.

This article originally appeared on Engadget at https://www.engadget.com/ai/googles-gemini-app-can-now-generate-videos-from-static-images-150052396.html?src=rss

©

© Google

Google's Veo 3 model transforms an image of a cardboard box into a video where that box erupts with colorful confetti.

Crunchyroll blames third-party vendor for AI subtitle mess

3 July 2025 at 19:57

At the start of last year, Crunchyroll President Rahul Purini told The Verge the company was "very focused on testing" generative AI tools for subtitling and captioning speech to text. The comment came just months after the streamer temporarily took down the debut episode of one of its newest shows, The Yuzuki Family's Four Sons, after people complained about poor subtitles. 

Much of the translation was nonsensical, with missing punctuation in many sentences. At the time, some fans speculated the company had used AI to translate the episode. Earlier this week, fresh accusations of AI use came up when an episode of new anime showed evidence ChatGPT was used to write the subtitles.

The German subtitle for Necronomico and the Cosmic Horror Show in German read, ChatGPT said...
Igor Bonifacic for Engadget

On July 1, Bluesky user Pixel spotted an issue with the German subtitles for Necronomico and the Cosmic Horror Show, one of the new series Crunchyroll is streaming this anime season. Beyond a general sloppiness, one line began with the words "ChatGPT said..." during a pivotal scene in the show's debut episode. Engadget was able to independently verify the episode contains the AI-generated translation. If you're curious, the English subtitles aren't much better, as seen in the screenshots above and below.

"We were made aware that AI-generated subtitles were employed by a third-party vendor, which is in violation of our agreement," a Crunchyroll spokesperson told Engadget. "We are investigating the matter and are working to rectify the error."

People were understandably upset about the subtitles. Crunchyroll subscriptions start at $8 per month, and since its acquisition by Sony, service has been the dominant player in the anime streaming market outside of Japan. "This is not acceptable. How can we be expected to pay for a service that clearly doesn't care about the quality of its products?" wrote Pixel in their original post. As of the writing of this article, their post has been quoted more than 300 times and reposted by thousands of other people. Many fans say they're turning to torrented fansubs, calling the official AI-generated translations "unwatchable." People on Reddit have expressed similar frustrations.

A translation that reads
Crunchyroll

Ironically, when Purini revealed Crunchyroll was testing generative AI tools for subtitles, he said part of the motivation was to prevent piracy. He reasoned the tech would allow the company to start streaming new, translated anime episodes as close to their original Japanese release as possible, adding the lag between official releases was sometimes what pushed fans to torrent shows.       

Update 3:58PM ET: Added comment from Crunchyroll.  

Have a tip for Igor? You can reach him by email, on Bluesky or send a message to @Kodachrome.72 to chat confidentially on Signal.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/streaming/crunchyroll-blames-third-party-vendor-for-ai-subtitle-mess-145621606.html?src=rss

©

© Crunchyroll

A translation that reads "Those two have seen them in a video before."

Perplexity joins Anthropic and OpenAI in offering a $200 per month subscription

2 July 2025 at 19:17

You can add Perplexity to the growing list of AI companies offering $200+ per month subscription plans to users who want unlimited access to their most advanced products and tools. As of today, Perplexity Max is available on iOS and the web

The subscription comes with unlimited monthly usage of Labs, the agentic creation tool Perplexity released this past May. People can use Labs to generate spreadsheets, presentations, web applications and more. Perplexity is also promising early access to new features, including Comet, a new web browser the company claims will be a "powerful thought partner for everything you do on the web." The company adds Max subscribers will receive priority customer support, as well as access to top frontier models from partners like Anthropic and OpenAI.     

Perplexity will continue to offer its existing Pro plan, which remains $20 per month. Admittedly, the company is courting a small demographic with the new subscription, noting it's primarily designed for content designers, business strategists and academic research.  

OpenAI was the first to open the floodgates of very expensive AI subscriptions when it began offering its ChatGPT Pro plan at the end of last year. Since then, Anthropic, Google have followed suit. 

This article originally appeared on Engadget at https://www.engadget.com/ai/perplexity-joins-anthropic-and-openai-in-offering-a-200-per-month-subscription-191715149.html?src=rss

©

© Perplexity

Perplexity Max art showing a stylized picture of a man running.

How to buy a GPU in 2025

2 July 2025 at 16:01

One of the trickiest parts of any new computer build or upgrade is finding the right video card. In a gaming PC, the GPU is easily the most important component, and you can hamstring your experience by buying the wrong model. The buying process can be frustrating, with many manufacturers selling their models above their suggested retail price. In this guide, we'll help you navigate the market and find the right GPU for your needs.

It's all about the games

The first question to ask yourself is what kind of games do you want to play. Competitive shooters like Valorant, Overwatch and Marvel Rivals were designed to run on older hardware. As such, even entry-level GPUs like the GeForce RTX 5060 can push those games at 120 frames per second and above at 1080p (more on why that's important in a moment).

By contrast, if you want to play modern, single-player games with ray tracing and other graphical extras, you'll need a more powerful GPU. Just how much more powerful will depend on the resolution of your monitor.

A 1440p monitor has 78 percent more pixels than a 1080p screen, and a 4K display has more than twice as many pixels as a QHD panel. In short, running a game at 4K, especially at anything above 60 frames per second, is demanding, and most GPUs will need to use upscaling techniques like NVIDIA's Deep Learning Super Sampling (DLSS) and AMD's FidelityFX Super Resolution (FSR) to push new games at high refresh rates.

While we're on the subject of resolution, it doesn't make sense to spend a lot of money on a 4K monitor only to pair it with an inexpensive GPU. That's a recipe for a bad experience. As you're shopping for a new video card, you should think about the resolution and frame rate you want to play your games. If you're in the market for both a GPU and display, be sure to check out our guide to the best gaming monitors.

If your budget allows, a good bet is to buy a midrange card that can comfortably render all but the most demanding games at 1440p and at least 144 frames per second. Put another way, you want a GPU that can saturate a monitor at its native resolution and refresh rate in as many games as possible. That will give you the smoothest possible experience in terms of motion clarity, and allow you to dabble in both competitive shooters and the latest single-player games as the mood strikes you.

NVIDIA vs AMD and Intel

Intel Arc B580 label view
Photo by Devindra Hardawar/Engadget

One of the confusing aspects of the GPU industry are all the players involved. What you need to know is that there are three main players: AMD, Intel and NVIDIA. They design the cards you can buy, but delegate the manufacturing of them to so-called add-in board (AIB) partners like ASUS, XFX, Gigabyte and others.

As you can probably imagine, this creates some headaches. The most annoying of which is that AMD, Intel and NVIDIA will often set recommended prices for their graphic cards, only for their partners to sell their versions of those GPUs above the manufacturer's suggested retail price (MSRP). For example, NVIDIA's website lists the RTX 5070 with a starting price of $549. On Newegg, there are no 5070s listed at that price. The only models anywhere close to $549 are open box specials. If you want one that comes sealed, that will cost you at least $600.

As for what company you should buy your new GPU from, before 2025, NVIDIA was the undisputed king of the market. Specific GeForce cards may have not offered the best rasterization performance in their price range, but between their performance in games with ray tracing and the fact NVIDIA was ahead on features like DLSS, an RTX GPU was a safe bet.

However, with this year's RTX 50 series release, other than models like the RTX 5080 and 5090 where there's no competition, it's safe to say NVIDIA missed the mark this generation. If you're in the market for an entry- or mid-level GPU, AMD and Intel offer better value, with cards that come with enough VRAM for now and into the future. That said, there are still a few reasons you might consider an NVIDIA GPU, starting with ray tracing.

Ray tracing

For decades, developers have used rasterization techniques to approximate how light behaves in the real world, and the results have been commendable. But if you know what to look for, it's easy to see where the illusion falls apart. For that reason, real-time ray tracing has been a goal of industry for years, and in 2018 it became a reality with NVIDIA's first RTX cards.

In some games, effects like ray-traced reflections and global illumination are transformational. Unfortunately, those features are expensive to run, often coming at a significant frame-rate drop without upscaling. Since ray tracing was optional in many games before 2025, you could save money by buying an AMD GPU. For example, even if the RX 7800 XT was worse at ray tracing than the RTX 4070, the former was often cheaper to buy, had more onboard VRAM and was as good or better rasterization performance in many games.

However, you can't ignore ray tracing performance anymore. We're starting to see releases like Doom: The Dark Ages where the tech is an integral part of a game's rendering pipeline, and more are likely to follow in the future. Thankfully, AMD's newest cards are much better in that regard, though you'll still get an edge running an NVIDIA model. For that reason, if ray tracing is important to you, NVIDIA cards are still the way to go.

Refresh rates and frame rates

If you're new to the world of PC gaming, it can be tricky to wrap your head around refresh rates. In short, the higher the refresh rate of a monitor, the more times it can update the image it displays on screen every second, thereby producing a smoother moving picture.

For example, moving elements on a monitor with a 240Hz refresh rate will look better than on one with a 120Hz refresh rate. However, that's all contingent on your GPU being able to consistently render a game at the appropriate frame rates. In the case of a 120Hz monitor, you want a GPU with enough headroom to drive most games at 120 fps. Realistically, most video cards won't be able to achieve that in every game, but it's a good baseline to aim for when shopping for a new GPU.

Upscaling and latency

I've mentioned DLSS a few times already. Alongside FSR and Intel XeSS, DLSS is an example of what's known as an image reconstruction technology. More and more, native rendering is going out of fashion in game design. With ray tracing and other modern effects enabled, even the most powerful GPUs can struggle to render a game at 1440p or 4K and a playable framerate. That’s why many developers will turn to DLSS, FSR or XeSS to eke out additional performance by upscaling a lower resolution image to QHD or UHD.

Upscaling in games is nothing new. For example, the PS4 Pro used a checkerboard technique to output games in 4K. What is different now is how modern GPUs go about it. With DLSS, NVIDIA pioneered an approach that uses machine learning to recreate an image at a higher resolution, and in the process, addressed some of the pitfalls of past upscaling methods. If you're sensitive to these sorts of things, there's still blur and shimmer with DLSS, FSR and XeSS, but it's much less pronounced and can lead to significant performance gains.

To DLSS, NVIDIA later added single and multi-frame generation. DLSS is only available on NVIDIA cards, and following the recent release of DLSS 4, widely considered to offer the best image quality. That's another reason why you might choose an NVIDIA card over one of its competitors. However, if you decide to go with an AMD GPU, don't feel like you're missing out. The company recently released FSR 4. While it's not quite on par with DLSS 4 in terms of support and image quality, it's a major leap over FSR 3 and FSR 2.

While on the subject of DLSS, I'll also mention NVIDIA Reflex. It's a latency-reducing technology NVIDIA introduced in 2020. AMD has its own version called Radeon Anti-Lag, but here again Team Green has a slight edge thanks to the recent release of Reflex 2. If you're serious about competitive games, Reflex 2 can significantly reduce input lag, which will make it easier to nail your shots in Counter-Strike 2, Valorant and other shooters.

Driver support

Previously, one of the reasons to pick an NVIDIA GPU over the competition was the company's solid track record of driver support. With one of the company's video cards, you were less likely to run into stability issues and games failing to launch. In 2025, NVIDIA's drivers have been abysmal, with people reporting frequent issues and bugs. So if you care about stability, AMD has a slight edge right now.

VRAM

As you're comparing different GPUs, especially those in the same tier, pay close attention to the amount of VRAM they offer. Modern games will eat up as much VRAM as a GPU can offer, and if your card has a low amount, such as 8GB, you're likely to run into a performance bottleneck.

If your budget allows for it, always go for the model with more VRAM. Consider, for instance, the difference between the $299 RTX 5060 and $429 RTX 5060 Ti. I know spending an extra $130 — close to 50 percent more — on the 5060 Ti is going to be a lot for some people, but it's the difference between a card that is barely adequate for any recent release and one that will last you for a few years, and it all comes down to the amount of VRAM offered in each. Simply put, more is better.

A slight caveat to this is when comparing models that have different memory bandwidths. A GPU that can access more of its memory faster can outperform one with more memory, even if it has less of it outright. Here, you'll want to read reviews of the models you're comparing to see how they perform in different games.

Size and power draw

Modern GPUs are big. Most new cards will take up at least two PCI slots on the back of your motherboard. They can also vary dramatically in length, depending on the number of fans the AIB has added to cool the PCB. To be safe, be sure to check the length of the card you want to buy against the maximum clearance listed by your case manufacturer. If you have a radiator at the front of your case, you will also need to factor the size of that in your measurements. The last thing you want is to buy a card that doesn't fit in your case.

Lastly, be sure to check the recommended power supply for the card you want. As a rule of thumb, unless you know what you're doing, it's best to just stick with the manufacturer's recommendation. For instance, NVIDIA suggests pairing the RTX 5070 with a 750 watt PSU. So if you're currently running a 650 watt unit, you'll need to factor in the price of a PSU upgrade with your new GPU.

Should you buy a used GPU?

NVIDIA RTX 5060 Ti
Devindra Hardawar for Engadget

It depends. If you can find a deal on an old RTX 40 series GPU, then yes. NVIDIA's RTX 50 series don't offer greatly improved performance over their predecessors, and with most models selling for more than their suggested retail price, it's not a great time to buy a new NVIDIA card.

That said, I suspect finding a good deal on a used GPU will be difficult. Most people will know the value of what they have, and considering the current market, will probably try to get as much as they can for their old card.

You may find better deals on older AMD and Intel GPUs, but I think you're better off spending more now on a new model from one of those companies since the generational gains offered by their latest cards are much more impressive. Simply put, the 9070 XT and B580 are two of the best cards you can buy right now.

Anything older than a card from NVIDIA's 40 series or AMD's RX 6000 family is not worth considering. Unless your budget is extremely tight or you mostly play older games, you're much better off spending more to buy a new card that will last you longer.

When is a good time to buy a new GPU?

If you've read up to this point, you're probably wondering if it's even worth buying a GPU right now. The answer is (unsurprisingly) complicated. There are a handful of great cards like the Intel B580 and Radeon 9070 XT that are absolutely worth buying. The problem is finding any GPU at prices approaching those set by AMD, Intel or NVIDIA is really tough. To make things worse, uncertainty around President Trump's tariff policies is likely to push prices even higher. If you own a relatively recent GPU, you're probably best off trying to hold onto your current card until things settle down.

However, if your GPU isn't cutting it anymore, you face a difficult decision: overpay now, or wait and potentially pay even more later. As much as I'm reluctant to recommend a prebuilt PC, if you're already planning to build a new computer, it's worth exploring your options there since you might end up saving money on a video card when it's bundled together with all the other components you need.

The best GPUs for 2025: Engadget recommendations

Entry-level (1080p) GPUs

As we mentioned above, if you're only aiming to play basic competitive shooters like Valorant and Overwatch 2 in 1080p, an entry-level GPU may be all you need. While 1080p isn't an ideal resolution when it comes to sharpness, many gamers prefer it since it's easier to reach higher framerates. And it also helps that 1080p gaming monitors, like the AOC 24G15N 24-inch we recommend, tend to offer speedy refresh rates for between $100 and $200. When you're zipping through matches, you likely won't have time to take a breath and appreciate the detail from higher resolutions.

Here are our recommendations for entry-level video cards.

Midrange (1440p) GPUs

While entry-level cards can dabble with 1440p gaming, it's worth stepping up to something a bit more powerful if you actually want to achieve higher refresh rates. For most gamers, 1440p is the best balance between sharpness and high framerates. It looks noticeably better than 1080p, and doesn't require the horsepower overhead of 4K. (And there's a good chance you won't really see a visual difference with the jump to 4K.)

Here are our recommendations for midrange GPUs.

High-end (4K) GPUs

If you want the most of what modern PC games have to offer, including 4K and all of the benefits of ray tracing, then be ready to spend big bucks on a high-end GPU. If you're going this route, though, be sure you're also gaming on a high-end monitor that befits these powerful GPUs.

Here are our recommendations for premium GPUs.

Super high-end/Money isn't real GPUs

Listen, there's only one choice here and it's NVIDIA's enormously powerful and fantastically expensive RTX 5090. It's an absolute beast, with 32GB of VRAM and the most hardware NVIDIA has ever stuffed into a consumer GeForce GPU. The RTX 5090 doesn't make sense for 99 percent of gamers — especially since it's now going for $3,000, up from its $2,000 launch price — but if you have the cash to spare, it'll certainly earn you bragging rights. (Check out our NVIDIA RTX 5090 review.)

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/how-to-buy-a-gpu-160100017.html?src=rss

©

© Devindra Hardawar for Engadget

NVIDIA RTX 5070 Ti

NVIDIA's RTX 5050 arrives early in laptops from Acer, MSI and more

24 June 2025 at 14:33

NVIDIA's add-in board partners won't start selling the GeForce RTX 5050 until mid-July, but it looks like the company has given the early go-ahead to OEMs to start announcing laptops with the new entry-level GPU. Wccftech and Videocardz report that 5050-equipped laptops are available to order in China as of this morning from domestic manufacturers like Mechrevo. 

Over in the US, companies like MSI and Acer have begun announcing their own RTX 5050 laptops. The former, for instance, will sell the Katana 15 for $999 through Walmart. Alongside the 5050, it features a Core i7-14650HX processor, 16GB of RAM and a 144Hz display. We've reached out to NVIDIA for more information on global availability, and we'll update this article once we learn more. 

In the meantime, the Chinese listings give us a good idea of what to expect from the new GPU. It features 2,560 CUDA cores, 8GB of GDDR7 VRAM and a TDP of 115W. The memory spec is interesting. Before today's announcement, the desktop variant of the 5050 was rumored to include GDDR6 memory. The fact the laptop version has GDDR7 VRAM would suggests its sibling will as well since it wouldn't make much sense for NVIDIA to hobble the desktop card in that way. With a 128-bit interface, the RTX 5050 should have a memory bandwidth of 384 GB/s, putting on par with the 5060 mobile in that department. 

As for performance, the 5050 laptop should land somewhere in the middle between the 4050 and 5060, with decent generational gains on offer but nothing too exciting. This being an entry-level card, the fact it only comes with 8GB of VRAM is more understandable, and it fits the bill for a GPU most people will only use for occasional gaming.   

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/nvidias-rtx-5050-arrives-early-in-laptops-from-acer-msi-and-more-143309816.html?src=rss

©

© NVIDIA

NVIDIA blackwell GPU core.

The Tacx Alpine is a $1,100 gradient simulator for your Garmin smart bike trainer

24 June 2025 at 11:00

Cycling season may have only just started, but that’s not stopping Garmin from looking ahead to when all the roadies need to take their bikes indoors. On Tuesday, the company announced the Tacx Alpine, an indoor gradient simulator for its family of NEO smart trainers. The accessory can replicate inclines of up to 25 percent and declines of up to -10 percent, with adjustments made either manually through a built-in control panel or automatically when using the device with a compatible apps like Zwift and Garmin’s own Tacx Training software.

In those same apps, the Tacx Alpine also allows for real-time virtual steering adjustments. Naturally, Garmin Connect support is also included for stat tracking and more. In short, the Tacx Alpine is designed for those who want spice up their off-season training since pedaling a road bike on an indoor trainer is about the most boring thing ever.

Garmin says mounting the front of your bike is easy. Inside the box, you’ll find adapters for both quick release skewers (9 x 100), and thru axles (12 x 100, 15 x 100 and 15 x 110), so the Tacx Alpine will work with most modern road bikes. However, trainer compatibility is limited to Garmin’s NEO 2T and Neo 3M models. Those cost $1,400 and $2,000 new respectively, and the accessory itself will set you back $1,100, with Garmin not planning to offer bundles at launch.

That might seem like a lot to pay for an accessory designed to make your indoor rides less monotonous, but it’s broadly comparable with the rest of the industry. Wahoo, for instance, sells its Kickr Climb simulator for $750, but it doesn’t come with a steering feature. Meanwhile, the Elite Rizer, which offers both steering and gradient simulation, costs $1,000. Either way, if you’re looking at one of these, chances are you already spent a pretty penny on a fancy carbon road bike and the thought of dropping another $1,000 on your hobby doesn’t phase you.

This article originally appeared on Engadget at https://www.engadget.com/wearables/the-tacx-alpine-is-a-1100-gradient-simulator-for-your-garmin-smart-bike-trainer-110041344.html?src=rss

©

© Garmin

Garmin Tacx Alpine
❌