It's easy to start feeling the pressure in the lead up to Father's Day. Finding a gift that expresses your love and gratitude to your beloved dad can feel nearly impossible, but that doesn't have to be the case. There are plenty of great Father's Day gifts out there for every type of dad, whether they're a tech lover, a gamer, a runner or a combination of many things. Here, we've compiled a list of the best gifts you can get your dad to show you appreciation for all of the parenting they've done, and likely still do.
This article originally appeared on Engadget at https://www.engadget.com/the-best-fathers-day-gifts-for-any-dad-in-2025-131504054.html?src=rss
It’s been a while since a company has thrown out a truly silly number of megapixels for a new phone. After all, the double-digit pixels found on most flagship handsets are just used to pixel bin the size down without harming the quality. Rejoice, then, when I tell you Honor’s new midrange 400 series is shipping with a 200 megapixel sensor working hand-in glove with an AI to make use of all that data. 200 megapixels, in this economy? Apparently so.
The 400 series is the latest in Honor’s not-at-all-confusingly-named “N” series of midrange handsets which bear numbers. Naturally, while there’s also a low end version of the 400 to buy, the company’s focus here (as always) was on the 400 Pro 5G and the regular 400 5G. Both models get that 200 megapixel primary camera tied to a Samsung-made 1/1.4-inch sensor with both optical and electronic image stabilization. Both are also equipped with a 12-megapixel macro/wide camera, plus a 50-megapixel front-facer.The Pro, however, also gets an additional 50-megapixel telephoto that the company claims will produce some impressive digital zoom.
Of course, these handsets are less about the raw numbers and more about what they can do when the images are run through the AI. Honor says the phones will capture and enhance portraits, erase passers-by, create videos from still images and can even remove reflective glare when taking pictures through panes of glass. Plus, on-device generative expand will expand the edge of an image if you feel the original was too closely cropped when you shot it. And Honor says the phone will use AI to create film simulation models to annoy all your Fuji-owning frenemies. Honor hasn’t yet been clear about how much of these AI innovations will be part of the phone and how much will require an extra subscription.
As for the rest of the phone, the 400 Pro’s spec list is no slouch: It’s got a Snapdragon 8, Gen 3 processor, 16GB RAM and a 5,300mAh silicon carbon battery. Up front, you’ll be staring into a 6.7-inch 2,800 x 1,280, 120Hz AMOLED display with a peak brightness of 5,000 nits. If you opt for the regular 400, then you’ll get a Snapdragon 7, Gen 3, 8GB RAM and a 6.55-inch, 120Hz AMOLED with a similarly beefy peak brightness. Both handsets will get Honor’s often-ballyhooed AI thread optimization for better sustained performance under load, such as if you’re gaming on the go. And the company has tweaked the graphics engine to better handle people’s massive photo libraries without stuttering.
The Honor 400 series is available to buy in Europe and the UK from today, with the Pro 5G setting you back €800 / £700. The regular 400 5G can be snapped up for €500 / £400 if you want 256GB storage and €550 / £450 if you want 512GB instead. Naturally, if you’re looking for a cheaper alternative, the “Lite” version can be picked up for €300, but the company didn’t share any specs for that particular handset. As usual, there's no word on if this handset will come to the US unless you import it yourself.
What Honor has been eager to point out, is the company has committed to providing six years of Android support for these handsets. That means buyers should expect to get at least that many OS and security updates, and Android 16 will be coming to the handsets by the end of the year.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/honors-midrange-400-series-pairs-a-200-megapixel-camera-with-the-usual-ai-tools-150018371.html?src=rss
You might know the story by now: Framework makes repairable, modular laptops where you can sub in new components for old or broken ones. It’s been two years since the company debuted an AMD mainboard for the Laptop 13 and so it’s time for the first replacement to arrive. The new model gets AMD’s Ryzen AI 300, a series of power-efficient chips for notebooks that can do all the Copilot+ AI nonsense the industry insists people need and want.
Framework sent me the new AMD mainboard to slot into the existing chassis, along with a new bezel and input cover. Rather than the usual solid colors, the company is now offering translucent plastic versions for all those late ‘90s kids who owned Game Boy Colors. I prefer the tinted translucent version over the clear transparent one, but you can judge for yourself in the pictures.
Mainboard with AMD Ryzen AI 300
Daniel Cooper for Engadget
Would-be buyers have three options: They can buy a new pre-built laptop with Ryzen AI 300, a DIY edition to assemble or just get the mainboard on its own. Either way, you get the pick of a Ryzen 5 340, Ryzen 7 350 or the flagship Ryzen 9 HX 370 capable of harnessing up to 96GB of RAM. Naturally, the price will start to climb the greater your technological ambitions.
I was supplied with the midrange Ryzen 7 350, which took me about 11 minutes or so to swap into the existing AMD model. It’s amusing to think it took me the better part of an hour the first time I did this but once you’re fluent, it becomes effortless. If you, like me, lost the muscle memory to swap components at the turn of the millennium, don’t feel like this is beyond you.
The Ryzen 7 350’s performance is fine for the sort of tasks you’d expect to do with a 13-inch notebook. I didn’t find there was a clear, epoch-shifting leap between what I got out of this and the 7840U it replaced. If you’re using it for the usual stuff — work, browsing and watching videos — then it’ll handle that all with aplomb.
As usual, the best reason to recommend the AMD model over its Intel equivalent is for its greater strength in gaming. After all, you can easily get 50 to 60 fps in titles like Fortnite and Grand Theft Auto V, making this an effective all-rounder.
There are two downsides to Framework’s modular approach, one of which is that the hardware will always look more functional than its rivals. The second, and more critical, is that all of the CPU cooling has to be integrated onto the mainboard itself. Whereas a lot of machines can be designed around thermal management, this one can’t because everything is modular. So the fan on top of the mainboard has to do all of the work with one hand tied behind its back. Consequently both the Intel and AMD versions of Framework’s laptops are noisy in ways more modern machines aren’t.
Framework says it addressed the noise issue by redesigning the heatpipe, improving the fan algorithm and switching to Honeywell’s PTM7958 thermal paste. Sadly, as many hours as the company may have put in here, you’re still going to have to deal with plenty of fan noise and heat under heavy load. And while AMD promised the 300 AI was designed to be more power-efficient, the new processor further dents this thing’s battery life. I didn’t get anything close to a full day on a charge here, but that seems to be the way with so many laptops these days.
Input cover (Second generation)
Daniel Cooper for Engadget
I’ve always commended Framework for its commitment to improving every hardware component as it goes. This time around, the company worked with manufacturing partner Lite-On to give its keyboard a makeover. It focused on hard-to-spot improvements like printing the caps in a slightly thinner weight, replacing the fingerprint sensor and redesigning the Shift and Enter keys.
Thankfully, what hasn’t changed is the 1.5mm key travel, and founder Nirav Patel told me years ago that he had no interest in trying to fix what wasn’t broken. This means the keyboard itself remains as easy to use as it was before, albeit with a slightly quieter typing action. The keys aren’t as loose in their housing as they were on the older model, which is another sign of higher quality. Users can also pick from keyboards with a dedicated Windows Copilot key or the Framework key, depending on your operating system loyalties.
More importantly, the company addressed feedback that the wide keys would rattle when the speaker played at high volumes. There’s a new scaffold supporting the Shift and spacebar to reduce vibration when the sound gets loud. I think the original problem was overstated, but perhaps my audio picks aren’t as bass-heavy as some other folks. Regardless, the changes here are welcome and when I’ve watched explosion-heavy audio, I found little to no rattle at all.
The compromises
Framework
It’s not as simple to swap an Intel mainboard for an AMD one as there are issues with hardware compatibility. Similarly, the AMD boards support different USB standards for different expansion card slots, as you can see in the picture. This is true for AMD boards no matter the manufacturer, but it’s one thing to remember before you make your purchase.
In summary
Daniel Cooper for Engadget
If you are already inside Framework’s ecosystem then feel free to sleep on this upgrade. Yes, the AI 300 is better than the chip it replaces but not to the extent I’d suggest you shell out several hundred dollars on one. If you aren’t, then you’ll probably be swayed by Framework’s broader pitch rather than this specific update. My suspicion is that the company’s maturing ecosystem is ideally placed to take advantage of the current geopolitical brouhaha. After all, if the cost of every notebook is at risk of leaping through the roof, being able to keep one machine running for longer is compelling. And, if you’re looking to leap in, you should grab one of the discounted Ryzen 7040 machines before they go. After all, if you find the performance a little slow in a few years’ time, you’ll be well-placed to take advantage of the next generation upgrade when it arrives.
I know there are some who feel Framework’s underlying platform is getting a little stale, which I do understand. Tech consumers are constantly clamoring for a newer, flashier doodad, and when the Laptop 13 first launched, it already looked a generation behind rival 13-inch notebooks in the same class. When the Laptop 13 first launched, it already looked a generation behind rival 13-inch notebooks in the same class. Five years down the line, it won’t beat any machines from Dell, Acer or ASUS in look or feel.
But while it may not have the razzle or dazzle, it does have the staying power, and that’s going to be a bigger asset in the next few years. If you’re the sort of person who would buy a Lenovo Thinkpad and run it until it falls apart, then this is a better option. After all, with a Framework, you won’t even have to worry about it falling apart.
This article originally appeared on Engadget at https://www.engadget.com/computing/laptops/framework-laptop-13-2025-with-amd-ryzen-ai-300-review-the-usual-iterative-upgrade-172031005.html?src=rss
Sometimes, in fiction, you don’t need to say a Very Important Thing in a Very Important Way to make a good point, just ask how a thing would work if it played out in the world. This week’s episode of Doctor Who, “The Well,” does exactly that, and brilliantly.
Picking up straight after “Lux,” the Doctor and Belinda, still in their ‘50s outfits, are trying to get the TARDIS to work. Belinda helps with the controls, but the vessel still refuses to land on May 24, 2025, which panics the nurse even more. If the TARDIS isn’t broken, she assumes that the date or the Earth itself could be broken, and frets about her parents. The Doctor shares her concerns, but promises that she will be reunited with her family.
The Doctor persists with his plan to land in a few more spots with the Vindicator (the gadget he built last week) to orient the TARDIS. This time, it’s 500,000 years in the future, and Belinda asks if humanity even exists by now. He assures her it does, as humans spread to the stars and wormed themselves into every corner of the universe. The pair head to the TARDIS wardrobe to get into some appropriate clothes before heading out.
They step out onto the gantry of a spaceship where an advance party of marines are leaping into the void. With no choice but to join them, they land on the planet below, enabling the Doctor to take the Vindicator reading. But, alas, the planet’s heavy radiation means the ship (and by extension, the TARDIS) has to glide down slowly over the next five hours. So they tag along with the mission, the Psychic Paper enabling the Doctor and Belinda to insinuate themselves with the team.
The planet is inhospitable, occupied only by a small mining colony that has dug down into the world to extract its last remaining useful resources. The colony went silent a few days before and, before you can say “Oh, is this going to be an(other) Aliens riff?” one of the marines suggests it would have been wiser to “nuke the site from orbit.”
All of the colonists are dead, half from gunfire, half from injuries that look like they fell and broke every bone in their body. The mirrors are all smashed and the systems are offline, the records of what went on inaccessible. But there is one survivor, the colony’s chef, Aliss Bethick (Rose Ayling-Ellis) who, like the actress who portrays her, is deaf. Aliss has been waiting in the middle of a large cargo turntable (which reads on camera as a big circle) for days.
Aliss is isolated, both physically in the staging and because of her hearing loss, and while she can lipread, it’s still a barrier between her and the soldiers. The Doctor can communicate with Aliss in sign, and the soldiers all have their own captioning screens on their lapels. Much of the second act is taken up with the interrogation of Aliss as the marines work through the logistics of how to communicate with her. For instance, getting her attention by casting to another soldier’s screen in her eye-line to get her to turn around. Belinda enters the circle to treat Aliss’ injuries but keeps seeing something lurking behind her new patient.
It isn’t long before the Doctor learns that the desolate planet they stand on was once covered in diamonds. This is the planet Midnight from the series four episode of the same name when the Doctor, trapped in a shuttle, tries and ultimately fails to defeat a sinister entity that possessed one of the passengers. Like then, the Doctor’s pleas for calm fail. Two of the soldiers mutiny and attempt to lure the entity out and kill it. They do not survive.
It’s Belinda who works out and explains the rules: If you imagine the host — Aliss — at the center of a clock, then whoever stands directly behind her is attacked by the unseen monster. If you stand at six o’clock then you’re fine, but “you’ll die at midnight.” Quite literally, as whoever is in the entity’s way gets thrown around like a ragdoll — half the crew shooting each other to kill the entity, the other half getting minced by the alien.
The Doctor approaches Aliss to speak to the monster but since it’s time for the third act to start wrapping up, he just stares for a bit before working out the solution. In order to mine the diamonds the colonists would dump down mercury, using a pipe which is conveniently running behind Aliss’ head. Shooting the pipe will cause a river of mercury to cascade down, creating a mirror that should be enough to banish the monster.
They make their escape, but the Doctor can’t help but wait behind to see the monster, giving it a chance to latch onto Belinda. The captain of the marines shoots Belinda enough that the entity thinks she’s about to die and switches hosts, after which point they leap into the mineshaft. Belinda wakes up in the TARDIS in the Doctor’s care, ready for the next adventure. Meanwhile, the marines debrief their boss — Mrs. Flood! Who knows all about the Vindicator, too — before revealing the alien did make it on board their spaceship after all.
One of the threads in the episode is Belinda keeps discussing human terms and superstitions to shrugs from everyone around her. It’s something that’s got both her and The Doctor puzzled, as there seems to be something very wrong with all of reality.
You die at midnight...
James Pardon / BBC Studios / Disney / Bad Wolf
Showrunner Russell T. Davies was asked about bad faith criticisms that the show had somehow gone woke. “Someone always brings up matters of diversity and there are online warriors accusing us of diversity and wokeness and involving messaging and issues and I have no time for this,” he said. “What you might call ‘diversity’ I just call an open door,” he added, “it’s cold and it’s bracing and there’s a world in front of you! There’s a blue sky, there’s clouds and there’s noise, there’s birdsong, there’s people arguing.”
What’s notable about this is that Davies’ open-minded (and open-hearted) approach to making the show creates storytelling possibilities. For instance, the last time an episode of Doctor Who featured a deaf character (2015’s “Under The Lake”), she relied upon a colleague to interpret on her behalf. And her ability to lipread wound up being part of the solution to the episode’s problem — reducing her to little more than a plot mechanism.
Here, while Aliss’ deafness is a core part of the plot, it doesn’t feel as if she’s defined by that one facet. Effort has been made to flesh out her character, and it’s more a venue to explore how technology and communication intersect with someone with different accessibility needs. Especially as (co-writers) Sharma Angel-Walfall and Russell T. Davies made the effort to think through how this would work.
BBC Studios / Disney / Bad Wolf
Whenever I’m watching an episode of nü-nü-Who, in the back of my mind I’m mulling what the injection of Disney money changed. “Midnight,” the episode “The Well” is a sequel to, was produced as a “double banked” episode — splitting the leads to shoot two episodes at a time. “Midnight” was also intended as a cheap story, with the bulk of the script taking place in a single room. If we’re being honest, “The Well” could have worked just as well given the bulk of the action takes place in a handful of rooms.
That’s not to say the extra cash lavished upon this episode is wasted: “The Well” feels almost indulgent by Doctor Who standards for the sheer breadth and depth of its sets. I can’t help but recall the Aliens riff Strange New Worlds produced in its first season, which re-used the series’ standing sets for the wreck of the USS Peregrine. It sounds weird to say that Doctor Who is luxuriating in the fact it can afford to show a trashed bunkroom for all of a minute, but it is.
Perhaps part of the reason it does feel indulgent is that this is an episode relatively low on incident and high on character. Belinda gets a real showcase here, both asserting herself on the narrative at several points, but also being rebuked for doing so. She tries to take charge to help the injured Aliss but the medical kit is so advanced she’s not able to use it. She’s smart enough to work out the rules of the alien, but also it gets the better of her in the end.
Whereas the first two episodes this season felt overstuffed and rushed, the smaller story and focus on character lets everything breathe. That an accessibility tool is a key focus of the plot and used as a venue for storytelling and character development is marvelous.
Look, I’m as bored saying it as you are reading it, but once again I can’t help but point out the influence of Steven Moffat on this season. One of the inspirations for monsters like the Weeping Angels and the Silence was the idea of them being easy to turn into a schoolyard game. The unnamed entity here, with the mechanic that if you stand directly behind the host you will die, seems perfectly in that tradition.
But “The Well” also offers instances where Davies is in conversation with the rest of this season and his earlier work. In both “Midnight” and “The Well,” the Doctor is at risk of losing his grip on the situation because the threat of the unknown makes people paranoid and jumpy. A streak of deeply dark pessimism runs through all of this work and while it’s also on show here, there’s a little more hope than there was before.
It’s also interesting how Davies, who has always structured his seasons in a fairly rigid manner, seems to be deliberately repeating motifs and beats. The parallels between this season and the last feel almost like they’re trying to draw attention to themselves. “Space Babies” and “The Robot Revolution,” “The Devil’s Chord” and “Lux” and now the “Boom” paired with “The Well” feel like episodes vying for the same space in different realities. Not to mention the repetition of moments from episode to episode — like the TARDIS wardrobe sequence and the repeated hand injuries. If next week's "Lucky Day" is predominantly featured on Ruby Sunday without the Doctor and revolves around physical distance and / or the supernatural, then perhaps we might assume that this is more than coincidence.
Mrs. Flood Corner
I’ve always hated “The End… or is it?” fake-outs that often undermine the drama of whatever denouement they’re tacked on to. Sure, it can be effective if you want to cheapen the sacrifices your characters made to vanquish the villain, but often it comes across as hacky. Not to mention that people with poor media literacy will assume that it’s actually a teaser for a cliffhanger to be resolved the following week.
Here, eh, it’s essentially a way to shoehorn Mrs. Flood in as the soldiers' boss taking the debrief after the Doctor and Belinda depart. She knows about the Doctor’s use of the Vindicator, and has now seen it in action thanks to the soldier’s recording. But there’s no breaking the fourth wall, which means she’s operating here in the same manner as Susan Twist did last year. Which is, uh, interesting.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/doctor-who-the-well-review-signing-makes-you-feel-heard-200528202.html?src=rss
The start of any season of Doctor Who is important, doubly so when there’s a new co-star to introduce. “The Robot Revolution” has to get us to fall in love with Belinda Chandra (Varada Sethu), ensnare new fans and keep existing ones hooked. Especially since it’s the second of two series that Disney paid for, meaning it’s got to do well enough to keep the money flowing.
We open “17 years ago” with Belinda Chandra staring at the stars next to her boyfriend, Alan Budd (Jonny Green). It’s an awkward teenage date, with Alan clearly trying to win the heart of his beau by buying her one of those star adoption certificates. In 2025, Belinda is now a nurse at a busy London hospital where, in the background, the Doctor is searching for her.
Belinda goes home to bed, and we see that she’s got the star ownership certificate framed on her wall. But she’s rudely awoken by a squad of retro-futuristic ‘50s robots in a Tintin rocket who have come to abduct both her and the certificate. The Doctor reaches her home just in time to see the rocket take off, and cue the opening credits.
The certificate wasn’t a gag present, and she is actually queen of the planet BelindaChandra, populated by BelindaChandrians (I’m calling them humans from now on). The Doctor gives chase in the TARDIS but both the rocket and TARDIS get caught in a vaguely-defined time fissure. When Belinda arrives, she’s greeted by the human Sasha55, who tells her the robots are in charge, having overthrown and subjugated the people in a bloody uprising a decade prior.
“Oh, this is a bit like Jupiter Ascending,” I wrote in my notes.
Belinda is taken to a throne room where she’s told that she must merge with the planet’s evil ruling supercomputer, the AI Generator. AI Generator, all skull shapes and Tesla coils, intends to bond with Belinda. She is shown an animated demonstration of her fate, as she is wrapped in machine parts and made into an unthinking cyborg.
“Oh, that's like the scary bit from Superman III,” I wrote in my notes.
Who’s been lurking in the background of the scene all along? The Doctor, who has adopted the title of Planetary Historian. (Thanks to the time fissure, he arrived here six months ahead of the rocket, the Robots seized the TARDIS and he’s been working with the rebellion. He’s even got a new companion, Sasha55, who he’s promised to take to the stars when this is all over.) He tells Belinda the robots can’t, for some reason, hear every ninth word spoken, and gives her a coded message telling her he, and the rebellion, are here to rescue her. In the ensuing fight, Sasha55 is vaporized, much to the Doctor’s admittedly brief horror and grief.
The surviving rebels, along with a little Roomba bot assigned to clean Queen Belinda’s pathway…
“Oh, like the floor-cleaning robot M-O from Wall-E,” I wrote in my notes.
… escape to a teleporter, after which the Doctor disables the Roomba to ensure the robots can’t track them down, then kisses the ‘bot by way of apology. Then comes time for the episode to stop while we see the Doctor and Belinda interact properly for the first time. The Doctor was told about Belinda’s plight by someone from their future, and he can’t say more lest he muddle the timelines.
Alistair Heap / BBC Studios / Disney / Bad Wolf
The time fracture both vessels passed through has caused plenty of time-bending issues, like the fact the robots have their own copy of Belinda’s star certificate. But it’s not a copy, it’s the same object from another point in time, and nobody knows how or why they have it. Belinda, like Ruby Sunday before her, is trope-aware enough to know that two of the same object from different points in time cannot occupy the same space, lest it cause an explosion.
“Oh, like in Timecop!,” I wrote in my notes.
There are wounded at the base, and Belinda instantly kicks into nurse mode, grabbing IVs and treating patients. She’s quick to take charge and has no patience for nonsense, quick to defend herself from any hint of condescension when the Doctor suggests something “timey-wimey” is going on. She refuses to allow anyone to fight her battles for her and is determined to grab the narrative and shape it her way, no matter the cost. So, she sneaks off, reactivates the Roomba and offers herself to the robots in exchange for them sparing the lives of the rebels.
Belinda and the Doctor are taken to meet the AI Generator which turns out to be… the AL Generator. When Belinda was kidnapped by the robots, she mentioned her ex Alan had bought the certificate, and so they went to kidnap him as well. But the time fracture meant Alan arrived a decade earlier, fused with the machine (becoming a creepy cyborg) and started the robot uprising.
Even so, Belinda’s happy to sacrifice herself to him until she spots Alan holding his copy of the star certificate. She opts to Timecop the two pieces of paper together, causing a big timey-wimey explosion that only the Doctor can pull her out of. Belinda is safe, but the Doctor mentions that he’s now intertwined with Belinda’s timestream. Alan, meanwhile, has been regressed to a sperm on the floor that the Roomba bot quickly mops away.
Reunited with the TARDIS, the Doctor scans Belinda and reveals he’s already met her descendant — Mundy Flynn (also Varada Sethu) from last season’s “Boom.” Belinda may be curious as to how someone that far removed from her may be identical, but she’s not embracing the mystery. She’s angry with the Doctor for scanning her without consent and that he’s treating her like a puzzle to be solved.
Having seen Sasha55 die, she knows trekking around with the Doctor is dangerous, and wants to get back to May 24, 2025. But the TARDIS won’t land on present-day Earth, and even the Cloister Bell begins ringing a warning. They open the TARDIS doors to see empty space before the Doctor decides to take her back home “the long way round.”
Once the ship disappears, a series of objects start to float in front of the camera: A smashed up black cab, the twisted wreckage of the Eiffel Tower, Belinda’s star adoption certificate and a calendar with all the days in May but the 25th ticked off. Uh-oh.
It's a lot to get through in such a short episode
Alistair Heap / BBC Studios / Bad Wolf
Like a lot of Disney-era Who, “The Robot Revolution” feels overstuffed to the point of bursting. On one hand, nothing overstays its welcome. On the other, it feels like the show is burning through a movie’s worth of plot on fast-forward. It’s hard to get a tangible sense of the stakes given how rushed everything is, and there’s a lot of telling, rather than showing. We’re told the planet is under the brutal thumb of an evil overlord but it plays out as little red ships firing at buildings in the digital matte paintings. We’re told Alan is a creep but we never really get any sense of that until after he’s revealed as the villain. We’re told the Doctor is operating on instructions from a figure from his own future, but it’d be nice if some of this was depicted.
Davies was pivotal in reviving Doctor Who and building the cultural juggernaut it became under his leadership. His role in the show’s history is secure but, even so, his Disney-era series seem to be in thrall to the work of his own successor, Steven Moffat. “The Robot Revolution” features a macguffin found inside a mundane trinket, a split narrative and time-bending shenanigans. It’s not that Moffat owns these ideas but you can almost feel Davies trying to bend his less formal, more character-driven style into something else. A cynic might suggest Davies is reacting to the slight of not having a single credited episode in Doctor Who Magazine’s most recent poll of the series’ greatest, while Moffat has five.
In fact, I wouldn’t be surprised if the slightly frantic, gappy nature of this script is a deliberate ploy to lay the framework for the rest of the season. But, even so, you can feel a degree of straining for a storytelling model that doesn’t quite work.
If the script is the weakest part of the episode, then the production design has to take the crown for strongest. The retro-futuristic robots call to mind a bright red Ford Thunderbird or Chevy Bel Air while the cleaning robot is clearly styled on a VW Beetle. It’s a rather humanistic design I wish the robovac makers of today would emulate.
Behind the scenes
James Pardon / BBC Studios / Disney / Bad Wolf
Doctor Who is a regular source of gossip, especially given the permanently tenuous nature of the star role. It’s easy to say the lead is about to quit and for that to sound true, given they leave after three or four years in the role anyway. There are a number of recent reports suggesting Ncuti Gatwa has already quit the show, or is about to. Many of them also suggest the BBC and Disney are refusing to greenlight new episodes until they see how successful this season is. In addition, the BBC says funding cuts and inflation has seen its budget fall by £1 billion (around $1.3 billion) in real terms since 2010. It doesn't help that, when asked directly about the future of the series in an interview with (the BBC's youth-orientated news show) Newsround, Russell T. Davies opted to equivocate in a way that suggests the show is about to back on ice.
I mention this because of the sequence where Belinda defeats Alan with the certificate, and the Doctor pulls her out. He says she needed a Time Lord to absorb the enormous amount of energy kicked out when she touched the paper together. The Doctor then clutched at his back as if he was in a lot of pain, but shrugged it off and was fine for the rest of the episode. Fans with long memories, however, know that absorbing a lot of energy from the time vortex is what killed Christopher Eccleston’s Doctor back in 2005. Well, that and Eccleston’s decision to leave.
Mrs. Flood Corner
Lara Cornell / BBC Studios / Disney / Bad Wolf
It seems Mrs. Flood enjoys moving in next door to whoever is winding up as this year’s companion. While being abducted, she calls to her neighbor to call the police and tell her parents she loves them. As the rocket lifts off, she tells the audience that we haven’t seen her, and goes back indoors to avoid encountering the Doctor, who sprints out in pursuit.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/doctor-who-the-robot-revolution-review-meet-belinda-chandra-190054697.html?src=rss
I appreciate devices that don’t try to do too much. There are too many products throwing too many features at the consumer in the hope one or two sticks. I’m reminded of the recently revived Pebble, which offers a pared down way to check your phone’s notifications from your wrist, and little else. That’s the best way to describe Even Realities’ G1 smart glasses, which puts a second screen for your smartphone on your face.
G1 is almost aggressively low-tech, putting in your line of sight a dot matrix display that’ll leap into life when required. You’ll be able to see the time, phone notifications, calendar, stock and news updates from a handful of chosen publishers. Tap the temple tips (also known as earpieces) and you’ll be able to speak to an AI to answer questions or dictate a note without touching your phone at all. Open the app to activate heads-up turn-by-turn directions, access real-time translation and use the glasses as a portable teleprompter.
On devices like this, the limited feature set goes hand-in-hand with lowered expectations: If it promises the basics and delivers them, all good, right? Problems arise when it fails to do them well, or if it’s obvious to anyone there are features that would work here but have been omitted. The G1 doesn't stumble on the essentials, but I do find myself flip-flopping about how much praise they should get. They’re useful right now, but I’d love to see how much better they’ll get in a year or two.
Hardware
Daniel Cooper for Engadget
Until the laws of physics change quite dramatically, smart glasses will always be noticeably bigger and heavier than their siblings. But the G1 is hardly an embarrassment in those stakes, and while chunky, they’re not comically oversized. The frame is built from magnesium and titanium alloy and weighs in at 44 grams. That’s more than the 26 grams my regular glasses weigh, but not to such an extent that it’s burdensome. It’s no surprise to learn some of the company’s leadership team came from the glasses industry, including stints at high-end eyewear brand Lindberg and Mykita.
Each lens houses a rectangular waveguide with a 25-degree field of view displaying a 640 x 200 20Hz dot matrix green monochrome screen. It brings to mind the old workstation terminals from the early 1990s. This isn’t shameless hipsterism, since green is more easily seen by the eye, and it also reduces the display’s power consumption, which is useful since the maximum brightness is 1,000 nits, and you’ll need all of it to see your screen on a bright day.
Much of the hardware is housed in the temple tips — the bits sitting behind your ears, if you’re not au fait with glasses terminology. These chunky boxes include the wireless charging gear, 160mAh battery, Bluetooth LE 5.2 radio and the touch control sensors. As for the rest, the projector and microphones are housed in the end pieces (the hinges on the frame), and that’s it. There’s no camera, speakers, headphones, bone conduction audio or any touch surface along the arms themselves, because the makers expect you to have your own camera and audio gear.
In the box, you’ll get the glasses and USB-C charging case, the latter of which has its own 2,000mAh battery which the company says will charge your glasses two and a half times before you need to head back to a socket. You can also get a pair of clip-on sunglass lenses for your specs to ensure you can wear them outdoors. I had initially presumed, given the heft of the original investment, that the clip-on glasses were thrown in but no, they’ll set you back an additional $100 in the US, €100 in Europe or £85 in the UK.
Glasses need to be up to the challenge of withstanding the same conditions the rest of your head is exposed to. Even Realities says the G1 can resist a “splash" or "light rain,” but doesn’t include a specific IP rating, so you may need to baby them a little.
There is only one size of the G1 available, and the company says it’ll suit anyone with a pupillary distance between 54 and 80. The temples can be adjusted to go looser or tighter, depending on the unique topography of your skull. But the frames are fairly stylish, and if you prefer the rounded “Panto” style of glasses, or a squarer one, you can opt for the B1 model instead.
Naturally, glasses aren’t a one-and-done deal, with most folks changing their prescriptions on a bi-annual basis. Even Realities says that you should be able to send your glasses back to HQ when your eyesight changes, along with your updated prescription, for up-to-date lenses. In the EU that’ll cost €230, or if you’re further abroad, €240 ($260).
Random musing about Smart Glasses vs AR
I’ve deliberately not referred to the G1 as a pair of augmented reality glasses because I don’t think they fit the description. The dictionary says AR is anything that puts a digital view in front of the real world, but that’s too broad a definition. After all, if I held a paper map in my line of sight I wouldn’t be augmenting my reality as the map doesn’t engage with its environment. And it’s the same thing here — just because the display in question is transparent doesn’t, for me, move the needle enough for it to be classified as AR.
In-use
Daniel Cooper for Engadget
Charge the glasses, download the app and you’ll walk through the brief setup and tutorial process. Software updates take around five minutes each time, surprisingly long given the low-tech nature of the system. Put the glasses on and you’ll notice the waveguide prisms add a slight tint to whatever you’re looking at. For instance, when staring at a white page on my computer, the prism adds a hint of gold that’s noticeable compared to the white around it.
The glasses are deactivated by default, behaving like any regular pair of specs as you go about your day. If you want to look up at the dashboard, which is your home screen, you’ll need to tilt your head up. This is the first thing you’ll define in the settings: How far you need to move your head up to trigger the dashboard. I found I had to tweak it regularly, given my head naturally drift upwards more often when I'm, say, relaxing against the sofa versus standing at my desk.
Daniel Cooper for Engadget
The dashboard is the first sign the G1 has some limitations. You only have three layout options for what you can see, one of which is the Status Pane, which is always on. Status shows you the time, date, temperature, a notification indicator and (when required) low battery warnings. You can add one widget or two, but if you pick two, one of those widgets will default to your calendar. The other widgets include News, Stocks, QuickNotes or a Map.
News, Stocks and Maps feel like the default options you get with every gadget while its makers work out what its users really want and need. The news feed pulls from a handful of outlets and you can select from which categories (Politics, Science, Sports and the like) you want to see. The map view gives you a little peek at your location to the nearest 25, 50 or 100 meters, which I’m sure is helpful for some folks, like delivery drivers. But I’m not sure there are many folks who want or need to have this information so immediately at hand.
Notifications
You’ll be able to gatekeep which apps you can permit to be put through to your glasses, and when you get a message, a speech bubble will pop up on your blank display. When it appears, you can just flick your head up to see the message, or tap the temple tip while in Dashboard.
It’s the same limited message you would get with any basic wearable connected to your iPhone. The obvious benefit is removing the need for you to get your phone out of your pocket or look at your wrist.
On a Saturday jaunt into the city, several of my group chats sprung into life and I was able to keep abreast of the conversation without having to look at my phone. That was a real benefit, as I’d have been frustrated if I’d have had to check my pocket on the regular for messages that didn’t entirely concern me. The only downside is that you can’t do anything with the messages, like respond to them, unless you do finally succumb to your phone.
Teleprompt, Transcribe and QuickNote
Even Realities
Even if you’re comfortable speaking off the cuff, using the G1 as a teleprompter is a staggeringly good idea. As soon as I started using it, I was thinking they would be great for shooting review videos, as well as giving presentations and acting. Hell, I’ve recited the Gettysburg Address a few too many times in the last week.
The glasses will also offer a way to turn what it’s hearing into text, either with the dedicated Transcription setup, or the QuickNote action. For the latter, all you need to do is touch a temple tip and you can then speak, with the system picking up your words and turning them into text. You’ll then be able to read the note, and play back the audio recording in the app, although you can only share the text of what you’ve said. It’s perhaps a lot more pertinent to journalists than other folks but the ability to have such easy access to a tool like this is exciting.
Translation
The G1s presently support real-time translation of 24 languages, including the major European languages, Arabic, Cantonese, Hindi, Japanese, Korean and Turkish. After you open the app, select the language in question and activate the feature, you’ll get a translation two or three seconds later when someone talks to you. I’ve tested the feature with native French and Turkish speakers and while the translations did at times miss a word, the overall sentiment was well conveyed.
Without a doubt, this is one of the glasses’ most eye-catching and useful features, since it subtitles the real world. But while the idea and implementation is all there, it’s not as sci-fi perfect as it could or should be, and that’s a problem. For a start, there’s the obvious moment you have to wait for while the system translates what your counterpart has said. Then there’s the fact it’ll translate anything it can hear, so when I asked my Turkish friend to speak to me, and then I read out the translation in English, the glasses tried to translate my response back into English.
Everything’s far more reliant on the app than it could be — you can’t activate the feature or swap languages without having the phone in your hand. If you were able to switch the options around with a tap on the touch sensor, you could theoretically have a conversation just with the glasses. But as soon as you need the phone in your hand, it’s easier to just open up Google Translate and harness the power of conversation mode.
Navigation
The ability to project basic information in your line of sight is enormously helpful when it comes to navigation. After all, if you’re wandering around unfamiliar streets, then you probably don’t want to appear to look like you’re lost. Certainly, the spate of phone thefts where well-prepared poachers snatch devices from people’s hands is a sign of that. Much like every other feature, you’ll start by… opening the app, activating the navigation pane and setting your destination.
You can pick walking or cycling directions, and you’ll get a turn-by-turn layout on the phone as well as in the glasses. Once the route has been calculated, which will take a second, you can put the phone in your pocket and start moving around. On the left, you’ll get the road name, an arrow for your direction and the distance in meters before the next move. In the middle, you’ll get the projected journey time and distance, and on the right a mini-map showing you the route. Look up (triggering the Dashboard) and you’ll get a full sized route map showing your progress as well as an indication of your speed. I’d be lying if I said I didn’t adore this feature and would love to test it out while roaming an unfamiliar city, because it’s supremely effortless.
Even AI
Rounding out the spec list is Even AI which, at the tap of the left temple tip, will be available for you to ask questions. Even AI is essentially just an interaction layer for either Perplexity, which is the default AI client, or ChatGPT. Press the button and you’ll be able to ask it questions, the answers of which will then be displayed in your field of view.
If you have a beginner’s knowledge of AI, by which I mean a fundamental distrust of anything it says, then this might be useful. Defining words, answering basic questions like “Who is Florence Nightingale” and looking up facts like the price of Bitcoin are all easily done. But that’s all, I think, I’d trust any AI to do, given how generally incapable of providing useful information it is.
Controls
There are two buttons, one on each temple tip, which will let you engage EvenAI or QuickNotes and scroll through notifications. Two buttons, however, even with the ability to accept multiple taps, is a little too few input mechanisms for a device this sophisticated. I keep thinking about the ways you can control true wireless headphones with all of that rhythmic tapping and that’s just for audio playback.
It means you’ll be relying on your phone a lot more than you may like, and while it’s not a deal breaker, it is an issue. After all, if these glasses offer a way of spending more time engaging with the world around me, then I don’t want to be constantly snatching up my phone. I imagine this is another area that, as the software develops and more commands can be incorporated into the buttons, things will get easier. But it is, for now, a fairly significant frustration.
Battery Life
I’d consider myself a fairly heavy user, and I would regularly get a day and a half’s worth of life from the G1 glasses before needing a recharge. It’s vexing in the extreme that the glasses don’t have an off switch, so they’ll be draining an admittedly small amount of power when not in use. I suspect, if I was living with these full time, I’d get into the habit of keeping them in their charging cradle on the nightstand while in bed to avoid any inadvertent losses of power while out and about.
Price
Even Realities’ G1 is available in two different frame styles: The G1A with the “panto” round-rim style and the G1B, with a rectangular frame. If I’m honest, I’d have preferred to test the G1B, which is more in keeping with my regular glasses preference, but c’est la vie. The glasses on their own cost $599, with corrective lenses costing you an extra $150 and the sunglass clip an additional $100. It puts these glasses in the same sort of territory as the highest-end designer frames you can get at LensCrafters.
I’m not sure there’s a mainstream competitor sitting in exactly the same category as the G1. There are similar headsets, like TCL’s RayNeo, but that has a far higher resolution display since it promises real AR. The Frame by Brilliant Labs, perhaps, but that only has a display in one lens and relies far more upon AI to operate. Captify’s glasses use binocular vision but are only designed to offer real-time captioning for users with hearing loss. Vuzix’s Z100 only has the display in one lens and, as far as I understand it, Meizu’s Myvu glasses are only available in eastern countries. Which means, for now, Even Realities is your one stop for a product like this.
And while they’re not in the same category at all, it feels negligent to not even mention Meta and Ray-Ban’s Wayfarers. The retail price may be cheaper but, once you’ve added prescription lenses they’ll set you back around $600, putting them close to the G1. But they’re obviously a very different product, with no heads-up display and a greater emphasis on AI and photography.
Wrap-Up
Daniel Cooper for Engadget
I really like Even Realities’ G1 for what they can do right now, but I’m also hopeful that it’ll get far more useful in the future. It seems to me there are so many things that could be tweaked, primped and plumped to make these far more appealing.
I’d love to be able to switch the translation mode with a press of the temple tip, so I could get a translation of what’s said, flip it to translate my English to the other language and then say it back to them so we could actually have a(n admittedly stilted) conversation. Adding reminders and other options to the dashboard would make it a lot more desirable to use. Hell, imagine a future dashboard update that pulls your step count from your phone so you can see how well you’re moving. Not to mention the ability to offer some form of real-time captioning for users who may have hearing issues.
I’m not going to judge the G1 on its potential but for what it offers now, and what it offers now is plenty good enough. The biggest obstacle is the price, but what can you expect for a first generation product in a niche category? When speaking to friends about them, many said if the price wasn’t that much more than a regular pair of glasses, they’d struggle to say no to what’s on offer here. And I agree, once you’ve had a taste of the functionality that’s on show here, it’s hard to go back to normal.
This article originally appeared on Engadget at https://www.engadget.com/wearables/even-realities-g1-review-limited-but-effective-smart-glasses-140059586.html?src=rss
Anker’s lifestyle brand Eufy has already swallowed a big chunk of the robot vacuum market and now it’s got its sights on your yard. The company has been sharing details of its first two robot mowers since the start of the year, and now they’re ready to start selling them. Eufy’s E15 and E18 are designed to automate one of the most tedious jobs around the home — if you’re able to pay. I’ve been testing an E15 for the last few weeks ahead of their retail debut today and I’m fairly impressed.
Early robot mowers needed a boundary wire to tell them where they were allowed to mow. But digging a trench around your lawn is time consuming, costly and less than ideal if you eventually move. It prompted companies to pivot to other methods, such as GPS or RTK (real time kinematics) to navigate. Eufy, however, is harnessing its computer vision know-how to trim your lawn with even less fuss, calling its technology “visual full self-driving,” or vFSD. Yes, I know. Anker says there are plenty of benefits in using cameras over GPS, like more reliable mowing and better obstacle avoidance.
Daniel Cooper for Engadget
The E15 is capable of covering lawns up to 800 square meters while the E18 will conquer lands as broad as 1,200 square meters. If you assumed, like I did, that the difference between the two is battery size, you’ll be mistaken — both have the same 4,200mAh battery, but the E18 has more on-board memory to accommodate a bigger map size. Otherwise, they are the same machine, with an adjustable cutting height between 25 - 75mm, a maximum climb of 18 degrees and a combined GPS / 4G anti-theft system. One feature I’m very partial to is that the garage (the mower’s charging station) comes with a rain cover, meaning fewer worries if you’re out and the weather suddenly gets a bit intense. Not that it’s necessary, since the hardware is rated IPX6 — enough to withstand being cleaned with a hose.
Setting up the E15 is painless so long as your lawn is nicely mown, with the grass no taller than 3.5 inches. All you’ll need to do is fix the garage in place with some hefty ground screws, hook it up to power and connect it to your home’s Wi-Fi. Then all you’ll need to do is send it out for one or two mapping runs in order for it to get a sense of your space.
Daniel Cooper for Engadget
My lawn is cut into a hill, with a sunken pathway and a 1.5 meter drop at one end, which is a problem. Since it maps visually, I opted to babysit the mower during the process to make sure it didn’t hurl itself into the chasm. I also have a small lean-to wood shed with a green roof (at the bottom of the chasm) that I reckoned a computer vision system could easily mistake for grass, so I wanted to keep an eye on it. Once it had made a few too many furtive advances toward that roof, I paused the mapping, sent the E15 back to its garage and set up a keep out zone in the app before finishing the job.
Once that was done, however, the E15 very easily staked out the rest of the space and made sure it could get nicely close to the path without going over. From there, you’re doing everything of note within the app. You can set the cutting height as low as 25mm or as high as 75mm, and can also set the unit’s movement and cutting speed — letting you use more power if you’re pressed for time. Plus, you can schedule mows, and if the device detects rain or too much wet in the grass, it’ll head back to base until things have dried out.
One feature I’m a big fan of is that it’s the first such machine I’ve encountered that lets you set a cutting direction for stripes. It’s not that I have an issue with most robomowers’ chaotic mowing per se, but I’ve always seen striped lawns as desirable. While the unit isn’t going to give you the sort of over-manicured, inch-perfect stripes you’d find at a tennis club, you can at least see the contrast.
Daniel Cooper for Engadget
An additional benefit of remote control is that if it does run into an issue and you’re not at home, you can activate a remote control mode. Not only are you able to access the camera feed, but you can trigger on-screen controls to navigate out of any tricky spots it might have wound up in.
It's funny, but something I didn't notice, but my in-laws did when they visited, was how shockingly quiet the E15 was. When I set the hardware running to satisfy their curiosity, they were baffled that the thing was scuttling around on the lawn making almost no noise whatsoever. It's certainly a perk, especially if you choose to set this thing off for a scheduled trim in the early morning — it's quiet enough that even the ants probably won't complain.
All in all, I like the package Eufy is offering, and it even handled some of my misgivings about its computer vision system. If I have gripes, it’s not really about the E-series at all and more about this category of product generally. For a start, robot mowers may not get every square inch of your lawn, especially if some of your edges neighbor deep crevasses, like mine. That means you’ll still need to go out there every once in a while with a weed wacker to trim the borders of your turf.
And I’d still love nothing more than to be able to exert more control over the initial mapping phase to eliminate some of the trial and error. I wish for a system that would let me use my phone as a tool to trace the outside edge of a space myself, to set some basic expectations. Sure, the hardware would still have to scuttle around making sure it can get where I need it to go, but it’d save some of the busywork for both of us.
The Eufy E15 (800 sqm) and E18 (1,200 sqm) are available to order today from Eufy and Amazon. The E15 will set you back $1,599, while E18 is priced at $1,999.
This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/eufys-new-robot-mowers-use-smart-vision-to-trim-your-grass-130008542.html?src=rss
Nintendo is today laying out the goodies for the Switch 2, which includes the third-party titles available at launch. The roster may be small, but it includes a number of notable titles from the current generation, trimmed and polished to work on the new hardware. That includes Cyberpunk: 2077, Elden Ring: Tarnished Edition, Street Fighter 6, Hitman: World of Assassination and Split Fiction. EA has also committed to putting whatever the hell we pretend to call FIFA these daysEA Sports FC as well as this year’s Madden. There’s also a Bravely Default: Flying Fairy HD remaster and Yakuza 0: Director’s Cut plus a tweaked version of Hogwarts: Legacy. Fortnite will also be available on the console on release day.
The breadth and depth of titles available on day one is testament both to the fact there are plenty of good titles in the air right now, and that the Switch 2 must be fairly easy for developers to work with.
This article originally appeared on Engadget at https://www.engadget.com/gaming/nintendo/cyberpunk-2077-and-split-fiction-are-third-party-launch-titles-for-nintendo-switch-2-135648661.html?src=rss
If there’s one thing Nintendo has always understood, it’s that everyone may want to play together, but might not all own the same game. With the Switch 2, the company is launching GameShare, enabling local multiplayer on multiple consoles with just one copy of a title. Yes friends, this is the wireless multiplayer feature from the Nintendo DS or, depending on your era, the modern day GameBoy Link Cable.
With the first Switch, multiplayer was limited to sharing JoyCons on the same console hardware. But for the successor, if two people each have a Switch 2 but one copy of a compatible game, then they’ll be able to play wirelessly on their own hardware. But that’s not all, as you’ll also be able to do this with four consoles at a time, including original Switch and Switch Lite models.
Unfortunately for now, the list of games compatible with GameShare is pretty thin, but Nintendo says more will be coming in the future. At launch, it'll work with Captain Toad: Treasure Tracker, Super Mario 3D World / Bowser's Fury, ClubHouse Games (pictured above), Super Mario Odyssey and Big Brain Academy.
This article originally appeared on Engadget at https://www.engadget.com/gaming/nintendo/nintendo-lets-switch-2-players-share-their-games-132431186.html?src=rss