We knew the Nothing Phone 3 was coming sometime this ‘summer’, but now the makers have got a bit more specific, saying that the phone will land in July.
This was revealed in a teaser on Nothing’s X account, which… doesn’t tell us much else. Below text with the July launch window the number ‘3’ simply flashes on the screen several times, followed by text saying “it’s a magic number”.
That ‘3’ is created from a series of white blocks that are reminiscent of the glyph lighting system on the back of Nothing’s phones, so that will probably be making a return here, but that was always expected.
Phone (3). It's a magic number. Coming July 2025. pic.twitter.com/WEQ7Vcf72HMay 20, 2025
A top-end chipset and a reworked cameraStill, while this teaser doesn’t tell us much else, previous teasers and leaks do give us some idea of what to expect.
Nothing itself has previously said that the Nothing Phone 3 will be the company’s “first true flagship”, and that it will have a price to match, coming in at around £800 (roughly $1,060 / AU$1,640). That price will apparently be justified through “premium materials, major performance upgrades, and software that really levels things up.”
Beyond that, a recent rumor pointed to the Nothing Phone 3 having a “flagship Snapdragon chipset”, which might mean the Snapdragon 8 Elite, also found in the likes of the Samsung Galaxy S25 series.
The same tip also pointed to a significantly reworked triple-lens camera, complete with a larger primary sensor than the Nothing Phone 2 and a periscope telephoto lens, suggesting this phone could offer long-distance optical zoom.
The battery could be in for a boost too, with this said to possibly exceed 5,000mAh – up from 4,700mAh in the Nothing Phone 2.
So, the Nothing Phone 3 could be quite an exciting handset, and if you’ve liked the look of Nothing’s phones but wanted something higher end, this could finally be the device for you. We’ll find out in July.
You might also likeIn the ever-evolving landscape of financial markets, the introduction of artificial intelligence (AI) has been a game-changer in the fight against market manipulation. As stock trading practices diversify, globalization expands and competition intensifies with the daily addition of modern businesses, the complexity of monitoring and maintaining fair play across markets has increased exponentially.
However, as global exchanges have invested in adopting and developing AI tools, so too have their criminal counterparts. Market manipulators have become more sophisticated in their tactics, employing highly advanced pump and dump and spoof trading strategies to influence market conditions to their advantage.
To get ahead of illicit activity, the human immune system has emerged as an unlikely source of inspiration for enhancing AI powered detection tools.
Detecting and Preventing Market ManipulationAI's role in financial markets is akin to a vigilant sentinel, tirelessly scanning vast amounts of data for signs of manipulation. By leveraging machine learning algorithms and complex pattern recognition, AI systems can identify irregularities and potential manipulative behaviors that would be nearly impossible for humans to spot due to the sheer volume and speed of high frequency stock market trading.
These AI systems are trained on historical data, learning from past instances of market manipulation to recognize the subtle signals that may indicate foul play. They can monitor multiple markets simultaneously, track the behavior of individual traders, and correlate seemingly unrelated events to uncover hidden patterns. This comprehensive monitoring capability is crucial in a landscape where a single manipulated trade can have far-reaching consequences.
Despite its potential, applying AI to market surveillance has many challenges. Financial markets are complex, dynamic systems with a multitude of variables at play. The bespoke nature of AI models required for each unique scenario means that there is no one-size-fits-all solution. AI systems must be tailored to the specific characteristics of each market and the types of manipulation that may occur within them.
Moreover, the AI must be capable of adapting to new strategies employed by market manipulators. Just as viruses evolve to bypass the immune system, so do manipulative tactics to evade detection. This necessitates AI systems that can learn and adapt in real-time, a feat that requires significant computational power and advanced algorithms.
Learning from the Human Immune SystemThe human immune system is a marvel of natural engineering, capable of identifying and neutralizing a vast array of pathogens. It is this remarkable adaptability that has inspired the development of AI systems for market surveillance. The immune system's ability to remember past infections and recognize new ones that share similar characteristics is mirrored in the way AI can learn from historical market data and adjust to new forms of manipulation.
Just as the immune system has different mechanisms to deal with various threats, AI systems can employ a range of strategies to tackle different types of market manipulation. The abstract term used for such mechanisms is Artificial Immune Systems (AIS), and are computational intelligence methods modelled after the immune system. These systems develop a set of pattern detectors by learning from normal data, incorporating an inductive bias that applies exclusively to this baseline data, which may shift over time (due to its non-stationary nature).
The Dendritic Cell Algorithm (DCA), a biologically inspired subset of AIS, mirrors the human immune response by monitoring, adapting, and identifying potential threats. From statistical analysis to behavioral analytics, AI leverages this adaptive framework to help preserve the integrity of financial markets.
In recently published research, we explored how DCA can identify market manipulation patters. The model performs anomaly detection for a selective set of outputs obtained from DCA while examining multiple types of manipulative patterns. The uniqueness of this approach is in reducing the dimensions of the input dataset and avoiding the inconsistency in selecting the thresholds for the parameters involved.
It is also unbiased towards specific types of manipulation, as any knowledge about the anomalies injected is not provided to the model a priori. The distinctiveness of the results is visible when compared with existing models, for a variety of evaluation metrics from area under the ROC curve to false alarm rate.
The Balance Between Human Oversight and AI EmpowermentWhile AI can process and analyze data at speeds and volumes beyond human capability, it is not infallible as it lacks the human ability to understand nuances. The balance between human oversight and AI empowerment is critical in stock exchange surveillance. Human expertise is essential for interpreting the findings of AI, providing context, and making judgement calls on whether identified patterns truly constitute manipulation.
Humans can also provide the ethical and regulatory framework within which AI operates, ensuring that surveillance practices remain fair and just. As financial markets continue to grow in complexity, the need for sophisticated surveillance tools becomes ever more pressing.
AI, with its ability to learn from the past and adapt to new threats, offers a powerful solution to this challenge. However, it is the combination of AI's analytical prowess and human expertise that will ultimately ensure the fairness and integrity of financial markets. As technology continues to advance, this partnership will only become stronger, safeguarding the financial ecosystem against those who seek to undermine it.
We list the best monitors for trading.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
Gaming accessory brand Logitech G has announced the Logitech G522 Lightspeed, a new wireless gaming headset intended to supersede the popular Logitech G733 Lightspeed.
The G522 Lightspeed features redesigned earcups, with a wider shape and an added layer of memory foam for enhanced comfort. It has a lightweight, adjustable fabric headband, which now rests flatter than its predecessor and has built-in ridges for better cooling.
The exterior of each ear cup features four eye-catching customizable RGB lighting zones, which can be tweaked to the color of your choice in the Logitech G Hub desktop software. It's also compatible with the Logitech G mobile app.
Under the hood, the headset is packing Logitech G's highest-fidelity 40mm Pro-G drivers with 24-bit / 48kHz signal processing for enhanced audio clarity and detail.
The headset comes bundled with a removable omnidirectional microphone, which offers an impressive 16-bit / 48kHz bandwidth. It's the same microphone found in the excellent, but much more expensive, Astro A50 X, which impressed with its crystal clear recordings in my hands-on testing.
On the Logitech G522 Lightspeed, the microphone has the added benefit of a built-in red LED indicator that illuminates when it's muted.
As its name would suggest, the headset can connect to PC or PlayStation 5 via Logitech's Lightspeed wireless dongle (which is included in the box), but also supports traditional Bluetooth for the aforementioned platforms in addition to Nintendo Switch and mobile. There's also the option for wired play via its USB Type-C connector.
Logitech claims up to 40 hours of battery life with the default lighting on, or up to 90 hours with it disabled, which is a pretty impressive figure. It's not quite the up to 200 hours promised by the competing HyperX Cloud III S, but it's still more than enough juice for a few weeks' worth of intense gaming sessions.
The Logitech G522 Lightspeed hits shelves on June 16 in white or black colorways. It costs $179 / £139.99 / AU$299.95, putting it in the midrange price bracket.
Its expansive feature set seems very promising, but only time will tell whether it becomes one of the best PC gaming headsets or best PS5 headsets around.
You might also like...While hybrid work models have helped teams collaborate across locations, persistent challenges remain with teams still wrestling with misalignment and communication gaps that slow progress and delay achieving notable outcomes.
To build more adaptive, high-performing teams—regardless of where they work—organizations are turning to Agile practices. Agile's emphasis on continuous feedback, quick adjustments, and strong collaboration makes it an ideal framework for bridging the gaps that often arise in hybrid work environments.
But embracing Agile isn’t a one-and-done fix. As work evolves, so should the way we apply these methods. The real opportunity isn’t just about keeping up, it’s about using these changes as a launchpad for better ways of working.
Breaking free from inefficienciesAccording to a recent survey by Lucid Software, nearly half of UK businesses report that teams can take up to three hours to decide on how to move forward on business goals, highlighting that meetings may drag on and clear next steps often don’t follow.
The survey also revealed miscommunication and poor planning are significant barriers to productivity, with 41% of respondents citing unclear project requirements, scope changes and miscommunication with colleagues as the top reasons for redoing work. These issues not only demand extra time and effort but also leave 1 in 5 of workers feeling that their team’s plans rarely align with the company’s strategic goals.
While 45% of workers believe that adopting new collaboration tools could significantly cut decision-making time, tools alone won’t solve the problem. To truly address communication challenges, a shift in mindset is crucial.
Agile frameworks offer exactly that. By breaking work into smaller, manageable increments and fostering regular feedback cycles, Agile enables teams to adapt quickly to change, clarify goals, and align efforts more effectively across stakeholders. This approach reduces wasted time, minimizes costly misalignments, and accelerates progress towards strategic objectives.
Agile in motionAgile practices have been gaining popularity, with 51% of respondents indicating their organizations actively use Agile to organize and deliver work. Yet, despite its growing presence, only 49% of UK businesses have adopted Agile and even among those that have, the benefits of Agile aren’t consistently felt across teams. One big reason? Resistance to change.
Much of that resistance often stems from middle management. Middle managers are often caught between evolving expectations from leadership and long-standing habits rooted in traditional management practices. The shift to Agile requires more than just new skills, it’s about evolving how we perceive, interpret, and respond to the complexities of work and leadership.
This resistance is often driven by fear of losing control or uncertainty about how to navigate this shift, making it crucial to provide middle managers with the right tools and support to embrace the new Agile mindset.
This is where mindset matters. Adopting agility requires both horizontal development (e.g. learning a new topic or tool) and vertical development (e.g. holding a new perspective). The concept of vertical development, popularized by researchers like Robert Kegan and Lisa Lahey, expands a person’s ability to lead amidst complexity. It enables them to interpret shifting conditions, not just follow a fixed playbook. For agile to stick, organizations must invest in both forms of development for those involved.
To enhance the effectiveness of Agile, leaders should work to create buy-in from all team members and ensure that Agile practices are consistently applied across the organization with meaningful training and solutions that facilitate successful implementation. This can start by identifying key change agents within teams who can help model and reinforce Agile principles, while also setting up regular feedback loops to accelerate progress and address any obstacles. When done right, Agile isn’t just a framework—it’s a foundation for better, faster, more human ways of working.
The power of a common visual frameworkToo often, traditional methods persist simply because ‘it’s the way it’s always been done.’ But as work grows more complex and distributed, those default approaches, especially meetings, aren’t enough to keep everyone aligned.
Team meetings remain the go-to methods for tracking progress, with 74% of respondents relying on them. However, this approach doesn’t work equally for all roles. Only 53% of entry-level employees report having high visibility into their work, indicating that even regular stand-ups may not provide everyone with the clarity they need. This highlights a critical need for more effective approaches to decision-making and alignment — ones that don’t depend on everyone being in the same room.
That’s where visual collaboration solutions come in. Agile teams are already ahead of the curve here — 69% report using visual tools as opposed to only 41% of general knowledge workers. Visual collaboration supports Agile by providing a shared, always-on workspace that enables teams to track tasks in real-time, visualize workflows and adjust priorities as needed.
What excites me most is seeing how these tools are transforming team dynamics. Team members who might stay quiet during video conferencing calls now actively shape ideas and decisions through visual contributions, creating a stronger sense of ownership and alignment. This visual engagement fosters a more collaborative and responsive environment, key principles of Agile practices.
Forging ahead with a united workforceEven if teams interpret and apply Agile practices differently, the underlying principles can still guide better ways of working. Leaders may feel confident in their team’s direction, but when newer employees don’t understand the direction or feel misaligned with the company’s values, that misalignment can ripple across the organization. In fact, what those employees experience often reveals how well Agile is truly being lived—not just implemented.
For example, if a team struggles to prioritize or frequently misses deadlines, it may signal that Agile practices aren’t being fully integrated, even if they’re technically in place. For any organization, bridging these gaps is essential. Leaders should lean on shared tools and frameworks that promote clarity, build skills and foster better communication. A visual roadmap, for instance, can make abstract goals clearer by laying out specific, achievable steps, showing progress, and aligning team efforts.
Addressing these challenges early helps prevent problems like misalignment and employee burnout, ultimately enabling teams to accelerate work and drive efficient outcomes.
Start here: a low-barrier entry point to agilityNot every organization is ready for a full agile transformation. That’s okay. You don’t have to adopt every practice to benefit from agile thinking. Start small by using a shared visual board to clarify weekly priorities. You can also replace a long meeting with asynchronous feedback using sticky notes or comments. Most importantly, ask your team what’s blocking progress and listen.
Agility isn’t the goal. Value is. But agility is how you get there, consistently, sustainably, and together. Instead of trying to replicate the office in a hybrid model, it’s time to rethink how work can happen more intentionally and effectively. The future belongs to those who can align quickly, learn continuously, and move forward with shared purpose. That’s how agile teams stay aligned, fast, and focused.
We compiled a list of the best Microsoft Teams alternatives.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
The seven-year wait for fans of The Librarians franchise is finally over, as The Librarians: The Next Chapter starts a new page. Originally intended to be shown on The CW, US viewers can tune into The Librarians: The Next Chapter on TNT or via Sling TV. Read on for how to watch The Librarians: The Next Chapter online from anywhere with a VPN.
Premiere date: Sunday, May 25
US TV channel: TNT
Stream: Sling TV (US) | CTV (CA) | Foxtel (AU)
Use NordVPN to watch any stream
21 years after the first The Librarian film hit the small screen, this latest spin off series sees a new time traveling addition to the popular franchise. Callum McGowan (Marie Antoinette) leads the fantasy fun as a historical librarian who finds himself in the modern day.
As Vikram Chamberlain, TNT says that 'he inadvertently releases magic across the continent' before being given a new team to help him 'clean up the mess he made'. As the official trailer suggests (which you can watch further down this page), his team of Librarians will be challenged by a plague of lost souls, demons and other evil forces as Chamberlain seeks a way back to his own time.
We’ve got all the info on where to watch The Librarians: The Next Chapter online and stream episodes from anywhere.
How to watch The Librarians: The Next Chapter online in the USUS viewers can watch The Librarians: The Next Chapter on TNT. It kicks off with a '2 Night Series Premiere' on Sunday 25 and Monday 26 of May — both at 11.30pm ET / 8.30pm PT. After that, the remainder of the 12-episode run will go out weekly on Mondays at 9pm ET/PT. You can see a full schedule at the bottom of this article.
Don’t have cable? You can also watch TNT via Sling TV via your choice of its Blue or Orange plans. Both cost from $46/month, with your first month half price.
Away from the US? Use a VPN to watch The Librarians: The Next Chapter on Hulu from abroad.
How to watch The Librarians: The Next Chapter online anywhereIf you’re traveling abroad when The Librarians: The Next Chapter airs, you’ll be unable to watch the show like you normally would due to annoying regional restrictions. Luckily, there’s an easy solution.
Downloading a VPN will allow you to stream online, no matter where you are. It's a simple bit of software that changes your IP address, meaning that you can access on-demand content or live TV just as if you were at home.
Use a VPN to watch The Librarians: The Next Chapter from anywhere.
Editors ChoiceNordVPN – get the world's best VPN
We regularly review all the biggest and best VPN providers and NordVPN is our #1 choice. It unblocked every streaming service in testing and it's very straightforward to use. Speed, security and 24/7 support available if you need – it's got it all.
The best value plan is the two-year deal which sets the price at $3.09 per month, and includes an extra 3 months absolutely FREE. There's also an all-important 30-day no-quibble refund if you decide it's not for you.
- Try NordVPN 100% risk-free for 30 daysVIEW DEAL ON
How to watch The Librarians: The Next Chapter online in CanadaCanadian viewers can watch The Librarians: The Next Chapter on CTV's Sci-Fi channel on Mondays at 9pm ET/PT.
Rather stream the show? Use the CTV.ca website or app. You'll need to enter your cable provider details.
US viewer currently traveling in Canada? Download a VPN to connect to your streaming service back home and watch The Librarians: The Next Chapter no matter where you are.
Can I watch The Librarians: The Next Chapter in the UK?At the time of writing, no broadcaster has been announced for The Librarians: The Next Chapter in the UK.
If you're a resident of somewhere that does have The Librarians: The Next Chapter streaming but are currently in the UK, you can use a VPN to watch your regular service.
How to watch The Librarians: The Next Chapter online in AustraliaFoxtel subscribers can watch The Librarians: The Next Chapter on FOX8 in Australia. Episodes will go out on Thursdays at 9pm AEST from May, 29.
Foxtel Now's pricing begins from AU$35 per month.
Foxtel shows usually land on the Binge streaming service as well. However, at the time of writing, this has not been confirmed for The Librarians: The Next Chapter
If you’re visiting Australia from abroad and want to watch on your home service, simply download a VPN to stream The Librarians: The Next Chapter just as you would back home.
What you need to know about The Librarians: The Next Chapter The Librarians: The Next Chapter trailer Can I watch The Librarians: The Next Chapter for free?The Librarians: The Next Chapter isn't listed to watch on any free-to-air channels or streaming services.
The Librarians: The Next Chapter castThe Librarians: The Next Chapter is set for a 12-episode run from Sunday, May 25 to Monday, August 4.
Google and Samsung’s Project Moohan Android XR headset isn’t entirely new – my colleague Lance Ulanoff already broke down what we knew about it back in December 2024. But until now, no one at TechRadar had the chance to try it out.
That changed shortly after Sundar Pichai stepped off the Google I/O 2025 stage. I had a brief but revealing seven-minute demo with the headset.
After scanning my prescription lenses and matching them with a compatible set from Google, they were inserted into the Project Moohan headset, and I was quickly immersed in a fast-paced demonstration.
It wasn’t a full experience – more a quick taste of what Google’s Android XR platform is shaping up to be, and very much on the opposite end of the spectrum compared to the polished demo of the Apple Vision Pro I experienced at WWDC 2023.
Project Moohan itself feels similar to the Vision Pro in many ways, though it’s clearly a bit less premium. But one aspect stood out above all: the integration of Google Gemini.
“Hey Gemini, what tree am I looking at?” (Image credit: Future)Just like Gemini Live on an Android like the Pixel 9 – Google’s AI assistant takes center stage in Project Moohan. The launcher includes two rows of core Google apps – Photos, Chrome, YouTube, Maps, Gmail, and more—with a dedicated icon for Gemini at the top.
You select icons by pressing your thumb and forefinger together, mimicking the Apple Vision Pro’s main control. Once activated, the familiar Gemini Live bottom bar appears. Thanks to the headset’s built-in cameras, Gemini can see what you’re seeing.
In the press lounge at the Shoreline Amphitheater, I looked at a nearby tree and asked, “Hey Gemini, what tree is this?” It quickly identified a type of sycamore and provided a few facts. The whole interaction felt smooth and surprisingly natural.
You can also grant Gemini access to what’s on your screen, turning it into a hands-free controller for the XR experience. I asked it to pull up a map of Asbury Park, New Jersey, then launched into immersive view – effectively dropping into a full 3D rendering akin to Google Earth. Lowering my head gave me a clear view below, and pinching and dragging helped me navigate around.
I jumped to a restaurant in Manhattan, asked Gemini to show interior photos, and followed up by requesting reviews. Gemini responded with relevant YouTube videos of the eatery. It was a compelling multi-step AI demo – and it worked impressively well.
That’s not to say everything was flawless. There were a few slowdowns, but Gemini was easily the highlight of the experience. I came away wanting more time with it.
Hardware impressions (Image credit: Google)Though I only wore the headset briefly, it was evident that while it shares some design cues with the Vision Pro, Project Moohan is noticeably lighter – though not as high-end in feel.
After inserting the lenses, I put the headset on like a visor—the screen in front, and the back strap over my head. A dial at the rear let me tighten the fit easily. Pressing the power button on top adjusted the lenses to my eyes automatically, with an internal mechanism that subtly repositioned them within seconds.
From there, I used the main control gesture – rotating my hand and tapping thumb to forefinger – to bring up the launcher. That gesture seems to be the primary interface for now.
Google mentioned eye tracking will be supported, but I didn’t get to try it during this demo. Instead, I used hand tracking to navigate, which, as someone familiar with the Vision Pro, felt slightly unintuitive. I’m glad eye tracking is on the roadmap.
Google also showed off a depth effect for YouTube videos that gave motion elements—like camels running or grass blowing in the wind – a slight 3D feel. However, some visual layering (like mountain peaks floating oddly ahead of clouds) didn’t quite land. The same effect was applied to still images in Google Photos, but these lacked emotional weight unless the photos were personal.
Where Project Moohan stands out @techradar ♬ original sound - TechRadarThe standout feature so far is the tight Gemini integration. It’s not just a tool for control – it’s an AI-powered lens on the world around you, which makes the device feel genuinely useful and exciting.
Importantly, Project Moohan didn’t feel burdensome to wear. While neither Google nor Samsung has confirmed its weight – and yes, there’s a corded power pack I slipped into my coat pocket – it remained comfortable during my short time with it.
There’s still a lot we need to learn about the final headset. Project Moohan is expected to launch by the end of 2025, but for now, it remains a prototype. Still, if Google gets the pricing right and ensures a strong lineup of apps, games, and content, this could be a compelling debut in the XR space.
Unlike Google’s earlier Android XR glasses prototype, Project Moohan feels far more tangible, with an actual launch window in sight.
I briefly tried those earlier glasses, but they were more like Gemini-on-your-face in a prototype form. Project Moohan feels like it has legs. Let’s just hope it lands at the right price point.
You might also likeBeaten and restrained by Taitra security guards, I'm hauled back to the MSI booth from whence I came, the laptop I'd tried to spirit away handed back to MSI while members of the North American PR team look at me in stony silence. I lift up my head and meet their eyes, one by one.
"It belongs in a museum!" I yell over the clamor and din of the Computex 2025 showfloor.
One of the reps that I've known for years shouts to be heard: "John, what the hell, man? Have you lost your mind?"
"It belongs in a museum!"
(Image credit: Future / John Loeffler)OK, so that scene didn't play out anything like that yesterday when I first set eyes on the MSI Prestige 13+ AI Ukiyo-e Edition laptop, but it damn well could have. All that I needed was a means of escape through the packed crowd at the MSI booth, all of whom gawked along with me at what is undoubtedly the most beautiful laptop any of us has ever seen.
The MSI Prestige 13+ AI is already one of the best laptops MSI's put out in recent years, but the one on display at Computex was something entirely different. Splashed across the lid is a hand-laquered reproduction of The Great Wave off Kanagawa by the Japanese artist and printmaker Hokusai, a master of the ukiyo-e art style that dominated Japan from the 17th to 19th centuries.
(Image credit: Future / John Loeffler)I'm not as into Japanese art and culture as many of my friends are, a few of whom speak varying degrees of Japanese as a second language and all of whom own pretty much every Manga that has been released in the United States (as well as many that they've had to pay extra to order directly from Japanese shops), but I do love ukiyo-e..
I grew up in New York City and spent a lot of time going to the Metropolitan Museum of Art throughout my childhood, and the Met has a rather impressive collection of ukiyo-e prints, including an original print of The Great Wave, first produced in 1831.
Something about the bourgeois market scenes, manor intrigues, and quaint personal moments between friends and lovers that defined the ukiyo-e style resonates with me to this day.
But it was always the depictions of vulnerable humanity in the presence of unassailable natural forces that spoke most strongly to me. And no work of art captures that as well as The Great Wave, with its unstoppable water cresting over a pair of fishing boats, the owners of which are nowhere to be seen. The only proof of their existence is the boats left behind, pilotless and at the mercy of nature.
Image 1 of 4(Image credit: Future / John Loeffler)Image 2 of 4(Image credit: Future / John Loeffler)Image 3 of 4(Image credit: Future / John Loeffler)Image 4 of 4(Image credit: Future / John Loeffler)The Prestige 13+ AI Ukiyo-e Edition reproduces this masterful scene thanks to the work of OKADAYA, a Japanese company renowned for its lacquerwork on fine chinaware and pottery.
Similar to how ukiyo-e prints were made in steps and layers back in the day, OKADAYA's process for creating The Great Wave on the Prestige 13+ AI lid involves applying eight thin layers of lacquer by hand, incrementally building up the coloring and texture of the scene before polishing it to a smooth, resilient finish.
Image 1 of 3(Image credit: Future / John Loeffler)Image 2 of 3(Image credit: Future / John Loeffler)Image 3 of 3(Image credit: Future / John Loeffler)The process isn't limited to just the lid, either. The keys of the keyboard have also been stepped up to a polished, piano-key-like finish with gold-colored key labels to match the MSI logo on the inside of the device and on the lid, as well as the labels for the device's ports.
(Image credit: Future / John Loeffler)While the artwork on the device steals the show (and by show, I mean Computex, as the Prestige 13+ AI Ukiyo-e Edition won Computex's Best Choice Award this year), the underlying laptop is still impressive as well, with up to an Intel Lunar Lake SoC, up to 32GB LPDDR5x memory, 1TB PCIe 4.0 SSD storage, and a 13.3-inch 2.8K OLED display.
(Image credit: Future / John Loeffler)As an Artisan Collection product, the new laptop will have a limited run of 1,000 units, with each getting its production number laser-etched onto the bottom of the device. Given the handscrafting that's gone into these laptops, you can imagine that they won't be cheap, and I wouldn't be surprised if the majority of them have already been purchased before they even made their debut at this year's show.
Still, even if it's not possible to own one yourself (unless you get very lucky), maybe one of the buyers could do their good deed for the year and donate one of these masterpieces to a museum somewhere so we can all enjoy the artistry that's gone into this device.
Having seen it up close and held it myself, I can tell you it wouldn't be out of place among the finest ukiyo-e prints on display at the Met, and it's something I'd happily take the time to go see whenever I'm there.
You might also like...Headphone maker Skullcandy holds a soft spot in my heart. It was my go-to brand for wired earbuds when I was a teenager (this was long before the best wireless earbuds dominated the audio market) because they were affordable, available in a range of funky colors and, at the time at least, I thought they sounded great.
When I entered the world of tech journalism in my early 20s, I was exposed to a plethora of new brands and I started earning adult money. This meant I found myself drifting away from the company that I’d long considered ‘budget’ and ‘the headphones to get if you don’t mind them getting damaged’ – instead investing in more premium offerings from the likes of Sennheiser and Bose.
So when Skullcandy came back onto my radar with the announcement of the Method 360 ANC earbuds, proudly stating they’d been designed and tuned in collaboration with none other than Bose, I was hit with a wave of nostalgia – a 20-year gap can count as nostalgic, right?
Not only that, but I also wondered if the maker of my first go-to headphones could reclaim its title and knock my current favorite LG ToneFree T90S – which themselves replaced the Apple AirPods Pro 2 – out of my ears.
The quick answer? They’ve come awfully close, but their large charging case has made my final decision much trickier than I expected.
Sounding sweetFrom a sound quality and ANC perspective, Skullcandy’s collaboration with Bose is an absolute hit. Bose has long been a front runner when it comes to audio performance, but is arguably best known for making some of the best noise-cancelling headphones in recent memory.
I’m inclined to believe that Bose has had free reign with the audio and ANC smarts for the Method 360 ANC because from the moment I put the buds into my ears in the office, everything around me was silenced. Colleagues talking to each other near my desk, the office speaker blaring out questionable songs, it all disappeared.
(Image credit: Future)My trusted LG earbuds perform similarly, but they require the volume to be increased a little further for similar noise-cancelling effects. And when nothing is playing, the Skullcandy earbuds do a better job of keeping external sounds to a minimum.
It took me a little longer to formulate a definitive opinion on the sound quality, partly because of the design (more on that later) and partly because I’d become so accustomed to the sound of the LG ToneFree T90S.
After inserting and removing both pairs from my ears more times than I can remember, I settled on the notion that Skullcandy’s latest effort sounds more engaging, a little clearer on the vocals and just simply fun.
The increasing intensity of the violins at the start of Massive Attack’s Unfinished Sympathy reveal them to be dynamically adept, and they show off their rhythmic talents when playing the deliciously upbeat Bread by Sofi Tukker.
Plus, despite not supporting spatial audio (I have to agree with my colleague Matt Bolton when he says the company “blew the perfectly good name it could've given the next-gen version where it did add this feature”), the earbuds do give songs some sense of space. I was able to confidently place the hi-hat sounds, hums and drum beats in the opening of Hayley Williams’ Simmer around my head, for example.
Overall, I’m very impressed with the sound performance of the Skullcandy buds, especially considering their $129.99 / £99 / AU$189.99 price tag, which places them firmly in affordable territory.
It’s not unjust to expect limitations where sound or features are concerned at certain price points, but I think the Method 360 ANC delivers a sound that belies their price tag.
You'll be able to read our full thoughts on the Skullcandy Method 360 ANC in our full review, which is on its way.
The peculiar case of the peculiar case“If you like how they sound, how come the Skullcandy Method 360 ANC aren’t your new daily pair of earbuds?” I hear you ask. Well, it’s predominantly because of their case, but also a little to do with a design choice inherited from Bose.
When I first saw images of the case following their announcement, I was a little perplexed. All of the wireless earbuds I’m aware of come with a case that can easily be slipped into a pocket – apart from the Beats Powerbeats Pro 2 that is – yet the one supplied by Skullcandy looked enormous.
(Image credit: Future)Now I’ve unboxed my own pair, I can confirm the case is pretty damn big. Not heavy, just big.
It’s an interesting choice, especially since the earbuds themselves don’t take up that much space. I’m also not sold on the fact you have to slide the internal section of the case out to access the buds.
What’s more, I feel the most logical way to hold the case is horizontally when sliding the earbuds out, with the carabiner clip on the right as it feels more weighted and natural in my right hand.
Doing so reveals the earbud for the right ear, with the left earbud on the other side. That means I have to pick out the right earbud with my left hand, then flip the case over and do the opposite for the left bud.
And what’s even more confusing is the earbuds appear to fit into their charging spots upside down. So not only do I have to pass each bud from the ‘wrong’ hand to the right one, but I also have to flip them around the right way. There are just too many steps involved for what has always been a seamless and convenient process with other earbuds.
What’s also interesting is that, since the Method’s launch, I’ve noticed a second, more affordable pair of earbuds appear on Skullcandy’s website called the Dime Evo. They employ a similar sliding-case design, but both earbuds are on the same side, which I can only assume will make the removal process that little bit easier.
(Image credit: Future)Based on Skullcandy’s imagery for the Method 360 ANC, it’s targeting a young, cool demographic who walk around with sling bags over their shoulder, upon which they can attach the earbuds via the integrated carabiner clip.
As much as I would love to say I fall into that group, the fact is I don’t – well, not anymore. And because I’m not someone who wants to clip their headphones onto even the belt loop of my pants, the case design is completely lost on me.
I would further argue that the target audience is a little niche, too, which is a shame considering how good I think the earbuds sound. I’m saddened for Skullcandy that not enough pairs of ears are going to get to hear them.
Hey, I did say it was predominantly the case I had issues with.
(Image credit: Future)As for the aforementioned inherited design trait – that would be the Stability Bands found on the Bose QuietComfort Ultra Earbuds and QuietComfort Earbuds II.
Despite their intention to provide a more stable and secure fit, I initially had a lack of confidence in their ability. I often found myself wanting to readjust them in my ears to make sure they were locked in, which also meant I pressed the on-ear controls at the same time and paused my music.
It’s not just me who’s had an issue with them – my colleague and self-confessed Bose fangirl, Sharmishta Sarkar, has previously written about her shortcomings with the design too.
I eventually settled on the largest size of Stability Band (which I could only determine by sight, as there’s no indication of which size is which on the included book of spares) and so far, so good. They definitely feel more secure in my ears compared to when I tried other sizes and passive noise cancelation has also been improved.
However, the design choice has confirmed I get along best with earbud designs that insert further into my ear canal.
Awarding cool points (Image credit: Future)I like the Skullcandy Method 360 ANC earbuds. I can’t say I like the design of the case, nor do I like their mouthful of a name (Skullcandy Method would have been just fine in my opinion), but considering the biggest selling point for a pair of earbuds is how they sound, I can find little to fault.
I will most likely use them whenever I’m in the office, as I can leave the case on the desk with the skull logo facing me directly. While I might not feel cool enough to clip the case to my person, that logo alone takes me back to my teenage years. For me, that’s cool enough.
You might also likeChuwi, a company better known for budget devices than flagship powerhouses, has unveiled its latest effort to break into the high-performance segment: the GameBook 9955HX.
Promoted as a laptop for coders, gamers, and professional creators, this new model is powered by the AMD Ryzen 9 9955HX processor, a Zen 5-based chip featuring 16 cores and 32 threads, with a boost frequency of up to 5.4GHz. It also includes a large 64MB L3 cache and a configurable TDP that can peak around 55W.
As of the time of writing, the device's price remains undisclosed - however, given Chuwi’s history of undercutting bigger brands, it’s reasonable to expect this model to be priced lower than similar offerings from MSI or Asus.
Chuwi GameBook 9955HXFor graphics, the GameBook 9955HX integrates the Nvidia GeForce RTX 5070 Ti Laptop GPU, based on the latest Blackwell RTX architecture, making it well-suited for video editing and graphics-intensive tasks.
The GPU offers 12GB of GDDR7 VRAM, a 140W TGP, and supports features such as full ray tracing, DLSS 4, and Multi Frame Generation.
Chuwi says this setup can deliver up to 191 FPS in 1440p gaming with ray tracing enabled, and 149 FPS at 4K, placing it firmly in the performance laptop category.
For creators working with AI-accelerated tools, advanced 3D rendering, or video post-production, this could prove to be a top contender, provided its cooling system and thermal management are up to the task.
The display is a 16-inch 2.5K IPS panel with a 300Hz refresh rate, 100% sRGB color coverage, and a 16:10 aspect ratio. Peak brightness reaches 500 nits, though claims regarding color accuracy have yet to be verified through independent calibration tests.
Internally, the GameBook comes equipped with 32GB of DDR5 RAM at 5600MHz, upgradeable to 64GB, and a 1TB PCIe 4.0 SSD. Storage expansion is supported via two M.2 slots, one of which supports PCIe 5.0, offering a level of future-proofing not typically seen in Chuwi’s lineup.
Connectivity includes Wi-Fi 6E, Bluetooth 5.2, a 2.5Gb Ethernet port, two USB-C ports (supporting 100W and 140W power delivery), three USB-A 3.2 Gen 1 ports, HDMI 2.1, and Mini DisplayPort 2.1a. There's also a 3.5mm audio jack, DC-in, and a Kensington lock slot.
Other features include a full-sized RGB-backlit keyboard, a 2MP IR webcam with a privacy shutter, a 77.77Wh battery, and stereo speakers. The laptop measures just over 21mm thick and weighs 2.3kg.
You might also likeThe CEO of Dell Technologies has told TechRadar Pro that AI offers a great opportunity for organizations to re-evaluate themselves to positive effect
Speaking at a media Q&A session at Dell Technologies World 2025, Michael Dell looked to reassure us that AI will never fully replace human workers, and in fact may offer them a whole new outlook.
In a wide-ranging discussion, Dell also laid out his views on political instability affecting the technology industry, and some of his key leadership principles.
"Always some change"“The way I think about this is that if you look at every progress, that’s for any technology, you always have some change that goes on,” Dell said in response to our question about AI affecting or even replacing human workers.
“My way of thinking is there’s probably a 10 percent effect for that - but I think 90 percent of that is actually growth and expansion and opportunity, and ultimately what I think you’re going to see is more opportunities, more economic growth.”
“There are a lot of things that we don’t do, that we used to do, because we have the tools, and we’re more effective as a species because of that - (using AI) is just another example of that.”
“One of the keys beyond productivity and efficiency I think for organizations, is to reimagine themselves, and say, alright, what is the trajectory of these capabilities, where is it going, and what should our activity look like in three years, five years time, given this capability.”
“You know, a lot of roles today just didn’t exist 10, 20, 30 years ago - and no-one was forecasting that.”
(Image credit: Future / Mike Moore)Having spoken with Nvidia CEO Jensen Huang in his opening keynote, Dell was also asked if the two shared any overarching leadership principles.
“I think anytime there’s a new technology, you have to leap ahead (and think), what is the likely impact of this, and how do we need to change? And if we don’t have a passion around that, or there isn’t a crisis in your organization - make one! We think it can make us a better company.”
Dell was also asked about how changing global economic and political situations might affect the company’s future outlook
“We agree that those are issues and challenges,” he said, “in my general view, the importance of this technology is greater than all those problems - and I heard somebody say recently, tokens are bigger than tariffs - and that would sort of summarize our view of it.”
“Are all those things helpful to our business? No, they’re not - but there’s a limit of what we can do about that, right? We can certainly do the things we’re supposed to do, and focus on the things we can control - we’re seeing plenty of companies that are dealing with all those challenges just as we are, and powering ahead in any case.”
You might also like