Another rumor about Nvidia’s RTX 5080 Super has been aired and we’ve got a look at what are supposedly the full specs of this GPU.
As VideoCardz pointed out, leaker Kopite7kimi has posted the claimed specs for the rumored graphics card on X, and that may mean Nvidia has just provided said details to its graphics card making partners (and they leaked from there). Or, it might mean precisely nothing, because as ever, rumors, much like demons, need considerable salting.
GeForce RTX 5080 SuperPG147-SKU35GB203-450-A110752FP32256-bit GDDR7 24G 32Gbps400+WMay 20, 2025
The key parts of the specifications are that the RTX 5080 Super will supposedly use the same GPU as the RTX 5080, which is the GB203 chip. As the RTX 5080 has already maxed out the cores on that chip, the core count will be the same with the Super version of this graphics card – there’s no room to maneuver to increase it.
The big upgrade comes from the leap from 16GB to 24GB of video RAM (VRAM), and as well as that 50% uplift, the leaker believes Nvidia is going to use faster memory modules here (32Gbps rather than 30Gbps).
We’re also told that the TDP of the RTX 5080 Super is going to sit at 400W, or it might use even more power than that.
Analysis: Crunching the specs and not forgetting about clocks(Image credit: Future / John Loeffler)Looking at those specs, you might think: how is the RTX 5080 Super going to be a tempting upgrade on the vanilla version of the GPU? It has the same CUDA core count, and somewhat faster video memory, but only around 7% more VRAM bandwidth than the RTX 5080. So, what gives?
Well, don’t forget that added to that VRAM boost, the RTX 5080 Super is expected to have considerably faster clock speeds. Pushing those clocks faster is why this incoming GPU is going to chug more than 400W (perhaps a fair bit more) compared to 360W for the plain RTX 5080.
So, if you’re worried that the RTX 5080 Super may represent an underwhelming prospect in terms of an upgrade over the RTX 5080, don’t be. (Although you may have concerns about your PC’s power supply instead). All this is in line with previous speculation that we’ll see something like a 10% performance boost with the RTX 5080 Super versus the basic version of the GPU, or maybe even slightly more (up towards 15%, even).
Plus that much bigger allocation of 24GB of VRAM is going to make a difference in some scenarios where 4K gaming coupled with very high graphics settings gets more demanding with certain games. (A situation that’s only going to get worse as time rolls on, if you’re thinking about future-proofing, which should always be something of a consideration).
On top of this is the fact that Nvidia is falling out of favor in the consumer GPU world, with AMD’s RDNA 4 graphics cards making a seriously positive impact on Team Red’s chances – and sales. The latest RX 9060 XT reveal has pretty much gone down a treat, too, so I don’t think Nvidia can risk damaging its standing with PC gamers any further, frankly, by pushing out subpar Super refreshes.
Speaking of refreshes – with the emphasis on the plural – previous rumors have also theorized an RTX 5070 Super graphics card with 18GB of VRAM is on the boil, but that’s notably absent from Kopite7kimi’s post here. That doesn’t mean it isn’t happening, but it could be read as a sign that the RTX 5080 Super is going to arrive first.
Again, previous spinning from the rumor mill indicates a very broad 2025 release timeframe for the RTX 5080 Super, but if the specs really are decided on at this stage – and it’s a huge if – that suggests Nvidia intends to deploy this GPU sooner, rather than later, this year.
You might also likeEarlier this month we reported that the incoming new pair of Samsung affordable earbuds – possibly called the Samsung Galaxy Buds Core and the likely successor to the Galaxy Buds FE – could deliver a much-needed battery boost, with significantly enhanced battery capacity in both of the buds and in the case too. That information came via leaked regulatory filings, and now another leak adds more confirmation.
This time the leaks are from Samsung. As Sammobile reports, support pages for the imminent earbuds are now live on Samsung's portals including the ones in Russia, Turkey and the UAE.
And in a sheer coincidence, the Samsung Galaxy Buds FE appear to be out of stock in most of those markets.
There's some speculation that the new earbuds will more closely resemble the Galaxy Buds 3 (Image credit: Future)Samsung Galaxy Buds Core: what we know so farIt looks like the battery capacity is up from 60mAh per bud to 100mAh, and from 479mAh to 500mAh for the case. Factor in the expected chipset improvements from newer hardware, and that could mean a significant boost to the buds' playback time. The current Buds FE deliver about six hours with ANC on and nine with it off.
The new model number is SM-R410 (the Galaxy Buds FE were SM-400) and there is speculation that we'll see a new design, possibly closer visually to the Samsung Galaxy Buds 3; that would make room for those bigger batteries.
Samsung hasn't announced these buds yet, so we don't know pricing or availability, but clearly if support pages are going up then a product launch can't be too far away.
We'd expect the new buds to be priced similarly to the Galaxy Buds FE, subject to tariff-related hikes: those launched at $99 / £99 / AU$149 in 2023.
You might also likeNPR asked researchers, advocates, tax experts, a parent and a public school leader for their thoughts on this first-of-its-kind national voucher plan. Here's what they said.
We've been ready and waiting for the Samsung tri-fold phone for months now – remember it was officially teased back in January – and as its launch gets closer, there's a new leak hinting at a high price for the foldable.
This comes from well-respected tipster Yogesh Brar, who says we can expect a price tag of around $3,000-$3,500. With a straight currency conversion at today's rates (which Samsung won't use), that's £2,225-£2,595 or AU$4,650-AU$5,425.
However, if you live in a country using any of those currencies, it sounds like you're not going to be able to spend your cash on this device. Brar reckons the handset is launching in "limited quantities", and only in South Korea and China (as previously rumored).
Samsung has previous form for this, because last year's Samsung Galaxy Z Fold Special Edition has also been limited to South Korea and China. Perhaps the company isn't sure what the demand for these very expensive foldables would be like globally.
One more foldGalaxy Tri-fold all set to launch in Q3 this yearSamsung is only launching it in 2 markets : South Korea & ChinaLimited quantities with a price between $3000 - 3500May 21, 2025
That high price isn't much of a surprise of course. As our Samsung Galaxy Z Fold 6 review will tell you, that phone launched at a starting price of $1,899.99 / £1,799 / AU$2,749, and the new model will come with a bigger screen and an extra hinge.
Then there's the Huawei Mate XT, which costs 19,999 yuan in China. That's roughly $2,775 / £2,060 / AU$4,305 at today's conversion rates. These are clearly expensive and difficult to make, and that means high prices and limited production runs.
Since rumors of a Samsung tri-fold first started swirling, we've heard that the handset could be called the Galaxy G Fold, and that it'll share the same hinge technology expected to appear in the upcoming Samsung Galaxy Z Fold 7.
The tri-fold, the Galaxy Z Fold 7, and the Galaxy Z Flip 7 are all expected to be announced at an Unpacked event sometime in July, though on-sale dates may vary. At the same launch, we should also see the new Samsung Galaxy Watch 8.
You might also likeUK broadband network and infrastructure giant Openreach has committed to rolling out full fibre broadband across the UK more quickly after acknowledging that only 37% of customers are connected to the network.
The news coincides with an undisclosed "increased investment" from BT Group – Openreach's owner.
According to the company, more than 18 million homes and businesses nationwide have benefitted from new infrastructure, including four million in the past year, but with an extra cash injection from BT, it hopes to e
Openreach wants more homes and businesses to have full fibreOpenreach "now expects to accelerate towards its target of reaching 25 million premises by December 2026," a press release reads, noting how the company's build rate is expected to increase by 20%.
The BT-owned network and infrastructure firm says it's seen record demand over the past year, connecting one customer to its full fibre network every 17 seconds.
"We’re bringing life changing connectivity to all corners of the country, and we’re determined to go further and faster, so we’re proud of the confidence being shown in us through this investment," Openreach CEO Clive Selley said.
That growth is expected to continue into the end of the decade. Openreach envisions 30 million properties being connected to its full fibre network by 2030, adding a further five million after its December 2026 target.
BT recently confirmed a £9.8 million contract to extend its full fibre network to 1,800 "hard-to-reach sites" in Pembrokeshire, Swansea, Neath Port Talbot and Carmarthenshire.
BT Group CEO Allison Kirkby added: "Our new network is helping to grow the economy, create jobs, delight customers and deliver value to our shareholders."
You might also likeThe Nintendo Switch 2 Pro Controller has a bit of an ace up its sleeve, and it relates to the remappable GL/GR buttons found on the rear of the pad.
A spotlight for the Nintendo Switch 2 Pro Controller was featured on the Nintendo Today mobile app (spotted by GamesRadar), showcasing some of the functionality of these extra buttons.
It confirmed that the GL/GR buttons have a couple of fantastic quality-of-life features that are sorely missing from the likes of the DualSense Edge and Xbox Elite Wireless Controller Series 2 - two premium gamepads that also house additional remappable buttons.
With the Nintendo Switch 2 Pro Controller, the major difference is that the GL/GR remappable buttons can be assigned (and reassigned) without backing out of your current play session.
By holding down the Home button, you'll gain access to a 'quick settings' menu, within which you can assign the GL/GR buttons instantaneously. Furthermore, the controller will remember which inputs have been assigned to these buttons on a per-game basis.
This differs greatly from, for example, the DualSense Edge. While Sony's controller has a pair of exceptionally handy Function switches that let you swap button profiles on the fly, said profiles still need setting up in a separate menu on your PS5's dashboard.
For Nintendo Switch 2 games, this makes it incredibly easy to quickly assign a secondary input to the GL/GR buttons, but also test it out immediately to see how it feels in-game.
Quick remappable button assignment, in and of itself, is nothing new. Plenty of the best Nintendo Switch controllers feature button combination macros that let you remap on the fly. The downside here, though, is that this can be quite cumbersome, and you'll often need to dig into a controller's instruction manual to figure out what these macros are.
We're now less than a couple of weeks away from the Nintendo Switch 2's launch on June 5. Be sure to check out TechRadar Gaming around that time, as we'll have plenty of coverage on the console, its hardware, and games in the months to come.
You might also like...Marvel has delayed the release of Avengers: Doomsday and its sequel.
In a move that won't come as a surprise to many, the comic titan has pushed back the launch dates for Doomsday and its follow-up Avengers: Secret Wars.
The pair had been slated to land in theaters on May 1, 2026 and May 7, 2027. Now, you can expect to see Doomsday release in theaters worldwide seven months later than planned, with Avengers 5 now set to arrive on December 18, 2026 and Secret Wars' launch pushed to December 17, 2027.
The next two Avengers movies are set to be the biggest undertakings in Marvel Studios' history. Per Deadline, sources close to the production of both films say they're among the most ambitious projects that parent company Disney has ever produced, too. To quote Thanos, then, it was inevitable that Marvel would need more time to make both flicks.
Why Avengers 5 and 6's release-date delays are so significantMarvel hasn't said what impact Doomsday's delayed release will have on its other projects (Image credit: Marvel Studios)Make no mistake, Disney and Marvel have made the right call to delay the release of Doomsday and Secret Wars. The overall response to Marvel Cinematic Universe (MCU) projects since 2019's Avengers: Endgame has been mixed. While some films and Disney+ shows have been critical and commercial successes, others haven't been greeted as enthusiastically or made as much money as Marvel would have hoped.
Disney and Marvel can't afford to fumble the proverbial bag with Doomsday and Secret Wars, especially given the amount of money it'll collectively cost to make them. Add in the talent behind and in front of the camera – Avengers: Doomsday's initial cast alone is 27-deep – and the pressure to deliver two more top-tier Avengers movies is most certainly on.
The release of Spider-Man's next MCU adventure could be pushed back, too (Image credit: Sony Pictures/Marvel Entertainment)Their release date postponements also raise other potential issues.
For starters, Doomsday and Secret Wars' delay could have a significant impact on Spider-Man: Brand New Day. The webslinger's next big-screen adventure was set to arrive between the pair, with its initial launch date penciled in for July 24, 2026. Spider-Man 4 suffered its own release setback in February, but its launch was only delayed by a week to July 31, 2026.
The big question now is whether Brand New Day will swing into cinemas on that revised date. Depending on which online rumors you believe, Spider-Man 4 will either be a multiverse-style movie like Spider-Man: No Way Home was, or a more grounded, street-level flick.
If it's the former, and if Brand New Day's plot is dependent on events that occur in, or run parallel to, Avengers: Doomsday, the next Spider-Man movie's launch date will likely have to be pushed back again.
Should Brand New Day be moved into 2027, we could see a repeat of 2023 when only one MCU film – Deadpool and Wolverine – landed in theaters, with 2026's sole Marvel movie being Doomsday. That's on the basis that Avengers 5, aka the second Marvel Phase 6 film, doesn't suffer another release date setback.
Will Marvel decide to move some of its 2025 Disney+ offerings into early 2026? (Image credit: Marvel Television/Disney Plus)These delays could have a huge knock-on effect for Marvel's small-screen offerings, too.
If Brand New Day keeps its mid-2026 launch date, a whole year will have passed between the final MCU film of 2025 – The Fantastic Four: First Steps, which arrives on July 25 – and Tom Holland's next outing as Peter Parker's superhero alias. That's not necessarily a bad thing, but it means MCU devotees will look to Disney+, aka one of the world's best streaming services, for their Marvel fix.
Fortunately, Marvel has plenty of TV-based MCU content in the pipeline. From Ironheart's release in late June to Daredevil: Born Again season 2's launch next March, there are currently five live-action and animated series set to debut on Disney's primary streamer.
In light of Doomsday's delay, though, will Marvel tweak its Disney+ lineup and further spread out its small-screen content to fill the void?
Right now, Born Again's second season is the only series confirmed to arrive in 2026. There are other shows in the works that are expected to debut next year, but they aren't likely to be ready until mid- to late 2026. To offset a potentially months-long barren spell in the MCU that Doomsday's delayed release has caused, Marvel might opt to push animated series Eyes of Wakanda or Wonder Man, the final live-action MCU TV show of 2025, into early 2026.
I guess we'll find out more about any further release-schedule changes when Marvel takes to the Hall H stage for its now-annual presentation at San Diego Comic-Con, the 2025 edition of which runs from July 24-27.
You might also likeGrilling usually involves burning fossil fuel. But some manufacturers are offering electric grills and citing climate change and convenience as reasons to switch.
(Image credit: Jeff Brady)
The U.S. has officially accepted a luxury jetliner from Qatar as a gift, and slated it to become a new Air Force One. Experts say that overhaul could take years and cost hundreds of millions.
(Image credit: Roberto Schmidt)
Tush pushes, prison breaks, luxury jets and orange cats: This week's quiz is the usual potpourri of the silly and sublime. Actually, not the latter.
Nina Badzin, host of a friendship podcast, explains why staying friends with people from our past matters — and how to nurture relationships with old friends across time and distance.
Loving Day, the landmark case that overturned U.S. state laws against interracial marriage, is on June 12. NPR wants to hear from people who celebrate this day.
(Image credit: Kevork Djansezian)
Today, artificial intelligence is revolutionizing virtually every industry, but its rapid adoption also comes with a significant challenge: energy consumption.
Data centers are racing to accommodate the surge in AI-driven demand and are consuming significant amounts of electricity to support High-Performance Computing, cloud computing services, and the many digital products and services we rely on every day.
Why are we seeing such a spike in energy use? One reason is heavy reliance on graphics processing unit (GPU) chips, which are much faster and more effective than processing tasks. More than just an advantage, this efficiency has now made GPUs the new standard for training and running AI models and workloads.
Yet it also comes at a high cost: soaring energy consumption. Each GPU now requires up to four times more electricity than a standard CPU, an exponential increase that is quickly – and dramatically – changing demands for energy in the data center.
For example, consider these recent findings:
The New York Times recently described how OpenAI hopes to build five new data centers that would consume more electricity than the three million households in Massachusetts.
According to the Center on Global Energy Policy, GPUs and their servers could make up as much as 27 percent of the planned new generation capacity for 2027 and 14 percent of total commercial energy needs that year.
A Forbes article predicted that Nvidia’s Blackwell chipset will boost power consumption even further – a 300% increase in power consumption across one generation of GPUs with AI systems increasing power consumption at a higher rate.
These findings raise important power-related questions: Is AI growth outpacing the ability of utilities to supply the required energy? Are there other energy options data centers should consider? And maybe most importantly, what will data center’s energy use look like in both the short- and long-term future?
Navigating Power Supply and Demand in the AI EraDespite growing concerns, AI has not yet surpassed the grid’s capabilities. In fact, some advancements suggest that AI energy consumption could even decreases. Many AI companies expended vast amounts of processing power to train their initial models, but newer players like DeepSeek now claim that their systems operate far more efficiently, requiring less computing power and energy.
However, AI’s sudden rise is only one factor in a perfect storm of energy demands. For example, the larger electrification movement, which has introduced millions of electric vehicles to the grid, and the reshoring of manufacturing to the U.S., is also straining resources. AI adds another layer to this complex equation, raising urgent questions about whether existing utilities can keep pace with demand.
Data centers, as commercial real estate, are also subject to the age-old adage, “location, location, location.” Many power generation sites – especially those harnessing solar and wind – are located in rural places in the United States, but transmission bottlenecks make it difficult to move. That power to urban centers where demand is highest. Thus far, geodiversity and urban demand have not yet driven data centers to these remote areas.
This could soon change. Hyperscalers have already demonstrated their willingness and agility in building data centers in the Arctic Circle to take advantage of natural cooling to reduce energy use and costs. A similar shift may take hold in the U.S., with data center operators eyeing locations in New Mexico, rural Texas, Wyoming, and other rural markets to capitalize on similar benefits.
Exploring Alternative Energy SolutionsAs strain on the grid intensifies, alternative energy solutions are gaining traction as a means of ensuring a stable and sustainable power supply.
One promising development is the evolution of battery technology. Aluminum-ion batteries, for example, offer several advantages over lithium-based alternatives. Aluminum is more abundant, sourced from conflict-free regions, and free from the geopolitical challenges associated with lithium and cobalt mining. These batteries also boast a solid-state design, reducing flammability risks, and their higher energy density enables more efficient storage, which helps smooth out fluctuations in energy supply and demand – often visualized as the daily “duck curve.”
Nuclear energy is also re-emerging as a viable solution for long-term, reliable power generation. Advanced small modular reactors (SMRs) offer a scalable, low-carbon alternative that can provide consistent energy without the intermittency of renewables.
However, while test sites are under development, SMRs have yet to begin generating power and may still be five or more years away from large-scale deployment. Public perception remains a key challenge, as strict regulations often require plants to be situated far from populated areas, and the long-term management of nuclear waste continues to be a concern.
Additionally, virtual power plants (VPPs) are revolutionizing the energy landscape by connecting and coordinating thousands of decentralized batteries to function as a unified power source. By optimizing the generation, storage, and distribution of renewable energy, VPPs enhance grid stability and efficiency. Unlike traditional power plants, VPPs do not rely on a single energy source or location, making them inherently more flexible and resilient.
Securing a Sustainable Power Future for AI and Data CentersWhile it’s hard to predict what lies ahead for AI and how much more demand we’ll see, the pressure is on to secure reliable, sustainable power, now and into the future.
As the adoption of AI tools accelerates, data centers must proactively seek sustainable and resilient energy solutions. Embracing alternative power sources, modernizing grid infrastructure, and leveraging cutting-edge innovations will be critical in ensuring that the power needs of AI-driven industries can be met – now and in the years to come.
We list the best web hosting services.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
People blame gun violence on different things depending on their political leanings. But Jens Ludwig, an economist at the University of Chicago, has found a different reason behind it. Today, we bring you a story on solutions to gun violence.
Today's StoryCorps is about a love that lasted through the seasons. Patrice Hudson was apprehensive about online dating until she met Byron Ball, a high school science teacher who, like her, was a single parent and had been married before.
For SaaS businesses eyeing a successful exit, particularly when engaging with sophisticated Private Equity (PE) and tech investors, the era of simply showcasing impressive top-line growth is over.
Today, data reigns supreme. It's the bedrock upon which compelling value stories are built, the lens through which operational efficiency and scalability are scrutinized, and ultimately, the key to unlocking those coveted higher valuation multiples.
A robust data strategy, coupled with the ability to extract meaningful insights, is no longer a ‘nice-to-have’ but a fundamental requirement for securing a lucrative exit in today’s competitive landscape.
What investors are looking forSo, what exactly are these discerning investors looking for in the data of a prospective SaaS acquisition? The foundation, without a doubt, remains the ARR bridge, or what can be referred to as the ‘revenue snowball’. This isn't just about presenting a static ARR figure; it’s about demonstrating how that recurring revenue has evolved over time. Investors will dissect this data from every angle – group-wide, segmented by product, customer cohort, and geography.
They want to see the trajectory, understand the drivers of growth and churn, and identify any potential vulnerabilities. Therefore, your ARR bridge needs to be more than just a spreadsheet; it needs to be a dynamic, drillable, and rigorously stress-tested tool that can withstand the intense scrutiny of due diligence.
Beyond the ARR bridge, several other key insights are paramount. Sales pipeline reporting provides a crucial forward-looking perspective. Investors want to see a healthy, well-managed pipeline with clearly defined stages, realistic conversion rates, and accurate forecasting. This demonstrates the predictability and sustainability of future revenue growth. Similarly, classic FP&A reports remain essential, offering a historical view of financial performance, profitability trends, and cost management.
However, some SaaS firms are now also looking to leverage product usage insights to a greater extent than ever before. Understanding how customers are interacting with the platform, identifying power users, and tracking feature adoption provides invaluable insights into customer stickiness, potential for upselling, and overall product value.
Looking aheadLooking ahead, the role of data in shaping SaaS valuations will only intensify. We anticipate that the level of scrutiny and the expectation for data maturity and insightful analysis will continue to rise. Gone are the days of presenting high-level metric summaries; investors will increasingly demand granular insights and a clear understanding of the ‘why’ behind the numbers. When it comes to performance and trends; just saying profitability has grown by X% year on year is now not enough - it needs to be evidenced by granular data and solid analytics.
Investors want to know what’s working now and how your company can scale post-acquisition. By providing the context behind the metrics, it makes it easier to showcase opportunities for further growth, with potential investors being able to leverage these data “assets” to underpin their investment cases. With higher investor expectations, those who fail to do so risk undermining their valuation potential or, worse still, failing to secure the deal.
Furthermore, I believe that companies will need to start demonstrating how they are leveraging data to capitalize on the value that advanced analytics can bring. This could range from using AI-powered analytics to identify at-risk customers to employing machine learning to drive new business growth and customer expansion.
Even while there may be applications of AI tools in the SaaS space that aren’t necessarily tied to a firm’s data, most of these revenue-driving applications of advanced analytics and machine learning are only possible when the fundamentals are already firmly in place.
Building compelling valueSo, how can SaaS firms proactively use data to build a compelling value story that resonates with potential acquirers? It boils down to not just making data a strategic priority but building the data policies, expertise and infrastructure you need into the fabric of your SaaS business.
Everything does not have to be in place from day one, rather you need to create a strategy that will enable you to ramp up to gathering all the critical data points you will need to answer every question an investor will ultimately ask. Doing this also lays the foundations to take advantage of the latest generative AI advances. As mentioned, AI applied to a shaky data foundation is unlikely to get you results, but applied to the right data foundations can transform the value of your business.
Luckily, the data points that PE firms and other potential investors now really value are the same insights that will make a fundamental improvement to how effectively you make decisions as your SaaS startup scales. The important thing to remember with any data project is to start with the questions you want to answer. This means understanding modern investors. Ask yourself, what metrics, beyond simple revenue figures, will tell the story of your company’s success and potential?
Aside from the core metrics already mentioned, it could be there are further opportunities to demonstrate differentiation. It could be the diversity of your customer base - both geographically and by sector. It could be that the cost of serving an additional customer and the automation of key processes can provide compelling evidence of scalability.
When you have a clear picture of where your real strength and USP exists, the next step is to develop the data collection, management and analysis systems and policies that will prove what you know to investors.
Further down the lineFurther down the line it’s likely that there will also be a strong business case for investment in upskilling and retraining staff across the board
This should include everyone, including all senior teams. Even today, it still surprises me how few founders and business owners can understand and interpret their core business data, instead relying on a handful of experts. After all, it’s impossible to know what you don’t know – and a second-hand account of somebody else’s understanding, no matter how advanced it may be, could never substitute for your own personal analysis.
By building up your own expertise now, you and your senior team will be best positioned to demonstrate a compelling equity narrative that results in the highest possible valuation at the point of exit.
We've compiled a list of the best business intelligence platforms.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro