The time for talk is over. After two years of exploring the potential use cases, growing numbers of organizations are beginning to adopt generative AI (GenAI) to drive tangible business value. Gartner reports that investment in these technologies will continue to rise in the coming months — driving global IT spend to almost USD 6 trillion in the next year.
CIOs are keen to progress beyond the proof-of-concept stage and start putting GenAI to work. Although exciting new capabilities and use cases are emerging on a daily basis, GenAI needs to be built on firm foundations to deliver results. The teams charged with coming up with ideas on how GenAI can be used – and the leaders signing off on their investments of time and money need a solid understanding of how it works. First and foremost, however, they need to focus on making sure they have the data required to fuel the successful adoption of Gen AI tools.
Covering the basesFrom Microsoft leadership teams to US courtrooms, experts are sounding the alarm: with AI, ‘garbage in = garbage out’. If they fail to heed these warnings, organizations will not unlock the benefits they are expecting. Before investing time and money into adopting new use cases for GenAI, organizations need to get the data in place to enable it to succeed. Specifically, they need to cover four core main bases:
1) Modernize existing data
First, organizations need to transform the existing data sets that will be used to train models and drive insights. They need to map and analyze their current data to understand the existing landscape, then use a mix of data warehousing and data lakes to lay the foundations for a robust architecture. They also need to consider the data aggregation, storage, and retrieval requirements, to ensure they can conduct analytics in real time. Data modernization projects can take years to complete, but there is no time to waste – they must be completed in a matter of months.
2) Identify and ingest new sources of quality data
Next, they need to enrich existing data with external insights to add crucial holistic context to supercharge AI. To date, ingesting external data sets has been a time-consuming process, but cloud-based Extract, Transform, Load (ELT) solutions can automatically create pipelines. This enables organizations to quickly bring in reliable data sets that can put them on the path to unlocking deeper insights to fuel their AI use cases.
3) Proactively remove any bias
Next, organizations need to review the entire data landscape to ensure it is clean. They need to be certain their data can be trusted to inform their AI, driving it to make the right decisions. It’s crucial that they identify and remove any unintended biases that might emerge if they feed this data into their AI. By stepping back to consider the potential biases that could arise in their AI use cases before deploying them, they can head off the likelihood of these problems arising in advance.
4) Ensure visibility to underpin data quality and governance
Finally, organizations must eliminate silos, unifying data with end to end visibility to create a single source of truth. AI will not be reliable and accurate if fed with conflicting data - so they must be able to identify confusing conflicts, and remove them. Data evolves over time, which means it is important to maintain visibility over who has changed or added data, and why. This traceability will help identify and overcome potential mistakes, for example, if synthetic training data has been accidentally used for real-world decision-making.
Increasing AI literacy to capitalize on the opportunityThis data provides the raw materials, but it needs to be used in the right way to drive GenAI success. Building knowledge across the business will enable teams to identify use cases that can really generate value. Multiple departments could potentially benefit from GenAI in different ways, so it’s crucial to start with a clear vision and objective in mind. Organizations that invest budget and manhours in training will likely be rewarded with use cases that enable them to confidently deploy GenAI in ways that unlock the fastest ROI.
To enable this, leadership teams must also have a solid level of AI literacy and data literacy. Business leaders need understand how traditional and GenAI models work and how underlying data and training can influence the inferences presented by these models. This will give them a deeper appreciation of the recommendations coming out of an AI based solution in the context of the business use case and they will find themselves in a much better position to accept or decline such recommendations. This is the whole point of the “human in the loop” which is a key factor in the success and acceptance of AI based solutions.
Building on the foundations for successful adoptionBy laying solid data foundations, empowering teams to uncover use cases and ensuring leaders can green-light the right projects, organizations will be on the path to successful GenAI adoption. The opportunity is very exciting, and evolving at a rapid pace, so there is no time to lose. CIOs just need to balance the need for speed with a firm focus on making sure none of the corners are cut. Taking time to lay solid foundations will put them on course for successful GenAI adoption that will unlock value and benefit many different teams across the business.
We list the best project management software.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
The growing popularity of Macs and MacBooks in enterprises can in part be attributed to their “secure by design” reputation. And generally, macOS is considered a safe platform, a view widely shared across the tech community.
Although macOS is widely perceived as more secure than Windows, 2024 revealed a worrying trend – a notable increase in Mac-targeted threats. From infostealers like Amos Atomic and Poseidon to advanced nation-state campaigns like BeaverTail and RustBucket, threat actors are exploiting macOS design elements to compromise corporate environments.
An over-reliance on the security mechanisms built-in to macOS can leave organizations vulnerable to attacks, so it’s key for organizations to recognize these risks and understand how to mitigate them effectively.
The Rise of macOS crimewareThere is a growing concern about the presence of malware on macOS, a problem that was relatively minor ten years ago. One contributing factor is the increased prevalence of Macs in business environments, a significant shift from the late 2010s, that has made them more attractive to attackers.
Threat actors have realized there is money to be made from Mac users. As a result, cybercriminals are increasingly targeting them, recognizing the value of these devices for conducting malicious activities.
Additionally, there are more targeted attacks in business environments. Beyond general attacks, Mac users in business environments face targeted attacks from sophisticated threat actors who aim to steal sensitive company data or disrupt operations.
Today, there are more threats to Macs than ever before, but awareness of these threats remains low. In contrast, most Windows users are generally aware of the need for the best antivirus software. However, Mac users often believe their devices are safe by design, a misconception that needs to be reconsidered given the current threat landscape.
Mac myth-bustingWhile the myth that “Macs don’t get malware” has been thoroughly debunked, a lingering perception persists that macOS is inherently safer than other OSes. This belief stems from comparisons to Windows, which faces a staggering volume of malware, but it doesn’t mean that threat actors aren’t actively targeting Macs, too.
2024 saw a significant uptick in macOS-focused crimeware. Infostealers-as-a-service, such as Amos Atomic, Banshee Stealer, Cuckoo Stealer, Poseidon and others, represent a significant portion of these threats. These tools are designed for quick, opportunistic attacks, aiming to steal credentials, financial data, and other sensitive information in one fell swoop.
Amos Atomic, which reportedly began as a ChatGPT project in April 2023, has quickly evolved into one of the most prominent Malware-as-a-Service (MaaS) platforms targeting Mac users. Initially a standalone offering, Amos Atomic has splintered into multiple variants, including Banshee, Cthulu, Poseidon, and RodrigoStealer. These versions are now developed and marketed by competing crimeware groups, spreading rapidly and affecting businesses throughout 2024.
What sets this malware family apart is its shift in distribution tactics. Instead of focusing on cracked games or user productivity apps, it now spoofs a wide range of enterprise applications, significantly broadening its reach and posing a greater threat to corporate environments.
Safe – or unsafe – by design?For convenience, Apple designed Macs so that a single password could be used to unlock the device and allow administrator functions. This means that by default, the same password is used for logging in, installing software, and unlocking the Keychain – the database built into macOS that stores other passwords, including online credentials saved in the browser, application certificates, and more.
In addition, a built-in AppleScript mechanism makes it easy for attackers to fake a legitimate-looking password dialog box. Malware that successfully spoofs a password dialog box to install a fake program is then able to access all the sensitive data stored in the Keychain.
This straightforward yet effective approach is widely adopted by the rash of infostealers currently plaguing macOS businesses and home users. Given how deeply these features are integrated into the system itself, this technique is unlikely to be mitigated by Apple any time soon.
Advanced adversaries: Staying hidden in plain sightRather than the quick-hit tactics of smash-and-grab infostealers, advanced adversaries such as nation-state actors also aim to persist on the device over time. Their goal is to maintain long-term access to compromised devices, often for espionage or other high-value objectives. With Apple introducing user notifications for background login items in macOS Ventura, attackers have adapted by exploring new ways to remain undetected.
Common techniques include trojanizing software, which consists of compromising popular or frequently used applications to ensure the malicious code runs regularly. This can involve infecting development environments such as Visual Studio and Xcode with malicious payloads.
Additionally, leveraging Unix components, threat actors are exploiting overlooked command line elements like zsh environment files (“.zshenv” and “.zshrc”), which execute whenever the user opens a new terminal session, granting the attacker persistent access to the system.
Such tactics underscore the importance of scrutinizing trusted applications, development tools, and the underlying command line environment.
Defensive strategies for organizationsTo protect against the rising tide of macOS threats, organizations should implement proactive and comprehensive security measures. Key defensive strategies include:
The perception that macOS is inherently more secure can create a dangerous blind spot for organizations. Macs are not necessarily more “secure by design” than any other computing platform, and the evidence from 2024 demonstrates that threat actors are increasingly targeting them.
Organizations must treat macOS as a primary target in their security strategy, adopting a layered defense approach and educating users about the risks.
By recognizing and addressing these vulnerabilities, organizations can mitigate the risks of betting too heavily on macOS security – and avoid becoming sitting ducks for the next wave of attacks.
We list the best antivirus software for Mac.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
A little-known device maker is looking to address the growing concerns about smartphone surveillance as modern devices collect and share extensive user data to build digital profiles.
BraX is working to launch BraX3, a business smartphone designed for those who value their privacy above all else - it runs iodéOS, a de-Googled, open source Android 14-based alternative operating system that blocks ads, trackers, and unwanted data sharing.
Instead, the BraX3 uses dedicated privacy-focused servers for essential services, providing safe browsing with privacy-first search engines like Qwant, Brave, and Ecosia.
The most privacy-friendly smartphone yet?The BraX3 also employs Lunar Network for geolocation, blending GPS for outdoor navigation and a secure indoor network-based location service. With no Google identity required, users can enjoy complete anonymity.
It also includes an internet traffic analysis tool, which visualizes who is accessing your data, how much is being collected, and where it is sent. By restricting intrusive applications and ensuring only privacy-compliant apps are used, BraX3 minimizes data exposure without sacrificing functionality.
As for hardware, the BraX3's modular design allows users to replace parts using standard tools, with spare parts available for six years post-launch.
It offers a 6.56-inch HD+ display which supports 90Hz refresh rate, and a pixel density of 280 xhdpi. Under the hood, it boasts the Dimensity 6300 processor (Octa-core, 2.4GHz, 6nm), paired with 8GB of RAM and 256GB of storage. A 5,000mAh battery with 10W charging ensures lasting performance.
It comes with a 50MP camera on the rear as well as a 5MP front camera for selfies. This device features a fingerprint sensor, NFC, Bluetooth 5.2 and multiple 4G and 5G bands.
Crucially, it also supports an eSIM for international travel, dual SIM, and a MicroSD slot.
While the porting process may face delays, BraX hopes that power users will have the option to run Ubuntu Touch, offering an independent app store and Terminal access.
The BraX3 is available for pre-order for $299.00 via crowdfunding platform Indiegogo. With 2,792 backers at press time, this alternative business smartphone signals a rising demand for tech that prioritizes privacy and the right-to-repair.
You might also likeAlthough the likes of Pure Storage, IBM, and Meta believe the writing is on the wall for hard drives, the technology doesn’t look like it will be going away any time soon.
Seagate and its main rival Western Digital are working on magnetic recording methods that will allow the drives to continue increasing in capacity, helping them maintain a clear advantage over SSDs when it comes to storage density.
The main technology leading this charge is HAMR, or heat-assisted magnetic recording, which could see HDDs hitting incredible 100TB capacities. HAMR works by briefly heating the disk surface with a laser to make it easier to write data at higher densities. HDMR - short for heated dot magnetic recording - is HAMR’s likely successor and could lead to even larger drives by focusing the heat and magnetic energy into smaller, more precise areas for even denser data storage.
Not an unreasonable outlayIn a recent The Wall Street Journal article, John Keilman wrote an article covering Seagate’s “fight to store the world’s data”, and mentioned something which caught my attention. “Seagate said two large cloud-computing customers have each ordered one exabyte’s worth of HAMR storage, which works out to tens of thousands of hard drives.”
Keilman didn’t name names - Seagate wouldn’t have told him who the buyers were - but we can narrow the list of suspects down to the usual big US hyperscalers, including Apple, Oracle, Microsoft, Google, Amazon, and Meta. It’s possible that Chinese hyperscalers could have come shopping for the drives, but that seems unlikely to me.
Keilman doesn’t say what capacity drives were sold, but we can assume they will have been Seagate’s highest commercial HDD, the Exos M, which ranges from 30TB (CMR) to 36TB (SMR), with a breakthrough 3TB-per-platter density. Based on timing, it’s likely we’re talking about the 30TB models, as the 32TB drive was only added to the range in December 2024, followed by the 36TB model just a month later.
Assuming the hyperscalers in question paid bulk pricing of around $500 per drive (refurbished models of Seagate's Exos 28TB HDD can currently be purchased for as low as $365), their combined bill likely came to somewhere between $33 and $35 million. For a full exabyte of cutting-edge, high-capacity storage, $16 billion or so isn't an unreasonable outlay.
Seagate previously revealed that a 60TB drive was on its way, and the firm recently announced plans to acquire Intevac, a HAMR specialist, which could help it achieve that 100TB capacity goal faster, as well as ramp up HAMR drive production.
You might also likeIf you've read my previous articles, you should know that PC is my preferred option for gaming. Whether it's with handheld gaming PCs or a full-fledged desktop setup, I believe it offers the best gaming experiences possible with better performance and advanced graphics options.
There's also a much greater level of freedom PC players have over console players: lower game prices thanks to digital marketplaces, free multiplayer online access (which shouldn't even be a debate), and modding capabilities all play an integral part. I can't deny that PC gaming isn't cheap, but there also isn't much of an argument when looking at the likes of the PS5 Pro and its $699.99 / £699.99 / AU$1,199 price - as I've said before, you may as well start saving for a PC build at that price point.
However, my stance on that has changed at least for the time being - not because I think consoles are more powerful than most PCs, no - but because the current landscape of the GPU market is an absolute mess. Nvidia and AMD both launched new GPUs: the Blackwell RTX 5000 series and Radeon RX 9000 series respectively, and getting your hands on any of these graphics cards at MSRP (or even at all) is one heck of a mission.
From scalpers to retailers, you'll more than likely find yourself overpaying for a new mid-range or high-end GPU. Now it's worth mentioning that Intel is also in the mix with its Battlemage Arc B570 and B580 GPUs, but it's got some catching up to do against Team Green and Team Red in terms of performance capabilities and its XeSS upscaling method.
Ultimately, it means that if there was any opportunity for a large amount of console-only gamers (or even new gamers entirely) to join the PC platform, that chance is nearly dead in the water - and I don't see it getting better anytime soon.
GPUs are far too expensive and it's completely unreasonable (Image credit: Andrew Derr / Shutterstock)It's important I note that I absolutely don't expect powerful graphics cards to be cheap, especially considering the advancement of tech and power capabilities we've seen over the years from Nvidia and AMD. Game developers are now able to provide exceptional and immersive gaming experiences with hyper-realistic visuals, thanks to the power provided by GPUs like the RTX 5090.
This is even possible with midrange cards at high resolutions, with the help of upscaling tools like DLSS and FSR. What I do expect, however, is for hardware to be affordable, especially with less powerful products - and unfortunately, that's the complete opposite of what we're seeing.
Examples of this are evident with the RTX 5090 and RTX 5080: both of these Blackwell GPUs are high-end offerings, priced at $1,999 / £1,939 / AU$4,039 and $999 / £999 / AU$2,019 respectively, and will give gamers the best performance possible this generation. Those prices are arguably too high, particularly when the leap from the previous flagship GPU, the RTX 4090, is significant but perhaps not enough to pay another $400 above its $1,599 MSRP.
When you add scalpers, low stock, and hardware issues into the equation, it makes matters worse - and we've seen this occur since board partners are selling the GPUs at inflated prices. It means either you won't find a GPU to buy at all, or if you do, you'll more than likely be overpaying.
You might be thinking it's best to just buy a midrange GPU, but the exact same thing is happening there too. AMD's Radeon RX 9070 series GPU prices have seen a sudden hike - so instead of paying $599 / £569 / around AU$944 for the RX 9070 XT, you'll be paying much more.
It's also worth noting that PC games are released with bad optimization - so even if you do manage to find a powerhouse GPU without overpaying, you'll have to deal with bad performance and game-breaking bugs.
If I was a console-only player, I would stay away too... (Image credit: Mohsen Vaziri / Shutterstock.com)As a gamer who is on both PC and console, I can totally understand why most console players are hesitant when advised to build gaming PCs. Yes, I still think PC is the better platform and there's plenty of freedom to be had with your gaming experience - but if you can't even acquire the right hardware at affordable prices, what choice do you have but to stick with a console?
Again, you may end up building your desired gaming PC and then still be met with frustrating performance problems. While performance may not be as good on console, it's a manageable experience with stable frame rates that come without the need to tinker or mess around with settings.
I believe it's a big part of why handheld gaming options like the Steam Deck are so popular: gamers can simply select a game they want to play (with the help of Deck Verified) and dive in. It's not like that isn't the case on Windows PCs, but it's annoying dealing with regular performance drops due to bad optimization, along with Windows 11 and its game incompatibility problems.
Gamers just want to be able to purchase the right hardware and get their money's worth while gaming - and if PCs can't provide that right now because of GPU price inflation, then I can't blame anyone for sticking with a PS5 or Xbox Series X|S.
You may also like...If the rumors we've heard so far turn out to be accurate, we could be seeing the official launch of the Google Pixel 9a as early as next week – and fresh benchmarks that have appeared online give us some idea of the kind of performance we can expect from it.
These benchmarks come from tipster @KaroulSahil (via Notebookcheck), and are presumably from a device that's being tested somewhere, ahead of the full reveal. The stats include an AnTuTu score of 1,049,844, and Geekbench scores of 1,530 (single-core) and 3,344 (multi-core).
While that AnTuTu score is along the same lines as the existing Google Pixel 9 phones – which you would expect, given that the Pixel 9a is predicted to be running the same Tensor G4 processor inside – the Geekbench scores are some way short of the flagship phones that Google unveiled last August.
There could be a few reasons for this, with the primary one most likely to be that this is a Pixel 9a running pre-launch software that hasn't been properly optimized yet. There might be a few hardware tweaks that still need to be made too.
The price is right?Google Pixel 9a Benchmark result#Google #GooglePixel9a pic.twitter.com/3lZBobYt6gMarch 15, 2025
Given the history of this mid-range phone series – see our Google Pixel 8a review, for example – it's unlikely that we're going to be too surprised by what the Pixel 9a has to offer in terms of performance, when it finally shows up.
Typically with these phones, the internal specs have been comparable to the flagship models that came before them, while cost savings have been made in the design and materials. That makes them a more affordable choice if you don't want the most expensive Pixel phones Google has to offer.
As always, pricing is going to be crucial. The Pixel 8a launched for a starting price of $499 / £499 / AU$849, and it looks as though the 128GB model of the Google Pixel 9a is going to match that. However, we have also heard that the variant with 256GB of storage is going to cost a little more than its predecessor.
It seems there's a surprising design decision on the way that we're going to have to come to terms with: Google is apparently getting rid of the classic Pixel camera bump, so it will have a flatter back than the phones that came before it.
You might also like