Everyone’s racing to optimize their business with AI.
Turns out, deploying it is much more complicated than companies anticipated.
What’s happening?
S&P Global’s 2023 AI report found that 69% of surveyed organizations have at least one AI project in production.
Most orgs believe AI and machine learning projects will drive up revenue by improving the quality of their products/services along with customer satisfaction, increasing their rate of innovation – and more.
Meanwhile, a third of respondents are leveraging AI instead for its potential to lower costs by identifying and resolving operational and IT inefficiencies.
But here’s the kicker: Though the consensus is that AI implementation holds great promise, most aren’t optimized to do so.
The number one reason why is data management, according to the report.
Let’s break this down: AI relies on large volumes of data. Companies have that data, they just have it in multiple formats – from structured data to real-time data to semi-structured IT system data to unstructured rich media data.
With disparate data sets, standardization becomes a major challenge. Many orgs are also struggling with insufficient IT infrastructure and data governance, making the road to AI deployment that much harder.
When you look beyond data, companies are citing compute performance (i.e. how quickly and efficiently a processor can perform calculations and execute tasks) and security as key blockers to their AI efforts.
This doesn’t even cover the issue businesses may face once they reach the deployment phase, as transitioning to production environments that are designed for large datasets can come with its own set of challenges.
Wait, There’s More
With AI comes environmental concerns.
Stanford’s AI Index Report found that Chat GPT-3 released over 500 tonnes of CO2 emissions in 2022. To put it in perspective, the average human uses 5.51 tonnes in one year.
For companies with sustainability goals, the question is, can they balance them with their need for innovation? A recent experiment by Google’s DeepMind team suggests they can.
At two experiment sites, the team was able to achieve energy savings of 9% and 13% respectively, by using AI to control commercial cooling systems.
When applied to AI deployment, the report suggests that companies with modern data architecture will be able to curve negative impacts to sustainability performance.
In other words, though training AI is harmful for the environment, companies leveraging the right infrastructure will be able to limit this effect.
What about talent?
On one hand is data. On the other is the workforce.
In 2022, the MIT Technology Review Insights survey found that when it comes to deploying AI, the greatest challenge for companies between $1 and $500 billion in revenue is hiring skilled workers.
It’s not enough to have AI and machine learning experts, organizations need translators who can turn business requirements into technical solutions.
As for non-AI roles, businesses need to engage them too, as their input will be valuable in scaling these projects.
“When you deploy machine learning, it’s not just about getting technical trust in the system as an engineer, it’s also about the people who use them actually trusting it as well,”said Aleksander Mądry, director of the MIT Center for Deployable Machine Learning. “This involves guidance and empowerment so that they can confirm the predictions, or intervene when they think something is wrong.”
TL;DR
In the race to deploy AI, two things have to be true to succeed.
Data should be the priority – not just having it, but setting up the proper infrastructure to collect and standardize it. Second, invest in talent.
Thoughtful investments in both are what will separate successful companies from the rest.