IT decision-makers doubt current data architectures will meet future model inferencing requirements

As companies look to expand their use of artificial intelligence (AI) and machine learning (ML) to keep up with the demands of their customers, they are facing hurdles getting these projects to production and ultimately deliver the desired results to their bottom line.

model inferencing requirements

88% of AI/ML decision-makers expect the use cases that require these technologies to increase in the next one to two years, according to a study conducted by Forrester Consulting. The research looked at the challenges keeping decision-makers from their desired transformation when deploying ML to create AI applications.

The study revealed that companies are developing increasingly more models based on real-time data. Still, more than 40% of respondents believe their current data architectures won’t meet their future model inferencing requirements.

64% of decision-makers say their firms are developing between 20% to 39% of their models on real-time data from data streams and connected devices. As teams develop more models on real-time data, the need for accuracy and scalability is becoming increasingly critical. Significantly, thirty-eight percent of leaders are developing roughly a third of models on the real-time spectrum.

Model inferencing requirements challenges

  • Nearly half of decision-makers cite reliability (48%) and performance (44%) as their top challenges for getting models deployed with their current databases. Equally concerning was the revelation that 41% of respondents believe their databases cannot meet the necessary data security and compliance requirements.
  • To achieve the benefits that AI/ML promise, survey respondents said that locating models in an in-memory database would solve key hurdles currently standing in their way. According to the survey, the benefits architecturally would allow firms to prepare data more efficiently (49%), improve analytics efficiency (46%), and keep data safer (46%).

As Forrester Consulting concludes, “AI powered by ML models mustn’t slow down applications by necessitating a network hop to a service and/or microservice for an application to use an ML model and/or get reference data. Most applications, especially transactional applications, can’t afford those precious milliseconds while meeting service-level agreements (SLAs).”

“Companies are embracing AI/ML to deliver more value for their mission-critical applications, yet need a modern AI/ML infrastructure to support real-time serving and continuous training. There are still gaps that impede companies from making existing applications smarter and delivering new applications,” said Taimur Rashid, Chief Business Development Officer at Redis Labs.

Don't miss