Advantech Europe

Tintri’s 2024 predictions

Author : Phil Trickovic, Tintri

02 February 2024

Phil Trickovic, Senior Vice President of Revenue, Tintri
Phil Trickovic, Senior Vice President of Revenue, Tintri

Tintri, specialist in data management solutions for virtualised workloads, shares its predictions for 2024.

1. The flexibility and segmentation of the AI stack will be driven by virtual machines 

2024 will see the first new ‘killer’ generative AI and inference-based critical applications at enterprise scale. We have spent more than a decade training neural networks to perform many complex tasks. Training modules are typically developed and executed outside of production application environments. Most often with non-‘production’ SLAs. Private AI in 2024 will unlock enterprise data and protect corporate IP from leaking. 

This is one of the top three gating factors to broader adoption of AI within the enterprise.

Where will deployed apps live? Virtual machines. VMs will be critical to accelerating the deployment phase of AI inference workloads. Once trained, we will see the “trained applications” reside on a VM topology whilst the ‘big iron’ will find its home continually training the models.

Mark Walsh, Tintri’s VP EMEA adds: “In Europe, we are seeing the adoption of VMs for production roll outs, sandbox, and CICD workloads. A VM-inclusive topology allows our customers’ data scientists to interact and deploy within traditional production application roll-outs. We have seen massive improvements to our customers' workflows. All signs point to a big ramp-up in 2024.”

2. Inference takes flight in 2024

Highly-trained inference applications will flood the market in 2024. These innovations require optimised platforms with complex distribution networks. Properly architected, order of magnitude gains in efficiency shall be realised. The long-term economic benefit to the bottom line is undeniable. 

To deliver these business-critical applications, we will see a new evolution of application delivery platforms and progressive availability schemas. 2024 will bring an explosion in platform requirements not witnessed since the late 1990s. 

Brock Mowry, Tintri CTO & VP of Products, notes: “In 2024, the platform demand for generative AI applications will drive a new version of the “enterprise-ready” platform. Enterprise IT platforms will need to provide quick and powerful instances to support production, development, and playground environments. As a consequence, these instances will become the new challenge of workload sprawl within the enterprises. The successful avoidance of such sprawl will dictate the business winners.”

It’s a fact that our world is written in the key of ‘do more with less’. This is no less true in the world of AI. Inference applications are resource intensive. Paramount is the consideration of how businesses will compete when immense and high price systems are required to be successful.

Today, AI processes and application management require highly-specialised operators, complicated networking, complex co-compute needs, and complex storage features. To that end, we anticipate 2024 to be a year of extensive evolution in our current application and data delivery global initiatives. 

3. Controlling the cost of production application management in 2024

We are staring into the infinitely-expanding abyss of new and quite complex requirements. These innovative workflows will require new I/O paths, massively-distributed compute distribution methodologies, scale-up bandwidth requirements, and enormous storage management demands. That’s before we get to the actual application deployment. Sometimes it stares back… Indeed.

Microservice architectures are by nature highly efficient when deployed on purpose-architected platforms. Market beware, the converse is true and can be a progress killer on your journey. Scope early, build wisely. 

Brock Mowry, Tintri CTO & VP of Products, notes: “Highly-autonomous, evolved container platforms will play a crucial role in AI infrastructure in 2024, allowing for increased efficiency and flexibility, reducing total application TCO. However, the complexity of container infrastructure will pose a significant challenge for many enterprise IT teams. To bridge this knowledge gap, AI platform vendors will need to invest in user-friendly solutions and educational resources.”

“In 2024, the ability to manage and deliver resource granularity will be paramount. Monolithic, legacy-based platforms will be replaced by ‘workload’ architected public and private AI platform solutions. Massive efficiencies vs. legacy are to be gained by employing this strategy.”

4. The evolution of security in 2024

As the availability of AI increases, so will the number, and more importantly, the increased capabilities they will have at hand. 

Brock Mowry, CTO & VP Products, adds: “Bad actors are doing their R&D. 2024 will see new and undiscovered attack vectors leveraging the intelligence and power of AI. This will be the new adversary of blue teams in the enterprise.”

The age-old game of cat and mouse has been greatly enhanced, and we will have our collective hands full in 2024. It’s a brave new world!


Contact Details and Archive...

Print this page | E-mail this page


Stone Junction Ltd

This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.