Unwrapping the key insights on AI strategies from SuperCompute 2024

Supercomputing 2024 blended innovation with thought leadership, exploring realistic themes around enterprise AI challenges and inherent complexities.

The event showcased cutting-edge technologies and discussions, signaling a shift from experimentation to operational deployment in enterprise AI strategies.

“We’re seeing that the enterprise spending is growing on generative AI, and I think we’re moving beyond the pilot and POC phase that the customers have been in. Now they’re looking for that ROI and they’re looking to justify that spend,” said Chad Dunn (pictured), vice president of product management, artificial intelligence and data management at Dell Technologies Inc. “We see customers trying to get to a consistent infrastructure stack to run generative AI. They need something that not only fulfills their AI needs but also is something that IT can support long term.”

Dunn spoke with theCUBE Research’s Dave Vellante for post-event coverage of SC24, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed how companies can unlock new AI strategies to fulfill its full potential while navigating the complexities of a rapidly evolving landscape. (* Disclosure below.)

Gen AI evolves with new data challenges

AI is no longer that shiny, new idea. It has entered the implementation phase, where companies demand real operational value and returns on investment. This shift is driven by the need for scalable, consistent infrastructure capable of supporting long-term IT demands. Another critical concern is security. Companies are reassessing their AI frameworks to ensure responsible and secure operations, according to Dunn.

“We’re now seeing people think much more about security than they had in the past,” he said. “We had this mad dash to the results of generative AI. Now we’ve got to go back and make sure that we’re securing it, that we’re operating it responsibly. And sustainability is also there. No surprise that this sort of technology takes a lot of power and there’s a lot of work between us and our other partners to help our customers stay green as they adopt generative AI.”

Data is the linchpin for successful AI operations, and the challenges are manifold: from discovering relevant data to ensuring data hygiene, maintaining proper formats and upholding governance. Poor-quality data input results in poor outputs and significant resource wastage, Dunn added.

“Customers are challenged with discovering the data that they want to use, the hygiene of the data, the format of the data and the governance of the data,” he said. “If it’s garbage going in, it’s going to be really expensive garbage coming out after what you spend on generative AI. So, data prep, data hygiene and making sure that you have the highest quality data going into the process is important. Data governance is becoming everyone’s job in the company now, not just a group of people who are tasked with looking after the data.”

Leveraging ecosystem partnerships to simplify AI strategies

The rapid evolution of gen AI tools and ecosystems has introduced significant complexity. To combat this, advancements in software are making AI more accessible to enterprises. Frameworks such as Llama Stack, Semantic Kernel, and LangChain are abstracting the intricacies of AI applications, enabling companies to adopt and operate these technologies more easily, according to Dunn.

“If I go back a year and look at what a stack would look like, there are a lot of components there that a customer needs to know how to operate,” he said. “Now I think we do a good job at the AI Factory of doing a lot of that hard work for them, but software is also starting to catch up.”

To address these challenges, companies are investing in advanced data preparation tools and governance systems. Dell’s Data Lakehouse initiative, for example, preserves the permissions of source systems while integrating structured and unstructured data. The focus is on creating a robust foundation for enterprise AI, enabling seamless and secure data integration across platforms, according to Dunn.

“I think you’re going to see the model ecosystem continue to evolve,” he said. “We’re going to continue to see the larger and larger LLMs, but at the same time look for companies to innovate on small, more efficient models that are very purpose-built for the use case at hand.”

Dell’s emphasis on partnerships has become central to its AI strategies for addressing the fast-paced nature of the gen AI market. Collaborations with companies such as Meta Platforms Inc. (through Llama Stack) and Hugging Face are helping Dell offer comprehensive solutions tailored to customer needs.

Here’s theCUBE’s complete video interview with Chad Dunn, and make sure to catch all of theCUBE’s SC24 coverage:

(* Disclosure: Dell Technologies Inc. sponsored this segment of theCUBE. Neither Dell nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU

Related Content

Here are Nintendo’s Boxing Week 2024 deals

These are the cybersecurity stories we were jealous of in 2024

Tesla Superchargers: GM, Ford, Rivian, and other EV brands with access

Leave a Comment