BTEX 2020: How to Manage and Consolidate your AI Workloads

0
1351

“AI is everywhere,” says Thomas Henson, Business Development Manager, AI/Analytics at Dell Technologies, speaking at CDW’s BTEX 2020 virtual event. “It’s in the cars that we drive, in healthcare imaging and in any of our smart home devices. But where does this technology fit and what can we do to usher in that wave?”

Henson’s presentation highlighted the next generation of voice technology. “We’re starting to see this moving from home automation to where we’re actually seeing it in the enterprise,” highlighting call centres as a business that’s relying more on AI. “One of the ways we’re starting to see that being automated is with this next generation of voice. It’s so good now that it’s hard to derive whether you’re talking to a machine or to a human.”

3 key points in the history of machine communications

1950: Alan Turing develops the Turing Test to answer the question “Can machines think?”

1966: ELIZA, an early chatbot from MIT, demonstrates basic communication between humans and machines.

2010s: The emergence of voice assistants like Siri and Alexa.

“People have been going at this since 1950, but in the last 10 years, we’ve really seen some good input,” says Henson.

The key purpose of AI projects – how to extract value from data

“At Dell Technologies, we have a data-first approach,” says Henson. “We want to capture the data in a way that can be accessed by our traditional assets and can improve our traditional assets.”

“If you have a data-first approach, you can fuel, innovate and make better your assets within human capital, intellectual property and infrastructure. Understand what data elements you have, so you can make good decisions.”

The infrastructure costs of artificial intelligence

Business intelligence is typically measured in megabytes, gigabytes and terabytes. From a performance perspective, CPUs generally range in the hundreds, but can be much higher for large enterprise data warehouses.

When it comes to analytics and machine learning, i.e. anomaly detection, recommendation engines, predicting outcomes, typically this is semi-structured data, so you would see it measured in terabytes to petabytes, and typically in the hundreds and even thousands of CPUs.

When you get into deep learning, i.e. voice-to-text, image data, video data, this is fully unstructured data that is typically measured in petabytes to even exabytes. This is where it takes thousands of CPUs, or even GPUs, of performance to analyze and train those models using deep learning.

“What we’re trying to do here is solve a business problem,” says Henson. “We want to map that back to the business, find the right data elements and then use the right tools to build a solution. The challenge is how do we go from business problem to proof of concept (POC) and how do we scale that into production?”

Eliminate AI bottlenecks with Dell PowerScale

“Our PowerScale platform gives you the ability to build out a consolidated data lake, where you can build pipelines that are going to help you train your models faster,” says Henson. “The faster you can get to the answer you’re looking for, it’s going to solve that business problem.”

This can improve your data team’s productivity and maximize return on investment. “Whether you’re starting out from POC or scaling into production, your data can stay in place, and you can just bring compute for your deep-learning models into that environment.”

Dell’s AI starter bundle

Henson recommends a Dell solution he calls the AI starter bundle. This includes the Dell Precision Data Science Workstation, along with PowerScale. “This gives you the compute, the networking and that data layer within PowerScale to be able to start off with 5 TB of data, but as that data starts to move, you’ve got the solution that gives you that backbone and best practices for data consolidation.”

“If you bring in multiple Dell Precision Data Science Workstations, you can have multiple different modellers or data scientists building off that one PowerScale. Sharing the data between those and not having to swap data or try to move things around means that more data scientists can train those models. We’re bringing that compute to the data.”

“As you move from POC into production, you’re knowing that you have the backbone and can integrate with other models of PowerScale and Isilon to build out your data lake, scale your modellers and scale your compute as well. You can go from POC to production seamlessly.”

To learn more about Dell solutions, please visit CDW.ca/Dell. And be sure to bookmark this page for more coverage of BTEX 2020.