Driving AI in a Hybrid Cloud Environment

November 16th, 2022 | | 28:35
Image for FaceBook

Share this post:
Facebook | Twitter | Google+ | LinkedIn | Pinterest | Reddit | Email
This post can be linked to directly with the following short URL:

The audio player code can be copied in different sizes:
144p, 240p, 360p, 480p, 540p, Other

The audio player code can be used without the image as follows:

This audio file can be linked to by copying the following URL:

Right/Ctrl-click to download the audio file.
Connected Social Media - iTunes | Spotify | Google | Stitcher | TuneIn | Twitter | RSS Feed | Email
Intel - iTunes | Spotify | RSS Feed | Email
Code Together - iTunes | Spotify | Google | Stitcher | SoundCloud | RSS Feed | Email

One of the greatest challenges to using AI/ML in industry is reducing your time to solution so that your data scientist and engineers have a chance to iterate. Setting up a machine learning pipeline from scratch can be a complexity nightmare for MLOps engineers.

Steve Huels and Raghu Moorthy join Tony to discuss the value of RedHat OpenShift Data Science, how the platform is built, and how it can significantly reduce the initial deployment challenge. They also discuss some of the software deployed as part of the RedHat OpenShift Data Science platform and how these pieces drive cost effective, accelerated AI solutions from Cloud to Edge.

Intel-Redhat Partner page:

Open Data Hub:

RedHat Open Shift Data Science Sandbox:

Steve Huels – Senior Director for Red Hat’s Cloud Services

Raghu Moorthy – Global Director Intel Sales and Marketing Group:

Transcript Read/Download the transcript.

Tags: , , , , , , , , , , , , , , , , , , , , , , , ,
Posted in: Audio Podcast, Cloud Computing, Code Together, Intel