Inspur’s AI Inferencing and Open Source Project – Conversations in the Cloud – Episode 202

Image for FaceBook

 
Share this post:
Facebook | Twitter | Google+ | LinkedIn | Pinterest | Reddit | Email
 
This post can be linked to directly with the following short URL:


 
The audio player code can be copied in different sizes:
144p, 240p, 360p, 480p, 540p, Other


 
The audio player code can be used without the image as follows:


 
This audio file can be linked to by copying the following URL:


 
Right/Ctrl-click to download the audio file.
 
Subscribe:
Connected Social Media - iTunes | Spotify | Google | Stitcher | TuneIn | Twitter | RSS Feed | Email
Intel - iTunes | Spotify | RSS Feed | Email
Intel Chip Chat - iTunes | Spotify | RSS Feed | Email
Intel Conversations in the Cloud - iTunes | Spotify | RSS Feed | Email
 

In this Intel Conversations in the Cloud audio podcast: Vangel Bojaxhi, Global AI & HPC Director at Inspur, joins Conversations in the Cloud to discuss AI Inferencing applications and Inspur’s new solution. Vangel describes the collaboration between Intel and Inspur, delivering optimized AI solutions for customers, and Inspur’s open source projects for deep learning inference. As an Intel Select Solution, Inspur’s AI Inferencing is a fully optimized, tested, and ready-to-go configuration. The solution reduces the deployment time and cost for end-users while ensuring scalability.

Explore Inspur’s AI Inferencing and other solutions here:
inspursystems.com

Discover Open Source Deep Learning Inference Engine Based on FPGA at:
github.com/TF2-Engine/TF2

Learn about Intel Select Solutions and other performance optimized configurations at:
intel.com/selectsolutions

Tags: , , , , , , , , ,
 
Posted in: Artificial Intelligence, Audio Podcast, Cloud Computing, Intel, Intel Chip Chat, Intel Conversations in the Cloud
 


 


Learn more with Intel IT’s expert on IT Cloud Computing, Das Kamhout on:
Intel.com, LinkedIn, Google+, or follow him on Twitter.