Deep Learning Systems with Milena Marinova

The applications that demand deep learning range from self-driving cars to healthcare, but the way that models are developed and trained is similar. A model is trained in the cloud and deployed to a device. The device engages with the real world, gathering more data. That data is sent back to the cloud, where it can improve the model.

From the processor level to the software frameworks at the top of the stack, the impact of deep learning is so significant that it is driving changes everywhere. At the hardware level, new chips are being designed to perform the matrix calculations at the heart of a neural net. At the software level, programmers are empowered by new frameworks like Neon and TensorFlow. In between the programmer and the hardware, middleware can transform software models into representations that can execute with better performance.

Milena Marinova is the senior director of AI solutions at the Intel AI products group, and joins the show today to talk about modern applications of machine learning and how those translate into Intel’s business strategy around hardware, software, and cloud.

From September 18-20, Milena is attending the O’Reilly AI Conference, hosted by Intel Nervana and O’Reilly.

Full disclosure: Intel is a sponsor of Software Engineering Daily.

Question of the Week: What is your favorite continuous delivery or continuous integration tool? Email jeff@softwareengineeringdaily.com and a winner will be chosen at random to receive a Software Engineering Daily hoodie. 

Show Notes

Data Skeptic podcast: Generative Adversarial Networks

Software Daily

Software Daily

 
Subscribe to Software Daily, a curated newsletter featuring the best and newest from the software engineering community.