AWS AI & Machine Learning Podcast

Episode 10: AWS news and demos

February 17, 2020 Julien Simon Season 1 Episode 10
AWS AI & Machine Learning Podcast
Episode 10: AWS news and demos
Show Notes Transcript

In this episode, I cover new features on Amazon Personalize (recommendation & personalization), Amazon Polly (text to speech), and Apache MXNet (Deep Learning). I also point out new notebooks for Amazon SageMaker Debugger, a couple of recent videos that I recorded, and an upcoming SageMaker webinar.

⭐️⭐️⭐️ Don't forget to subscribe to be notified of future episodes ⭐️⭐️⭐️

Additional resources mentioned in the podcast:
* Amazon Polly Brand Voice:
* Amazon SageMaker Debugger notebooks:
* Numpy for Apache MXNet:
* Automating Amazon SageMaker workflows with AWS Step Functions:
* Deploying Machine Learning Models with mlflow and Amazon SageMaker:
* SageMaker webinar on February 27th:

This podcast is also available in video:

For more content, follow me on:
* Medium
* Twitter

speaker 0:   0:00
Hi, this is Julian. Formidable. Yes. Welcome to Episode 10 of my Podcast. Don't forget to subscribe to my channel to be notified of future videos. In this episode, I will cover some latest announcements from AWS and I will also share some interesting resource is Let's get going. Let's start the news with the regional expansion. So in the last few weeks we deployed A I service is two additional regions, so you can now use Amazon comprehend in South Korea, in Tokyo and Mumbai. You can also use Amazon forecast in South Korea, and you can use personalize in South Korea so pretty big week for South Korea users. Right? And remember, these service's will probably come to your regions as well. Keep asking. Okay, the more people are asking for them, the more chances you have that they actually get deployed. So get in touch with your inner press contacts, or just keep yelling in the AWS forums or just yell at me on Twitter if you like. Um, now let's take up some features that have been added recently. The first future today is ah, new feature on Amazon Personalized are managed service to build recommendation and personalization models, so you can now use up to 50 item attributes to build your recommendation models. Previously, you gonna only use five. So it's ah, it's a 10 x improvement. So this means if you have why data sets with all kinds off features, signals events that show user engagement with the items from your from your item collection, Then you can add more in the data set and build more relevant models. Okay, so that's pretty cool. We also added a radical feature on Amazon Polly, our text to speech service. It's now possible for customers to get in touch with the Pali team and build a custom voice. Ah, voice that represents their brand or their company, their service. Whatever it is you want to showcase eso, let's listen to a couple of examples and Well, the 1st 1 is the KFC voice, right? Everybody loves fried chicken. Okay, here it is. Hi. I'm Colonel Sanders, the founder of Kentucky Fried Chicken. Let me tell you a joke. What do you call a chicken crossing the road? Poultry in motion. Got it. Poultry in motion food for laughter. All right, well, I'm not so sure about the joke. But, you know, I kind of like the chicken. So there you go. That's one example. Let's listen to the other one. This one is Ah, for any Be a Nostra alien customer. Welcome to National Australia Bank. This is the new voice of nab created by Amazon Polly, a service that turns text into life like speech. This voice has been uniquely created for now, providing a consistent experience for our customers whenever they call us. There you go. Well, you can tell it's on Australian accent. No offense, guys. I love you. So now you can you can get in touch with us and, uh, and if you want to build a custom voice, it's one of our capabilities. So that's really cool. Now I'm either one is going to start the petition for a Jeff Barr voice. Well, I guess I am. So if you agree with me, just keep tweeting. We need a Jeff Barr voice. Sorry, Jeff, Let's talk about sage maker. So here. I'm not announcing any new features. I'm appointing you at a collection off new sagemaker notebooks that have been put together by the sage maker D bugger team as You probably know Sagemaker de Burger gives you the capability to inspect the internal states off your machine, learning models, saving tensor information during training and then starting a debugging job or even inspecting the dancers yourself. Well, this is exactly what those notebooks do. So if you go to the getup depository for sagemaker examples and you zoom in on the stage McCurdy bugger folder, you'll find those new examples. And well, I have to point out, uh, the, uh, tensor analysis and the real time analysis the examples here. They are absolutely amazing. They are really, really impressive. Um, there's a Burt example. Burt is, ah, state of the art natural language processing model. And you can visualize in real time while it's training how the model is actually training. There's also ah, CNN model the convolution, a ll neural network example that shows you activation maps that shows you actually what parts of a kn image is Ah, is ah helping the model figure out what class an image belongs to. So this is trained on a data sign, but this is trained on the traffic. Signed a two set. This is trained on the traffics baton this is trained on the traffic sign data set. And ah, you can visualize specific parts off the image that again are helping the model figure out what class this image is. So this is super impressive work. So I want to uh, uh, really congratulate my colleagues who build this. These are really amazing resources. So if you if you've never looked at Sage McCurdy burger, please take a look at these examples. They really pushed the service to the limit and and you will learn a lot. So that's really, really nice. The next item I want to talk about is actually not new, eh? So maybe I'm the only one who actually missed it. So if that's the case, sorry about that. But I think it just flew under everybody's radar. Honestly, on I'm talking about the num pie a p I in Apache mxnet. So Apache mxnet is an open source, deep running framework. And as you probably know, the breast does a lot of work on mxnet and ah. A few months ago, the project actually released the num pie compatibility a p I, and what this means is you can use mxnet as a drop in replacement for numb pie. So whatever numb pi code you have and you know that's a super popular library other you can just import a maxim that doc numb pie and just use mxnet instead off Mumbai. So why would you use a mxnet instead of Mumbai? Well, I'm excellent, is just much faster, right? And next that was built for speed. It's natively written in c++ and it makes good use all your CPU cores. O r your GPU course if you have a GP on your machine. So in the in the launch block poster, it's actually Ah, a tiny test here where where were multiplying large matrices Using numb by using mxnet numb pie on CPU and mxnet dump ion GPU and well, you can see mxnet, known by own CPU, is roughly three times faster than ah numb by. And of course, if you throw that matrix multiplication at a video GPU, then it's blazingly fast and all it takes again is really just to import mxnet Mumbai and set the GPU context in your number. Michael's if you want to use the GPU. So these are really easy modifications. And if you're using numb pie at scale. Well, this is definitely worth a shot. So again, not sure why I missed it probably was busy reaching writing those reinvent post, but I didn't see a lot of, ah, attention brought on this. And I think that's a mistake. You should definitely look at this. Now let's look at a couple off Resource is that I built over the last few weeks. It looks like I have a bit of a kn automation obsession these days. So I recorded two videos showing you have to deploy machine learning models. The 1st 1 is based on AWS Step functions, the steak machines that you can define and where you can plug all kind of compute calls using lambda and using me containers if you want. And you can also use a drinker. Okay, so this video will show you how to build that state machine to deploy sagemaker model, and the other one kind of does the same thing. I trained a machine learning model locally on my Mac and using extra boost, and then I use an open source project called ml flow to deploy that model locally and tested, and then I deployed to a sagemaker endpoint, and this is a really nice alternative. I I think this is a pretty convenient library if you're working with all kinds of frameworks and if you want to deploy locally or in the cloud so pretty good and in a few days, um, I'm also running Ah, stage maker tech talk. So it's already recorded, so it's gonna be pretty cool. Ah, we'll be covering all the latest announcement for reinvent doing a very long demo. So, uh, you know, not not too many slides, but lots of code and, uh, and lots of new features. So you can you can join. You couldn't register for free. Of course, this is gonna be broadcast on the 27th. Okay, so there's still time. And of course, as usual, I will put us all those girls in the video description. That's it for this episode. Don't forget to subscribe to my channel to be notified of future videos. There are plenty more coming until next time keep rocking