Serving Machine Learning Model using AWS SageMaker and Boto3

alpha2phi
4 min readJun 27, 2021

Learn how to serve machine learning model using AWS SageMaker and the Python Boto3 library.

Photo by Markus Winkler on Unsplash

Overview

In my previous articles, we used different technology stacks to serve the machine learning model. E.g.,

In this article, let’s start to explore AWS SageMaker. We will provision a notebook instance using AWS CDK, use a sample notebook for training and testing, serve a model endpoint and subsequently invoke the model endpoint using Boto3.

Provision A SageMaker Notebook Instance

Let’s start by provisioning a SageMaker notebook instance using AWS CDK. If you not familiar with Infrastructure As Code (IaC), do check out this article.

  • I am going to use the below code snippet to provision the VPC, S3 bucket, and SageMaker notebook instance.
  • Run cdk deploy to deploy the stack.
Deploy SageMaker Notebook Instance using CDK
  • Once the operation is completed successfully, you should see the notebook instance in AWS Console.
Notebook Instance

--

--

alpha2phi

Software engineer, Data Science and ML practitioner.