30 Days DevOps Challenge - NBA Game Schedule Notification
#Week1-Day2 #DevOpsAllStarsChallenge
Automating NBA Game Schedule Notifications with Azure Services
In this blog post, we'll create a Python function app that requests NBA game schedules from SportsDataIO.io, sends the data to an Azure Service Bus topic, and uses Azure Logic Apps to send an email notification every time a new message is received. We'll also use Visual Studio Code to deploy the function app.
Special Thanks to Ifeanyi Otuonye for providing a very detailed guide and explanation.
Prerequisites
Before we start, ensure you have the following:
An Azure account
Azure CLI
Python installed on your local machine (recommended to use Python 3.10 or 3.11)
Visual Studio Code installed
Azure Functions extension for Visual Studio Code
Azure CLI installed
sportsdata.io API key
Resend Account for Email sending (Resend.com)
Let’s get started
Github resource
Let’s download the repository containing the python script for our function app and a shell script to deploy resources such as Resource Group, Azure Service Bus, Topic and Subscription.
git clone https://github.com/annoyedalien/week1-day2.git
cd week1-day2
First we modify the values of variables before running the script
vim initial.sh
# Variables
RESOURCE_GROUP="Resource_Group_Name"
LOCATION="Region"
SERVICE_BUS_NAMESPACE="Service_bus_Namespace"
TOPIC_NAME="Service_Bus_Topic_Name"
SUBSCRIPTION_NAME="Service_Bus_Topic_Subscription_Name"
Save and exit.
Run the shell script
./initial.sh
Navigate to azure portal to check the provisioned resources.
Create Function App
Now let’s create a function app using Visual Studio Code, ensure the necessary extensions installed on VS code
Azure Functions
Azure Account
Azure Resources
For this demo i created a new directory for the function app and open the new directory/folder on VS code
Create a Function Project
Select the folder that will contain the function project
Select a language, since our script is python choose python
Select a programming model (choose V2)
Select a python interpreter to create the virtual environment (as of creation of this blog python3.12 is not supported) for this demo I used python 3.10
Select a template for the function (Choose Timer trigger)
For the purpose of this demo only, we will use a CRON expression to trigger the function every 30 seconds (30 \ * * * \). We can change the expression to be triggered per second, minutes, hours, days and etc. For a detailed guide on CRON check this https://medium.com/@tushar0618/cron-expression-tutorial-721d85e4c2a7
Once done VScode will generate the resources for the function app
Copy the function_app.py, function.json and requirements.txt that came from the github repo and paste it to the directory of the function app, or you can just delete all of the contents of the function_app.py and paste the script that came from the github, same goes for requirements.txt
Create .env file to store some variables necessary for the python script
sudo nano .env SERVICE_BUS_CONNECTION_STR=[Connection String From Servicebus] SERVICE_BUS_TOPIC_NAME=[Service Bus Topic Name] NBA_API_KEY =[sportsdata.io API Key]
To get service bus connection string
Navigate to your service bus on azure portal, under Shared Access policies, select RootManageSharedAccessKey, Copy the Primary Connection String.
To get the NBA_API_KEY, assuming you have subscribed for a free tier on sportsdata.io
Once all is set for the .env file
Run pip install
pip install -r requirements.txt
After the installation of the requirements
Test the function app
func start function_app.py
It should trigger the script every 30 seconds, To verify, we can check the service bus if it sends the modified game data to the topic/subscriptions.
If the function app works as we expected, it’s time to deploy it to azure.
Deploy function App to Azure
Click on the Lightning Icon and Select Deploy to Azure…
Create a function app
Enter a globally unique name for the function app
Select a runtime stack (choose python)
Select location for the new resources
It will create the function app and also creates a storage account to store its files and it uses it for logs and facilitating certain features of the underlying functions infrastructure.
After deployment the function app will run automatically, Verify if the messages on the service bus is being received.
After verifying that the function app works as expected, let’s move on to Logic Apps.
Create Logic Apps
On Azure portal search for Logic Apps
Click on Add
For demo purpose choose Consumption Plan
Enter Logic App Name, Resource Group and Region
Review and Create, Hit Create.
On Logic App page, Select on Logic App designer
Click on Add a trigger
-
On Add a trigger window pane, search for Service bus
Choose ‘When a message is received in a topic subscription (auto-complete)’
On Create Connection
Enter a Connection Name
Choose Access Key as Authentication Type and Enter the Connection String
Enter Topic Name and Subscription
-
Move back to the Design view and click on + icon and Add an Action
-
For this Send Email function we choose Resend, because here is where we have an account, you can use other Send Email functions like SendGrid, Outlook, GMail and etc.
-
Enter a Connection Name and API key for Resend
On Parameters Tab, Select From, To, Subject and Text
15. Enter the details From, To and Subject, and on Text Field click on a lightning icon beside it
And Select Content, This will forward the content of messaged receive on the Service Bus subscription to your Email address.
On the design view Hit Save
Now It’s time to verify if the Logic App run successfully
You can also check on Resend.com of the successful email sent.
Conclusion
By following these steps, you've created a Python function app that fetches NBA game schedules from SportsDataIO, sends the data to an Azure Service Bus topics and subscription, and uses Azure Logic Apps to send email notifications whenever a new message is received. This setup leverages the power of Azure's serverless and messaging services to automate and streamline your workflow.
Clean up the resources
delete the resource groups created for this demo.