I’ve been willing to do this one since I first saw it last year. Forrest Brazeal, from A Cloud Guru, had the brilliant idea of the Cloud Resume Challenge and I loved it. But it was on AWS, and I didn’t have any experience with AWS. I wanted to try and adapt the challenge to Azure. I started it, and advanced with the challenge up until a certain point. Then, for a number of different reasons, I left it aside.
The entire process of building the resume is lengthy and I wanted to write a shorter article. I think it’s too long.
This is the result: My resume up in Azure.
And this is the GitHub repository where all the code lives.
Introducing the Challenge
The challenge is to build an online resume and host it in Azure. There are a number of steps to it, it’s not just simply upload your pdf and store it somewhere in Azure. The challenge is challenging! 🙂
This i what we need to accomplish:
- All the code has to be stored in a GitHub repository.
- A GitHub Action must be set, so that when you commit changes to the code, these are automatically deployed to production.
- There should be a test procedure to test your API code, and this should also be automated via GitHub Actions.
And this is the visual representation of all this:
I actually started building my website way before the challenge, influenced by the original AWS Cloud Resume Challenge. I built my website from a template I downloaded from Start Bootstrap. It’s a very simple page, but that’s what I was more or less looking for.
Back then, I still didn’t have my project hosted on GitHub. I was using VS Code to update and deploy the changes to the storage account. We can actually do it from within VS Code with the Azure Storage extension. VS Code is so cool, and it works so well with Azure. A match made in heaven!
I’m getting ahead of myself. I forgot to mention the creation of the storage account in Azure. A storage account is a place where you store things. The Azure Blob Storage is the place where you can store objects. You can host a static website in a blob storage, and it’s something very simple to do. You just go to the Static website feature on the Storage Account panel, and enable it. Here’s the tutorial on how to do it.
The address, however, is huge and cumbersome. Now we’ll move to another step of the challenge, that is configuring a custom domain that will point to our website, and enabling HTTPS. For that, we’ll need to enable a Content Delivery Network (CDN). CDN’s store our content on edge servers that are closer to the users. The loading time is much faster like that.
Custom Domain and HTTPS
I bought my domain from Namecheap. I own 3 domains, and I bought all of them from Namecheap. I’m very happy with their services, and the prices.
We’d like our visitors to access our website by using HTTPS. HTTPS is the encrypted version of HTTP, therefore it’s more secure. Always use HTTPS.
First of all, we need to enable Azure CDN for our website. From the Storage Account resource, we can look for Azure CDN and create a new endpoint. For the pricing tier, I chose S3 Standard Microsoft. I believe that’s the only one that would work for me. I didn’t give it that much of a thought, to be honest. The origin hostname field should be filled with your static website endpoint.
To add the custom domain to the CDN endpoint, we first have to go to our DNS provider (Namecheap in my case) and create a CNAME record. Mine looks like this:
Back to the Azure Portal, now we need to associate the custom domain to the CDN endpoint. We navigate to our CDN Profile resource, click on the endpoint created previously and add the custom domain.
For the last part of this step, we’ll enable HTTPS. We’ll need to choose a certificate, and if we don’t have one already, Microsoft can provide us with one. It takes some time to validate and provision the certificate. The page where you request the certificate looks like this:
So that’s what I had done some time ago. I ended up just leaving it like that, because I felt intimidated by the other steps of the challenge. I was trying to figure out which service in Azure would be the equivalent of the service the AWS Challenge was asking for, and I even started building a Cosmos DB database, but then I left the challenge aside. Until now!
The GitHub Repository and the frontend GitHub Action
Back with full force to challenge, the first thing I did was to set a GitHub repository. I’d been studying and experimenting with GitHub Actions, so this step was super fun.
The Microsoft Documentation offers a great tutorial on how to set up a GitHub Actions workflow to deploy a static website in Azure Storage, and I followed it.
This workflow has 2 main tasks. It will upload the new files to the Azure Storage Account and it will purge the CDN endpoint, to replace what’s sitting there with the new code.
This step was fun. I got confused with the
--name parameters of the cdn endpoint purge command. I had so many names I wasn’t sure what went where, but I ended up figuring it out. The profile name is the actual name of the CDN Profile resource and the name is the name of the endpoint, as it appears in the hostname mywebsite.azureedge.net.
The Cosmos DB Database
Cosmos DB is a super cool database offering. As its name implies, it’s cosmically-distributed, meaning it’s easy to be reached no matter where you are
In Cosmos DB the databases hold containers inside them with our data. For this project, I created a container that holds the number of visitors that visit our site. The Azure Function will have to retrieve this data, and also write new data into it, to update the counter each time a visitor drops by.
The Azure Function
This was the hardest part, as coding is somewhat new to me. I’ve been studying Python, so I decided to write my function in Python.
I created my Azure Function using the VS Code extension, and it’s pretty straightforward. Well, at least the first part, that is the main structure of the function, and its related files. Functions are triggered by events, and in this case the event is an HTTP request. Every time a client makes a request to our resume, that function is invoked.
We also need to set the function’s binding. This is the connection between the function and another resource. For our challenge, we need to bind the function to our Cosmos DB database. VS Code helps a lot with setting the bindings too, with the add binding feature, from the Azure Functions extension.
I don’t mean to make this article absurdly long, so I’d like to get into the details of how I built the Azure Function in a later article.
I couldn’t have done it without the help of my challenge buddy Florian. His insight was invaluable! Thank you Florian!
Now this is where I hit that immovable mountain! I know I’ll eventually be able to move it, but right now I still can’t.
I’ve come so far, and I’ve learned so much. Even though this test workflow isn’t working as it should, I feel I can consider this challenge as accomplished. At least I feel I’ve accomplished so much. I admit my limitations, and it’s no shame.
For the test, first the workflow will run flake8, a Python library that checks your code against coding style (PEP8), and programming errors like library imported but not used. The second step is to test the code with pytest. Pytest is a test framework. It finds tests we’ve written, runs them, and reports the results.
I wanted to do something fancy, and include this GitHub Action that publishes the test results in a nice JUnit XML file. Somehow it didn’t work. All the tests pass, and I believe I’m setting the correct path to the file that contains the test results, But it didn’t work.
This was the least of my problems though. The serious issue here is that the action that was supposed to deploy the update code for my function is actually breaking the function. I’ve tried hundreds of possibilities, but I still can’t figure out what’s wrong with this GitHub Action. As you can see from the image below, the workflow ends with all green checks. This unit test results should display the results, but it shows No tests found. As of now, I can’t seem to find a way to fix it.
But let’s move forward. There’s a workaround, and that is to deploy any code changes directly from VS Code. And now everything’s working as it should. I’ll keep trying to solve this problem, but since today is the deadline for the challenge, I’m publishing it as it is.
It’s been a blast!
This challenge has been incredible, and I loved every second of it! I’ve learned so much!
I also got the chance to develop my patience and my endurance. I was about to give up, because of several issues with my function, then with my GitHub workflow. But I pressed on! The result isn’t perfect, far from it, but I enjoyed the journey.
Enjoy the journey!
I can’t thank Gwyneth Peña enough, and the other kind members from the Discord channel. My shout out goes to Rishab Kumar, who’s always willing to help.
Now for the next challenge.