Azure Functions is a cloud service that allows you to deploy “serverless” microservices that are triggered by events (timers, HTTP POST events, etc) and automatically scale to serve demand while minimizing latency. The service natively supports functions written in C#, Java, JavaScript, PowerShell, Python and TypeScript, and now supports other languages as well thanks to the launch last week of custom handlers for Azure Functions.
A new tutorial walks you through the process of creating a custom handler for a “hello world” R function. The process is fairly straightforward: use a couple of Azure CLI commands to set up a project on your local machine and create Azure resources, write a “handler” script in R to provide a Web service, and push a Docker container with the Azure Functions runtime, R engine, and packages needed to implement your Function. Then, when you want to update your Function all you need to do is push a new version of the container image with the updated R code. The video below shows a brief demo of the process in action:
The tutorial uses the httpuv package to implement a stripped-down Web server to implement the Function, but you can make things easier for yourself (at a small cost to performance) by using the plumber package. With plumber, you can easily annotate an R function you already have, and make it into a web service suitable for Azure Functions.
In this GitHub repository you’ll find code that implements an Azure Function to predict from a GLM model trained with the caret package, and also a Shiny app to consume the function. Follow the instructions in the README.md file to deploy your Function, and then launch the associated Shiny app. As you adjust the parameters on the left side of the application, the chart on the right updates in real time with an estimate of the probability that a car accident with those parameters would be fatal.
It’s important to note that the model prediction is not being generated by the Shiny app: rather, it’s being generated by an Azure Function running R in the cloud. That means you could integrate the model estimate into any application written in any language: a mobile app, or an IoT service, or anything that can call an HTTP endpoint. Furthermore, you don’t need to worry how many apps are running or how often estimates will be requested by the app: Azure Functions will automatically scale to meet the demand as needed.
Thanks to plumber, the “handler function” providing the Web service is very simple. All I needed to do was to decorate an existing function that made a prediction based on parameters with a couple of comments to make it into an HTTP POST endpoint compatible with Azure Functions.
#* Predict probability of fatality from params in body #* @post /api/accident function(params) { model_path <- "." model <- readRDS(file.path(model_path, "model.rds")) method <- model$method message(paste(method, "model loaded")) prediction <- predict(model, newdata=params, type="prob")[,"dead"] return(prediction) }
The only other trick is to create a container containing R and the packages to support plumber and your model, but if you’re familiar with Dockerfiles this should be quite straightforward.
If you’d like to try it out yourself, you’ll find complete instructions at the repository linked below.
GitHub (revodavid): Azure Functions for R with Custom Handlers
Hello David,
First let me say Thank you for this great idea and tutorial. We're currently developing a scalable R API but I'm having a strange problem when testing the docker run locally (local function and R is working fine).
Seems that the HTTP request is just handeld by the Azure function but not going through the plumber API when testing locally via docker.
I started a discussion at github:
https://github.com/revodavid/R-custom-handler/discussions/3
Maybe you find the time for helping out because I'm not able to find the problem for myself.
Thanks you in advance, best regards,
Ralf
Posted by: Ralf | December 21, 2020 at 20:45