Integrate 2018

June 4-6, London


Jeff Hollan introduces basics of Azure Function concepts like triggers and bindings, and demontrates how to create an Azure Function using Visual Studio. Then he discusses tips for using Durable Functions and its limits.

Azure Functions role in integration workloads

Integrate 2018, June 4-6, etc.venues, London


Video Transcript

Welcome, everyone and thank you once again for all your support and passion towards Microsoft Technologies. I’m Gowri Ramkumar, Customer Relationship Manager for BizTalk360 and other products we have. Please allow me to introduce our next speaker, Jeff Hollan. Jeff, this is your fourth Integrate event, and this is the event he really looks forward every year. Very passionate about cloud technologies, Internet of Things, serverless computing and so on. He loves his food, watching movies.

Jeff: It’s a fact.

Gowri: And, of course, his family. He started his career with Microsoft as an intern six years ago, and now senior program manager, Azure Functions. So this session is all about Azure Functions, recent updates, patterns, and best practices to follow. Ladies and gentlemen, please put your hands together in welcoming Jeff Hollan.

Jeff: Thank you there. All right. Thank you all very much. And it is true, honestly, this is my favorite conference I go to every year. So, I get the chance to go to a few conferences, Microsoft Build, Inspire, Ignite, some open source conferences. I’m heading to NDC Oslo next week. But truly, this is my favorite place to be. And a big reason for that is that as awesome as Microsoft Build is, and as exciting it is to have, you know, hundreds of thousands of developers participating, I love this conference because it feels much more like a community. During the breaks people are talking, a bunch of people are talking to us as a team. It’s a very valuable experience to come here and just to be a part of this community.

So, even though over the last few months I’ve switched from focusing solely on Azure Logic Apps to focusing now on Azure Functions, as soon as I saw the email spinning around to say, “”Who wants to come here?” I raised my hand right away. In fact, last year, if you were here at the London event, I got to help out in about two or three, maybe even four different sessions. So I asked Saravana if I could just have all of day one, but he unfortunately said no, and they just gave me these 45 minutes. So, we’ll see what we can do with the time allotted to me. All right?

So, this is all about Azure Functions and how it may or should play a role in integration workloads and scenarios. So, the first part is, for those who may not be familiar, are, “What are Azure Functions?” And there are a bunch of different definitions that you could hear for something like Azure Functions. But I’m trying to keep it extremely simple and this is how I think about Azure Functions. Execute your code to run in response to any event. That’s it. That’s all we’re providing for you with Azure Functions. So, I have an example of a simple function here. I could publish this to Azure whenever any event occurs. In this case it’s listening for an http request. It’s going to go execute and run your code for you.

And the value of Azure Functions is that all you have to write is the code that you want to execute. And all the other pieces that traditionally you have to spend your time managing and dealing with and operating are now managed by Azure. So, if you get one request or 1 million requests, all of the wiring to make that happen, all of the scale to make that occur are managed as part of the service. Now, Azure Functions, you’ll hear them often talked about in regards to serverless architectures.

In fact, we’ve talked about serverless and serverless compute here today already, because Azure Logic Apps is another key piece of serverless. So, I wanna define very briefly, for those who may not be familiar with this term, this is gaining a lot of traction in the technology community over the last few years. Serverless and containers are two of the major paradigms that are starting to change the way that we’re able to build, ship and manage applications. And its core, serverless is just a way for you to publish and run applications where the server is not forefront in your mind. Okay? So, it’s serverless because you think about servers less, okay? Not because there are no servers. We haven’t figured that piece out yet. Okay?

So, with Azure Functions, you’re not having to deal yourself with what are the servers behind the scenes that are running. In fact, I’ll show you in just a few minutes how you might write a single Azure Function, and within a few seconds, there may be dozens of different servers that are automatically spun up behind the scenes for you to run your workload.

Now, when we need to spin up your resources, we need some event. There needs to be something to let us know, “Hey, run this little piece of code.” So, it does need to be event-driven. Now, you’re very flexible in the types of events that you can use with Azure Functions. This could be when a message is dropped in a service best queue, or when somebody hits this http endpoint. It could be triggers that you’re funneling in through something like Logic Apps. Something like SAP has had an event, or Dynamics 365 has had an event. These are all events that let us know, “Hey, go and execute this code, go and execute this workload.” And one of the really nice features of serverless is that you’re only charged when your code is actually running.

So, if you do have an API or a service that may only be running on the weekends, or may have very bursty traffic, you’re only paying when that workload is actually being executed. I have some serverless applications that are actually triggering off of my doorbell at my house. In fact, I wrote a few blog posts about it during the holidays where if you come to my house, I have an IoT doorbell and actually have a serverless app running in Azure behind the scenes that is looking at the person at my door and sending me alerts based on who is on my doorstep, because I just want to be the creepiest guy in my neighborhood that I guess I can be, which is fine. It always sounds so cool until I start explaining it and I’m like, “Man, that really is a little bit of a violation of privacy, I suppose.”

But that said, what’s great is I’m not very popular. Not very many people come to my house, so I’m not paying for this application 99% of the time, but when the UPS person finally does come to deliver my Amazon package, or the pizza person comes to deliver my pizza because I do love food, that’s where my application will now spin up. That’s where my code will execute. Azure actually has an incredible serverless platform because we do see serverless as much more than just those functions as a service. If you hear other cloud providers like Amazon talk, they’ll focus mostly on Lambda, which is like Azure Functions. But we know that real applications need much more than just your code. And that’s why in Azure, the serverless application platform includes both Azure Functions as well as Azure Logic Apps for serverless workflows. And we talked to them about Event Grid already. I believe Dan’s talking later today about how Event Grid can help route events to different pieces of your application.

But really, as serverless architecture can be leveraging many other services as well. It could be tapping into IoT devices. We showed a cool IoT device this morning running machine learning on the device itself. You’re going to need some place to store and keep your data. That might be something like a SQL database or something more modern like Azure Cosmos DB, as well as rich intelligent services like Azure Cognitive Services. So now you can pull in this intelligence directly to your application. So it’s much more than just an individual component. It’s this entire cloud-first and cloud-native architecture that you gotta take advantage of. To enable you to build applications in a fraction of the time, leveraging tools that traditionally would have taken months to build, now you can build in just minutes. And what we really pride ourselves in is the development experience. I think it’s clear that where Microsoft has spent much of its time is on rich development experience, whether it’s Visual Studio or more recently with things like Visual Studio code, Visual Studio team services. I keep reading in the paper about how now maybe GitHub will join the Microsoft family, who knows? I’m not speaking or confirming that. I have no idea. I’m not important enough.

But that said, this rich development experience to enable you to work with the tools and the features that you care about to build this very seamlessly. So we’ll go into that in a moment as well. Now, there’s some concepts to understand with Azure Functions that are just going to make your life easier, whether you’re building your first Azure function or maybe your 50th. The first is that, in Azure, you actually have a concept of a function app. Now, this function app could contain one or many individual functions. Okay? So think of that function as that little piece of code that’s going to run on response to an event. You may have a few different functions that all go in a single function app.

Now, the reason an app is important is because this is your unit of development and your unit of deployment. I should have actually added in the slide. It’s also your unit of scale. So when you’re deciding how many functions do I wanna fit inside of a single function app, just remember that it changed to one of those functions in an app. All of those functions seem to be deployed together. So, if you wanna be able to make a change to this function without having to touch that function deployment, just separate them into two function apps. It’s not going to cost you any more money. And the second piece is maybe function A requires 10,000 requests every second. You know it’s going to have to scale out very large. And function B may be very small in it’s utilization. Maybe it doesn’t have to scale out that much. There may be some benefit in that unit of scale, separating those out into two separate function apps. Because those apps are actually going to reflect instances.

I’m going to get into that a little bit later on when we go into some more advanced concepts. The other piece is this concept of a trigger. This is the event source that we will be listening to to let us know when we need to run your code. Now, every function call only have one trigger. And I mentioned a few of those, a queue message, an http request pushing in some data from a Logic App. There is another concept though, which is called bindings. Now, bindings isn’t that data set that’s going to start your function, but it’s data that you can either pull in or push out of your function as part of the execution. So, a simple example of this is let’s say you’re creating a very simple function API and you have an http endpoint that you want to be called. But maybe the path is like slash products, slash product ID.

Okay, so you have a path. I’m going to get in an http request and it’s going to have an idea of a product as part of that http path. Well, that ID very well may represent an ID in some database. Maybe in Cosmos DB. I have a corresponding ID property, and I wanna pull in an entire document. Well, my trigger is going to be http because it’s an API, so I’m going to get an http request. But I could actually create a Cosmos DB input binding to say use the ID parameter of that path and go and pull in data from Cosmos DB before you start the execution. So, you’ll see here I’ve got a table of some of the triggers and bindings. These can be triggers to start an event. This could be data that you pull in, or the other side is maybe my serverless API that I’m creating.

Maybe I wanna create some data in something like Cosmos DB as a result. Now, I could obviously just pull in the cosmos DB SDK, write all the code manually myself to write to cosmos DB, or I can leverage these bindings and automatically push data out to Cosmos DB as part of the execution. If that doesn’t make a lot of sense right now, just hang tight. I’m going to show you that very quickly in a demo. One of the nice parts about Azure Functions is that we do enable you to develop where and how you want to. In fact, I’ve listed a few of the most popular editing, or, I guess, these is all inclusive. So these are the different ways that you can develop. I’ve even included my recommendations of the type of workload and the tool that I feel best fits that workload.

So when you’re just trying out Azure Functions, if you haven’t created one before, you can develop and run them entirely inside of the Azure portal. Go sign up for a free Azure account, if you don’t have one. Create a new function. You can write a little bit of code. It’s great. I would say that that’s best just for trying it out or very simple scripts. You’re not going to get things like debugging or break points, or check into source control, or there’s things that really matter for production grade functions. So, if you’re doing anything more complex, I recommend you transition to one of these richer editors or IDEs. Visual studio has a phenomenal C Sharp and F Sharp experience. It feels just like writing a console application. In fact, these changes are pretty recent since about October of last year, we re-released the Visual Studio extension for Azure Functions. So if you haven’t tried it inside of Visual Studio in the last few months, I’d encourage you to try it again. It really does feel just now, like I’m writing a console app, but now I can publish it to run serverlessly in the cloud as I need.

Visual studio code is a more recent one. This is what I use when I’m building functions on my Mac. Lots of Macs on stage today. I don’t know what the deal with that is, but that’s okay. So, Visual Studio code has an extension. This is perfect if you’re writing non-C Sharp functions. It does support C Sharp. I think Visual Studio has a little bit nicer C Sharp experience today. But there’s no reason you couldn’t also do it in code if you really like that lightweight editor.

And then we just have rich tooling across the function’s run-time, and Java tooling like Maven, so that if you wanna use something like IntelliJ, Eclipse, or even Notepad, you’re totally welcome to do that. And you can run, debug and develop your functions all locally on your machine. So let’s check this out. I just want to show a very quick demo of developing and debugging an Azure Function on my machine.

So, this is Visual Studio 2017, and what I wanna do quickly is just create an Azure Function. So I’ll go ahead here and create a new project, and in the cloud workload, here is the Azure Functions project. Now, at the top level, I have a few different triggers that I can choose from. There’s actually more after I create the app. And I can actually choose two different run-time versions that we have. So, we have our generally available version one, which is written in .NET framework for .seven. And we have a preview run-time which is a in .NET core 2.0, which will take to general availability in the coming months. For this one, let’s go ahead and make it that V2 run-time because I wanna be doing the latest and greatest. And I’m just going to create a simple http triggered Azure Function. Now, there’s many more triggers here that weren’t listed in that first template page. And if I right clicked my project and started to add more, you’d see a much more comprehensive list. In fact, the list I just showed you, is about to get larger as well in the next few weeks.

But this here is my function. So here’s all the code I have to care about. There’s not boilerplate code here, there’s no, you know, config file or startup file. It’s just my function code. And in this case it’s saying I have an http trigger, and it’s going to do a simple request where it’s going to look at the request body and pull out a name property that I can pass in. Nothing too complex. Okay? One of the nice things here with Azure Functions is that I can go ahead and set a break point and go ahead and run this locally if I want to. So I’ve gone ahead and just clicked the run button in Visual Studio, just like you’ve done with any other projects you may have created inside of Visual Studio. It’s a very similar pattern in Visual Studio code as well. And what’s actually running on my machine right now is the Azure Functions run-time. This is the exact same runtime that we’ll be running in the cloud. You could also run this run-time on an IoT device, you could run this run-time locally. Wherever you want to run this Azure Function runtime you can, and it’s given me this nice API that I can go ahead and call here. So, I’ll open up postman and we can just call it very quickly and even hit a break point.

I’m not going to wait for postman to load up. Let’s keep going. Now, let’s say more than just getting in a simple http request, I want to log the data into Cosmos DB. So, to do that, I’m going to bring in the Cosmos DB binding. Now, I’m going to go through this a little bit quickly for the sake of time, but one of the pieces to do this is, I’m just going to install this new Git package for the Cosmos DB binding as part of functions. So let’s add Cosmos DB support to my function project. There’s a number of different extensions that I could pull in if I need to use it. And now I can just write some simple annotation here to let my function know what I want to do with the code. So, here I’m going to write this C Sharp attribute amazingly fast. Wasn’t that impressive? So, here I’m saying, “Hey, I’m going to push that data to Cosmos DB. I have a database called function, I have a collection called function, and I’m going to give you the connection string in this Cosmos DB connection property.” So just remember the Cosmos DB connection thing. We’ll go to that in a second.

And so, what I’m able to say here is, whatever I set to docs, this variable called docs, that’s what I want you to push to Cosmos DB. So let’s say here, right here, I’m getting the request body is a string, and I’m just going to say, “Hey, let’s make docs the request body.” So with that, you know, attribute and line of code, now what this function will do is when I get a request, it’s going to set the request body and push that data to Cosmos DB for me. So I don’t have to worry about creating the connection to Cosmos DB, managing that connection. I just wrote this attribute and said, I’m going to talk to cosmos DB. Okay? Data in, data out, nothing super complex. This isn’t an integration service, but enables me to simply pull in and push data out.

Now, the last piece I wanna show quickly, I mentioned there’s this connection string setting that says “Cosmos DB connection.” I actually have a file here called local settings, and when I publish this, I’ll have application settings in the cloud, that I can configure to set environment variables or settings for my application. So this is where when I’m running this locally, I could say, “Hey, I want to have this property called Cosmos DB connection, and I want the value to be this wonderful Cosmos DB connection strength that I memorized and just type for you live. Okay? So, now when I go ahead and run this same function app and hopefully postman is now awake and ready for us, I’m going to put in this end point that it’s going to expose.

I’m going to pass an http request to this local host standpoint, and I’m going to pass in my name, which is Jeff. So let’s wait for this to startup. It’s started up. We’ll go ahead and send this out now. It executed my function. In fact, I could come back here and see all the logs that are generated, and it returned back, “Hello, Jeff.”

Now, what’s cool here though, is if I come back over to the Azure portal and refresh this Cosmos DB database, I actually have a brand new document here that was just created and it has the request body stored into Cosmos DB, and it added the required properties for Cosmos DB for me. So if I went ahead here and, you know, said, “Integrate 2018,” I now have a simple rest API, they can return back to me a name, but it integrates also with Cosmos DB. So now I have all of these request bodies that are being stored in Cosmos DB for me. Now, this is all running locally. Everything looks like it’s working great. So at this point I could go ahead and publish this project to run in the cloud, which would give me a web accessible endpoint, So now anybody here could call, you know, https/ whatever I set it to be.

Now, one other thing I wanna show you, which is very cool, is how this simple function that I created. So, here’s one I have created that’s already published in the cloud, can scale to whatever requirements they need to have. So, I don’t know how many of you have had the wonderful pleasure to provision and deploy servers, but it’s not an experience that you wake up saying, “I really wanna configure and deploy as many servers as I can today.” So, what I’m showing right now, this is that same function out that I just wrote a second ago, but I’m looking at Azure application insights that actually gives me real time insights into how my application is performing, and the health of my app. Now, what I wanna show you really quick, is here, I can actually see the number of servers that we have had to spin up behind the scenes to process the load for your function. So right now nobody is calling my API. So I just have a single, lonely server sitting around here beautifully named this random GUID because we have so many servers that this is what we have to name them, but there’s not much happening here.

But let’s say that this serverless API, maybe it’s processing orders and there’s some promotion that’s just happened. It’s now, you know, some sales day, it’s the day before Christmas, whatever it might be, and now a bunch of requests are coming in. So I’m going to simulate that, and what I want you to pay attention to is the number of servers that are instantly going to be provisioned for me to process this load. So I have here a simple load-testing tool, and what I’m going to do is over 30 seconds, I’m going to send 1000 requests every single second. So I’m about to send 30,000 requests at that function app that I just wrote live in front of you now. And I’m going to do that, and I want you to check out how quickly this single function is able to scale out to process that load for me.

So I’ve gone ahead and pushed “Go”. It’s just warming up right now, but you’ll notice I’m actually going to zoom out, that within a number of seconds, I’m already at four, full powered Azure servers that have spun up to help process that, and the list keeps growing. Okay? So, I have had to do nothing to say that I want this application to process this heavy load. I just published this to the Azure cloud and pushed a heavy load at it, but instantly now, what are we at? About 5, 6, 7, maybe 10 servers that have been provisioned, spun up and are now running my code on it, processing these 1000 events every single second. If I kept this open for another few seconds, this would get to about 16 servers that are automatically being deployed for me.

Now, this is really the value proposition that functions brings is because I just have to focus on my code, but behind the scenes I get this massively scalable distributed system executing my code for me. And each one of these servers is actually processing a number of requests in parallel. This is a pretty big list right now. It looks like I have 16 servers online and it’s only been about 20 seconds. So that’s pretty cool stuff. Very cool that I don’t have to worry about this and I can just focus on my code.

All right, so let’s talk about, now that we’ve seen functions and gotten a basic experience of how they work. Let’s talk about some of the places that functions may fit, especially in your integration workloads. One that we’ve already talked about today, running some custom code in an Integration workflow with something like Logic Apps or BizTalk, right? If some time during your pipeline you need to execute something that maybe isn’t in down of the box connector, maybe needs to be cloud connected. This is a perfect place to just write an Azure Function and have it run for you in the cloud. Okay? We see tons and tons of Logic Apps, users leveraging functions to extend that with whatever custom code they need to write.

Functions were great on top of event streams of any kind. This is probably, if I were to look at our top 50 Azure Functions customers right now, this is what a good chunk of them are doing because these are high volume scenarios. So, you could think about IoT devices that are emitting telemetry of like the temperature, the health of a machine or a device, or an aircraft. Okay? Anything that’s kind of using Event hubs or Kafka, these big data scenarios, is really nice for functions, because as a developer, I just have to write the code for one single event. I say, you know, “For a single event, I want you to validate this maybe checks and data from that and push the data here.” I just write the function for a single event, but now once I publish it to Azure, it can now scale out and run that same code for thousands of event every single second. But I don’t have to worry about the scale out piece. I just worry about, “Here’s the code I want to write for a single event.” So it’s a perfect fit for event streams.

I’ve showed you just basically in a demo, how it could be a web API or a mobile backend, participating in an enterprise service bus. We’re subscribing to cues or topics to go take an action or push data or validate data, or just automating scripts or jobs, you know, setting something to run every hour or setting something to run whenever someone on the I team pushes a button to index the database or provision something or run an Azure command. Whatever you might want to be doing, these are really nice places for Azure Functions to fit. Just a simple use case, just to kind of show this, Carl Zeiss, who creates some very high tech devices, optics devices and more, they have a published use case that you can check out on, and they have IoT devices all over their field that are securely connected and pushing to REST systems on the back end. And they’re using Azure Functions to power this because as they add more devices and add more data, they want their application to be automatically scalable to those new devices.

So the quote from their lead architect is that the scalability of Azure Functions is awesome or amazing. They can send out millions of records and the platform adjusts automatically to the load. So, just a customer use case of how they were able to use functions in an IoT scenario to pull data. Now, I have 15 minutes, so I’m going to be very selective in the tips that I share. So let’s go ahead and just do this first one here. This is a very useful tip to know and it may help that we showed that demo before. Now, I showed you with that scale test example, how Azure Functions automatically adds instances in this demo I showed you, it was called servers. We automatically add instances for you. Now when instance is a function app, okay? So when I run Visual Studio locally and you saw that nice Ascii art, my machine was now a function instance. When we showed that app insight screen and there were 16 servers, that means that I had 16 instances that were spun up and ready to run.

Now, for the serverless plan, an instance means you have about one core, well, you do have one core, and you have about a gig and a half of memory. So every instance is a single core, a gig and a half of memory, but we will scale out so that collectively, you can have much more compute than that. Now, one important thing to note with Azure Functions is that a instance may not just be processing one execution at a time, right? You’ve got a bunch of horsepower on these servers. It’s very possible that if it’s doing something pretty basic, it could actually be processing multiple executions in parallel. Now that’s important to note because it may dictate the behavior and how you write your function.

In fact, the other thing I wanna show you is very quickly if I actually come back here to app insights, you’ll notice my scale test is finished, but right now I still have 16 servers that are online. We haven’t gotten rid of them yet. They’re still sitting there. My code is still initialized. It’s going to sit there for another few minutes until it finally decides that I’ve died or that my app’s not going to be used, and then we’ll start to de-provision them. Now, that’s important to note because I can actually take advantage of these behaviors and make sure that I’m sharing resources appropriately in my function code. That means when you have something like a connection to a SQL server, you do not want to create a connection to that SQL server in every single execution. Because if now I send 30,000 requests at my function, I’m going to get 30,000 SQL clients trying to create a connection, and my SQL server’s going to scream for mercy and probably blow up. Okay?

So, I actually wanna make sure that what I’m doing is proper instantiation. This is what you should not do. If you look at my own personal GitHub account, I have code where I did exactly this probably six months ago. So it’s a very common anti-pattern even outside of just functions, using Var client equals new http client. I’m creating a brand new http client on every single execution. It’s very common. It’s an anti-pattern. If you wanna learn more about why it’s an anti-pattern, and I even created a beautiful link for you at the bottom, but what’s going to happen, is if I had this code in my function, and then I sent those 30,000 requests to it, I’m going to run out a sockets on those instances very quickly, because every time I get an execution, I’m going to create a new http client, we’re just going to hang around for a bit of time and pretty soon my function’s going to start throwing errors that say, “Hey, I don’t have any more sockets anymore. I can’t create any more http clients.”

Now, the fix is so simple. And that’s why you actually, even though this is a getting started on function scenario, I wanted to focus on this one because it’s super common in the fix couldn’t be easier. Just move the instantiation of these clients, whether it’s SQL client, http client, event hub client, Reddus client, anything that’s kind of creating a connection or reserving sockets, and create a static singleton instance. Now, this means that I can reuse that connection in those sockets across multiple executions in parallel. So while those instances stick around, while I have those 16 servers that are still hanging out here, these could still have an http client ready to run. Okay? So, it’s important to note that yes, this is serverless, but it doesn’t mean that every execution happens on a brand new server.

We’re going to reuse resources. We’re going to let you create things like, http clients or SQL clients. Okay? So that’s the first best practice. I really wanna fit in one more. Let’s see if I can pull this off. Okay. The other one, designing for high scale. I’ve written this beautiful scenario and beautiful art. Okay? So imagine that I have City A and City B, and there’s a body of water separating them. Maybe on City A, I have 1000 people who want to get to City B. But I only have one boat in the boat is maybe like five meters by five meters, so it only has a fixed amount of size. Now, I can put as many people on the boat as I want to to kind of go to City B and drop them off and come back. But what I need to decide is how many people can fit on this boat at one time.

I might have a thousand people in City A who wants to go to City B, but if all 1000 got on the same boat, the boat’s going to tip over. It’s going to capsize. So maybe there are smaller people, maybe there are children and I’m like, “Hey, we can fit, you know, 10 of these people” Or maybe there are people who are my size and we’re like, “Let’s just stick with one person on the boat at a time, get them across and come back.” And that’s fine. It just means the boat’s going to need to take a bunch more trips to get all 1000 people across. And even if I have multiple boats, that doesn’t change the fact that I need to decide how many people can be on one of these boats at one time. Right? So maybe I have 3 boats or 30 boats. I still have to decide how many people get on the boat at one time.

Now, why the crap is Jeff talking about boats in the middle of a session about Azure Functions? I love boats. That’s why. No, I don’t. Because with Azure Functions, we’ve talked about this concept of instances and the instances process multiple executions, but it may be necessary for you to decide how many executions can one instance handle at a time. Maybe your Azure function is doing something as simple as addition, one plus one. Well, you could probably send a bunch of stuff on a single instance. Again, one core, a gig and a half of memory. Maybe you could send 50 of those things on a single instance and it can do one plus one, 50 times in parallel. It’s totally fine,

But maybe you’re doing some advanced mapping or complex work that is taking a few minutes to solve. You might actually wanna force that and say, “You know what? A single instance, maybe only do two or three or four things in parallel at a time.” Now, we do our best effort to scale you out and distribute the load for you, but there is a way for you to override our defaults and tell us exactly how many people you want on a boat at one time. And that is called thehost.jsonfile. This is something I knew nothing about until I joined the function team, but it’s come in very handy as I’ve worked with a number of different customers. Thehost.jsonfile allows you to set the limits for what you want an instance to have. So you can say, “Hey, functions, I know you’re going to do a best effort and send 20 of these things per each instance, but I know for this workload, if I send more than about five things might tip over. So, I’m going to limit you to five things per instance.” And we will honor that and we will see that the, you know, the load needs to be scaled out more and so we’ll add more instances to compensate for it.

So one thing worth noting is that there’s this thing called thehost.jasnfile and it allows you to decide exactly how many executions happen on a single function instance. We’re still going to be improving and trying to make it so that people don’t ever have to open this file, and that we’re just so smart and that we can figure out everything on our own, but until then, this is a nice override to tell the system the behavior that you need. Hopefully that makes sense.

The last thing I wanna show for a tip, is just that there are multiple ways to run… Oh, actually, I still have durable functions. Holy Crap. Okay, here we go. We’re going to jump through this pretty quick. There are multiple ways to run Azure Functions. Okay? So consumption plan, this is called the serverless plan. This is the one I’ve been pretty much talking about today, that’s pay only for what you use. We’ll provision instances and servers dynamically for you. But if you want, you can also host your functions in an app service plan. This is a plan where you’re actually able to reserve a specific number of instances that are running for you 24/7, right? Maybe you want 20 instances running all of the time. Always hot, always ready to run. You absolutely can just use an app service plan. App service plans also give you some extra benefits with on-premise connectivity options and things like, all the greatness that app service plan has. If you want your functions running within your own virtual network, you can run them in an app service environment. This is similar to the Logic Apps integration service environment where now this is your own isolated environment running connected potentially to your own vignette. Those same functions can run on it. Azure Stack, Azure Functions run-time, and the Azure IoT Edge.

The same function code, that same function that I wrote a second ago, I could deploy to any of these environments and have it be a symmetrical development and running experience. So that’s how we got the function running on the Pi. We wrote a simple function and we just said, “Hey, run this on a raspberry Pi.” Everything just worked.

All right. Let’s go to Durable Functions very quickly, because that was a specific request. We’ll see what I can pull off here. Okay, so there is a feature of Azure Functions that just recently went generally available, and there’s been a number of positive interest on it. It’s a very cool feature. This is an extension to Azure Functions and what it allows you to do is write long running or stateful serverless processes all using code. So, these are things that traditionally would be very difficult running a single process for, you know, 5, 10, 20 minutes, but being able to maintain the state throughout that entire process. Durable functions gives you that ability and you can write this all in C Sharp or JavaScript code.

So this isn’t a visual designer like Logic Apps, this is code. But there are some differences here and I’m going to get to those in just a second. Okay? So, it’s a service that automatically manages state, checkpoints and the durability of this process for you. So some of the patterns that it has is, you know, basic function chaining, maybe you wanna call function one, then function two, then function three. Maybe you want to fan out or fan in like parallelize a bunch of work and then aggregate it together. This pattern is super useful. This will actually let you run in http API, but have it be asynchronous. Now, what I mean by that is that usually with functions like my “hellojeff” function, I call the code, the code runs, but my connection is keeping open until I finally get the response. And if that’s a short running function, that’s fine, but maybe my function code’s running for four or five minutes.

How in the world do I expect all of my clients or Logic Apps, for that matter, to keep a connection alive for five minutes waiting for the response. With Durable Function, there’s actually a first class pattern where I can say, “Hey, start off this process, but do it in the background.” So my function will actually immediately return, a 202 accepted response and say, “Hey, I’m working on your task right now. Check back with me in a few minutes and I’ll let you know if it’s done.” That’s perfect for something like Logic Apps, because you can call your function. We immediately returned a Logic Apps and we say, “Hey, we’re working on it. Check back assistant in a few seconds.” And Logic Apps will actually keep checking that response until the work is finally completed.

I know there’s a lot there. Read the documentation if you wanna learn more. I just don’t have enough time to go into all of it. You can monitor things, you can do human interaction. The graphic is very small, but this is what a Durable Function looks like. Excuse me. So, this is in C Sharp. So, this is that function chaining pattern that I’m showing here on the screen. I have in C Sharp, I say, “Hey, call function F1, then call function F2, then call function F3, and then F4.” All in C Sharp in a try catch loop, and the durable extension will manage the state and cue things appropriately for me. Now, here’s one important thing to note with durable functions, which is how it works. Because it does run very different than how Logic Apps run. In fact, I think tomorrow, Kevin’s going to explain the workings of how Logic Apps executes behind the scenes.

Here’s the working of Durable Functions. This is very important to know if you’re thinking about using Durable for any of your workflows. So, here I have an extremely simple Durable function up top. I’m creating a list, I’m calling a single function which is says, “Hello, Amsterdam.” I should’ve changed that to London. I need to update my slides. And then it’s returning the response. Now, what happens behind the scenes is that I have my Durable Function, which is called this orchestrator function, and my activity function, which is saying “Hello Amsterdam.” And then I have my history or my state, which is all managed for me.

Now what’s going to happen is when this runs, the orchestrator function is going to start. It’s going to evaluate that first line, It’s going to create a new list, and it’s going to come here and it’s going to see this await word, and it’s going to say, “Hey, I’m asked to await for this other operation to call.” Now, what it does is it first looks at my state at the execution history and it says, “Hey, have we already done this step before? Have we already called say hello with Amsterdam?” And in this case I haven’t. So my history is going to say, “Nope. I’ve never done this before.” So my ochestrator knows, “Okay, that’s fine. I’ll go ahead and do this.” So it calls that activity, says, “Hey, say hello, Amsterdam.” That activity’s going to go ahead and run, and it’s going to update my execution history here on the right to say, “Hey, my task is now completed.” Okay? “Hello, Amsterdam. I did my task. It’s now in the execution history.”

Now, here’s the important part. At this stage, my orchestrator function is going to wake up, but it is not going to start from where it left off. It is going to start from the very top of my code. It’s going to create a new list, it’s going to come to this step, and it’s going to say, “Hey, have I already done this action already?” And in this case I have. So, it’s going to actually just pull in the value from that execution history, and then continue on to the next step. So after every function that I call, I replay my code from the beginning, and I pull out the values from my execution history. So this is replaying itself every single time and pulling out the values for my execution history. Now, why does that matter for you? Because when you are writing a Durable Function, your code has to be deterministic.

If you have a condition like, hey, if the current time is greater than, you know, 2:00 p.m., every time it replays, the current time is going to be different. So it’s possible that your condition might break your code because your code is not deterministic. So there is a very big caveat there that if you need things like the current date time or a GUID, that you need to make sure that you’re leveraging some of the…we give you some workarounds in your code to make sure that your code is deterministic for you. Okay? So it’s very important. This is not something that you can just go and modify a version. Like you saw in the first few demos. Derek would create a Logic App and then he’d go into that same Logic App and make a few changes and click save and everything was fine. That doesn’t work with Durable.

If you go and change your code that’s already running in the cloud, your replays are now going to start replaying on your brand new code. You might blow something up. So you need to be very aware of that in how you use Durable.

So the last thing I’m going to show, and I apologize, and I’m going through Durable very quickly, but I want to give Paul plenty of time because I feel like everyone always cuts into his time and we love Paul. So, Durable or Logic Apps? Here’s my two cents. Okay? So Durable Functions is a stateful process orchestrator, not an integration service. So, if you were looking for an integration service, you probably aren’t going to find all the features that you need from Durable Functions alone. Okay? Logic Apps is much more focused as an integration, as a service offering. There is some healthy overlap, though. There definitely are scenarios, even some of the examples I’ve shown, that could be done in Durable Functions or in Logic Apps. And that’s okay. In fact, a lot of it comes down to personal preference.

There are some developers who, for whatever reason, refuse to use a visual designer. I personally think it’s an extremely productive environment once you kind of overcome that initial learning curve. But if that’s something that you really want to do in a Durable Function, that’s okay. There is some personal preference there too. It is important to note, Durable does run everywhere functions do. So, you could run a Durable Function on a raspberry Pi. You could run a Durable Function on my local Dev environment. You could run a Durable Function on a local server, so that might be one consideration issues between the two. And the other last one I always like to mention is that Durable actually does pair extremely well with Logic Apps, especially with that like asynchronous http trigger.

If you want to call an Azure Function that might run for a very long time, but you wanna call it from a Logic App. Logic Apps is not going to keep the connection open for longer than two minutes. So there’s no way you’re going to get a Logic App to wait for an answer for more than two minutes. But if you pair it with Durable, it works perfect. Durable will immediately return the Logic App and say, “Hey, I’m working on this.” Logic App will pull the end point for you automatically and you could run something very long running. Okay? So Durable is incredibly exciting. I personally been able to build a lot of awesome technology on it. I encourage you all to test it out as well, but it is important to notice some of the differences. In fact, there’s a even deeper document that I have a link here, that will break down some of the differences.

There’s even been some awesome blog posts from the community that break some of this down as well. So with that, if you’re interested in learning more, getting started, I would recommend that you test out Azure Functions. I’ll be here through most of the week. I’m kind of split between this and another event starting tomorrow. But please feel free to talk to me if you have any questions or feedback on functions. Check out the samples. I’ve got a few blog posts as well that talk about some more advanced patterns and practices, including that calling long running functions from a Logic App on my own personal blog as well. So with that, thank you all very much. You’ve been incredibly attentive and I appreciate that very much. And with that, we will go ahead and turn it over to Paul for our next session. So thank you all.

Download Presentations

Fill the form below to get all the presentations delivered as a single zip file in your mailbox.

please wait …..

Interested about future events?

please wait ...