Integrate 2018

June 4-6, London

Summary

Jon Fancey & Divya Swarnkar present an array of useful points around Enterprise Integration using Logic Apps, including message handling, disaster recovery, primary and secondary Integration Accounts and tracking capabilities. Divya then demonstrates the SAP ECC connector.

Enterprise Integration using Logic Apps

Integrate 2018, June 4-6, etc.venues, London

Length51:25

Video Transcript

Good afternoon, everybody. My name is Lex Hegt, and at BizTalk360 I’m a technical lead. In that role I help…yeah, I give guidance to the team mainly in India to further develop the product.

The next session is done by Divya Swarnkar and Jon Fancey. Well, Jon Fancey doesn’t need any introduction anymore. We’ve seen him earlier today. And Divya is a program manager at Microsoft, and she’s mainly interested, at least business wise, in enterprise integration with Logic Apps. Yeah. Jon, of course, again he’s also working at Microsoft where he’s a principal manager. And for Integrate 2018 they have prepared the session on the favorite topic of Divya, being enterprise integration with Logic Apps. So please welcome Divya and Jon. Thank you. Enjoy yourselves.

Jon: Are you coming up? All right. Hello again. Awesome. You’re going to use your mouse? I don’t know. Let’s see what happens. All right. So what we’re going to talk about here in the next 45 minutes is enterprise integration, which is kind of like, “Well, what’s that?” Considering kind of everything else that you heard today. It’s really all of the stuff that nobody else is going to talk about in the context of Logic Apps. So a whole bunch of things. And when you think about enterprise integration, you think about SAP connectivity in ELP systems and that kind of stuff, but also we’ll look at some other things around kind of mapping constructs and how to do web services and consume that stuff and some other things as well around monitoring and around just kind of tracking of data as well. So it’s sort of a, you know, a large number of different moving parts that all are required when you really need to do kind of full blown enterprise integration at large.

So just as a quick level set, Kevin showed this briefly this morning. So let’s go into a little bit more detail. A level set for us around what enterprise integration is has this core construct called the integration account. The integration account provides XML and B2B capabilities, which allow you to do XSLT and XML validation, as well as liquid templates as well. I’ll talk a bit more about that later. XML support now isn’t just about XML or XSLT 1. We also have support for 2 and 3, as you’ll see later on. And then on the B2B side we have the ability with our protocol support for artifact X12 and AS2 to be able to do training partner management and be able to create files, send them to trading partners in those industry standard formats and also, of course, receive them and process them as well with a full sort of trading partner configuration experience with that too. And, you know, on top of all of that you also have the ability to take flat files and actually convert those to and from XML as well as being able to track everything that’s going on as well. That little sort of picture in the sort of right-hand side is the log analytics tracking portal that shows you kind of what’s going on not just in a single Logic App but actually across all of the Logic Apps that you have deployed into your environment.

You know, the power of this is that, you know, this little function symbol is on the left-hand side because the power of this from the integration account is you can actually do tracking not just from the components that we provide but we provide you with an API as well, and the API allows you to be able to track information from anywhere. It could be from [inaudible 00:03:39] function, it could be from a system on premises because it’s just an HTTP endpoint that you can then augment that sort of tracking stream with your own data from various systems in that sort of pipeline, in that chain of integrations to be able to push all of that data to OMS as well so you really can get true end-to-end monitoring.

Okay, the other important kind of context or concept here is…into the enterprise messaging is what we call the VETER Pipeline, the validate, extract, transform, enrich, and route, or route now that I’m in the U.K., process, the ability to kind of take something and validate it, and if it’s XML you wanna validate it again to an XML schemer. You probably wanna pull values out of that and we have expressions in Logic App that allow you to do that, pull these values out, so you can take sort of subsets of messaging, you can take values out and do business logic decisions based on what’s actually in the message and do content based routing with that as well, routing, I apologize, and enrich the message.

You can create new messages using our compose action and some other data operations as well. So you get a sort of full toolbox of being able to take data in any format. You can take it in JSON and convert it to XML. You can then take the XML and validate it. You can then map it, you could extract values out of it, map it again, transform it maybe into a completely different format, do some enrichment on it and create new messages, and then send them off to wherever you want. A canonical use case for this, as I mentioned this morning, is using Azure Service Bus to do this but you could, of course, use any protocol, transport, or connector that we provide.

Okay. So let’s keep going. Okay, so on top of all of that we have a large number of message handling capabilities. So a lot of flexibility in terms of the content type and how to deal with that that we provide. You can receive messages not just in JSON format but in XML, as I mentioned, but honestly in binary as well. So we really don’t care what kind of data you’re pushing through Logic Apps. We just provide you with tools to process whatever format that data it is, or if it’s binary we just kind of get out of the day and you can just pass it all the way through. So, you know, with our large message support you could receive binary data of HTTP, and we’ll just chunk it all the way through. You know, it could be a gigabyte in size and we can push it all the way through and maybe send it to blob storage or something like that. If it’s, you know, actual, sort of, structured data then you probably wanna do something more interesting on that and then you can use the other operations that we provide, you know, with XML, etc.

So with XLT, you know, this is a 20-year-old standard. XLT1 has been around for a long time. We also provide XSLT3 support now as well. And this provides a lot more capabilities, and I’ll touch briefly on that in a moment. XSLT3 is back compatible with XSLT2 as well. So you got a lot of kind of capabilities now because XLT was great but it had a lot of limitations, and now a lot of those limitations are being removed with 3 to make it much easier, doing things like JSON to XML transformations as well, for example.

And, of course, we provide you with the Visual Studio integrations so you have the ability to create maps in Visual Studio. Graphically for those of you who are long time BizTalk users you’ll see the similar mapping experience to what you have in BizTalk, including the ability to actually add parameter support as well, which is always a little tricky in BizTalk, but in Logic Apps we’ve made this very easy so that if you have parameters defined in your map you’ll see those rendered out in the card in Logic Apps so you can just pass values in and out of your map when you process your XML messages.

What else have we got? Some data operations. So probably the interesting ones are the things that can take one day to format and convert into something else. So maybe you’ve got an array and we can allow you to filter that array, select particular values or sort of columns, if you like out of that array, be able to join values in array and kind of flatten it, as well as being able to take data and write it out as a CSV or HTML format as well. I mentioned flat file already and validation. I think EDI I’ve covered as well, other than the fact that we have batching capabilities, which I shall come onto next.

Batching is kind of interesting. I talked about this last year for those of you who were at this event a year ago. What batching allows you to do is very simply be able to decouple the sort of creation of messages or items that you wanna put into a batch from the consumer of that set of items in the batch. So one Logic App or multiple Logic Apps actually can sort of add messages to a particular batch. Then you can set up release criteria which allow you to de-batch. And those criteria could be based on the number of items that you have in the batch. Maybe every 50 you wanna fire that Logic App to consume batches of 50. It could be based on the time. So you maybe wanna consume them all at the end of the day. Or it could be based on the actual size of the data in that batch or any combination of all of those. So you can always make sure that you process everything in the batch at a particular time or after a particular, you know, number of items that had been reached so you don’t have to wait for it to fill up with 50 items. You can just process every sort of 30 minutes, if that’s what you wanted to do.

You know, the decoupling is interesting because it actually has a, you know, separation between the consumer and the sender so that the actual guide is kind of doing the de-batching, is aware of the batch itself but it’s not directly aware of the Logic Apps that are sending to the batch. The other thing you can do is…we spoke partitioning. So you can specify simply a key on the batch. So even within sort of a larger batch of like 100 items or 1,000 items or whatever, you can actually specify a key within that so you can de-batch out of that in groups. It’s a lot of flexibility in terms of how you can deal with that. You know, very simple. When you compare with what that looks like when you write code, it’s a non-trivial problem that we make very simple for you.

Okay, let’s change context again. One of the interesting things with EDI and B2B is there’s a lot of requirement around how do you deal with downtime. Most companies who deal with sort of supplier relationships have to deal with very stringent recovery time objectives, often measured in terms of minutes, if not hours. And, you know, when you’ve got everything running in the cloud this may be a concern because it’s no longer in your data center. You can’t go in and just shout at people who are downstairs or in another building. You actually have just to wait, right? You know, if your national data center was offline or something within it was offline you’re kind of stuck, right? Well, not with our solution because you can actually deploy to multiple data centers. You can deploy to two different data centers, your, you know, Logic Apps and everything else associated with that. And then if there was a problem you could move the traffic, move that to the secondary data center and continue on your own terms because we take all of the control numbers and all the other state that we maintain in data center A and replicate that to data center B so that you can actually resume processing in the secondary data center if there was a problem which is kind of, you know, peace of mind type stuff.

The other thing actually which…somebody else might mention this tomorrow so I’ll just say it very briefly. The other thing we’ll be enabling very shortly is the ability to do DR for all kind of Logic Apps, being able to replicate the trigger state. So not just kind of B2B stuff but replicate the trigger state from one region to another. We kind of had geo-replicator story to the trigger state. Today we just don’t kind of expose it and make it easy for you to replay Logic Apps so we start them in the secondary node, but that will be coming very shortly as well.

What else do we have? I think that’s it for that. Let’s have a look at tracking. So I mentioned the way the integration account works in terms of tracking. You have a number of options here actually. You can not just use the integration account for using that API or using our components to write out tracking data and push them to log analytics, but you can also push them to a storage account of your choice, you can push them to event hub and do some downstream processing. You can also use Azure Monitor if it’s kind of hot path monitoring. So Azure Monitor sort of has a hot path pipeline that allows you to set alerts up when thresholds are reached so that you can set this up in your Logic Apps in your environment and get alerts and notifications when certain thresholds are being crossed. Maybe it’s the processing time of a particular Logic App or something else and actually configure that and set up alerts for it.

What else? Custom tracking, yes. So the other thing you can do is actually write your own data items from a Logic App. This is actually exposed in the designer now. So it was always a bit fiddly in code view previously that if you wanted to write, I don’t know, something like a PO number or something from the data flowing through your Logic App, you kind of had to use the expression kind of voodoo to do that. Now we have a nice tabular format for being able to see the tracked properties that you defined in your Logic App and sort of the path to the data and the messages that are flowing through and send all of that out to log analytics/operations management suite, which is a superset of that and visualize all of that, and Divya will show this later, visualize all of that in the portal that we provide with our custom solutions for both the B2B solution, so we do all the correlation between, you know, the technical and functional acknowledgments of all the goop that is these protocols like EDIFACT and X12, but also see in a generic Logic Apps dashboard as well, your entire environment. Everything that’s going on, Logic Apps that are running, Logic Apps that have failed, why they failed, kind of drill into all of that and even write your own queries. So, you know, very powerful stuff and basically essential when you have more than a handful of Logic Apps. How do you deal with all of that and how do you track everything? Okay. Why don’t you talk to this one, Divya? It will lead you into a demo.

Divya: All right. So we’ve been busy in past few months and especially from the last Integrate that we were here. We have done a lot of improvements to the existing connectors and we have been working on a lot of new connectors as well. So first one is SAP. We already announced in the keynote today that we are doing a private preview for the SAP Trigger. Along with that we are also very soon going to enable schema generation capability in the SAP connector. The next one that we mentioned is about SFTP. SFTP connector today only supports 50 MB. We will be releasing very soon a new version, a new connector, essentially, which would support large message sizes, one GB and more. Next is SOAP as SOAP connector. We have made improvements to SOAP connector as well. We started with…we added like pass-through support for the SOAP connector, also added the capability to connect to on-premise services using the connector.

Finally, for EDIFACT connector we added batching support incrementally. We provided that for X12 and now we are also adding the batching capability for EDIFACT connector as well. That should be out very soon. In terms of existing connector improvements, we have improved the message site support for ASD connector, which should now support 100 MBs. We’ve also done some improvements for other EDI connectors, such as X12 and EDIFACT, where at runtime you can provide overrides to the delimiters or the interchange, preserve interchange properties that you’ve provided in your agreements. So with that we’ll move on to the demo for the SAP connector. So the connector that we have now is the ECC Connector, and it can be used to connect to our three systems, our SAP S/4HANA systems.

The connector provides connectivity to your systems that are on premise and for that there are some prerequisites. The first one is the data gateway, which should be in the same network as your SAP system. In terms of the setup of the data gateway you need to install SAP NCo libraries from SAP marketplace for which you need SAP license. And then finally the gateway should be able to access your SAP system, so depending on the environment you may need to whitelist your gateway so that it has access permissions to your SAP system.

Let’s quickly see how the SAP Trigger works. So this is specifically…in this slide I’m specifically talking about how the newly added SAP Trigger works. So if you create a Logic App with SAP Trigger there are two things that happen with that. The first one is that an RFC server registration takes place on the data gateway. And the second thing is that, as I mentioned earlier, Logic App…the SAP Trigger is a WebHook-based trigger so we register a callback with the RFC server. And this step happens one time when you’re creating the Logic App but at the same time we repeat this at a regular interval. So whatever connection we have established, it gets renewed at some frequency.

On the other hand, when you want to do communication from SAP to Logic Apps which means that when events are triggered from SAP system, they first go to the data gateway where the listener is listening to the RFC or the program IDs, and from the data gateway they are sent to the SAP Trigger via the callback that has been registered at the RFC listener.

So now that we understand the working of SAP Trigger let’s move on to a demo and let’s build a Logic App from scratch with the SAP Trigger. All right. So I have a new Logic App here, and I’m going to look for the trigger. So there are three mandatory properties here which vary depending on your system. So what I’m entering here is for my system and the program ID that I have registered. I am going to check again that I have the service right. All right. The optional parameter here is the SAP action. So this is the namespace you…if you are from BizTalk server world you are probably familiar with the namespace URA to provide for the document that you want to receive from the trigger.

Now this is optional because we want to provide complete flexibility for the user here. If you do not provide anything here, then this trigger is going to receive anything that is sent to the program ID specified here. You can also provide…this is an array which means that you can use this trigger to listen to specific type of one or more messages. So today, for the purpose of this demo, I’m going to leave it empty which means that I want to use this trigger to receive any message that comes to this program ID.

And let’s send whatever we are receiving to our supplier. So this is similar to the scenario that we demoed in the keynote today. I’m just simplifying a bit. All right. Let’s save it. Permission contains invalid… I’m going to pick the folder directly from here. Let’s just pick this. Now I have saved my Logic App. I am also going to quickly check the trigger history for this Logic App. So the successful trigger run here indicates that the registration from the Logic App to the gateway was successful, which means that the connection has been established with the SAP server. So now I am going to send a message from my RFC. All right.

Jon: Did it work?

Divya: Nope.

Jon: So there’s a lot going on here, right? So, you know, one of the problems we have that we needed to solve with the data gateway is the ability to, you know, remove all this polling stuff out of the equation and have the data gateway act as a server that SAP can connect to. So you do all the configuration in SAP to tell it where to send the data to and that can’t just be, you know, a Logic App. We need a broker. We need something acting as a proxy, as another…looking like another SAP server for that to send the data to, and then the gateway knows which Logic App to send to and that’s what that trigger firing registration does. It says, “Hey, I’m a new Logic App. I’m a new trigger.” Calls down the gateway, registers itself on the gateway so that when the gateway has the connection established to SAP, the whole thing’s wired up. The other way around when SAP sends data, it can then trigger the Logic App automatically. Did I give you enough time to get it working or I’ll just keep talking?

Divya: Yeah, I think we have seen the trigger running this morning. I’m just going to open that Logic App and show you what it looks like, but you get the idea that configuring the trigger is very simple. You just have to provide a few parameters here, and then you have a running Logic App for that. For those who are not in the keynote I’ll quickly open the trigger and show you how that run looks like. So basically whatever is sent from the SAP is received by the trigger and then you can do whatever processing you want to do on the message and integrate with the external system and send a message there. All right.

So in this scenario we have sent a message, a transaction to the supplier. Now for the next scenario the supplier, Fabrikam Supplier who is integrating with Contoso Retail is sending more than one type of documents to the supplier. So, as we demoed in the morning, we are sending purchase order to the supplier. In response to that the supplier censors a ship notification, and the supplier is also sending weekly reports that are much larger in size. So here is my SFTP server on which I have two files. One is the ship notification that’s a smaller file, but then I have a weekly report that’s 200 MB report. So I have already copied these files to the SAP server and I just enabled my Logic App so that it starts reading those files. So we should see two runs happening here. All right. So we have one run that is in progress. We just complete it, which was probably the smaller file, and let’s see what happens here.

So I have two processing pads here. If the file that I’m receiving from the supplier is a report, I do not do any B2B or EDI processing on it. If the file is a report, then I just send it to a blob and delete it. If the file is not a report, then we process it using AS2 protocol, and then we do some XML validation on the received file. The XML validation is basically leveraging the schema, SAP schema, and validating against that to ensure that the transaction, the message that is received from the external partner is valid and we won’t be getting any errors when this gets submitted to our SAP system.

And finally the…using the send action, we are sending this message to SAP system. Here I want to point out that for XML validation you need schema from the SAP system. Right now…one second. Right now there is no in-product capability to generate the schemas. If you’re coming from BizTalk server, well, you probably already have those schemas. But for customers who do not have BizTalk server environment today, there wasn’t any first-class support for this. So we are also adding the capability in-product within Logic Apps to generate the schemas. And that’s how this is going to look.

The capabilities provided via an action that is also going to be added to SAP connector, and the action basically takes the namespace, the URI for the transaction, for the message type for which you want to generate this schema. So the URI contains all the relevant information such as the type of the document, version, release, etc. Again, this is an array, which means that you can provide more than one type of transaction here and generate around this action and get the schemas for all of them. In order to consume those schemas, you can either put them in blob or somewhere else, but the most natural choice, if you’re doing B2B integration using Logic Apps, would be integration account. So for that I’m using the arm action to update integration account with the schema that was generated in the previous action.

This is a work that is in progress. So we have plans to improve this experience by providing a first-class action that would allow you to directly upload these artifacts in the integration account. Now let’s run this.

Jon: Can you zoom in?

Divya: Can you repeat that?

Jon: Can you zoom in a bit?

Divya: Oh.

Jon: Do you have zooming on that? Or just…yeah.

Divya: All right. So this was…the action has been executed successfully, and I am uploading the schema into the integration account with the name “integrate schema.existing.” So let’s take a look at the integration account. And you see the schema file here. Right? So with this we now have SAP connector that allows you to send messages to SAP server that supports receiving messages from the server and also XML generation from the schemas that are in your SAP test instance. Let’s look at the…

Jon: Yeah. Let’s see if [inaudible 00:29:05] and make files done.

Divya: SFTP run, which is probably completed. So it took 3.45 minutes to download a 200 MB file from the SFTP server in this network. This is the action that took the file. Here I have Fabrikam report, which I had shown that it was 200 MB. And I uploaded that in the blob, and finally I deleted the file after downloading it from the server. All right. Let’s see. I think…

Jon: Just while you’re reading this, one thing is interesting about the generation of the XML schemer is that, of course, you move this from being a design time to a runtime activity, right. I mean, of course, you can create them in design time but it also opens the door to being able to keep schemas up-to-do date when things in SAP change. So, you know, previous to today you kind of go to Visual Studio and you generate the schemas or you download them from SAP or whatever the mechanisms you used to get them, but now you have the ability to actually create a Logic App to generate the schemas for you and put them in the integration account automatically.

You know, there’s one customer that we work very closely in terms of the requirements and in terms of the usage of Logic Apps and for…they do a lot of kind of SAP and dynamics integration and they’ve seen the sheer time it takes them to implement these types of integrations dropping dramatically from the competitive product that we were using previous to Logic Apps by, you know, 95%, which is pretty impressive. And this is a company that’s very familiar with SAP, very familiar with integration, and still saw a huge difference when they moved to Logic Apps and started using Logic Apps to connect with SAP and with their CRM system.

Okay. Let’s have a look at, you know, another part of this kind of enterprise integration space. You know, very common, as I mentioned earlier, is the necessity to be able to use mapping to change the format of the data. Not just, you know, XSLT1 support but 2 and 3 now. But not just that. You can also use your own custom assemblies and upload those into the integration account as well. You might’ve just seen that when Divya was kind of showing all those buckets in the integration account with the schemas and maps and everything, but you can also upload your own code. So you can take libraries that you already have or write new ones and call those from your maps to do some really interesting things. If there’s no out of box support for the fun toys that you need or for, you know, writing script to do these things, you can actually reference your own assemblies that you can use and reuse in your maps.

The other mapping capability that we have is Liquid. So if you’re not familiar, Liquid is a sort of standard open source that was released by Shopify which is a website that allows you to kind of create e-commerce websites. And they kind of realized they had this problem of everyone’s requirements are slightly different in creating a genetic platform to build an e-commerce website out of, required some type of mapping to be able to take some generic engine and be able to customize it and provide a tailored experience for their customers but it was open sourced and released to the community so we took that and gave you the ability to create Liquid maps. What’s interesting about Liquid maps is that they’re very, I guess, content heavy instead of XSLT which is kind of code heavy. So you actually, you know, have a different use case for this, where it’s very good if you want to do things like email templating, because most of the map is actually static text and you just want a drop of values into that from the kind of state that’s flowing through your Logic App.

So you parameterize that and then just sort of map in the values. So imagine this was an email you’re sending out to somebody. You just wanna drop in their name and some other details, maybe some order or whatever. So you just have a handful of items and the rest of it is just the static text, which you can then manage through a Liquid map, which is a lot easier oftentimes than having to sort of write XSLT, if you’re not familiar with that, because it’s sort of simplified for, I guess, people who are less familiar with XML and certainly less familiar with XSLT. But it also is very powerful. It has the ability to create loops that you can spin over arrays of data. You can do conditionals, you can create arrays, you can kind of take branching logic, and kind of render out different pieces of texts depending on the input values. So, you know, a lot of power and flexibility with that. And that also comes with the integration account.

Just a little word on XSLT3. This is actually pretty new. So it was a recommendation in July 2017. For those of you who aren’t familiar with what that means it’s a W3C spec, which sort of goes through a long kind of review process, and as soon as it hits recommendation it’s basically done, which means it’s safe to implement. So we’ve taken one of those implementations and provided it, you know, our server-less awesomeness of Logic Apps so you don’t have to care about how it actually runs or where it runs, and then you can just upload XSLT3 into your integration account and just use it. It has a number of interesting improvements over XSLT1 and even 2. One of them is being able to do JSON to XML mapping and vice versa.

And this is kind of a lot more powerful if you already use Logic Apps and use our expression to be able to do this because you have a lot of complex mapping that you can put inside of that XSLT. It’s not just about something converting it from one thing to the other. You can actually do a whole bunch of mapping on it during that conversion inside your map. Other things you can do which is basically impossible to do in XSLT1 and required you to use custom code or script or whatever to do is do things like evaluations. So now you can provide an XPath in a string and you can cannonade values and parse a whole bunch of stuff in as the XPath expression and evaluate that dynamically, which is also pretty cool because, as I said, you’d just be writing code to do that otherwise and it gets kind of hard to maintain.

Another thing that XSLT3 provides are what’s called TVTs, or text value templates. If you are familiar with JQuery, it’s a similar sort of idea, where instead of doing all this verbose kind of value of and select stuff in XSLT, you can just use the curly braces to sort of do…use them as selectors into the data, into the input XML, and kind of pull values out and cannonade them and change them very, very easily. So it’s sort of a shorthand way of being able to path into your data much more easily than XLT1 provides.

And then the other major enhancement is a load more built-in functions. So XLT1 had a few. This has dozens of these things across kind of date, arithmetic and math, and string manipulation, and a whole bunch of others so you have, you know, a lot of flexibility and capabilities with XSLT3. It’s not just a minor upgrade. It’s pretty significant. There’s a bunch of performance improvements and things as well, but these are sort of the main functional ones.

Okay. We’re running out of time, Divya. Okay. Liquid, actually I think I’ve covered this already. Just a very, you know, minor example of what this looks like. You can take an example of an “if, else” or an “end, if” rather. It can be an “else” in there as well, and “Hello” is the text and you can pass in the username and it just basically piles into like JSON that you can pass into this thing from your Logic App and give you a very simple way of creating a template in Liquid and uploading that to your integration account.

Okay. I know I talk fast. I might start talking faster in a minute because we’ve only got a few minutes to go. SOAP support. So who here uses XML web services? Who has SOAP services in their organization? Yeah, a whole bunch of people, right. They’re just… Who still creates new SOAP services? Yeah, still a bunch of people. Yeah. So this is a capability we introduced certainly after last year’s Integrate. I think it was kind of in the late 2017, and then we made some more enhancements earlier this year and these were sort of two steps. So the way SOAP support works in Logic Apps is you create a custom connector, and I’ll actually show this in a second. The customer connector you basically point at your WDSL, or point at the end point of your service, and it automatically generates a connector for that service. So it’s strongly tight in that it provides all of the operations that it finds defined in that endpoint and, of course, all the parameters to pass in to that service to be able to call it into the operation.

The other thing that we added, you know, earlier this year is a pass-through. So you can actually have a very generic way of calling your SOAP services. And you can actually then sort of runtime, pass it, whatever you like, provided it matches the contract of that SOAP service. That’s interesting because it opens up some more use cases where maybe the Logic App is just acting as a façade to the web service itself. Maybe it’s taking JSON in from something, converting that to XML and maybe using a Liquid template to create the XML document that’s going to be passed in to the SOAP service and just parameterizing that from JSON and being able to mash up from different services you provide sort of new lease of life for your web services, and, you know, give more flexibility into being able to consume them.

And the final thing is it also works with the data gateway so that these SOAP services do not have to be publicly facing because the data gateway works using service plus relay under the cover so it makes an outbound connection from your data center to Azure. So it’s a secure pipe outbound so there’s no firewall issues and no security issues of exposing what may be internal SOAP services that are on the internet.

Okay. Let’s take a quick look at how that works. So what I have here are just a few things. I’ve got a Logic App which is going to call a web services mapping with SOAP thing. And then I’ve got this thing. which is my custom connector, which is going to add two numbers together. As I said, this is all about enterprise integration so mission critical, 24 by 7 business functionality that you get called up at 3:00 in the morning if it doesn’t work.

So let’s start by looking at that. In fact, actually let’s start looking at the web service itself. So this is a circa, you know, 2005 web service. I can click on the WDSL and I can have a look at this beautiful XML representation, the description of the service. And in here I’ll see that I have an operation called “AddTwoNumbers” and it takes two parameters, first and second. I’ll just zoom in for the people in the back. So two integers that it takes in, and the implementation is over here and looks like this. You can see there’s really no magic here. It really is as complex as adding two numbers together. So it kind of takes, you know, in code…this is the Visual Studio and it’s actually running in the debugger, takes these two numbers and it’s going to return the result.

Okay. Let’s go back to the browser. So when I created custom connector, this is what this looks like. I can point it at my WDSL. So I have a few options here. So first off, custom connector doesn’t do much more than just SOAP. They can call pretty much any REST HTTP end point. So I kind of pick the radio button moving from REST to SOAP, and then that allows me to pick the WDSL, and the WDSL can be uploaded from a file or it could just point straight at the URL. I uploaded it from a file because this is an internal web service running on my machine, so when I sort of look down you can see this is actually just running…if I get down to the bottom here, it just said, “Running on local host.” You know, because I’m running it in the debugger in Visual Studio, this is not running in the cloud. This is not pubically accessible, right?

And I can set an icon on this thing which is beautiful. I can also specify the call mode. So it’s either SOAP to REST, and that gives me the rich typed experience where I see the parameters, the first and second parameter values, which I’m going to do here. I could also do say a pass-through, which I said really sort of takes that typing out of the way and lets me call any SOAP endpoint that I point at. It doesn’t really care so much about the operations and definitions in the WDSL, so I can be much more flexible in terms of how I do the processing. Okay, so let’s zoom back out again and look at the Logic App itself. I’ll just quickly open this up. Wait for the WiFi to crap out. Grief, it was faster a second ago.

Okay, while it’s mulling over there, what I’m going to do is I’m going to just use Postman to call this. So, oops. What I have here is a simple SOAP [inaudible 00:41:40] envelope. I’ve got the first and second numbers here. So just to kind of prove this is real, I will change these and everyone can kind of work that out. Just pop back. It’s loaded my Logic App. I’m going to open this up. So what I’m doing here is I’m just receiving A to B request. As I said, it doesn’t care what it is. In this case it’s XML. Then I’m creating a couple of variables where I’m going to pull using XPath out of that and pick the first item out of the XML incoming message, you know, the value nine, and then I’m going to put that into a variable. And then for the second I’m going to create another variable I’m going to pull a value out for that because that makes it very easy then to use my custom connector. And if I just kind of show you this, it’s really…appears kind of built in, that once that custom connector is deployed I can just type this thing in and I should see, you know, my add two numbers. I actually see, because I had…I just literally used the standard template, where you just see that I get this kind of great data using…to get a data contract thing as well because it’s WCF, but I can pick my add two numbers and kind of open that up and then you see the two values and I can easily just map it in.

But I did that already so I’ll show you that because it’s just down here. You see, I just parse in the first and second variable, and then I just simply send the response out from that and send it back to Postman. So if I run this, what should happen because I’ve got the debugger running in Visual Studio it breaks immediately and I can hover over these values and see the first and second, nine and one, that I parsed in and, you know, meanwhile Logic Apps, if I kind of went back to this and look at the run history I should see a running Logic App, which is kind of waiting for me in Visual Studio to complete. Let’s see if it actually does refresh this time.

Maybe not. Okay. So you see this thing’s running, and if I open it up you’ll see it’s blocked on that call to my SOAP service, which is running on my local machine in Visual Studio on that break point. So I can just let that go and continue, and then pop back to my Logic App. You can see it’s completed. You can see the response came back and it’s just amazing, the power of technology, as I said. Things move so fast that it actually took the numbers 9 and 1 and added them together and got the right answer, 10. All right. Thank you.

And, you know, obviously there’s the result in Postman as well. Okay. Let’s go back to the slides. You get these slides. We don’t need to walk through all of this. I think there’s a whole bunch of things here around OMS. We’ll quickly show that because that really lights up a whole bunch of the telemetry data that we collect. It also provides the ability to take actions on your data, like being able to do bulk re-submissions of failures in operations management suite. You can also filter by run and use your track properties and filter those down as well. So if you’re looking for a particular purchase order number or some other data item, whatever you define and push into log analytics, OMS, you can find it. And you’ll also soon be able to see whether runs have been resubmitted or not so that you know whether you should keep resubmitting these things until they succeed so we’re kind of tracking that as well as the ability to do bulk downloads. So let’s quickly show that, Divya.

Divya: So this is the OMS dashboard. If you have a workspace and you click on the Logic App management solution, this is where you land after clicking on the initial donut. So the first style here shows a view of the Logic Apps that you have and their status, all the runs that have happened for them. This is ordered by the failure count so the Logic App that has most failure count will show up first and so on. The second tile is basically sorted based on the failure type. So you can see the status and the count for them.

And the third is more around the action and the kind of failures that have happened for that action. So each of these can have different entry points. For a DevOps person who wants to just look at the failed Logic Apps and probably take some action on it, for example, resubmit the runs, they might actually come only to the failed runs in the second tile and click that. If you specifically want to take some action on a specific Logic App you might choose the first one and click on a run there.

So for this let’s take a look at the…we check Logic App. So here most of the runs are successful. In terms of interesting information, you have everything around the status, tracking ID, run ID, so like Jon was mentioning, you can do all sorts of filtering on these parameters to narrow down the results and take some specific action.

I also have tracked properties here. So I do want to mention that now we have first-class support end designer to attract properties. You do not have to use JSON view for that. Go in the settings for the action and you will see track properties there. So here, like for the tweets I’ve added a track…I have added two-track properties. One is the content of the tweet and who’s tweeting it. So in terms of filtering, for example, let’s say if you want to look at all the tweets that contain SAP, I can clearly see here that out of all 300-plus tweets there are 30 that were specifically about SAP.

So in terms of business scenario, you could actually push the business properties on which you wanted to search for your runs and leverage that on this dashboard to filter through those properties. We have…another capability that we have here is around resubmit. So let’s say I’m going to resubmit… All right, let’s go to the runs. I’m going to pick some failed Logic Apps here and resubmit those runs. So you can…this is bulk resubmit, which means you can pick one or more than one Logic App and resubmit them. You can now go on to a disappear portal and see what the status of those runs is. Right now we do not have a capability on this dashboard to differentiate between the runs that were the result of a resubmit and we are working on it. That is coming pretty soon.

The other capability that we are going to add is around payload download. So we want to enable payload download both for single runs and for bulk. And that is also going to come very soon, which will be enabled both for Logic App runs and for actions. So you could pick an action, go to that action, and then download the inputs and outputs for that and see why that action had failed. Again, like these are different entry points. With this you can actually look at the kind of the failure and see what Logic App has failed due to the failures in the actions or the trigger.

Jon: So, you know, very simple to enable this is lets you, just when you create a Logic App, you can just switch it on. The other thing you can do is actually pin all of these tiles, all these visualizations that you see in that dedicated portal in the Azure portal as well. So you really do get the best of both worlds that you can create a dashboard, you know, like the ones that we have with all our Logic App resources, and you can actually pin all of these tiles to that as well. So you could see your Logic App’s run history for a particular Logic App, but also see all of the telemetry across your Logic Apps and open up all of those if you use inside the Azure portal as well.

A whole bunch of things is what’s next for the SAP connector. We’re hoping to GA this in the summertime, but there’s also a lot other additional enhancements we wanna make, including SNC support, supporting outbound batching from SAP, supporting VNET connectivity using the integration service environment, so you don’t need the gateway. Also doing things like BAPI commit and native scheme support for SAP so there’s no schema generation required. You could literally just export the schemas straight out of SAP.

I said I was going to talk faster and I am. So then a few other things. The second one down, I think, should be NetSuite. So these are the next set of enterprise connectors that we’re thinking about implementing. So, you know, we’d love to have your feedback, which one’s are important. We’ve heard these four a lot pretty much in this order. So, you know, we’re looking at, you know…basically we do six months kind of planning cycles. We’re looking at these and see which ones we can get done in the next six months before the end of this year. Also looking at integration account improvements, having a consumption-based integration account. Yeah.

All right. So nobody cares about that, right? We should just carry on as we are, yeah? You heard that Dan Chauvin [SP]. Is he there? Is he there next to you, Stefan? Is he gone? Yeah. So, you know, obviously pricing works great if you’re a very high volume user. Logic Apps doesn’t work quite so great if you’re not. So we are looking at that and how we can fix that and be fairer in terms of the pricing for that if you don’t kind of pile through millions of messages a month through this thing.

What else have we got? Yeah, BizTalk parody. Again, we’ll make sure you can see these slides. We really wanna keep closing the gaps at BizTalk. I think that’s obvious. BizTalk has a huge amount of rich functionality and always has, you know, on premises. So we wanna introduce some of the, you know, final things that BizTalk has that Logic Apps doesn’t. Again, this is sort of a look at our roadmap kind of further ahead. So again we’d love to have your feedback on this. And I think that’s it. All right. Let’s wrap it up. Thank you very much.

Download Presentations

Fill the form below to get all the presentations delivered as a single zip file in your mailbox.

please wait …..

Interested about future events?

please wait ...

Back to Top