Integrate 2016, May 11-13, London
Kevin: Jon and I are here to talk about our enterprise functionality roadmap and what’s coming in logic apps in regards to our enterprise capabilities. Today we’ll talk about the integration accounts. We saw some intros to it in the keynote. While that’s our container for artifacts we’ll put the different artifacts you need for doing much of your B2B and XM L processing. Enterprise messaging what we’ll actually go into B2B in the functionality to enable EDI for X12/EDIFACT/AS2 some of those enterprise features around message handling. How we can do Visual Studio development and the tools that you are used to and being able to enable those scenarios. Then finally enterprise connectivity and how we’re going to do hybrid connectivity to the systems you care about on prem.
First, let’s talk about the integration account. This integration account is a flexible container for the artifacts, your enterprise artifacts that includes your XML schemas, your maps, your XSLT, trading partners, agreements and their certificates that’s needed for doing B2B interactions across EDI with AS2. That container for artifacts, that’s acting as our vertical. One of the things that we want to do with the integration account is first support EDI coming out of the box, so then you can see how we built one of the verticals on top of logic apps. Then make it extensible so you can actually take that integration account, add your own metadata to partner’s and agreements so you can have your own trading partner context for yourself or start using that, not necessarily in EDI but you can have your own notion of a trading partner agreement using another type of vertical that you want and be able to extend the platform in that way.
It’s easy to manage. The enterprise integration pack is an Azure resource so it goes to the Azure resource manager so you get all the benefits for using that as a resource. That means you get a REST API, a manage REST API with Olaf. You can tag all of your elements that you have that you put into your account. You can have auditing of all those requests so you can see who’s adding and removing items from that store and that’s really important. A lot of your B2B scenarios we want to make sure if anybody’s modified your trading partner’s agreements, make sure that still works.
Of course, you want to have a REST API, so you don’t want to be tapping on some portal trading partners and agreements especially when you have hundreds of thousands of those or you want to be able to automate creating new partners and agreements in your system. Of course we have Power Shell to enable that as well. Because we’re behind arm as a resource, then you get the secure, scalable and manageable experience that you’d expect out of any Azure resource that’s in there. Let’s see. What’s actually.
Participant: The containers, do they actually has schemas or just containers?
Kevin: Sorry, what’s your question?
Participant: Containers do they have those artifacts or are they just containers?
Kevin: Let’s go look.
Participant: Can you repeat the question?
Kevin: His question was; do they actually have schemas or are they just containers? What I have here, let me show you. Is, that’s customizing mode. Is an integration account and in that integration account, this integration account has the Azure resource. You’ll notice that we have schemas, maps, partners and certificates. The one thing that’s missing here, this is work that’s in development is agreements and that will be coming soon to be added to this list of artifacts that’s in the system. If you go ahead and looked at schemas, schemas are…What we have today is XML schemas so you can upload those XML schemas and you can actually look at these schemas.
Here you can actually look at the content of that schema. These are XML schemas, so they’re schemas so you can go ahead add, you can you can upload new ones on top of it, upgrade the schemas that you have, download them down to your system to go in referencing and that’s a little.
Participant: My question is when you subscribe for this enterprising integration packs, I assume that you get all the EDI related schemas.
Kevin: I see your question. Your question is, do you get all of the EDI schemas, the EDIFACT schemas when you ask for being on the early adaptive program for the EIP. I’ll leave that question to Jon.
Jon: Yes. The intention is to revive the schemas necessary to support the functionalities of X12 and artifact later. We’ll talk about all that later.
Kevin: We have a section on EDI. We’ll talk about that. Here we have our maps, our maps are XSLT. These are the XSLT that you’re familiar with. The same ones that get built out of BizTalk Server. The BizTalk Server mapper produces a DLL as well as an XSLT and these are the ones that get pushed up here. One of the things that we’re doing is we’re having in Visual Studio. We’ve brought those capabilities to 2015 and so your maps, flat files, schemas were inherited from the stock server and now you’re able to upload those to this integration pack and use them within logic apps.
Participant: Including the functoids ?
Kevin: Including the functiods. We’ve modified it so that we’ve componentized the functoids and so that we reference those functoids a little differently now than what was generated before. Then you’ll be able to have your own functoids and scripts that you can reference as well.
Jon: Yeah that’s right. It’s an interesting problem that it’ll be one to solve but being able to have map portability between on-prem and clouds is an important thing. A lot of people invest a lot of time with the maps so they’ve created that and custom functoid as well as the ones that we provide. We’re working towards a model where we make that possible.
It’s interesting how we do that, so that’s where we’re going to make that at the moment to land that so that we make that a possibility.
Kevin: Yes. We will run your maps on our platform. You don’t have to worry about that.
Participant: So still we have to upload the BDM?
Jon: Well, the way that works.
Participant: It’s a good question.
Jon: You still have to upload the BDM was the question. Based on your BDM files inside the BDM you have the XSLT. The way that works is with the toolings provided, with the map of the Visual Studio integration to be able to create these things. When you do the builds now we’ve augmented that. You do the build and the build spits have the XSLT to make it easy to integrate the XSLT rather than having to take the BDM file which is really a class with a set of properties. It just returns the XSLT. We’ve optimized that to make it easy to get those maps uploaded.
Participant: If I had to upload the XSLT, then I don’t need to upload the schema?
Jon: That’s right. You never did, right? It depends on the schema runtime, it’s the runtime.
Participant: for validation?.
Kevin: Yes, for validation of the schema and we’ll show that actually later in the pipeline. Then of course you have partners for B2B, EDI scenarios who have partners in agreements and that partner information, you can see we have portal experts for being able to create a partner. We’ll look at one of the partners, we see that you have the qualifiers you can go ahead and identify and do resolutions on those partners. This is I can see we have a long list of different qualifiers that you can pick from and the values you can set up on those qualifiers to identify and find those partners.
Finally, besides the agreements which I want to show you. That’s coming up really soon, we have certificates. For AS2, you need a certificate to go ahead and sign and encrypt your messages. We want to be able to manage those certificates for you as well. So if you go ahead and add a certificate, there’s two parts to it. One for the public certificate, we will go ahead and store the public Cert in our store, but if you have a private certificate, you’ll see that the key vault lights up. What we do is we take the private part of the certificate and push it into a key vault, so then key vault is now managing securely your private keys. That is the integration pack. One of the great things about the integration pack is that, we’re packaging this together and it’ll be easily referenced from your logic app. Now you don’t have to have lose schemas some place and if you are using the V1 pieces of our EDI components you had to individually stick within the connector, the schemas and the maps and they were reproducible and it was very painful before.
Jon: To add to that, I think some of the experience some of the feedback we got from V1 was that the way we talked about this morning, the way we change our connector to make this much easier to consume means that the kind of burden of you having to provision these API apps and connectors has gone away but with some of the EDI components that we had the burden actually to do some other stuff, actually included the artifacts to that uploaded to that thing. That may be very hard to manage, very hard, to move these things around. Especially when you think about the exceptions kind of production, deployments is like these things are locked inside these instances of those connectors and you kind of uploaded sort of persistent state inside them. The integration account was born out of that feedback and made it a much easier management experience and the ability to upload and download and manage these artifacts.
Kevin: Now you can have multiple of your logic apps, referencing in the same integration accounts and you get a better containerization of those artifacts as well.
Participant: Can you manage the 997 and MDM as well]?
Kevin: Can you manage the 997 MDM as well? He’s going to go through the demo and you’ll get to see some of that.
Kevin: For those of course that are familiar with handling XML in BizTalk. Everybody is familiar with the better pipeline, so in your send and receive ports you have pipeline set up which does your validate, extract, transform, enhance and route. We’ve enabled those capabilities within logic apps, so we can do now XML enterprise type of messaging and logic apps. For the validate we have the XML validation action so that you can go ahead and reference one of the schemas that’s in the integration account and we will go ahead and validate that XML against the schema.
What’s coming next is that if you didn’t want to reference the schema in the integration account, you can have more dynamic messaging where the schema you retrieve the schema from someplace or some other source. You can do inline schemas as well. Then finally we’ll do schema resolution, so we’ll discover the right schema to use instead of being explicit about which name schema to reference in that validation step for XML.
Participant: Do you have a FLAT file processor?
Kevin: The question was, do we also have a flat file Processor? Yes we do have a flat file Processor. Extract, one of the things that’s in the language today is that we’ve created an XPath function so that now you can go ahead and XPath into your XML as part of the code of your logic app. You can select literal single nodes or node sets out of that XPath and with the designer being updated so that you can do functions within the designers and then it’ll be much easier experience to go ahead and reference that XPath within the designer experience as well.
Transform this is the big one. We have the mapper and BizTalk. You can use that same mapper from BizTalk server and have an XSLT and that XSLT will be able to transform your XML. It has the compatibility that we had on prime in the server and we also have added our parameter support. Now you can pass in parameters to that map so that you can do enrichment within the transform and now you have a different way to get context. Before you had promoted properties that you had to reference in your context. Now you can actually pass parameters from anything that’s running in your logic app into your transform and then utilize that in your transformation. Enrich, composes as an up and coming capability. Have a question?
Participant: What happened to the mapper in the mebs?
Kevin: What happened to the mapper and Mebs. Yes. One of the things that we had to think hard about was where is the mass of mappers that people are building today and have built in for the last decade and that was with BizTalk servers. We want to make sure that it was really easy to utilize all that work you guys did in BizTalk server and be able to push that to the cloud. When we looked at the number of maps that were and Mebs versus the ones that we find BizTalk server, it made more sense to make sure we focus on that and make sure that’s easy for everybody to go ahead and reutilize the investments they already made in that mapper.
Participant: are the maps abreacted?
Kevin: Well, it’s not abreacted Map’s. It’s still AGA servers that’s running and we’re just focusing on XSLT in this product.
Jon: The other thing, if that presented any challenges. If you’re a business services customer, who has those TFM maps and wants to start using of logic apps you still have the opportunity, the possibility of…We had a V1 action which relate to you being part of that. We have done that long term for Logic Apps as well and in the more integrated experience.
Kevin: Today you can enrich your messages in the language of logic apps where you can insert certain items but you know one of the things that you can’t do today is just construct or compose a new message out of thin air. But we will be exposing a new function that will allow you to do that so you can have multiple inputs from different aspects of your actions and outputs of those actions and then compose a new message subtly. Question?
Participant: Is there something in the event similar to the pipeline components that you can add to the other like a custom pipeline component to the BizTalk?
Kevin: The question was, can you add custom components to the pipeline. In BizTalk, if you’re familiar with BizTalk, you have for example your receive pipeline and you have your standard components that you can put in that receive pipeline, but you want to be able to have a custom component that pipeline so as your messages come in your code can get run. This isn’t logic apps and we’ll show in the demo the logic app here. Your logic app is now representing your pipeline, but is now not constrained to only certain steps that you can do for that.
Now you can have custom code that’s running either in a function or an API app and just reference it in the middle of what you would consider your better pipeline but now we’ve opened the doors for you. Now you have a workflow instead of a constrained pipeline.
Participant: What if we don’t want to write C++ code?
Kevin: You don’t have the right C++ code? Some people like to write C++ code. You can write in either language you want. You can host it and we’ll call it and make sure it runs but if you like C++, if you have an existing pipeline components, you can take that pipeline component put it into and API app or a function and then we’ll just call it and still continue to run. You can reuse the investments that you’ve had and not have to do that much work to go ahead and re-enable those scenarios in logic apps and then finally route. Route is kind of our bread and butter. You call any service you can base it on condition to determine what end point you want to go to or you can send it to a topic or a number of topics and then the right routing happens based on that call.
That’s a beta pipeline. I know. All right. Here’s our beta pipeline. We created a logic app that demonstrates what a beta pipeline is. We have as our messages received, with have a request trigger that will then go through an XML validation. That XML validation is referencing a named schema called new orders. That’s a schema that’s in our integration account. It’s going to take the body of the request and go ahead and validate it. After that request is validated it will send it to a map. That map will do a mapping of the Order to a customer then the output of that map will then go to a condition and you can see inside the condition that will do an XPath to go ahead and find the ZIP code of the order that’s come in. Then based on the ZIP code will determine which topic we’re actually sending to in service bus.
That’s our beta. Then we’ll actually do a response back, so then you can see that it actually works.
Participant: It would be possible to have a node selector? The condition is that Xpath.
Kevin: That’d be nice. That’s a great feature request. Thank you. The question was, can you have a node selector instead of having to write the XPath stuff? Yes. We don’t do that today, but that’s a great enhancement that we can do to make that a great experience in the designer.
Jon: Just one point on the XML validation, that this is the kind of happy path through. You see we’re just keeping straight forward for demonstration purposes, but the semantics are similar for any other of the actions or connectors in logic apps. If the step failed, then the following steps will be aborted because it’s got a failure. We’re not responding to whether it has succeeded or not because it’s implicit in the behavior that the next step when the validation step succeeds. You could have a condition underneath that that the checks the result and then we provide enumerated list of the validation errors in that validation step, you can then decide what to do with those before continuing processing.
Kevin: Let’s go ahead and tick that off. Let’s go ahead and send that message to my pipeline logic app and process it and hopefully get the right one working and yes then responds with the actual message that the order got transformed into. Okay I want to hand this off to Jon.
Jon: Okay. I showed some of the EDI capabilities this morning in the keynote. The features we have coming really soon and we want to get this to you guys in a private preview form as soon as we possibly can. The X12 and AS2 features that I demonstrated this morning and that showed how you could post in an encoded messages that contains and X12 850 purchase order and then kind of break all that down and then do some processing on it and that actually has behind it all of the flags and everything necessary to send back the single receipt of knowledge and all the rest of it to conform to all the specifications and we’re actually just kicking off the work to do the drama certification around these pieces as well so that by the time we get this to general ability we have all the certification in place that you would expect from us.
Coming soon after this is EDIFACT support. I think it’s hard to provide dates so we’re working on that next as soon as we get X12 and AS2 in folks’ hands, we’ll be moving to EDIFACT. Then I just wanted to add to some of the things that Jeff was talking about this morning around tracking because that is super important for having visibility into these workloads and understanding why failures occur and getting insights into the data so you can figure things out. Get a loading on the failures and deal with them and resolve them. We’re piling onto exactly the same mechanism that Jeff talked about this morning, the track properties. These actions we’re providing actually emit a lot of state around all the EDI processing and everything necessary that we need to track and then that just gets pushed into storage and then the plan is we build the mechanisms around that to consume that and provide a portal for that to service that up and the more we can make the configure for our customers and provide the experience that they want to see, everyone has different requirements in terms of what they want to see, their learning what they want to set, we think we can provide a great management experience over the top of that. As Kevin mentioned…Sorry.
Participant: Lots of existing customers have built it on the BizTalk Sever the EDI functionality. Because you can reuse the agreements, schemas, and the maps do we have a way of taking that into the new scenario?
Jon: Schemas and maps we’ve talked about already so yes. Trading partners and agreements.
Participant: We can support schemas, we can support maps, we can support agreements. I’m asking is there a way lift and shift from server to Agnet.
Jon: Yes, that’s the plan. With trading partners and agreements we’ve designed this in a way that has that backward compatibility. We’re not exactly sure what the mechanism will look like to be able to move the artifacts from BizTalk server to Logic Apps, but it’s definitely in scope for later to be able to do that. We understand that some customers have hundreds or thousands ready again we want into all this data again we want to be able to have some of the export import mechanism or something like that to be able to move those word place to logic apps and run them there if that’s what they want to do.
Got to mention the schema and party resolution is work. There’s also coming soon. We’re actually working on that right now and also referencing. Schemas being able to reference one another and making sure we get all of that right. Obviously, when you get into complex schemes and the EDI one’s have references based types all that kind of thing and provide support for that to all work seamlessly as well. I was going to go back into what I shared this morning. Let’s have a look and pull up to say 15 and explain a bit more detail what’s going on here. Here’s the logic app that I show you this morning. I actually ran this through and you saw the MDM come back and the response.
Kevin: Let me ask a question. How many people did EDI processing today? Good number. About half.
Jon: That’s great. On BizTalk?
Kevin: On BizTalk. That’s interesting.
Jon: Great. We’ll talk about this in context again in future again because this is titled Road Map sessions so we have a lot of these things close when you’re ready to go into folk’s hands in preview to give you guy’s access to this to try it out and give us feedback. Some of the other things we’re doing around this is the way as Kevin mentioned is the way parties are resolved is through the qualifiers and values that you set up on the trading partners agreement in resolving either coming in to that trading partner and then all of a sudden you have the certificates and everything else associated with that. You can associate all of these things with the logic app.
That’s how we do it currently but this whole mechanism is very extensible and we’re going to be working to expose that as well. Including the integration account itself has extensible metadata capabilities on it. It’s often the case that people use various ways in the past of being able to store configuration through in any kind of work flow engine and that’s an important thing that we’ve seen during some simple look ups of values and the integration camp is ideal for those types of things as well. Where you have additional metadata and partners that you want to capture, you can add that and we’ll make that provision as well for you. On top of that that data we’ll probably be able to find on that data or search on that data that you’ve tagged these with or extended them with and also use that for party resolution as well. That’s in the roadmap that we’re working on making it kind of taking the implicit behavior you see here and making it more explicit so you can influence the party resolution for your own metadata or whatever else you want to use conditions to be able to resolve that information at runtime as well.
Enterprise connectivity, right. I think one of things that may be obvious to some folks is right now we have a great and sad story in connectivity story with all the connectors that we have in logic apps to the cloud. Where we obviously want to go to is being able to connect to those resources on premises as well. We have these capability in V1. We are kind of re-enabling this with the new design and the new features. I just want to talk through some of the connectors that we’re going to be lighting up and these include and the first ones we really want to push on our SQL or a Oracle database and DBII. We have a bunch more planned as well, in fact we have a long list in this but these are ones that we’re fairly sure we can get these in a reasonable timeframe. Paul who’s sitting over there is one that’s going to make this happen. The MQ series. IBM works for MQ is one of our super popular adapters in BizTalk and also connector in V1 and logic apps. The other ones like Infinix, SAP, Oracle E-Business Suite, file to be able to get the fastest on prem and CICS to be able to integrate with IBM mainframes as well. I want to get rich enterprise connectivity story and we’re working hard to be able to get more concrete plans about when we make these features available. We know the SAP is top of the list for all customers especially when they’re doing EDI so we want to make that a great experience.
What I wanted to show now was a quick demonstration of what that actually looks like rather than just talking about it or when we’re going to have these capabilities. I’m actually going to jump into logic apps and show you a logic app that I built. That actually goes to on prem SQL server and can actually do cloud operations to my on prem SQL.
If I pull out this Cloud to SQL logic app. This is super simple. I’m just going to show you how it works and show you it running and I’m going to flip to my machine back in Redmond and you can see a management studio open there and you can see what happens.
If I open this up and see I’ve just got a manual trigger when the designer renders. Then I go to the SQL connectivity so you just actually when the HTTP request received. I want this in a set of row here I can enumerate the database I got on prem. I got two databases there. Sorry, two tables in my database so I can pick customers and then I can just fill in the values. Obviously the values is a logic app. It can come from any proceeding step or I can actually put in literal values here as well on expression or anything else that I want to do to push data into a SQL. If I just put a customer ID of 200. Customer name, let’s put BizTalk360, London. I’m just going to type this and where are we, Excel.
Kevin: This is great. From the cloud we’ve been able to dynamically interrogate the database that you made a connection to and discover all the tables that you have and give you a first class experience for filling out those values.
Jon:. If I just kind of save that down, successfully I’m going to come here I’m just going to kick it off manually. Let’s just quickly flip over to my machine here which is almost certainly locked make sure I’m not going to type my password. This is my real machine. I’ve got Microsoft SQL Server management studio running here and it’s actually tiny here, but you can see that I have no rows in the database. If I can magnify this a little bit. It’s a lag.
Jon: There’s nothing there so if I flip back. Just a sec. If I flip back and then I run this logic app. It started successfully you can see it already there and it’s flipped back and see what happens.
Kevin: Don’t delete the customer.
Jon: Well, you can see. There’s the data appeared. What’s that?
Kevin: Don’t delete the customer.
Jon: Don’t delete the customer. I know. This is the same database that I had earlier. Same database that I used this morning.
Kevin: You’ve got a question over there Jon.
Jon: Sure. Everywhere.
Participant: I was just wondering. Could you highlight what’s different about the existing SQL connector we had today?
Jon: The V1 SQL connector?
Jon: I’m not going to talk about that in detail right now. With more details we’ll be able to write really soon around this as well. Exactly how that works and the capabilities of using cloud to show that the capability is real and working today . What are those capabilities? We have all the cloud operations on there. We were talking about Oracle as well actually and the list I mentioned. Being able to delete, update delete, operations on it. Does that answer the question or?
Participant: Can it store procedures as well?.
Jon: For procedures as well. I knew that you were going to ask me that question. I think the answer is yes but I can certainly check on that. If it’s not there it will be there soon afterwards. It’s definitely one of the high requirements we got on the list. I would assume that in the fullness of time that we have support because obviously I’ve said that’s a necessity. Why would you want SQL XML?
Participants: My incoming load is XML and pass it to JSON
Jon: You’re making assumptions about how you will have the pay loads passed around by doing that?
Participant:My incoming load is XML.
Jon: I see. This is not really…The similarities and differences the way we do this today. This is kind of an enhancement of what we do today.
Participant: Can I pass an XML file as it is and expect it to work in Sql
Jon: From SQL?
Participant: To SQL.
Jon: To sequel? I can’t think of a reason why not, but I can certainly go back and find more about that.
Jon: We haven’t tried it with this one.
Kevin: This is not the V1 connect.
Jon: This is not the V1 connect.
Participant: The V1 connect works?
Jon: The V2 sequel connector large variables. This is a different thing. We’ll find out. We’ll make sure it works.
Jon: Yes. More details will be provided very shortly. I just wanted to give you a sneak peak of what’s coming.
Kevin: You’re at ease.
Jon: I just want to show…We’ve got a ton of stuff that you want to see. A lot of things that we talked about today. Meaning all the new will be recently released and this will show you how much progress we’ve made since we were here last year and cross the board across logic apps, across enterprise connectivity the whole thing. We’ve been working like unbelievably hard. Haven’t we Jim? It’s been fantastic. I think that’s it. Is there anything else?
Kevin: That’s it we’re out of time
Jon: We’re right on time. Again we’ll throw up the slide. We’re here, we’re listening. Reach out to us. We always want feedback and we really want to get these bits as quickly as we can because we want to see what you think of them as well and try them out and give us great feedback. Thanks.