Webinar
Jul 19
22 min

Strategies to transform your integration initiatives using AI

See how teams are embedding AI into integration projects to speed up delivery, boost flexibility, and bring automation to business users.

Video thumbnail

Overview

AI is reshaping enterprise integration. Learn how teams are using AI to speed up integration delivery, simplify automation for business users, and improve governance and security. This session covers practical examples, including conversational automation and AI-assisted workflow development, plus a live demo of Tray’s Merlin intelligence layer.

What you’ll learn 

  • How AI is accelerating integration development and delivery

  • New approaches like conversational automation to support business users

  • How to build secure, governed AI workflows with Tray

  • Live demo of Merlin: AI-assisted integration in action

Session chapters

  1. How AI is changing integration strategies

  2. Using conversational automation for faster delivery

  3. Live demo: building workflows with Tray Chat

  4. Introducing Merlin: AI for integration development

  5. Q&A and key takeaways

Transcript

And welcome to the Tray session for the AI deployment summit, where we'll explore how AI and large language models are set to revolutionize your integration and automation projects, both within your company and for B2B. Our speaker this morning is Swati Aggarwal, principal product manager at Tray. Swati, welcome.

Thanks, Vance. Glad to be here.

Yeah. We're glad to have Swati with us this morning. She drives a strategy for how Tray brings OpenAI's large language models into the Tray iPaaS or integration platform as a service.

She advised vendors and enterprises on adopting AI, machine learning, NLP, and recently, LLMs. And she's got fourteen years IT experience across apps, cloud, and API integration. So great range of technology expertise both for traditional integration infrastructure and AI. That all comes together in Swati's session, strategies to transform your integration and automation initiatives using artificial intelligence.

In this session, we'll learn multiple ways AI and LLM will speed up and even democratize your integration and automation initiatives. We'll see some great exciting examples of AI trends and use cases, including an exciting new category Swati calls conversational automation. And best of all, we'll get a great demo. That way you can see for yourself why AI powered integration will quickly become a foundational technology for governance, trust, and scale.

So great session, great demo in store. Let me quickly remind you, you can download today's slides. Just click that big red button under the view screen and they're yours. You'll see we also got some other links and assets right here in the lower area.

Just take a look at those downloads and even a click to the free trial, all available with one click. And we love your question. So type in the submitted question box, and we'll make sure to get your question or comment to Swati. So with that, Swati, let me turn to you and tell us about strategies to transform your integration and automation with AI.

Okay. Let's get started here.

In the last few months, we have seen that AI has evolved quite rapidly.

While it's definitely being employed in many industries, it's need when it comes to iPaaS is stronger than ever.

We at Tray are offering the first ever natural language capability for an iPaaS that anyone can use.

AI is a new automation opportunity, and it's broadly being used first to speed up the delivery.

The velocity at which you can deliver your integration has gone through the roof. The notion that AI could augment your development is embraced quite intelligently at Tray.

Secondly, to expand your pool of builders by being a helper that just sits on your side that helps you to build more intelligent, more comprehensive, and even more efficient integrations.

Lastly and most critically, for this to really permeate to your entire organization, it's essentially that you can now deliver automation to your frontliners.

You can now instruct your services to work in a way that you couldn't before. You can get started in a way that's most natural to you.

So what are we going to walk through Merlin today, is what we call conversational automation.

It's a new type of automation that's infused with AI to power three key experiences.

The first, technologist who use it to build and iterate enterprise processes.

With the introduction of AI, they can build faster. They can build more effectively, and they can build more efficiently.

They can build and deploy more quickly.

The second, managers and frontliners who can use it on demand.

And finally, the developers who can use it through API that we will surface in the platform so they can add AI powered integration and automation to their apps.

And you can see here the pool of end user personas that we can now cater to with the help of Merlin From one-off jobs to more scheduled workflows.

From IT managers to CMOs to sales executives to service managers.

Applications are practical and endless.

Let's take a sneak peek into our demo today.

Okay. So this is Tray's chat interface.

As a frontline manager, today, I just wanted to know my ten newest Salesforce opportunities.

So let me just submit this request to Merlin.

Once this request is submitted, Merlin adds context to it and sends it over to OpenAI.

OpenAI then parses this and then sends the individual task to Merlin. Merlin then has identified the right connector needed. Let me just authenticate myself here.

From security and governance standpoint, Merlin is again asking the user to confirm the authentication.

Now Merlin is making use of our APIs to make the calls to the connectors.

Merlin has fetched the results for the user. Now let me just ask Merlin to send this data to Merlin Video's Slack channel.

Merlin is now asking me to authenticate for Slack.

Merlin has finished processing and sent over the data to the Slack channel, Merlin videos. Let's just see the Slack channel Merlin videos.

This is the power of Merlin Chat. You can get answers at the point of decision on the critical business problems.

Okay. So this is today's workflow builder, which is now powered by Merlin. If I come here and click on the plus button, I'll see a tab to ask Merlin. So today, let's say, I just want to get my leads data from Salesforce, and then I want to add it to a Google Sheet. I also want to create a workflow for that so that I can schedule it at a later point in time. Let me just submit the request to Merlin.

Merlin then adds context to this request and sends it over to OpenAI.

OpenAI then parses this request and breaks it into individual tasks and then returns it to Merlin.

Merlin then identifies the right connectors and authentications needed. In this case, it is understood that a Google Sheet connector is needed.

It is further understood that a Salesforce connector is needed to fetch the records from Salesforce.

Let me authenticate myself.

At this stage, Merlin is configuring the individual steps in the workflow.

It has identified the right operations needed and then just adding them to those steps, saving time for the individual developers.

A new user can understand what are the different steps that are needed to build a workflow and what are operations available.

For a power user, this just saves time.

Merlin has generated the workflow in the workflow builder for the user.

Individual steps are configured.

Authentications are added.

This is the power of Merlin.

Merlin is essentially an intelligence layer, which is sitting between the experiences and capabilities.

Merlin is currently powering our workflow builder, and it is also setting up the trace chat. So Merlin really sits between our experiences and capabilities, adding AI and NLP to our workflow builder and a new AI-powered chat-only interface that managers and frontliners can use.

This is basically just the tip of the iceberg. Possibilities are endless, and our vision for Merlin in the near future is to provide that intelligence layer that could augment anything and everything that you do today with.

And here is why we are so confident about it.

We are the API-first platform, and we are making use of our standardized and publicly available APIs to seamlessly integrate with OpenAI to surface Merlin.

Then we have our modern workflow builder, which basically offers the low code UI to now leverage AI directly into our workflow building experiences.

Then we've got a whole library of connectors, six hundred plus connectors, which again is built with standard operations and interface, which makes it so easy for LLMs to identify the right connectors and operations needed.

And lastly, we have a scalable serverless architecture, which essentially was built to grow as we scale use cases.

So under the hood, what we are essentially trying to do is turn conversational AI into conversational automation.

So under the hood, what we are essentially trying to do is turn conversational AI into conversational automation.

And, essentially, we are trying to give the best of both worlds, conversational automation and low-code automation to our customers.

Let's understand how it works in the background.

As a user, I submit my request, and Merlin is sitting as a middleman between user and OpenAI.

Once Merlin receives the request, it adds context to it and then submits over to OpenAI.

OpenAI parses this request, creates list of individual tasks, and returns to Merlin.

Merlin then identifies the right connectors and authentications needed.

As a user, I need to authenticate.

Once it's authenticated, then Merlin identifies what configurations are needed, and it sends that schema to OpenAI.

It's just the input schema. No data is being sent over to OpenAI.

OpenAI adds the configuration data on top of it and then sends it over to Merlin.

In the case of the chat interface, Merlin directly executes that, makes API calls using our connectivity API, and returns result on the UI to the user.

In the case of builder experience, Merlin generates a workflow for the user. Now user can either use that workflow at that point or can schedule that workflow to run.

User then receives the result in the UI.

Now this is very important for us that Merlin sits on top of our enterprise core functionality.

We have ensured in the past that from governance and security standpoint, the platform is well trusted.

When we bring Merlin on the top of it, we ensure the same functionality are being leveraged for Merlin as well. So for instance, all roles and permission, authentications, and data access are applied based on the workspace that you're using while in end.

Full audit trail and notifications are maintained.

We are leveraging our highly parallel serverless architecture for concurrency.

And lastly, we care about security a lot. No data is being sent over to third-party LLMs.

All the processing is happening in-house within the platform.

We are only sharing the context with OpenAI. We are only sharing the conversations with OpenAI or LLMs.

Alright. So that concludes my session today. But let me hand over to Vance and see if you've got any questions.

Swati, really great, really eye-popping to see how AI and OpenAI in particular are really helping integration become even easier. Really great topic and really great demo. Thanks very much.

No problem, Vance. Happy to be here.

We're really happy to have you here. Luckily, you saved us some time for questions.

So with your permission, let's turn right to some questions.

Great. Let's do that.

You know, Swati, you had a great demo there. In fact, you talked about going under the covers. Couple of comments and questions here. I'd like to go under the hood a bit deeper, actually. If Merlin seems to understand a lot of integration requirements and a lot about the tray connectors, just from that English language prompt you were putting in, Architecturally, can you talk a little more about how Merlin and OpenAI and Tray's platform talk to each other? Seems that there's gonna be some really fine work done either in modeling or the framework. Give us a sense of what's going on under the covers.

Perfect. Yeah. It's a great question.

So Merlin is actually sitting between our platform and, OpenAI. It's very tightly integrated with our platform. It's leveraging the power of Tray platform.

Tray has got in-house connectors, six hundred plus connectors. It's leveraging those connectors. Tray has got the APIs which are commercially available for our customers, and Merlin is leveraging those. So if you go under the hood, the first time as a user, if I just submit my request to Merlin, Merlin just adds the context to that request and sends it over to OpenAI. Then OpenAI, you know, uses its brain to either pass that request or add some sort of input data to that request, which could be in the form of what configurations or operations are needed and then sends it over to Merlin.

Merlin is the one which can neatly talk to our platform first to identify, like, what connectors are needed so it can just tap into our own resources for that. And when it has to make a call to those connector, it just uses the APIs which are available via platform to make the calls to the connectors and then take the actions on behalf of the user. So, essentially, it's just leveraging the brain of OpenAI and the body of our platform to do the magic for the users.

It's really amazing. And the thing that is also very interesting about Merlin is it almost seems like it's a translator or an interface to OpenAI that it can learn about not just the general Tray platform, but all of your individual six hundred connectors. Is that a good way to think about it?

Absolutely. And that's the whole idea. Like, we keep talking about composability, and that's the idea that anyone can use the platform. And these are basically the building blocks which Merlin is taking the advantage of.

This is really awesome. Really awesome. Here's another question related to architecture. This is a little bit different. You mentioned Tray’s serverless architecture. Why is serverless computing important for AI based integration like the way you're doing it?

So AI-powered automation, it generally increases the velocity of automation development, and serverless facilitates that no extra provisioning is required. You just go based on the usage. And that's the beauty of it. Like, it's not that, you know, making new calls, so now you need new provisioning. So it's not like that. And that's the advantage that it can just scale with you.

Yeah. So I guess not only is it good for handling complex workflows, but also high-traffic workflows, the scale that you talk about.

That's exactly.

Perfect. Perfect. You know, this is the elephant in the room when it comes to using OpenAI, Swati. I'm sure you've heard this before, this whole issue of security and governance and issues like that. Couple of questions kinda related. How secure is the integration with tray dot I o? And do you pass sensitive data to OpenAI when Merlin reaches out to OpenAI?

It's quite secure. So, essentially, we are making use of our own governance that we have already implemented at the platform level.

So anytime only authenticated users are actually able to see those sensitive information on which they have access to. And when they're using Merlin, the same logic applies to that as well. That's the first thing. Like, our governance principles are applicable to Merlin as well. They just translates directly to Merlin as they are available in the platform today. But not just that. With OpenAI being integrated now, we are ensuring that no data is being sent over to the LLMs or third-party models.

What we are essentially sending over is input schema, conversations, context, but no actual customer data is being shared with the third-party LLM platforms.

Wow. That's really great to know. You can get that much information to OpenAI and still keep all my data private. That's really awesome. Swati, this is really fantastic, fantastic session. Good q and a. I think even though time is rushing by, we can fit one or two more in.

Let's talk a little bit about how the overlay between Merlin AI and Tray's traditional low code approach works together. You had a great low code or graphical output that you showed us in the demo about how the result of all that OpenAI work, something that looks very low code friendly. Can users iterate with low code after they've created their AI integration? In other words, once they see that workflow or that map, can they go in and fine tune it just directly in your low code interface?

Yes. Definitely. So low code interface is most natural to our developers, Vershona.

And when they come on the platform, they could be either, like, the new users or they could be power users. If they're new users, low code interface now powered by Merlin, it just gives them quick start to it. They can just use their natural language, and the first version of the workflow will be generated for them. If they're power users, they can again use the same way of generating the first version of the workflow, but, also, they can then iterate on top of it.

They can add more complex logic if needed. They can add a step. And, again, when they even add a step, they can still use the natural language. So they can just say that add a step to send a message to Slack.

And if you just find the right connector for it, add that to the step, and add all the needed operations to it. So at any point of time in the workflow builder, you can use Merlin AI either to generate the workflow for the first time or to iterate.

You know, that's really awesome. Let's talk a little bit about the artifact of conversational automation that you've got. Question here says, do you have logs that provide detail around the conversations that take place between the human and Merlin?

Yes. That's very important for us that audit logs are maintained. We're also exploring more opportunities to provide insights in our insights hub. So log audit logging is very good, and most customers would need it from governance standpoint. But then to also understand the value that this product is offering and getting more deeper insights, it's essential. And that's why we want to leverage our insights hub in showing some more detailed, in-depth insights about Merlin and AI.

You know, Swati, this has been fantastic so far. Just a couple implementation questions as we close. Do users end up having to preload any metadata or anything about their workflow usage that they've got inside of their company? Or does Tray basically have a pretty good library of common workflows that are implemented and it just knows how to proceed?

Yes. So Tray definitely has a preloaded library, which can get the customer started. And if they want to fine tune it depending upon their organization's requirement or depending upon their department's requirements, they can definitely go about doing it. But we have a very well versed self serve library available for our customers.

Yeah. It's really great to know that I can put in that English language instruction and that it can be aware of what a common workflow outcome would be, such as loading information into a Salesforce application without having a lot of questions back or having a failed operation.

Absolutely.

You know, Swati, this has been amazing. I love seeing how OpenAI is bringing together some rich capabilities in low code integration and actually, in this case, no code integration. Swati Agarwal, principal product manager at Trade dot io with a lot of portfolio in charge of bringing Trade together with OpenAI OpenAI and LLM. A really great session. Really eye-opening to see how artificial intelligence is supercharging integration to the point that even nontechnical folks can begin to get some very important workflows done without the help of a IT or developer team. This has been awesome. Thank you very much.

Thanks, Vance. It's been a great pleasure to be here.

Yeah. And we've really enjoyed it. And just a quick note, Swati mentioned the free trial. We've also got that here in the breakout room.

I highly recommend you take a click on that and take a look at that. We've also got some other great valuable resources for you to learn more about Tray, the iPaaS platform, and how they've used the OpenAI and large language models to make integration even easier. And as you can tell, there is a ton of innovation going on at Tray, more than we had room for today. Here's a slide.

It'll take you directly to the Tray website. Download Swati slides this morning, and all these links will be live. It'll take you to some other rich resources at the Tray website. Thanks again, Swati, and thanks again to the audience for some really great questions.

Let's explore what's possible, together.

Contact us

We use cookies to provide and improve our services

Cookie Policy