LinkedIn Live
Expert conversation
Nov 13
31 min

Driving AI success through tech consolidation

Tray.ai CEO Rich Waldron joins CIO analyst Myles Suer to share how tech consolidation helps execs move AI projects from pilot to production.

Video thumbnail

In this LinkedIn Live session, Rich Waldron joins analyst Myles Suer to explore what enterprise leaders need to scale AI successfully. They cover how platform consolidation reduces friction, why AI requires more than just a model, and how CIOs are reframing integration and governance as enablers, not blockers.

Top takeaways

  • Executives are shifting from isolated AI experiments to long-term platform strategy

  • Consolidation is about more than cost—it’s about reducing risk and accelerating time to impact

  • Enterprise teams leveraging Tray have both control and agility across teams building with AI

Session chapters

  • The hidden cost of bad integration

  • Why composability changes everything

  • How CIOs are funding AI: No new money, just smarter bets

  • Where AI is already working: 2 proven enterprise use cases

  • What successful AI adopters do differently

  • How CIOs should start their AI journey

Transcript

Hello everyone. I'm thrilled that you've come to join us today. I'm Myles Suer. I facilitate the CIO Chat, I write for a number of magazines, and I'm a tech analyst, and I consider myself also a tech journalist, and I'm thrilled to be here today with Rich. Rich, you want to give yourself a little bit of an introduction?

Certainly Myles. I'm Rich Waldron, the co-founder and CEO of Tray.ai. Excited to have a conversation today.

Thank you, Rich.

You know, one of the things that's always been, near and dear to my heart has been data and integration. It's in my mind the life's blood of an organization. How well it runs itself is dependent on how well it does these things.

Historically you know, in my discussions with CIOs and others, the complexity of this has really been a problem and, it really does impact digital transformation. In fact, the CIOs I know typically talk about it as tech debt, because it's such a mess to maintain and run.

Rich, what do you hear from your customers? What have you seen when you look at how historically enterprise integration has worked?

Yeah. Well, firstly, I'd echo the sentiment, and I think if there was going to be a remake of the never-ending story, it would be a digital transformation project because the kind of evolution of software stacks and data within organization means that as soon as you get to a certain point in time where you feel like you finally got that critical service integrated into the next one, you're already starting to think about what the next version of that looks like or what the next iteration of that looks like and how it carries on in between.

And I think that's why it gets commonly referred to as tech debt because much like when you write code, almost as soon as you deploy it, another engineer comes along and says, wait. This is old code. What are we going to do with it? And so from a CIO perspective and from a lot of the conversations that I have, the ways in which they're thinking about tackling this problem is this a composable mindset. How do we think about structuring the organization so that should the data source change or should the process change or should the company strategy change. Actually, we've composed the infrastructure in such a way that it's pretty easy for us to switch in and out the sources, and there's some consistency in what we're expecting to be delivered.

And to me, that's the critical piece of the puzzle when it comes to thinking about that data integration challenge or that digital transformation project mentality, which is how do we get the right stuff in the right place at the right time, and how does the data have some context or capability to understand what's going on around it rather than, four hundred siloed, stacked as it were, which is what many organizations have.

Yeah. It's amazing. Sometimes, people have multiple technologies, and it just becomes a mess. I know that my friend, Jeanne Ross at MIT CISR in her book, "Enterprise Architecture as a Strategy", talked about a company that 80% of their code, if you can imagine it, was actually involved in integration. And then she talked about if you can fix that, if you can standardize how you do all that, you might even be able to save 15% of the IT budget. Have you seen any of those kinds of numbers, Rich?

Regularly.

Because you know, if you think about the way in which software has been purchased through the last 20 years or through the phases that we've been through, there's been a very IT controlled environment that existed, in the sort of early 2000s or into the 2010s.

As we got through the tail end of that, we saw many point-to-point solutions purchased across the entire organization. And in some organizations, you started to get almost disparate IT support services, so you get marketing has its own stack. Customer success has its own stack. Sales has its own stack.

But there isn't somebody looking at it as a whole. And when you start to get into that territory, that's where you can end up with having 70, 80, 90, in some cases, hundreds of applications or data services existing within each department. And so when you multiply and stack it up and you think about the predominant effort that goes into structuring a business or operating it, it's totally believable that a lion share of that effort is going to be integrating, connecting, transforming, and utilizing the data that's stored in all of these areas. So, yeah, that's certainly on the high end, but, I can't say I'm surprised by it.

You know, silos are just such a terrible thing. I mean, I remember reading a story about Zara a few years ago where they figured out that by putting RFID tags initially they could solve their supply chain problems, but then they figured out that data might actually be useful to figure out what they should be making and what they shouldn't be making, so marketing wanted that data And if you silo it, you can't get the real business advantage and the transformation out there.

Yeah. And that's quite an interesting jump off point because this sort of thought exercise of unstructured data has become really prevalent in the last 12 months. There are so many organizations that, for a long time, have been sat on petabytes of data that they have figured is important but haven't necessarily figured out what they're going to do with it, how they're going to use it, and more importantly, how do you even, from a technology perspective, process it or access it in some way? And what's been interesting is the sort of second wind of data lakes and what we're seeing with some of the large data service vendors like Snowflake and Databricks and others is that they're really trying to tackle that centralization problem.

And what that's enabling is there are organizations that I've spent time with this past twelve months that have almost skipped an entire generation in digital transformation because they went from, on-premise technology. They started to build out their shift to the cloud. They got some way through it. And then what they've ended up doing is actually taking a lot of that data and just putting it in a data lake.

And now it's somewhere where they can do something with it. They can run models on it. They can start to get some intelligence on it. And then it becomes more uniquely accessible across the rest of the organization.

Where that gets really interesting for me with my Tray hat on, is that if you think about the power of integration. For me, that it is the power of action. It's the fact that we've got all of the intent. We've got all of the knowledge. We've got the capability. What integration gives me is the power to go and do something useful with it. So if there is all this unstructured data and in your case, it's RFID tags. In other cases, it might be the data that's pulled from machines in a manufacturing plant and sent it somewhere.

Historically, that might just be used to determine when a machine's starting to get to its end of life. But what happens when you can start analyzing how certain workloads or even certain customers impact the throughput of those machines, and you can start to change the way you operate or the timing that those machines run or even the way in which you think about, upgrading them or connecting them to the rest of the organization.

Suddenly, that data that was difficult to do anything with has an entirely new power. A way to access it and do something useful with it. And I think that's where the promise of AI has got a lot of people excited is because if you can reason on a mass set of data like that at scale and then determine an insight that you act on, you're going to see far more effective businesses. And, you called one out there in Zara. The example I always give is how did Amazon turn up from nowhere and become one of the biggest companies in the world?

Well, through integration. From the moment that you clicked order to the time that that order was delivered into your house, they so tightly integrated every system. There was so much effort put into the way in which the consistency of the data was married through the different services. And in some cases, it meant them even building their own custom solutions to solve these problems, but it gave them such an efficiency that they were able to in a very short order, become one of the biggest retail organizations in the world.

So, I think the case in point is there for so many to see that when you get this right that could be the difference between you and the competition.

So let's dig in a little bit on integration. So I remember earlier in my career when I ran the analytics division at HP Software that we had an interesting problem and that was we couldn't build the integrations fast enough. I remember sitting behind the integration engineers and they would spend a month just to build one integration. So one of the things I've been reading about lately is what AI can do. So how can AI transform how we do this messy integration? And the reason it's important is MIT's historical research has said that only 28% of the companies have really transformed because they've fixed and industrialized a mess. So how can AI help the rest of the folks who are behind catch up and do a better job of integration.

Yeah. And I think, before we even get on to AI, if you start with the point that you just made there on the month-long delivery for an integration project, which is actually in legacy terms, that's quick. What we saw looking around the market, as we built Tray over the last 10 years was that a lot of these projects took a year.

And the speed and pace at which businesses change today, the real time nature of data that we have at our fingertips really puts pressure on that delivery. And each of these integration projects is like an independent engineering project.

The bit that everybody forgets is the hard part isn't necessarily connecting the services together. The hard part is how do you think about what happens when they exist? Like, who maintains them? What happens with testing? How do you manage deployment? What happens if the scale of throughput changes? All of these things add pressure points to an integration.

So being able to adapt and being able to build these integrations in such a way, and it comes back to that composability theme so that they can adapt to the change in the business is very important. But, secondarily, what AI brings to the table is that it can start to augment the creation process. If AI can play a hand in speeding up the way in which you build the integration in the first place, and where I'm excited about AI getting to is in a process mining environment where you can start to recognize trends that are occurring because of these integrations and actually starting to make those suggestions intelligently within the integrations themselves. Then the speed at which you can implement, make change, and actually transform an organization because of an integration changes tenfold.

And, there are some prerequisites to being able to do that, such as the way your data is structured, the platforming, or the tooling that you use to create these solutions, but it's real. And that's where I see it having a transformational impact.

Now that power that it's going to impact manufacturing and other process-centered industries is really fascinating to think about and then even post product. So it's going to be interesting.

So let's dig into GenAI a little bit. You know a lot of attention has occurred with the LLMs themselves and all the technologies needed to assemble those, but we're still talking about data that needs to go from a transactional system or or a data warehouse into a vector database so it can be processed by an LLM.

How important is integration? I read the other day a report that was talking about don't forget the integration in the process. So how important is integration and and prepping that data so it's ready to use?

I think integration plays a few roles here, and the analogy that I like to use is if the AI is the brain, then the integration is the body. There's only so much or if you've got this amazing reasoning power that you can go and apply to all this data, you still have to give it the capability or you have to go give it the information to be able to react from. And so integration in this sense means that, can you connect to these different data sources. Can you bring them into a place where you can do something useful with it.

I think a common use case that I'm seeing amongst many of our customers and prospects centers around knowledge and pulling in from a if you're thinking about it from a customer management perspective, that might mean pulling in, prerecorded phone calls, pulling in emails, pulling in docs content, pulling in support ticket history, and basically building this vector database, this repository of knowledge that the AI can then go and respond to so that when you actually prompt it or ask it questions, it's got this rich territory that it can go and access.

And, really, the restriction to the capability is what knowledge can you actually provide. So the integration plays a really critical role in, firstly, being able to get that data into a place where it can be accessed and something useful is done with it. And in some cases, there's some transformation that may need to occur during that period. But then secondarily, where it gets interesting is when AI has a revelation or a suggestion based on being able to reason over this massive dataset, integration can play a role in taking action.

So if I discover from the analysis of all of the sales conversations that have occurred over the last six months that there's a common thread that's missing or a common piece of content that might be missing, then the AI can actually say, hey, if you go and do these things and push it out to this place or send these emails, we think we could actually impact the win loss performance that you see within your organization. Well, if you integrate the systems correctly on the downstream, they can actually go and start executing on that for you.

And that's where you get more into that agentic approach of actually being able to carry out an action on a user's behalf.

But to me, that's the part that gets really exciting is when you can go beyond giving you the output and actually taking action on the data itself.

So, one of the interesting things is the analysts have been talking. In fact, one major analyst coined the term AI-ready data.

Yeah so there's been a lot of discussion there obviously. One of the problems that CIOs are also wrestling with is getting the money to support all of this stuff, so it it is not necessarily the case in every company that the business has just opened up the piggy bank and said, here's a chunk of money, go do AI. In many cases, it has to be go reduce costs. So, when you talk to CIOs in your journey, are you finding that they're having to come up with the money? And then number two, what are the biggest issues you hear about being ready to use this data? And that can be anything from quality issues to protecting that data.

What are the kinds of things you typically hear?

I think it's a very interesting time to be a CIO because the speed at which you can go from zero to hero or the opposite, might be the quickest pace change that we've ever seen in the role. And that's largely because when you've got a technology as amazing as AI at your at your fingertips or the capabilities there, that instantly creates a pressure or an expectation within an organization upon which the board, the executive team, the organization itself is leaning on that CIO and saying, how are we going to harness this? Are we going to get left behind?There's instantly a pressure. What's our play here? How do we do something with this? And that's a common thread that I've seen and heard and and had in many of these conversations.

And then secondarily, when it gets to the budget piece, that's split into two areas. Firstly, ROI is a massive consideration. And I think the statistics that Gartner pulled out most recently that between 60-70% of AI projects are expected to fail in some form because it's so new.

We're trying things out. Is this agent going to take? Is this knowledge agent going to take? Is this GenAI chatbot I built going to be able to deliver on what's promised?

It is putting pressure on, testing the budgets and testing the experiments that are being run by organizations.

And then secondarily, where is the budget coming from? It's coming from consolidation.

And as you point out, there aren't many organizations that are carving out a fresh AI budget. The winners and losers in this in this change is that there are many point solutions that are simply evaporating and going away, and that consolidation is really paying for these AI projects. And in some cases, that might be support agent roles where, in some cases, people are actually being repurposed across an organization. And in other places, that's where you've got lots of technology that's been purchased that is actually deemed no longer necessary because we can pull the data out, push it into a service, do something else with it. So I think consolidation is pretty significant.

And then lastly, I think the way in which the organization is historically operated from an IT perspective has a pretty big impact here. The best-performing organizations I've seen are being extremely agile in how they're rolling these things out. Can we get a prototype built quickly? Can we get it tested within a small group? How quickly can we see whether or not it's working before we go and make that commitment Anything that's taking six, nine, twelve months to deploy, not only runs the risk of being out of date by the time it gets deployed, but you're well into the return on investment challenge that I highlighted earlier because you haven't given yourself enough time to see this thing out in the wild, in which case there's already pressure on you to do something different.

It's interesting. There's been a there's been a lot of different reports, from different organizations about adoption of GenAI. And the numbers that I remember are like 90-10, maybe 10% have actually gotten these things into production and there's a whole bunch of issues that are holding people back.

It's interesting. MIT CISR did a recent piece of research. I wrote an article on it And it talked about how it's got to move from something that is actually, you're buying a Gen AI technology and letting the marketing department use it to an actual app or agent out there.

What are you seeing? You know, those seem to me to be where the compelling things are, but what are you seeing as the early use cases that customers are actually having some success with?

Yeah. I think there are two main areas that I've seen customers be successful.

And, actually, from what I'm seeing, the speed of adoption is better than I've seen reported elsewhere. I've actually been impressed and, in some cases, surprised at extremely large organizations getting things into production early, which is not something that you'd necessarily historically see with new technologies.

The areas that I'm seeing these wins are existing processes that are infusing AI into them. So if you've already got a quote to cash process set up, integration set up, if you've already got a lead life cycle management integration set up, being able to go and infuse AI on top of that, being able to add something like sentiment analysis to those existing processes is where people are getting quick wins because it's enhancing what they've already got. It's speeding up the pace that they can operate at, and it's a measurable impact.

They've had these things running for some time historically, so they can actually tell, is this making a difference to the organization? And it is. So there's a huge success story in there.

The second one is in the, as it's been well publicized, the support use cases, where it's I'm getting the same requests or similar requests on a regular basis, and AI does a very good job of reasoning across a dataset that it has access to to provide either the frontline support or even skeleton out or put together the first response so that a human can go and actually edit and push out. And I think the reason that those are being successful are that they're processes that people already know.

They're easy to measure against. They're not expecting a user to go and do something different to how they have done historically.

Whereas some of the examples I've seen where people are expecting user behavior to change from day zero, those are much more challenging to get stood up. And I think that the sort of sidecar for this challenge is how do you actually get these things built out. Because if you continue to buy independent AI solutions, then you create a new integration challenge.

Because if we've transitioned from having 500 enterprise applications to we're gonna shift over to AI, but we're going to buy a hundred AI point solutions, well you've got the same headache. Then you got the option of, okay. Well, I'm just going to go turn the AI on for the 500 applications I've already got. Well, who governs that?

Who's the referee that sits in the middle of that battle and tries to figure that out? And which one supersedes the other is a totally different headache.

So then you get into this realm of, well, are we now an application development studio? Do we hire a bunch of engineers, and, actually, do we build custom solutions every single time?

Or do we use a composable iPaaS platform that allows us to build these things out because the orchestration's already there, which is really the evolution I think has to happen in the iPaaS market because it's forever been solving or focused on that integration problem, and this is now the modern integration problem.

So I think the twofold challenge is tackling areas that you can measure ROI on consistently and easily. And then secondarily, how do you actually go deliver on these projects in the first place? Because there's a lot of risk entailed with that.

Yeah. I think there's something to what one of the things you said earlier was the importance of already having worked out these processes that you're going to go use the AI now in automated form. So where you figured out step one, step two, step three for a human, the AI can just follow that whole stream.

But coming back to something else you were talking about, at the end, I think it's important to think about what are the early adopters that are successful? What are they doing differently than others? I mean, obviously, they're not adding a bunch more integrations into their system and trying to just cash it over. They're not making a new mess. They're industrializing. Right?

Totally right. There are a few traits. The first one is agility. So the best performing organizations that I've seen that have adopted AI have made a cultural change, but critically are comfortable with prototyping and getting things out quickly and iterating on them.

There's a large law firm that we work with, a 150 year old company, that now has every intern that joins the business submit and build Gen AI projects as part of their year one assignments. And so if you think about what they're setting up for the future of that organization is the new blood coming into that organization is AI native from the start. They're thinking about prompts. They're thinking about how to use this technology.

So they're already in that mindset.

Secondarily, that prototyping piece is key because if you're able to get something in front of somebody and you can very quickly start to see what the throughput is, how it's performing, even if that's in an initial closed group, if that means putting together a cross functional AI unit, which is something I see pretty consistently, those people are getting to a place where they've got the confidence to go and deploy these things out more broadly and start to rely on them much faster than anybody else. And in both cases, those organizations have a history of having good data centricity and being focused on integrating the stacks that they've got. They're not interested in adding hundreds of new tools and adding complexity.

If anything, they're reducing complexity and increasing output. And so it so much of it is how quickly you can get the organization to make the cultural shift, form the groups so that it there's cross functional buy in, but then critically get that prototyping piece under play because the speed of change here is the thing that makes such a such a difference.

I really like that a lot. I did a review of David De Cremer's book, and, one of the things he was talking about was you need to get the CEO on board, and then you need to do the org stuff and you need to do this in a way where people feel this is making their jobs more rewarding and all of that.

Yeah.

What would your final guidance be for CIOs getting on this journey, but wanting to do it in a way where because they've got the board coming down on them. They want to do it in a way that's going to be beneficial to the work.

Yeah. I would start with what are the core processes that you already have stood up, and then how could they be improved, or how could they be influenced by the addition of intelligent reasoning. So if you already have areas of your business that are well documented, well publicized, well integrated, starting there and looking to accelerate the way in which those areas perform means that you already know what you're building on top of. You've already got something to go measure against. You can do it in a contained environment. So it becomes much easier to silo some of this off, adopt the LLM of your choice of which they can be interchangeable given the speed of change within these models. And you can start prototyping with a group that already has an understanding or an expectation of the outcome.

That doesn't mean going and starting a completely new initiative for something that's completely alien within the organization, which is going to sidetrack you for the next 12 months. It's where are some of the pain points we already have, what are the processes we already have in place, and how could we go sort of apply these right now. A great example I could give you is your finance department. They are a 100% processing invoices in some form. There's no way this is all being done digitally.

So using something like IDP, using something that could semantically pull the data from those invoices with a high degree of accuracy and push that into, whichever your system of record or your system of choices to go and handle your downstream financial control process is an instant way to benefit from AI. It's a project area that you know. It's a critical part of your business because it's how you get paid, and there's an extremely measurable ROI straight out the gate. So picking these off and working on these projects quickly, to me, is the way that you can then go back to the board and say, hey. Look at this experiment that we ran. It allowed us to go and do x y z. That gets you the confidence of the board, the CIO, and the rest of the organization, and then you can start to go and apply that same mentality and approach across the rest of the organization.

Well, I've learned a lot, and obviously for the CIOs I know, they agree with you completely. Look for those quick easy wins, prove that you can generate value, and then go do more complicated things, but start there is a great place to do. Any final thoughts, Rich, before we close this off?

Other than the fact I've really enjoyed this conversation, for anybody that's interested to get their first agent off the ground, we have a workshop running tomorrow and on the seventeenth of December, which will be shared within the links or found within our home page. So we'd love you to come along and help out. And equally, always happy to share some of the knowledge that I've gained spending so much time with CIOs in this space, over the past couple of years. So, yeah, I really appreciate the opportunity to spend some time with you today, Myles.

Thank you, and thank you everybody for coming.

Let's explore what's possible, together.

Contact us