is now [learn more]
PODCAST

The future of Event-driven integration with James Urquhart

In the book flow architectures. The future of streaming and event driven integration flow is defined as networked software integration that is event driven, loosely coupled and highly adaptable and extensible. It is principally defined by standard interfaces and protocols that enable integration with the minimum of conflict and toil it aims to reduce the cost of real time integration while allowing data streams to be shared in new and innovative ways giving birth to a worldwide flow. 

Transcript

Aaren Quiambao

Welcome to Coding over cocktails, a podcast by Toro Cloud. Here we talk about digital transformation, application integration, low code, application, development, data management, and business process automation. Catch some expert insights as we sit down with industry leaders who share tips on how enterprises can take on the challenge of digital transformation. Take a seat. Join us for a round here are your hosts, Kevin Montalbo and Toro Cloud CEO and founder David Brown.


Kevin Montalbo

Joining us all the way from Australia is Toro Cloud CEO and founder David Brown. Hi, David. 


David Brown

Hi Kevin.


Kevin Montalbo

And our guest for today is a proven technology executive and a key influencer in the use of distributed systems and technologies in the enterprise setting. He has a background that includes both field and product leadership with both startups and large corporations named one of the 10 most influential people in cloud computing by the MIT Technology Review, the next Web and the Huffington Post and a former contributing author to Gigaom and CNET. 

He frequently writes and speaks to these disruptive technologies and the business opportunities there. He is the author of Flow Architectures, the Future of streaming and event integration published by O'reilly in it. He imagines a new global event driven network that will drive what he describes as a Cambrian explosion that will change how the world works. And we're here to talk about that today. Ladies and gentlemen, James Urquhart, great to have you on the podcast, James.


James Urquhart

It's a pleasure to be here and I love that sign man. That is so cool. I got to get one.


David Brown

Thanks man. We've been waiting for that sign for a while and we just need a new camera. So the resolution comes up better in the background there, but it is a coding on the cocktail sign. 

James look, we've been really looking forward to this. The book is really interesting stuff and your predictions are very, very interesting. It's always, we, we, we often talk in our podcast about, you know, what's your prediction for the future? Well, you've written a book with, you know, about some disruptive technology which you think is gonna disrupt the future significantly. Why don't you tell us what is flow to, let's get started with. What is it that does?


James Urquhart

Yeah, thank you. So the concept of flow is, is really about the fundamental question behind it is what happens when the interfaces and protocols that we use to integrate our applications in a way that we can consume real time event streams, real time indications of state change in other systems in our own systems. So today that happens all the time you have stock trading networks, you have health care systems. You have manufacturing systems where, where you know, sensors can indicate there's a problem to a central control system. But the problem is every single one of those use cases that you go through have basically bespoke made for that purpose interfaces and protocols that enable those things to talk to each other. 

I did some work with something and then we can talk more about as we go forward a technique called worldly mapping which allows you to kind of understand the user need, understand the needs that are required to meet that user need and then map that against an evolutionary scale that goes from sort of incredibly new and novel and not understood all the way through custom and then product till they get to a commodity or utility phase when it's fully understood, and everybody expects a certain behavior. And so my question was what happens when those interfaces and protocols begin to evolve to that point where they become standard and understood and ubiquitous. And it very quickly occurred to me that that is a huge deal. It's akin to when HTTP gave us a standard for defining and linking information with a standard interface and protocol for understanding how to not only provide a pointer to another piece of information, but to, but to follow that pointer from one piece of information to another, right? 

So in the same way, this is about activity, this is about linking activity in a way that you could very easily say, hey, I want to make available this stream of events that represent this activity to the world or to specific individual organizations. And that as a consumer, you can say, hey, I want to subscribe to that particular stream to be able to to react to that information as it becomes available. And if you follow the same train of thought in terms of what happened with HTTP and linking individual pieces of information and so on, until we started to see a global graph of information that's highly linked to each other. I believe that over the long term, we're talking a decade, maybe more that we will see a very large massive scale connected environment where real time data is being exchanged and consumed and generating new data across industries across the geographic locations and across the world. So that's, that's the heart of what flows. It's about how event integration, event driven integration will really be changed by standard interfaces and protocols and how that will change the way we integrate businesses.


David Brown

So I mean, it sounds like a pub sub model for internal architecture, application development, event streaming and subscribing to events and having applications responding to those events is a common architectural model for internal application development. So my understanding of flow is it's that on a global scale. So you're making those events publicly available, presumably with some form of security so that you can lock it down to only those subscribers that you want to. So it's almost replacing restful APIs with a event based subscription.


James Urquhart

I want to correct that slightly because it's not replacing REST based APIs today,


David Brown

Sorry, i mean replacing supplementing


James Urquhart

Today, we do many things with API calls that should are about trying to catch an event, a state change that's occurred. So when, what we're doing is we're placing those situations where we're using request reply models where that's not the most efficient and I'll give you an analogy. So there's a gentleman, I can't remember his name right now, but he spoke at Dev Ops Enterprise a couple of years ago about how Walmart has adopted a vendor, an integration for their real time inventory systems. And the most powerful aspect of that story, you know, he talked about, you know, the what, what the architecture looks like internally in Walmart systems to be able to take advantage of, of receiving data in real time as opposed to having to find it by making these kind of tree calls through multiple services, right? So literally the data that a system requires ends up looking a lot like a look up table because it just gets fed with the data from the outside, gets updated on a regular basis. 

So they actually just literally in some cases, use a look up table as the service cool stuff. And then he did the math on what that does to availability. So rather than having to chain all those services together and get availability by multiplying all the service levels of all the services that you have to call to get your answer. They now have that one call to get an answer. Whether or not that answer is 100% accurate, is dependent on how accurate the the feed is from the the chain, right? But, but the call tree is a simple call. And so it's way cheaper to achieve five nines in terms of service availability in order to answer that question. And so that that kind of fundamental shift in thinking really changes the nature of when you would use event driven versus API driven, I think, you know, and probably a large number of cases where APIs are driven are used today. The idea of using event, event streams will make more sense.

 However, even in flow, the interfaces I talk about are APIs right to initiate a connection to a stream, to request, to subscribe and you're right about the pub sub thing that seems to be likely the dominant if not, the only model that flow will take. But in order to make that subscription, you have to make a request response call to the to the topic holder, to the to publish subscribe system. So APIs for request response situations will still be critical. But it is, it's very, very true though that we do an awful lot of things where we make calls on a timely basis for, for trying to catch information when it's available as opposed to Hey, I have a query, I need a response to that query at that time and that's what it will replace. That's what that's what the fundamental change will be.


David Brown

There seems so many implications in terms of how it's going to disrupt business and the like which we'll get to in a second. But can I ask you, you've likened flow to H GDP, the foundation of the World Wide Web where you predict that the flow will become the foundation for linking the world's activity via this worldwide flow to use the same sort of an acronym. What led you to the conclusion that this was inevitable that this is, you know how business was going to be conducted.


James Urquhart

Yeah, the core element of it is that  if you look at the needs to satisfy event driven integration, so you just say event driven integration without putting any caveats on it. And especially the, the only caveat I put on is I'm really really interested in how I do that integration across organization boundaries. Like what are my needs to be able to securely and with trust and in a performant way have one organization say I want to subscribe to the feed from another. And there's many examples in the book of different kind of interesting use cases where that's the case. And and as I did that analysis integration anytime you have integration, the the key factors that make integration possible are you need some form of interface that allows the systems involved to be able to identify a connection between each other. Now, that might be through an intermediate party. 

So I am very clear in the definition of flow that it doesn't have to be the producer and the consumer that provide and consume interfaces and protocols. There can be agents that act on behalf of the producers and the consumers to do that. But when you have that flow that that situation, you need those interfaces and protocols to enable ultimately that those two things to be able to get connected together. So those are two things that you end up saying are components of the overall solution, the other things that you need, besides those two things include things that we all know extremely well, which are the the things that interact with streams. Because just streaming event data with no interaction is just moving data around for no really good reason, right? You have to have interaction that goes with flow. And so when you take that on, then, then you also have to map things like cues and processors and sources and syncs as a part of the map as well. And so I put all that stuff on there and then I, you know, I was paying attention to the fact that a lot of those things in terms of in terms of cues and processors, those are commodity today, right? You can go to Amazon today and you have a selection of different types of event processing. You can choose from lambda step functions and you know, and eventbridge and a number of other things. And then you have a number of cues as available to you as well. 

So Kafka just you know, simple message service, you know, a number of other things that are available just to give you one utility sense of that. And then you have a big choice in the marketplace for them. So you have this ubiquity of core processing elements sources in syncs range a little bit more, right? So it syncs if you're talking about data syncs and analytics things, those are largely commodity today too, you can get data warehouses and, and you know, even short term processing real time visualization stuff today very easily sources range from brand new sensors we've never seen before all the way to commodity elements. So you, but then you go OK, the interfaces and protocols. Those today, if you look at how people are integrating, especially across organization boundaries, they're integrating applications using event streams. As I said earlier, those are highly bespoke those situations today. 

One vendor, you know, 11 company may say, hey, we're just exposing our Kafka API through some sort of secure tunnel or, or something like that. And with heavy access control around it and that's how we're doing it. Another group may say, hey, you know, we're, you know, publishing these to eventbridge in Aws. And so if you want to connect to this, you can go to Eventbridge and this is how you connect to our stream that way. And, you know, there's, there's, you know, everything from what the stock market does where they literally sell you a port on a server that you connect to directly and get a very direct network stream that way to, you know, to more casual kind of ways of connecting. And so what got me there is so part of worldly mapping is understanding that everything moves from that, that sort of that crazy novel thing through that four steps that I talked about earlier, custom product towards that utility phase, right? And if it's really really useful technology, new or, or a capability, I should say new technology will come to meet that capability and replace and slowly you know, subsume the old one until you get to the point where you've got a very standard, there's no real need, you know, we know exactly what we want here. There might be little tweaks to specific features that are built off of the core capability. There might be tweaks to the service that's offered, but you get to that point where you move to that commodity. 

And so what happened? That's when you ask the question, if we're kind of largely custom or maybe product in some cases today, what happens when we the right technology comes along to make that a utility. And so that's, you know, to me that's the, the kind of core thing is you, you do have to, you know, it's, it's, it's somewhat speculative to say, but as I did the analysis on the business drivers that need to be there, there's a whole chapter on that in the book the technologies that and what the technologies today are telling us about the trends. And there is a whole chapter that kind of talks to that stuff. And then do a little bit of sort of gain what Simon Worley who invented Worley Maps calls gameplay against what you see in the current state map. And you, you ask some questions about different ways it might evolve. It became really, really clear to me that it's, you know, unless there's some fundamental, either legal or physics thing that's not understood today that would, that would block its evolution. You know, over time it's pretty inevitable and we're getting very close to the point where that's going to start to appear and we can talk more about what I see that drives that. 


David Brown

But, is that like an aha moment when you came to that conclusion or? Yeah.


James Urquhart

Yeah. Yeah. No, it's, it's when I, so I, I, you can go to a medium or some old blog posts that I did a while ago, that you can search for and look up if you want to, where you can see old, like really sloppy versions of different analysis that I did. But you can see literally where I sort of came to the conclusion that, wait a minute, these interfaces and protocols are really important. And when I reached that point, you know, that's, that was the definitely an aha moment for me. 


David Brown

Definitely, like can we talk, you've mentioned a few technologies. So it seems like as you say, some technology is already there in order to publish messages, put them onto queues, have people subscribe to them, even some of the security elements already there. Let's briefly touch on what technologies would need to evolve to support flow. So for example, I'm thinking about in the restful api space, what needed to evolve was a common standard to describe restful A PS to subscribe to them. Now I know we have emerging standards like cloud events dot IO are those the sort of technologies which we're going to need to evolve to support flow? 


James Urquhart

Yeah, let me break down interfaces and protocols just slightly more because that'll help you kind of understand. But yes, I think you're right on the right page. So there's really kind of on the interface side, there's really kind of two core elements that are the interfaces that you need to really make flow work. So the first is the logical connection which is really the subscribe command if you, if you buy into publish and subscribe, being the mechanism, then it's, you know, what are your interfaces to publish and what are your interfaces to subscribe? There's a possibility there's some other mechanisms that are more direct that aren't so much about publish and subscribe, that might be useful in some use cases that might play into what has to be supported. 

So you might have a situation where it's not so much about subscribe. It's more just about establishing a logical conversational, more conversational connection between a producer and a consumer. But if we go with the publisher subscribe, then you go OK, whatever the publish and subscribe, interfaces are and, and the mechanisms that allow you to say, you know, a publisher to say, I want to connect and put things in a topic somewhere and a subscriber to say, I want to connect to put and take something out of that topic somewhere. Then then those interfaces are the logical connection largely and they sit on top of all the networking stuff, TCP/IP and TSL or TLS or whatever. And and then, but then the other interface that's really critical is discovery and discovery is important because what you know, what streams do I want to subscribe to. How do I find streams that are valuable to me and my company, especially if the number of streams grows exponentially like html websites did. Then how do I, you know, how do I find what I need to find? And so there's a couple of ways that could go one way is that could be just Google and web pages that describe. 

Here's my, you know, there's a standard way that you can look and say, OK, here's my stream ur I and here's what you need to know in order to be able to consume the data that comes from the stream ur I and that has a very decent chance of being the winner, but there's also a number of people companies out there that are putting out commercial products that are more like you know, a api driven registry of streams that do more than just say, hey, here's a stream ur I and who's the owner or whatever they may they give you programmatic ways of understanding what's the velocity of, of stuff that you might expect to get from this and what schemas do you need to know and understand in order to be able to consume these things? And where can I, you know, if I can programmatically determine from a scheme of how to read it, where do I go get the, the things I need for my program to be able to determine that? So those things I think are important. And then on the protocol side, there are two pieces to that as well that are really important. So, and the first is the, the payload, you know, what is it that you're sending, that's unique about that state change that you want to communicate to to the other side. And you know, protocols are the one thing where it's gonna be not standardized in the sense of there's one ubiquitous standard for protocol for for payloads, it's likely to be more on an industry by industry or even, or even key market by key market basis. So things like you see already with Ed I and, and a number of other places where they built data standards for how to exchange data across boundaries, those will get encapsulated into payload form or new ones will be created to meet new needs. But the metadata gets very interesting. 

So the core descriptions of what's in the payload and what do I need to know to be able to pick it apart? What time was this event driven? How has it been routed to me where it's the original source? Those things are very, very important in order to be able to both trust the data that you're getting and also to understand a little bit about about how to process the data and and the kinds of things that you want to determine from that. And that metadata is there's something called cloud events out there that you mentioned earlier, cloud events. I owe cloud events is a project from CNCF. I think it has the inside track in a big way in terms of being that metadata protocol. I think it's a very important step in the right direction, even if it doesn't win it, it will inform whatever wins in the long term. But but man, I gotta tell you, it's everything about the way they set it up is correct. It's not itself a wire protocol. It is a, it's a metadata definition protocol that could be mapped to a number of different wire protocols. So a Q AM Q PM Q TT http writer's mappings to just about everything you can think about there. And so but there's something like that will have to be in place as well. And so when you look at those things that are in place, That's where I really kind of think that that those are the, those are the pieces and elements that have to get better and stronger and fully standardized in order for us to say now we have standard interfaces and protocols for flow.


David Brown

Do you think certain vertical industries will also drive adoption? So finance industry coming up with standards to how to describe a financial transaction, for example.


James Urquhart

Yeah, I mean, you know, I think there are certain places where it's gonna be really hard for flow to crack the nut, so to speak. Like, so I use high frequency trading as an example, frequently in the book as sort of the kinds of areas where you know a flow like contract exists already. The catch there is though of course, the speed and and performance of what you need to know there is really high, but that doesn't preclude flow from being giving you the, you know, some of the ability to say and giving the financial stock trading industry, the ability to say, hey, we're moving from this port based approach to here's a flow interface and essentially we're going to establish the same connection when all is said and done. But the metadata will change a little bit and you know, and the flow will change a little bit as a result of that. 

But I do think, yeah, absolutely. I do think that certain industries are gonna find this more valuable than right off the bat is transportation industry, one specifically the logistics industries, one where I look at and I say, yeah, they're already moving in this direction. There's a number of vendors that have open ST they are using some relatively common standards for things like truck. What is it telemetry, you know, truck, telemetry for semis and things like that, those things are already starting to become very, very standard. So I absolutely think the industries will drive it and I, and I, you know, and with some industries will be first, others will be later. But man, the power of real time data is surprising in places where you've been using batch processing for, for decades, right? I've seen this firsthand and you know, you, you get a, a real excited shift of behavior when people can actually see what's happening, say in a, in a online retail campaign in real time versus 24 hours later.


Kevin Montalbo

All right. So like any disruptive technology, I imagine that new technologies and new industries and business models are going to emerge. Have you seen, have you foreseen any of these emergent or new industries and business models going to take place?


James Urquhart

So far as of now, I think there is very little today that you can look at and sort of say,, this has been driven by of course, standard interfaces protocols because they don't exist. But you can see some hints of the kind of explosion of business opportunity that's out there. I mean, just today, right, we see in the news that Apple and, and K and Kia Motors are kind of working together on a factory to build cars, right? The big part of that is the telemetry capability that's available from the Apple ecosystem and from what they're able to do in terms of tying integrating you know, your entire personal portfolio of data into now adding cars to the the picture makes that possible, right? And I think, you know, so you look at self driving cars, you look at you know, you look at the streaming of music and film streaming industries and, and some of the capabilities that you might see with all these new networks popping up everywhere and everybody doing their own apps and all that stuff, which I think is, you know, is really indicative that there are things that can be built on top of that. I think it will go much further though. III I do think that there's a lot of opportunity for business data streams to be analyzed and combined and reused in, in ways that we don't see a lot right now. So, right, so I think there's more to come in a huge way.


Kevin Montalbo

How about for organizations? Do you see any organizational challenges that flow will face? So are companies right now set up to embrace a real time, event driven economy?


James Urquhart

Yeah, I think there, there will be some resistance in the sense of you know, the first one of the things I say a couple of times in the book and I, I believe this for a really long time is the first requirement of any really useful business system is trust. If, if you don't have trust as a baseline, none of the rest of the features and functionality matter. And so it will take some time for organizations to trust that if they make their data available, that it will be consumed in a way that's beneficial to them, it'll take away for and to find the business models where that's the case. It'll take a while as well for consumers to trust that the data sets they get from some place they don't have a contract with upfront is, is valid and worthy of consumption. So, I also think, you know, there, there's a debate, there's a fun little debate going on a little bit right now about sort of the nature of what is an application. And, you know, we've used this term about sort of big a application, little a applications and all kinds of terminology, that sort of, you know, what's, what's the difference between a deployable component that's owned by a team and the application from the consumer's perspective or from the business perspective? 

And I think a big part of the problem why we have trouble kind of mapping these things is because really in the end, what we have is a graph of software components out there. And, and you know, so we have a bunch of components that have dependencies and communication links with each other and finding those boundaries is getting harder and harder to do as we have more and more of them that are more connected and interesting and complex ways. So I do think that in the end, the this the applications that that sort of the resistance, some of the resistance that enterprises will have revolves around the way they're organized and the way that they're organized to build software and how software comes out. So the old sort of Conway's law conundrum that we are dealt with and seeking that dream of a reverse Conway's law where our organizations begin to change around the software components that make us successful. And I don't think we, we'll get there super soon anytime soon, which is why I think flow will be delayed somewhat because the, the trust, again, the trust factor is not only the trust factor between a business, one business and another and the network in between, but it's also within the organization, there's a trust factor about sharing data as well that has to be overcome. That's why I say it's a 5 to 10 year horizon before it's a mainstream technology problem. But I do think that the early movers, the people that help drive the standards and then build businesses around those standards early will have an inside track on, on really profiting from what happens later.


David Brown

I'm guessing there's also business process challenges to move to a sequential event processing model. Many businesses work in batch processing still. So they'll literally need to overhaul their business processing in order to be able to support an event, real time, event streaming model.


James Urquhart

Yeah. The real time part of it is, is two factors. So one is right, the need for automation where we've been able to kind of slide by without automation for a long time. I think is gonna be driven in a huge way. I talk a lot about, there's a, there's a class of work that's always been easily replaced by software that I call clerk jobs. And so, you know, it's a clerk job, somebody who takes something off a queue someplace does a prescribed function against that something and then passes it on to the next step in the process. There's, you know, if you look at the original, like if you look at the movie Hidden Figures about the women in the in NASA that used to compute functions and, and, and provide mathematical answers to equations in the sixties and the fifties, right? You look at that. And how quickly they were replaced as the IBM mainframe became the core processing system within the organization, I believe there's a lot of white collar clerk work that it is very much at risk as, as flow pops up because organizations are gonna have to look at, you know, does it make sense for us to try to process a growing velocity of data that's coming in through certain streams that we subscribe to in a human process way? 

Or do we have to finally find ways to, you know, automate portions of compliance and automate portions of of of data reconciliation and data clean up as, as, as we take a look at things hard, hard problems to solve, but maybe increasingly easy as, as machine learning and A I technologies get a little bit better. Over the course of the same decade that all of this is evolving. And so, yeah, I do think there's a fundamental and you know, that what's gonna really survive it. A lot from a career perspective for most organizations is gonna be those things where you have to use smarts and creativity in order to be able to either create something of value or to resolve issues that might impact the value of, of a product or service being offered. And in, in those situations, I think it will be a very, very long time before humans are replaced. But it, it is, you know, I always say there's always a negative to every positive and, and sort of the way these technologies evolve and the negative is really, I think there are a lot of jobs that have been relatively safe for the last 100 years that are probably going to increasingly be at risk and not just because of a IML but the combination of A IML and flow is really gonna enable businesses to automate those tasks easier.


David Brown

Fascinating, look, what can organizations do to prepare for this? And you mentioned the early adopters are going to be the winners and let's just put aside the technology providers which there's obviously huge opportunities for but organizations which are going to be publishing or subscribing to strings, what, how, how is it can they get started on this now or is it purely theory? Is there things they can do and put into practice now to prepare for this?


James Urquhart

Well, yeah, so the good news is because the interaction technologies are increasingly basically, you know, more or less commodities at this point. Processing events at scale is a doable thing today with the technologies that are available. So the first thing I point out is, you know, we're all talking about sort of microservices architectures, but a lot of people think of Microsoft service architectures in terms of, you know, request response APIs and, and rest API kind of services.  There's a number of different works out there that talk now increasingly about how you, either process streaming data as you know, a sort of a data stream approach using a number of say Apache, different Apache projects out there to do that different commercial products in the market to do that. And you could certainly take a look at it if what you're really looking at is beginning to understand and sort data. There's a number of things you can do there today, using those technologies, there's a new class of stuff out there that's around sort of building models of the real world in, in digital form as sort of a digital twin model. And if you, if you Google the term digital twin, you'll find a billion things about that. And you know, one of the ones that I've worked with a fair bit is an open source project called Swim Os and the company Swim dot IO, that's sort of producing that and delivering commercial products around that. The idea there is and, and this is a class of tech. It's not just the idea is that you process the stream as it comes in, you determine from the data of the stream that there's the existence of an entity or agent in the system. 

You build a digital twin for that agent, you then through the data, determine how it's connected to other agents in the system. You begin to build this graph model in memory of sort of what are these things and, and how are they connected to each other? And then so interestingly uses an AI approach to begin to more of a machine learning approach to kind of determine over time, how should I react as the other agents? I'm aware of change. What, what are the actions that I should take? What signals should I admit or how should I update my state to reflect what I'm seeing elsewhere. And so it's in heavy use of things like you know, traffic systems. I think the city of Las Vegas is one that, that the swim team kind of holds up as an example, Palo Alto as well. But also, you know, they, they do models of, you know, entire cell phone networks or major network cell phone providers where they know what towers are doing and what cell phones are doing and, and what's connected to what at any given moment based on the stream of data coming from those towers and coming from the cell phone network systems themselves. So, and they do that on a surprisingly small number of hardware systems in order to support that. So it's a very cost effective way of understanding how those things are related. And so, and I think, you know, another element of, of, of that that you can look at today and, and build around today is the cloud world and serverless, right? People who are doing serverless systems today are sort of building prototype systems that would work very well in a world where they're consuming an external stream instead of an API something, you know, a lot of land today is triggered by a call to an API gateway. And you could just replace that with something like that, consuming a stream and you can do that today as well. 

The big thing is, is if you build it in a way that the interfaces and protocols that you depend on in the processing of the system downstream, that you have the ability to adapt that very quickly and cheaply as you move forward. So you know, decoupling the implementation of the event processing from the consumption of the data in the business systems themselves, I think is a very important thing to keep in mind so that you can begin to consume what I believe will be standard libraries, right? Standard, you won't have to write all this stuff to be able to consume flow because a lot of it will be just very, very standardized around those standards. And you'll be able to say, you know, give me a, you know, give me a known object type out of the what I've received from the stream that I can then consume downstream in what I'm doing. So definitely, you know, keep in mind that those interfaces and protocols are what are going to evolve and change build with that in mind. But then, you know, take advantage of the architectures that are available today and the new architectures as they start to evolve. And where they make sense in the use cases, you need to do it. And there's, there's another, you know, number of books out there that O'reilly and others have about event driven microsystems microservices and about you know, about, you know, serve programming and things like that, that can help you, your organization be ready and on the way already when flow appears. 


Kevin Montalbo

All right, I have a suggestion actually for organizations who want to prepare for flow and you talked about books. So the suggestion is basically just read your book, right, James.


James Urquhart

Sure. You know, my book is a good way to get a, a sort of high level say, you know, national map scale overview of what the problem set is and what's available. You know, I don't go in, it's not a book that's gonna give you a recipe for exactly, you know, how to build your systems today to be event driven in that way. But yes, you know, I highly recommend to everybody that's interested in this idea of moving to sort of real time stream integration with other organizations even within your company or outside of your company to take a look at, at the, the flow architectures book because because I, I, you know, I think I did a really, really good job of mapping out sort of all kinds of different questions that you have to keep in mind and think about and that the that the market will have to solve in order for flow to really become trustworthy and useful in the future.

And there's a ton of opportunity, you know, not only is there instructions about how to be prepared so that you can be reacted to flow. But there's a ton of description of things that you know, if you're an entrepreneur or an investor, what are some of the spaces that need to be solved in order for flow to really come together and be useful? And you could do that for a vendor of an integration without standards today in very specific situations and then explode that out to take advantage of the standard when it's available. There's a bunch of stuff in there about what industry organizations, if you're part of an industry trade group or if you're in the, you know, in a government organization that has a standards body that you, that you adhere to. 

What can you do today to define the payload standards and to be prepared to quickly declare a stand, you know a consumption standard around something like cloud events as it becomes really clear that it's, you know, that it's the standard that's available. So there's a lot of opportunity in the book beyond just, hey, what do you need to do as a software architect or software developer to use flow. And it's really a book that's meant for investors and entrepreneurs and business leaders and technology leaders and, and you know, a number of different audiences that are going to be able to take advantage and leverage flow as it becomes available.


Kevin Montalbo

All right. We want to congratulate you here from Toro cloud on your book. Before we wrap this up, Where can they go to learn more about you?


James Urquhart

Yeah, you know, the best places to, to, to find me are these days is I'm doing I've, I've been a long time kind of Twitter user. And, and that's the number one place I usually go to, to communicate and, and, and, and look for ideas and meet new people. But I'm increasingly on linkedin as well. And using that platform, find it very useful in ways that Twitter is not really useful. And those are two great places to, to, to find information. I also just launched a blog, like literally just launched the blog. It's called Flow Arc Book. And it's on, it's on blogger. So, Flow Arc book.blogger.com is where you can find it. And so I'm about to post my second post ever on that blog. But but it would be a good place to kind of see how my, how I kind of break things down maybe a little bit differently than I did in the book or also how my thinking evolves and changes. And I'm gonna try to stay on top of any news that's really important as well, as flow sort of comes out and comes to fruition. So I'm seeing interesting things in a number of industries that begin to indicate that there are businesses that would, could transform quickly if this technology was available. And I'll try to highlight those as I go forward. So those are the main ways to get a hold of me.


Kevin Montalbo

All right, that's a wrap for this round of cocktails to our listeners. What did you think of this podcast episode? Let us know in the comments section from the podcast platform you're listening to. Also, please visit our website at www.torocloud.com for a transcript of this episode as well as our blogs and our products. We're also on social media, Facebook, LinkedIn, Youtube, Twitter and Instagram, talk to us there because we listen. Just look for Toro Cloud again. Thank you very much for listening to us today. On behalf of the entire team here at Toro Cloud. This has been Kevin Montalbo for coding over cocktails. Cheers.


Listen on your favourite platform


Other podcasts you might like