Speaker 0 00:00 Hi folks, and welcome to the workflow show. I'm Jason Whetstone, senior workflow engineer here at Chesapeake systems and I'm joined once again by my cohost Chesley senior solutions architect, Ben Kilburg. Hi Jason. Hey Ben. Uh, so AI, M L, what are they and why should you care? Is this some scheme to automate people out of their jobs? Is it a supplement to their current responsibilities and what solutions are out there? How can you harness the power of AI and ML to make your content work for you? How can we integrate these solutions with say, a media asset management platform? So here today to discuss these and other related topics, it's just his own chief executive officer, Jason pagan. Hello Jason. Hi. Thanks for having me on. Thank you for being here, Jason. And I'd also like to welcome Ryan Steelbrig, president of baritone. Thanks for joining us, Ryan. Thank you guys. I appreciate being here. Great. So let's get started with some background info as usual. So Ryan, tell us about yourself and what led you to where you are today.
Speaker 1 00:59 Sure. Well, I started in ad tech, um, back in the mid nineties. My brother and I started one of the first, you know, kind of internet ad serving companies called ad force. Um, and we were sort of the main competitor against DoubleClick for, for many years, um, you know, through kind of a hyper-growth stage of, of the the web. And what was really interesting about that, it was a, it was a hosted platform. We ran it, uh, more or less as a managed service on behalf of, you know, a few thousand prominent publishers, including the old Yahoo and geo cities and Lycos and all these names from the past. Um, and, and we scaled it to, you know, more or less managing the ad delivery of over a hundred billion ads. Um, and those were primarily our display ads, um, in a month. Um, it was, it was a fascinating, awesome, fun company.
Speaker 1 01:47 We took a public and ultimately it was acquired after we took it public. But what was interesting about that is, um, it, it kind of infused the data pedigree into my brother and I, um, you know, how we serve the initial ads, um, was initially our main partner was AOL. And back then we weren't trying to do like IP inferencing and trying to sort of glean insights about who we were serving ads to. Um, we actually knew who you were. We had an encrypted code, um, that when somebody would log in through CompuServe or AOL, we knew exactly that it was a male or female in the city they lived in, et cetera, et cetera. Which through that period of time, you know, the privacy laws changed and we had to kind of retool the whole architecture as did DoubleClick to try to figure out candidly, who are these people we're serving ads to. Um, and not just the people we're serving ads to, but what about, what's, what's the association with the content that we're serving ads upon? So you had to do something
Speaker 0 02:48 Like crazy research into like, based on this person's like, present browsing history or
Speaker 1 02:54 All of the above, you know, the initial onset of cookies, um, you, the class C IP addresses and how those change over time proxy addresses. I mean, so I mean, there's so much money flowing in and so much demand or figuring out image, internet advertising that we were just, we were forced to deal with a deluge of tons amount of data. So if you can think about, you know, the, the amount, every time I have to make one decision and I have to make a decision of what's the right display ad, the banner ad, the four 60 by 60 pixel banner ad, that is the bane of the world's existence today. But that was ubiquitous back in the day. Sure. Um, you know, what, what, how, what are all the different components in data sets that I have to sift through to make the right decision or the right ad and serve that to you in a few milliseconds.
Speaker 1 03:42 So making sure that I don't get ads about football stuff for example, or whatever. We have pro, you know, try to, if you bring up a good point, I mean as much of, you know, making sure that we don't serve the wrong ad to the right person, um, as you know, as important at times than serving the right ad to the right person. So I think ultimately it was when we were exposed to the necessity to look at different versions of ER, you know, AI machine learning back then, um, relatively rudimentary neural networks and different elements. It was, there was so much data that we had to start developing systems so we could learn from what was working, what was not working in terms of different demographic target sets. And that was really sort of the, the beginning of our, um, you know, education and in sort of, you know, professional careers of dealing with lots of data and AI.
Speaker 1 04:33 Machine learning was definitely an important tool. It's got a lot more sophisticated over the years, but it was definitely a tool that we were invoking way back in in 1994. 1995. Ryan, out of curiosity, was that all really based on more static ads on the sites, like banner ads or did you also at that time start getting into as shown inside video and in a VOD platform? Now? It's a great point. No, back then it was just um, interruptive based ads. So it was display ads. Um, that was the, the bread or butter. Ironically you kinda touch on kind of the early precursor to baritone. I, I would say, you know, we became poster Childs for if there was a place to serve, a disruptive interrupt in a, in an interruptive based ad and what I'll call, I'll put display ads in that category. Um, we were serving it and it got to a point where, you know, the average American was, um, traditionally exposed to about a hundred different sponsorships or promotions in a day.
Speaker 1 05:32 And then, as you know, in by 2006 and 2007 coming along, um, they were exposed to over 3000 unique promotions or ads per day. That's a big problem. Um, so it, it, and that was mostly display. So, you know, looking at native based ads or advertising that's embedded or associated directly with the content itself. Um, it really was some sort of the second generation of, of our focus, um, that really started to emerge, you know, a good 15 years later than, than, you know, when we originally built the initial ad force business back in the day. Gotcha. So for our listeners, a lot of this came from targeted advertising really. I mean, just making sure that you're getting the ads that really are gonna be meaningful to you connect that now with like what, what is baritone now? Like what are you guys doing now?
Speaker 1 06:20 So I think one of the things we learned in my brother and I became really interested in native based in advertising and native is really the term that the web started to birth. But for us who have been exposed to traditional media and cinema, whether those are branded integrations or product placements or advertorials, there's all different flavors of, I would say, more native or more organic based advertising as compared to a commercial break in and display based ads. So we initially started just finding some interesting startups and we became kind of, you know, strong, um, investors in a, in a multitude of different startups out there that were trying to figure out, you know, the native based advertising ecosystem. And ultimately, you know, we, you know, some of them did relatively well, you know, for the most part, a lot of, you know, our seed investments ultimately fails.
Speaker 1 07:13 Um, and so Chad and I kind of looked at it and said, you know what, let's give it a shot ourself and see if we can figure some of this stuff out. Um, and so what exactly the, the, the initial problem set was that we set out to sort of tackle in about 2011 was my brother and I, one of the companies that we owned was a, an endorsement based ad agency. Um, it was a relatively small boutique agency that that really was, was sort of on the cutting edge of working with live hosts, whether it's on TV but primarily on broadcast radio now including, you know, host on ESPN and rush Limbaugh and premier radio networks and others that they really became eco to boutique shop that if you want it to do live reads or have the radio host, we have a live commercial with an into his show.
Speaker 1 08:04 Um, this is the agency to go to. And so we were fascinated by that. Um, during kind of our, you know, period of exploration and obsession with all things, you know, native, we acquire that business. Um, and, and in effect, we, we started just passively just watching how, how things, how they played out. The business grew to a point, but then they ran into an, an interesting wall. Unlike, um, commercials that air in stop sets. And this is the same for television stations as it is for radio stations. All those advertising is historically booked through a traffic system and advertising traffic system, right. And the output of an output of advertising traffic system. Our locks. So ineffective will tell you exactly, okay, if there's five commercials in a stop set, when is each commercial and the length of the commercial and down to the millisecond when it, when it started to air native based advertising was a morphous, there was, it was not showing up.
Speaker 1 09:02 It didn't show up any log. By default, it's organic, somebody is talking to it. So they ran into a problem where it was hard to build attribution models or evaluate ad efficacy if I couldn't verify when exactly rush Limbaugh or another host was actually doing the library. Right? Yep. And so that, and so, you know, it was at that point where we started to look around for what would be an interesting technology solution to try to solve that problem. And that was really our first kind of foray into dealing with natural language processing and speech to text to try to programmatically and intelligently recording, analyze, you know, potentially thousands of different streams, um, broadcast streams to try to find exactly when those native ads were airing. And that would really the, the, at the impetus and the sort of initial kernel that that led us to, to launch the baritone.
Speaker 0 09:59 That's a really interesting story. Can you sort of like put that on the timeline? Just a range? Is it, uh, early two thousands or, yeah,
Speaker 1 10:07 Well that, so timeline, when we first did that, that was in 2000 and we acquired the agency in 2010 in 2011 is when we started to build prototypes for, wasn't called baritone back then, but, um, it was called actually the cognitive media platform, CMP. Um, and that was kind of our code name for the project. Um, which I still like by the way. Um, and, and that was in about 2011. Ultimately, we decided to incorporate a baritone in 2014 and now there we are, you know, a few years later.
Speaker 0 10:42 Yeah. That's great. That's great. That is a really interesting story. Ryan. Um, I'd like to move on and just talk about, uh, just, just sort of take a broad look because I know our listeners have heard, uh, you know, AI ML, artificial intelligence, machine learning, uh, many, many, many times here on the workflow show. Um, they've probably heard it from maybe people above them, maybe people in sort of the more corporate part of the organization. Cause I think they've kind of become buzzwords in the industry a little bit that came from maybe a larger industries or industries outside of the M and D space. So let's start with AI. What is AI? And then we'll talk about what is ML and we'll talk about how they're different. So let's start with AI.
Speaker 1 11:25 So, you know, AI is a very broad term and it encapsulates a lot and it's been around for a very long time and I'm going to put them into just two general categories. Um, computational AI in cognitive AI. We've, we've been experienced and had the luxury and the benefit of having computational AI for a very long time. You can argue that it is a calculator that we've had for a very long time, um, is a, is a fabulously powerful form of augmenting human capabilities by doing computations and brilliantly fast, um, that trend. Obviously Moore's law and other things in the whole, um, you know, Silicon relevant revolution. Um, we've gotten really, really good and, and continue to push the envelope for computational transactions. Um, and, and AI is definitely, you know, sort of a native part of that. I'm going to say it's, it's artificial intelligence because in effect it's a artificial form of, of what humans have historically been having to do manually.
Speaker 1 12:27 The machines are doing it for them or assisting them in that effort. And I'll put that in the computational side. Um, the cognitive side is a lot more challenging and that's I would say, a new emerging, um, major class of artificial intelligence. And, and that's really where, you know, it's barely been the exclusive domain of human beings as compared to machines for a long time is understanding meaning that that is, that potentially is arbitrary or relative to an individual. Um, it's not just adding up ones. And zeros or you know, looking at a common number and breaking it down into its smallest components. Um, but it qualitative aspect, um, you know, what's that piece of, what's that piece that show about, um, what is the conversation about? What, what are the, the subtleties and the color cue, uh, in a background of a movie in production and maybe even like, why should we care?
Speaker 1 13:19 Right. Yeah. Well, I mean, I think we all appreciate the computational aspects. They have, you know, significantly aided human society. Um, and I think w w we're still seeing the benefits of those on the, on the cognitive side. This is where it gets more challenging that there's, there's so much data being produced today that candidly humans can't keep up. Um, so you were, you were kind of touching on the preamble of the show about, you know, how this can impact people's jobs and careers. Well, candidly, we think AI, machine learning, particularly as it relates to cognitive services will help save a lot of jobs primarily because if they, if they can't keep up with the load, um, they're going to have to find a new business model and potentially that person's going to lose their job. AI will, you know, it will assist that individual to sift through, you know, what's now just because of the free computer, the cheap compute out there and almost the unlimited storage opportunities. You know, we're producing data, you know, content is just a form of data in an astronomical clip that is just impossible for, for people to continue to keep up with that demand AI machine learning and now will be, as we crossed over from just computational AI to cognitive AI assist us in, in sort of this, this next phase of, of hyper growth in content data and data services. Gotcha. So, so the machine learning aspect is really this cognitive. When we think of cognitive, we think of learning, uh, that's, that's the cognitive of the AI.
Speaker 0 14:52 Sure.
Speaker 1 14:52 Well, and I think machine learning, you know, classic machine learning is just, you know, there's really supervised and unsupervised is, you know, without having a human to deduct what is, you know, sort of how do I, if I'm, if I'm creating lots of data and I want to analyze the data and then act upon that data, machine learning can, in deep learning and other elements can really do a lot of that now in both, you know, a fully autonomous manner for some in some for some respects, um, or a semi autonomous manner. Um, but humans are inefficient as it relates to acting upon data. If there's an anomaly in a dataset, um, it may take us a humans very long time to understand that machines can do it very quickly and then act upon it. And so in effect, we need, when you look at these systems that they claim that they're getting smarter, they really getting smarter, or just the machines are able to update the systems faster without human intervention. And I put that, not necessarily cognitive computational, a sense of, you know, machine learning, obviously crossover for both. Um, but I think most of what we're talking about here,
Speaker 2 15:53 Well, what really excites me in this area of machine learning, and then, you know, certainly getting into the cognitive side of AI is, you know, a passion of Chesapeake systems for a long time now has been automating processes so that creative people can actually do their creative jobs, which a machine cannot do. A machine today can't, in our opinion, sit down and review content, decide how to cut that content up and actually create that story. It can try, it can come close perhaps. And those things are always evolving. But again, that's been a passionate Chesapeake systems for so long. And this really is getting to the next level of that. You know, we started off getting into automation systems where we would be excited to automate the delivery of a transcoded file and automate the transcript of the file itself. So it was consistent and it was validated in a human, didn't have to waste their time, especially when they were hired to be a creative. And that was the very common thing. We always walked into a client's site and saw, so you're describing a new level of that where we're getting into how efficient is this business process? How efficient are these humans at doing these different things that are part of their job today? And for the efficiency of the business, how can we change that, you know, that formula so that humans are doing what humans should be doing.
Speaker 0 17:10 Yeah. And I actually have an anecdote that really speaks to what you're talking about. For those who may still be saying, ah, I don't really know. I feel like I'm going to lose my job here. I, you know, maybe I do a lot of logging or tagging of metadata and you're telling me that like, you know, the machine's going to do all that and I might have to either do something else or, you know, I, I listened to a podcast, um, I don't remember. I'm sorry. I wish I'd, I wish I remembered. It wasn't the workflow show. It was another podcast on, uh, I believe it was an NPR podcast. And there was a, uh, an organization that was trying to determine, um, if elephants in their native habitat were being poached. And they did this by setting up about a, if I recall, it was like a hundred different field recorders out in different various places, a high up in the trees that they wouldn't get damaged or anything and they let them record for like, uh, several weeks I think.
Speaker 0 18:02 And then they had to go back and analyze all of that content. And what they were looking for was this very specific sound that an elephant makes when it's, I'm feeling a certain way and that that was going to tell them, you know, essentially, uh, what was going on. They were also listening for gunshots because that is an indication that the elephants are being hunted and poached. So, um, they were able to, they had calculated out how long it would take them to listen to all hundred or so of those recordings for the amount of time that they were recording. And it was like some ungodly amount of man hours just to listen for a human to listen to that content. You know, and then they were able to put together an algorithm to basically process all of the content and look for the sound and they fed it through and it took like four hours, you know? So this is like, this isn't, this isn't necessarily a throw people at the problem solution. It's really, you know, I mean, people get distracted and people make mistakes and you know, short machines can crash and things like that. But generally they're pretty reliable when you give them a reliable task to achieve.
Speaker 1 19:04 And I think we gotta be realistic. I mean, to that point is, it's not like we have a greatly expanding budgets in our state park system, right? So let's say you wanted to do something similar here for endangered species and Yellowstone national park. They're, you know, they're not going to hire, you know, 5,000 interns or others to do this work. They're simply not going to do it. And so, you know, to your point, you know, that's, that's a use case that nobody would have ever tried to go tackle simply because it would just inconceivable that they would be able to finance it or staff it and have the budgets to do so. So I think, you know, another one that's kind of similar to that, um, is the, um, director of the intelligence community, um, deputy director, uh, for the DOD, um, was estimating that the explosive growth of new locations and accessible data is growing so fast that for them to keep pace, the federal government, U S federal government would have to hire 8 million additional analysts to keep up with the data 8,000,040 the article.
Speaker 1 20:04 It's fascinating. It did. It's a document and it's public, it's public domain information now. Um, and obviously that's not going to happen, right? So in effect, we, you know, we need to tap into machines just to keep up with the current human staff that we do have. Um, so again, I don't think you're going to see, um, a great impact for jobs, particularly on the cognitive side. But I do agree with you that it's going to help that individual remain competitive just simply because we're just producing a lot more data both directly and passively that, that, uh, you know, AI machine learning if does nothing else will allow the current infrastructure of human capital to keep up. Right. And we're getting to that phase, that output
Speaker 2 20:48 From AI certainly is incredibly accurate to where we have an example where a client is using free interns to do a lot of their logging. And so the argument is why would I pay for machine learning when I have free interns? And an easy argument to make is, are your free interns good at what they're doing? I have plenty of stories about that and they aren't. Um, yeah, sometimes they are, but there's a lot of review that has to be done by the management of those free intern.
Speaker 0 21:14 Yeah. And I mean, let's, you know, let's, let's be realistic. It's not necessarily their fault. I mean they're, they're doing the best they can most of the time and they probably really do care about what they're doing, but they may not have all the insight that that's needed to say tag thousands of assets with the right metadata
Speaker 2 21:32 And to get in back to the topic of where we automating people out of jobs in this scenario, is there an automation of students out of experience and no, they should just be doing things that gives them more experience. They could be walking around with a producer on a shoot and actually getting experience. They're being creative versus plugging in metadata, which really doesn't interest them all that much.
Speaker 0 21:54 No, but it should a little bit because you know, we, we do need, we, we talked about this on the workflow show before. We do need some education from some of our educational institutions around media management and what that looks like and how to do taxonomy. And you know, what all that needs, you know, what we need to do to, to, to, to get more education in that field.
Speaker 2 22:12 So Ryan, I think this is a good time to also go into really automate studio and a product that Veritas has been putting together. The automation side of this, we've been touching on in this conversation for a few minutes now. You know, we received a demo of automate studio a few weeks ago, I want to say, and it was really great to see the approach, the not just in the capabilities but also the, the interest in the presentation layer of automation in your world of machine learning and really tying the output of one engine into the input of another engine and then another output and really automating further what these engines are pushing out to do more with machine learning then than really a lot of clients were thinking they could do in the first place. Can you speak a bit to the approach with automate studio and where that came from?
Speaker 1 23:02 Yeah, absolutely. Um, you know, for the last few years, you know, we've only been able to really work with, you know, partners and end customers through really two integration points, um, an API layer. Um, and so for more sophisticated groups who have the appropriate systems and internal resources and dev resources, they love that. And for some of our larger customers or those candidly who have committed to the budgets of these new innovative programs, um, API is the way to go. And, and I like to say is we're just a super intelligent, you know, highly scalable, highly, highly accurate bit blower. On the other extreme is customers who are wanting plugin play, um, utilization through a well-defined user interface and a user experience. Um, so if it's the ad sales department and they want to use the output of AI to immediately start building attribution models for ads, right?
Speaker 1 23:57 For example, um, that may only be a viable opportunity for that client. If there is a turnkey, stupid simple application that makes all that complexity of cognition and AI and very easy workflow for them to manage. So you really had two extremes. Um, the problem with the UI despite how simple it may be. Um, and actually what's interesting is when we were talking about, again, earlier in the beginning of the show, some people love the user interface that baritone pur provides through its essentials application. Um, that shows, you know, ability to search and discover through, you know, output of AI. If I give that to some groups, it's dead on arrival, way too complex, they can't figure it out and whatnot. So the having to wait until a full blown, well-designed and hardened application to be built, um, was becoming inefficient. So if I have this brain, you know, an automate studio really is saying, how can I create a low code environment short of a whole development shop, building an application to harness AI and machine learning for my specific business purpose?
Speaker 1 25:03 How can I equip a host of sales engineers or solution architects to start building new and innovative AI machine learning workflows without a having to go directly to graph QL and our API layer or be having to go and wait and either finance or wait for us to build an hardened, um, and I'll call finite application. So automate studio is our solution and I'm primarily, it's, you know, we're really only offering it up to partners now and we use a candidly isn't major internal tool. Um, we are going to be opening up, um, as a end product and starting in a market and selling it as an end to end independent solution in the future. Probably Q1 or even a Q2 of next year. Again, automate studio very simply allows a low code opportunity to build dynamic AI driven workflows that is sort of has all the full power of AI accessible to it.
Speaker 1 25:59 But again, you and I can start building our own creative workflows that would be inconceivable or candidly way too expensive to even try to tackle just a few short months ago. Great. So I'm picturing, correct me if I'm wrong, I'm picturing something that might look like a vantage workflow designer or a spare orchestrator or something like that. A node based, uh, drag and drop arrows between all the workflow points spot on it. And, you know, we're, we're, you know, we were disciples, you know, like our, you know, we love Alteryx and Alteryx is an amazing company. Um, and some of these interesting ETL development tools. And so we wanted to create something, again on a node red type of framework that is, is easy to use. People understand what a note is, they understand, um, it's drag and drop and, and you know, we, but we look at as proprietary and IP that people come up with and build their own amazing workflows, you know, that that is as valuable to a company as an app or proprietary new concept.
Speaker 1 26:57 So we feel the same way. We're thrilled about it. That's great. Give us an example of like, you know, what you might be doing with automate studio. Uh, just something simple maybe. Uh, yeah. So I'll give you one example. We, um, in one of our use cases for a major broadcaster, we historically have been ingesting kind of static flat files of Nielsen data on their behalf. So they could start to correlate, you know, when certain events were happening in their broadcasts to Nielsen ratings. And that, and that was working relatively effectively for a while. But, um, they also started to acquire new data sets and without them having to build complex adapters or other elements, they wanted to start to ingest and start to analyze, um, new forms of connected TV datasets, set top box data and the like without having to change the application because the application was designed, it was build was presenting to them what we call mentions.
Speaker 1 27:59 Um, when these events were happening and what the Nielsen rating was. But they really wanted to get a different perspective on set top box data as compared to summary in diary based Nielsen data. Um, we developed, um, well them working with them used automate studio to create a flow to ingest through a secure layer, a secure process to ingest the set top box data from their internal system, then process it. And we did some, you know, data cleansing and data normalization and then we wrote in developed correlation engines that would then more or less simulate against the sort of same time sequencing and time coding for that set top box data to coincide with the Nielsen data for that specific event. And we were able to do this candidly in a few hours and it was, it was something that was just, um, you know, really powerful and exciting for both companies or able to prototype, ideate, prototype and deliver in effect a working and scalable version of this application and this workflow that then they immediately could start Jenner realizing value out of it.
Speaker 1 29:13 That's, that's one example. You know, one other one that's really interesting and it's actually in a different vertical, but I think it's sort of apropos to what we're talking about in terms of, you know, human automation is one of our earlier public safety clients. Um, Anaheim police department. Um, they were w w we actually before we even built some of our initial applications with Anaheim police department about using AI to sift through their known offenders databases and try to correlate those against footage within an investigation. We built a workflow through automate studio that basically did all this right outside of an application more or less simulated all the piping and the running cognition against the audio and video foot dish as it was being ingested within an investigation. And then doing real time lookup and analysis and correlation against known offenders database. They were able to prototype that whole thing again in a few days, found out, okay, boom, this really works for us. Or here's the changes and tweaks we wanted to make. And then only then did we start building the actual application, hire the front end developers and starting to build the user interface, um, around that. So you were able to
Speaker 0 30:23 Sort of build the framework for this, for the solution before you started looking at UI. And that's great.
Speaker 1 30:30 It's a whole fascinating new way of really thinking about what's the best way to build a, you know, a UI and a user experience.
Speaker 0 30:36 I can imagine, uh, especially, uh, you know, some of the pub for public safety, uh, you know, we, we've got a lot of officers wearing body cams and things like that. So this has a really huge, huge impact for that.
Speaker 1 30:47 Yeah, it's an exciting area. I mean, that's an area where a lot of sensitivities, um, you know, they're kind of put it in it that even myself, am I comfortable with having passive surveillance on me wherever I go and it's running AI. I have a problem with that personally. Um, am I okay with, you know, using AI in the context of an investigation trying to find the bad guys? Absolutely. So it just, it's, it's like anything, it's, it's a fine balance between, you know, these are extremely powerful tools. We just gotta make sure that we have the right processes and sort of governance in place to, to take advantage of the, of these new tools and AI machine learning, but also, you know, continue to protect us.
Speaker 0 31:29 Yeah, sure. So that, that actually brings up a point that I hadn't thought about before, which is the sort of the privacy aspect of using AI. I mean, uh, any other, any other points you wanna offer on that?
Speaker 1 31:37 Well, I think, you know, I think we're going to have to have some interesting, and again, I'm not going to have a political viewpoint on whether there should govern at the city, state, County, or national level. Yeah. I think we need to protect a person's, you know, when we convert from an analog to digital number system of an individual, like, is my face universally, should it be universally indexed through machines? So in effect, now I have a code for my face that is interchangeable with every, you know, system out there. You know, I mean, there's a lot of movies, there's a lot of biblical things and a number of the beasts and all these things that were really, really scary about a lot of that stuff. So, you know, again, I think on a relative basis if this is kept, you know, in a sort of, I just think a centralized, ubiquitous database of humans in, in using AI, um, would be a very dangerous thing. And hopefully we can find ways to get, again, continue to get good yield out of these technologies without compromising our personal rights. Yeah. For our listeners, Verifone is not storing your face ID
Speaker 2 32:49 Just lives in your phone. So I'd like to take a moment to jump to a comment that you had made in giving us that example. Ryan, you made the statement of on their behalf. And I want to get back to that statement just a second, but um, to, to give a little bit of background before I get back to that, you know, uh, Vera toner is a partner at Chesapeake systems and one of the things that we evaluate and the partners that we go to our clients with, um, is really, really the passion of that, uh, partner. And not only the passion, but then the ability to go follow through with what they are passionate about and supporting our customers. And really doing that with the whole understanding of really what that client's doing, how they're doing it, what the output of, of any work they're doing, uh, with us as partners needs to really be, uh, to be successful with our clients.
Speaker 2 33:38 And, uh, you know, one thing that I've seen in the approach of baritone in our evaluation of baritone in the market as we've chosen our partners over the many years now is one, your approach with AI where in an OSTP and a platform that is not just handing people engines and saying, work with these cool engines, you're giving a platform to work with these engines on and then going into automate studio, um, and really solving a need around the use of these engines and in your platform with yet another offering with automate studio. But then you made that statement on their behalf. It's something I've heard a handful of times from different individuals at baritone where you're getting your hands dirty. Um, what I'd like to hear a little bit more of is what are those kinds of service offerings that you found benefit both the customer as well as Vera tone yourselves to understand the customer's real needs, get your hands dirty, be a part of the solution for them and you know, what are those offerings and what are you gained from that? Yup. Yup.
Speaker 1 34:36 Well, thankfully, you know, our background, um, and, and this has been at times a little bit of a, a learning curve for my brother and I historically we've sold every one of our historical solutions from our previous company is client direct. Um, we never went through channel partners. We went up, never went through, you know, resellers or system integration groups and or consulting shops. We were calling, you know, calling the businesses directly and we were the end pane of glass. And if you, if, although that's at times challenging, it takes a while to build up kind of a sales vacuum to do that and sort of build a brand. So you have the capability of doing that. The one luxury is, I'm talking to the executives directly. Right. I'm not getting a filtered representation of what we're talking about. What kind of solution are we trying to deliver and deploy together.
Speaker 1 35:24 Um, I'm talking to the horse directly and it's inconceivable when you're trying to introduce something that's brand new to the opportunity. And, and candidly we, you know, we have to have some sort of foresight into saying, okay, how could these tools apply to an advertising minded broadcaster? You know, we do have to come up with a lot of our own ideas and concepts like use cases first before I have the discussion. But the reality is is that the big ideas, the big breakthroughs of the combination of the technologies that we're bringing to the market are never going to be realized unless both parties have that intimate relationship where the client is willing to really open up and talk about what they're trying to achieve. Or candidly, what are your problems? And you know, you have to have that white glove service and obviously that level of trust with the customer so they can really open up and express what's going on, what problems are they looking to go solve.
Speaker 1 36:17 There's only so much we can do as a technology provider and sort of an independent service vendor if they're not willing to open that up. So that's sort of been a philosophy of ours for every business we've ever had. And as long as we can find partners, the assessors, Chesapeake and others that also have that same passion and discipline. Because again, the last thing we want to do is I have no bias to trying to push for any specific infrastructure deployment, AI model or whatnot. What will be the best solution if the client really understands what that solution is or let us help you figure that, that solution, but let us go find the right components and build the best solution for you. Whatever your goals are, some people, you know, and just give you reference back to what we're talking about. Let's say that accuracy and speech is the most important thing.
Speaker 1 37:05 They don't care about cost or performance, it's just accuracy. Well, depending on those variables that may warrant us as the service provider to say, okay, here's the right cognitive profile or recipe for you and other groups saying, Hey Ryan, I got a tight budget and I have so much content that candidly I'm willing to compromise a little bit on transcription accuracy as long as I can come under budget. Great. Thank you very much. Here's a, you know, a recommended solution for that. So I, I think anybody who's not providing a level of service, you know, in this industry, particularly in a SAS based ecosystem that we're living in today is not going to be in business too long. So, um, I think, I think that we share that same principle together.
Speaker 2 37:49 Right. And I've definitely seen examples using another one where you've brought multiple offerings in the industry together. Like when you acquired Waze, you know, I guess it was about a year ago now, and bringing what is now DMH from baritone digital media hub into the platform that you have built. And then marrying that up with the AI and ML services to really enhance what someone's getting out of this library of content that they're pushing out for licensing. And that's just yet another example of solving problems for clients and bringing those tools together and then addition to that, um, back to the one their behalf. You know, there's certainly scenarios where baritone is helping to host, uh, and correct me if I'm wrong, but host and process, uh, and handle a lot of that licensing for clients because it is another role where the client might not have someone to do it but as a huge value to them. And then, you know, in that client direct kind of mindset, walking in and saying, well then how can we help you get there because you have value in your content. Let us help bring that to you.
Speaker 1 38:51 I love the Waze acquisition and you know, and you know, one a part of it, it was just out of necessity, we saw, you know, a market in efficiency that really wasn't being served. There was, you know, ma'ams everywhere in production-based ma'ams and we really saw that there was a sort of a limitation in the industry for I'll call distribution maps, right? So, great, I've already created the content or I've ingested the content or compiled all the content in my production, ma'am. Um, and what do we want to do next with that as it relates to social media distribution as it relates to, um, packaging up for, you know, derivatives for Alexa or, or, uh, potentially licensing my content to a movie studio, whatever it is. We, we felt that trying to sort of extract, you know, a lot of those use cases within a lot of the traditional ma'ams that we saw, um, just wasn't there.
Speaker 1 39:40 And so, you know, I was, you know, sort of exposed to Waze digital many years ago when it was T3 partners and, you know, thought equity partners, that licensing was intriguing to me. Just in a world where if I can index any archive or any, you know, data's, you know, media silo with the right rule sets and permissions, you know, expose that content. There's a lot of stuff we can do with it. So again, do we try to push DMH on every single customer? You know, absolutely not. Do we, when we see that there's an opportunity to introduce that platform of digital media hub for a client that will augment their existing preferred ma'am or installed ma'am. Awesome. Um, that that just kind of goes to a more or less the consultative approach that we want to take with all of our major customers.
Speaker 2 40:27 Right. And you know, speaking of ma'am and, and really your, your point on the distribution, ma'am, I think is really important. Uh, but in our, in our world that we have been in for awhile now at Chesapeake, we go in and we help a customer understand, ma'am understand the value of ma'am. And after they start going on the road on implementing a ma'am, we then find that we have to get them to understand the value of a librarian or someone at least generally managing the ma'am, the not just the engineering side of the man of the administration. We can help them with the editorial side. But the, yes, the editorial side of the man, the taxonomies, the, you know, scenarios where you're putting a lot of automation and business process in your organization around the man and someone has to make sure that it's actually happening. Do you find, you know, going in, not just in that distribution man world with that Waze offering DMH now, but also just in AI and machine learning in general, are there roles that need to be developed into an org chart for a client that is taking on these new technologies that you offer? Is there a common thread there?
Speaker 1 41:32 Yeah, there absolutely is. And
Speaker 2 41:35 You know, the data czar, you know, so I can look at assets and we can have a common lens of looking at my, all my audio, my video, my structure to my unstructured data. Um, you know, that's, that's consistent and kept ubiquitous. You know, across my enterprise is critical in some, you know, some companies are farther ahead in that than others. Um, for example, you know, we're working with CBS news right now on, you know, a huge effort that they are tackling, trying to digitize their entire historical archive. And to date, we've been a partner of theirs for a long time. You know, they're, they're a, they're a DMH customer of ours. They're a cognitive partner of ours and they're also a licensing partner of ours, meaning baritone through our division, baritone digital, formerly Waze. Our job is to help sell their content and license that content.
Speaker 2 42:25 To this point, we've only indexed or digitized about two to 3% of their archive. That's it. So there's a huge amounts that's sitting there and in most of it is on film, it's on tape. Um, so, you know, how do I even prioritize now what potentially is the content, you know, should I try to digitize or index next? So, you know, that they have, um, actually they call it a librarian. They have their own kind of data specialist who understands their preferred taxonomy and sort of, you know, coming up with an intelligent game plan and sort of a a yield criteria of what should we prioritize next to go through the effort and expense to digitize next and to make this type of content more accessible and more of a leverageable asset. Uh, you know, a great one obviously is we just had the 50th anniversary of the moon landing, right?
Speaker 2 43:18 And when we put out some promotions around us a while ago, but a lot of that content wasn't digitized up just just, you know, Intel a couple of years ago. That's a great example where some stuff is pretty obvious that you might want to take, take the expense and try to go index and sort of surface. But, but today the world is so dynamic that a trending topic or an event somewhere in the world may be the impetus that a broadcaster or studio has to quickly go, Oh my gosh, we have some potentially something really interesting around that topic. Let's go find it and let's go find it. As you guys know, it can be a very daunting task and depending on where you know, where, where their content is and, and how it's been indexed. Yeah. One thing that we've seen for many years now is a lot of the decisions of our clients and their infrastructures and what they're implementing in our world is driven by engineers.
Speaker 2 44:08 And we're now getting into that point of there's value behind your content and these systems can help you get that value out of it. Uh, that is not an engineering decision. You know, we have some clients where we talk to them about these offerings and the engineer in them says, my Condon's not worth anything. You know, what do you mean it's news B roll? Uh, and, and we kind of roll our eyes at it in a way because you know, there can be this value in a lot of this content. Do you have some surprise wins where clients really didn't think that their library was going to be all that valuable? And then once you've got it out there and did some processing on it really kind of blew them away.
Speaker 1 44:44 We have several putting their content out there. It started to expose some of their primary but not just primary but secondary and tertiary content, um, to a larger audience. It sort of gave them great insight of like, wow, we're really, we should be producing more of this type of content. And they really weren't able to sort of envision where, where the demand curve was until they actually put their content out there. Um, CVS isn't is another great one. Um, you know, we're seeing a lot more higher demand for local content, not just their national classic CBS news and 60 minutes and 48 hours, but we're seeing a lot more increased demand for local content. Obviously CBS owns a ton of local stations and they also are, you know, have a lot of CBS affiliates and they're finding whether it's for crime drama shows or podcast development, you know, crime drama is a huge category right now that they're finding a market for a lot of their content that they didn't previously think existed. That's great.
Speaker 2 45:44 Yeah. A great use case. There is not just getting value out of the content sitting there that you are reselling for getting value out of the analysis of that to determine what other content should we make that's like that so that we can reach more of our devoted customer base devoted viewers and really, you know, show them the content that they are excited about that for a lot of our audience historically for the workflow show I think is very engineering heavy and engineers get really excited about AI and machine learning and bring these tools into their environment. Um, it's a long process to get those adopted and it do the integrations in there for some of those clients. And I guess a big message for those clients is bring this up the chain, bring this up to the CFO, bring it up to others that are managing the business as a whole when the content decisions and show the value of getting this in where it might meet a metadata logging need that might make everyone's job a lot easier, but you can easily bring this up the chain and show the value of everything else that comes with it to make that an easier ask internally at your organization.
Speaker 1 46:46 Yeah. What are you not capturing? Right. When you, when you do your logging that that you could be having that would be so valuable and protect the rest of the organization so they don't have to pay for again, that I like to say is like once you've, if you're satisfied with the accuracy of the metadata, once you've processed it, once, that should be extensible throughout the organization. If the marketing department wants access or has a demand to access that metadata or the advertising sales group, um, or the finance group, um, you know, right now it's almost, they're too siloed at times. You know, the people who have been the gatekeepers of the historical ma'ams, um, in the archives, um, you know, they, they need to know that if they're getting budget allocation for Chesapeake or for, you know, for baritone to process their content, they almost should have an obligation of like what are use cases we do internally about this.
Speaker 1 47:39 You know, I like, you know, sometime that they're very protective and they don't want to sort of get outside their boxes, but I'm sorry. But if you're in it and you can walk into see at CFO say, Hey, I have a growth opportunity now. Meaning, you know, this expenditure we just did for operational efficiency and workflow optimization over here, I think I can help increase our advertising yield on the other side of the line. Um, and I, and I think your, you touched on this earlier where, you know, people are going to have to, you know, they have to be either strategic or financial minded now. Um, and, and they're, they, they're not making technology decisions in a silo anymore.
Speaker 2 48:14 Absolutely. Yeah. Uh, Ryan, I know that you have a lot of background in many different areas of content is starting all the way back from, you know, in the ad world and what you were doing there. Were you and your brother Chad got your start. Um, all the way now into what we're talking about where we're all really talking about content creators and supporting content creators, but you've mentioned, uh, supporting the police and doing facial recognition and getting into the more investigative side of things and obviously still back into the ad sales, which is not our world directly. Are there insights from all of these different industries that their tone has their hands into that you are able to bring into that content creation world for our clientele?
Speaker 1 48:55 Yeah. And it goes both ways. You know, let me start the other way and I'll work back towards M and a in a second. It's amazing. You know, we're, we're excited and we're proud and I think a lot of our meeting entertainment customers should be proud that Canada, a lot of the intelligence that are in improvements in our systems that we've had to make to process sports footage and news footage for, you know, an audio based, you know, commercial broadcast radio has helped improve our products to catch better, to catch bad guys, our, our systems, our processes, our engines are better, are smarter, are faster, are cheaper, and they're the direct beneficiary of that without compromising any of the proprietary data of our media entertainment customers. So, you know, that, that's a great example in, because we did launch a meeting entertainment first and then we expanded into public safety and justice a few years later.
Speaker 1 49:49 So really it was the benefit of these new verticals, you know, getting access to it going backwards. The one sort of common thesis that I've seen in, in sort of my career is you really don't know the opportunity until you achieve scale. And scale is such a weird concept. But you know, I'll give you just a quick, you know, a few specific examples that I think we're going to start to see here as we leverage AI machine learning more across an organization of a media entertainment partner or customer. I think it's going to be the secondary big idea like Oh my gosh, wow, that's there. And the idea to Google, Google search engine, classic third generation search engine. We all grew up with WebCrawler and Infoseek and Lycos and all these groups and boom, Google comes around with with candidly a slightly different algorithm that included page rank and it became kind of very quickly one of the dominant search engines out there.
Speaker 1 50:44 Also just just a search bar and nothing else, just a search bar. Not until they achieved a level of scale, did some creative people and not just engineers saying, Hey, we have so much insight about these websites now and the web traffic through Google analytics. What about an ad network? We can build a next generation distributed that and they were called ad sense around that. So that there, that's just one example where you know, you don't even, you didn't even see which ultimately became now a $20 billion line item, $24 billion line item for Google each year was sort of the secondary product around search. So I think in content we're going to see this Piper video personalization and recommendation is kind of the buzz word right now, but I think it's still relatively siloed. We're in the midst of the streaming Wars and candidly the streaming Wars remind me so much of the early days of the web, right?
Speaker 1 51:39 Where, I mean we had, we had walled garden, you know, experiences that kind of like were superficial. Like this is the internet. Where is it? Or is it an, is it AOL, his version of the internet, right? Is it copy, copy, service version of the internet? Um, I think ultimately it will happen. Just how long are these sort of siloed barriers going to be broken down. But I think there's going to be whatever we think about hyper personalization of content exists today. I think we're in the beginning of the very first ending of such. I think ultimately when AI gets to a point of cost effectiveness and accuracy and we do have more of a generalized and sort of unified data schema for this content, well audio and video, the shackle is going to be off and if my content experience is going to be programmatically built for me, that's a hundred percent personalized.
Speaker 1 52:30 When I want to watch content, what type of, and if I only have 15 minutes in my commute because my car and my phone knows my average commute is 15 to 20 minutes, well the content experience that's going to be packaged and delivered to me is guess what? It's going to be 15 to 20 minutes. So I think at times I, all these little baby steps that we're doing here, once we achieve scale, scale of indexes, scale of accuracy of cognition and scale of content providers, we're going to start to see sort of the next generation of the content experience in the years ahead.
Speaker 2 53:05 I think part of that next generation is, is also you had mentioned earlier about using machine learning to understand more about the content that viewers are really watching and that you can really gain from as a content creator to make more of. And we're seeing examples where it's clear that a lot of these content creators, they have to start making more, you know, the investors are expecting to see their investments yield results and you know, one example recently is Netflix just announced that they may start putting out content week by week instead of this bingeable content all at once that you'll just watch all weekend and you have to believe these things are coming from both the merging of the business deciding how they really need to move ahead. That is more value for them as an organization. Creating content and being able to make more money emerged with the information coming from the analytics of the data of their viewers and how they're watching and not just of their own viewers but their competitors, viewers, other content creators, viewers, and really redefining how they bring content to market.
Speaker 1 54:10 Yeah, you're spot on. I mean, and I think at times we try to look at content using the same models, meaning Netflix, you know, recommendation engines and intelligence that they can garner from long form content is so radically different than a YouTube experience where the average video is three and a half to four minutes long. Right. And what are those models look like to the individual? You know, I mean, again, I, you know, Google is obviously on YouTube and Susan would shoot you then, but they've gotten so good at keeping that child hooked. And I have three children from ages 10 to 15 and it's incredible how well they're able to keep that child hooked on the very next video. Right. To your point, they have a lot of transactional touchpoints to act upon quickly. That's right. Um, it reminds me of the video game industry a little bit.
Speaker 1 55:01 I mean, the biggest problem I see with all these streaming groups is I only have so many hours of the day to consume content. Right? I can't make it. I can't possibly consume 10 different shows at the same time, you know, and obviously walk outside and have a job. So you need to look at the just sheer amount of money that some of these big groups are spending eight, nine, $10 billion a year, knowing that only 10% of that could ULA hit. Only a very few number of companies in the world can do that. But I think that kind of touches to your point now where if the data's out there, it's going to help us get smarter faster. And Amazon obviously has a tremendous amount of scale. They can start to release content out with a more higher frequency and cadence. It hopefully will give them better insights so they can make better in intelligent content investment decisions.
Speaker 2 55:47 Great. Ryan, you've mentioned chat a couple times. What is his role with the organization and how long have you two as brothers been also in business today?
Speaker 1 55:55 Can I ask you about that? Well, it's right out of school. So Chad's two and a half years. My senior, you know, we grew up in a very entrepreneurial family. My, my dad was an entrepreneur, all different types of businesses growing up. It's definitely some level of genetics in, in there. Um, kind of derailed, you know, my personal focus of becoming an orthopedic surgeon. I was a bio major and a premed major at UCLA where I am today. So my brother was a mathematician, you know, and, and going to Princeton and he decided to go to USC. Um, and it was really, you know, we, we met up in the summer after my sophomore year at UCLA and we came up with kind of our first idea together and it had nothing to do with advertising. It was some just tell you anecdotally it was, we were the very first people to sort of port a multiple listing database for residential real estate properties to the worldwide web.
Speaker 1 56:47 We did that primarily cause my, my parents were involved in at that time in residential real estate. And we were just dumbfounded that these real estate agents who were still using a 14 four modem to dial into this legacy multiple listing service database to pull data. We're like wow, wouldn't that be killer to extract that and publish it on the worldwide web with a graphical interface. So that was kind of the initial idea was kind of a, just a brainstorm of the summertime and we got financed and we got to, you know, because we put out a kind of a beta VC firms in the Valley, started calling my parents, trying to get in touch with Chad and I and we ended up deciding to take capital. And so our precursor to our ad tech business was actually a residential real estate property. What we know today is Zillow or Trulia and that's what we built in 1993 and ultimately decided to get out of the real estate business when we developed the ad tech technology, which we thought was going to be much bigger opportunity for us longer term.
Speaker 1 57:48 So back to Chad and I, we've been working together, you know, pretty much nonstop since that point. And over the years our brains, we've gotten a little bit more similar in all aspects of train of thought of, you know, say we're more provincial or our ideas are not as unique. Um, we don't know if that's necessarily good or bad, but the way we've sort of split up the, uh, the organizations going back years now is I pretty much handle the business side, sales, marketing, business development. Um, Chad has been a hundred percent it and tech and, and we sort of, you know, share, you know, meet in the middle. Yeah. Um, you know, we've been pretty collaborative in terms of, you know, right down the middle of, you know, 50, 50 owners in every endeavor. Um, and it's worked so far. So we're, we're extremely blessed and it's obviously very rare that we've been able to work together this long and haven't killed each other yet.
Speaker 0 58:38 That's fantastic. Yeah. That's amazing. Yeah. Thanks for sharing that. That's awesome. Yeah. Well, uh, I think with that, we'd like to conclude our episode. So I'd like to thank you, Ryan Steenberg, president of baritone for joining us today. Thanks for your time. Yes, thank you, Ryan. Thanks. Thank you. Thank you gentlemen, and I'd like to thank our CEO, Jason Paquin for taking time out of his busy day to join us today. Thank you. And I would also like to thank my cohorts co. I would also like to thank my cohost, senior solutions architect Ben Kilburn. Thank you Ben. And thank you Jason. The workflow show is produced at the Chestnut church. That's our home here in Baltimore, Maryland. It's co-produced by my cohost, Ben Kilburg and Shasta sales operations manager, Jessica Amantha. And we would like to thank you so much for listening today. We hope you like listening to the podcast as much as we enjoy creating it. If you have any questions or concerns or stories about media asset management or storage, or infrastructure or integrations, email
[email protected] or visit our website at <inaudible> dot com thanks for listening. I'm Jason Wetstone.