#68 API First Live Video and Media Ingest with Cinedeck

February 08, 2022 00:39:46
#68 API First Live Video and Media Ingest with Cinedeck
The Workflow Show
#68 API First Live Video and Media Ingest with Cinedeck

Feb 08 2022 | 00:39:46

/

Show Notes

On this episode of The Workflow Show, hosts Jason and Ben welcome from Cinedeck Jane Sung, COO, Charles d’Autremont, Product Development, and Ilya Derets, Project Manager. They discuss Cinedeck as a live media ingest solution, including what interesting workflows can be managed with their product and how being API first differentiates them from other solutions.

View Full Transcript

Episode Transcript

Speaker 1 00:00:08 This is the workflow show, the podcast, covering stories about media production technology from planning to deployment, to support and maintenance. We cut through the hype and talk about the nuts and bolts of secure media solutions to offer you some workflow therapy. I'm Jason Whetstone, senior workflow, engineer and developer for Chesa and I'm Ben Kilburg senior solutions architect. It just speaks systems. In this episode, we welcome technologists from Ciena deck, which specializes in live media, ingests solutions. Ben, and I will discuss how their API first platform addresses some of the challenges of multi-site live media ingest over long distances. Our guests are Jane , chief operating officer Charles product development, and Eliya duress project manager from SynenTec first. I'd like to remind our listeners to subscribe to the workflow show. So, you know, when we have released new episodes and it also helps us know how many of you are listening on a regular basis, so we can keep doing this thing we love doing, and also send your feedback or suggestions to workflow [email protected] and you can tweet or hit us up on LinkedIn at the workflow show. Now let's get into our discussion with Seanna deck. So let's talk about the nuts and bolts of this a little bit. When you talked about moving this ingest platform into the cloud or trying it out, what did that look like? Was it basically spinning up some VMs and some sort of a compute infrastructure and the cloud, you know, what, what, what did that look like and what storage are you guys recording too? Yeah, what's the storage, what's the compute look like? Is it as long as it's not trade secrets or anything? Right. Speaker 2 00:01:42 So actually we started thinking, can we move into the cloud and starting of what is the requirement for the system to, to run our software? And we start start, uh, like creating dollars, uh, points. Okay. So we need a GPU, we need a CPU or that class we need so much Ram. Okay. And we need that like operating system. And we started getting to the cloud providers and see what they can provide us with and to find out that actually AWS provides that instances of with G-Force widow, uh, cards on board. And we can easily use it for our standard deck as it's one of our major requirements as a point. And we have really a great CPU there, which is you're on with enough computation power. And yeah. So starting seeing that way, Speaker 1 00:02:41 So that in terms of the computer, it sounds like it was a pretty academic exercise of like, what would we spec out if we were building a system that was going to be bare metal and just sort of transitioning that to what the cloud requirements would be. Okay. What about storage? Speaker 3 00:02:54 Oh, well, we're recording. So whenever we spin up a, an AMI instance, we attached to an EBS volume. Okay. So it's recording to the volume of VBS. And then once the recording is over, it can be moved over to whatever cloud storage you have, S3 or whatever. Speaker 1 00:03:13 Gotcha. So we should, uh, define AMI as the Amazon machine image and the EBS. Is there a blue elastic block story? Not the blastic luck story as my dyslexic brain wants to spit out, which is there direct attached, usually SSD storage that you would record directly from the CPU over to Speaker 3 00:03:38 And using that kind of storage allows us to have workflows like edit well capture in the cloud. There's a lot of other interesting workflows that came out of being able to put our product in the cloud. Let's talk about that. Yeah. We have a partnership with a company called lucid lake who I think you guys Speaker 1 00:04:00 Know we're very familiar with, so Speaker 3 00:04:03 You can actually, you could just load lucid link on that same image. Got it. You can write that a proxy file because what Synetic does, you know, we can, we can record highs and low Rez, but you can write that HD six, four proxy file to the Lisa thing drive and then have your editors editing locally from that same as the thing share. So that's really cool. You know, then you're not having your editors have to spin up instances of, uh, you know, remote desktop or whatever they can just edit locally. Um, using RUSA think as the sort of bridge between what's being ingested in the cloud and what they have access to locally. Speaker 1 00:04:48 Yeah. So just, just as an aside, listen, think as the technology that we have talked about, you're on the show before we are going to be having some folks from listed link on the show and a future episode, probably very soon just to throw that out there in case anyone in case anyone listening is not familiar with lucid link. Basically what we're talking about is the ability to have some things in the cloud that can sort of be cashed down to your local system, giving you the ability to use them as if you were just working in the cloud. You know, as, as I kind of explained it, that seems like it would be perfect for what we're talking about here. Speaker 3 00:05:23 You know, I think the idea of having remote workflows is so important now because of COVID because of, you know, there's just been like a mass migration to people wanting a cloud solution. Maybe they're not ready for it right now, but I think that people are more seriously thinking about, well, how do I actually move this production into the cloud? You could for a live event, for instance, you could just go with your encoders. We're also working with video on, you can go with a bunch of video and encoders to a live event stream live to sit at an instance in the cloud and record ECC canvas. I'm actually talking about a POC and upcoming POC for a live event that's going to happen. They're going to be recording two stages, five cameras per stage. The person operating Senate ag is going to be an LA live event is happening in Atlanta. The editors are going to be everywhere. And basically by capturing all of those XC canopies directly to the cloud, they just have a collaborative distributed workflow. Like, boom, that's it. They don't need to do anything else. Speaker 4 00:06:38 Let's break that down. Right. So we've got our encoders on prem, which is sending a stream of HD or 4k or whatever the specific resolution might Speaker 3 00:06:50 Be. Speaker 4 00:06:53 Okay. Which, yeah, it's the high efficiency video coding, right? Which the data rate is a little bit lower and the quality is maybe a little bit higher than each dot 2, 6, 4. And so that's what you guys are receiving on the sinner deck ingest server side. But then once you get that, you're flipping it over to an edit codec that gets stored in the EBS storage. Right? Speaker 3 00:07:17 Right. So they're asking us to create extra cams Xicana file. XEM 50 from that H 2 65, where in coding it live with matching time code across all of the stages for the one stage has one time, et cetera, but just all matching metadata so that when the editor gets the file, they have access to do like a edit well capture multicam edit, like while the event is happening. And the editor is not anywhere near that. I mean, previously what's cool about this. Um, POC is that we're working with a company called Envoy who is helping with the streaming side and we're actually partnering up with them for this particular POC. He was like, you know, before us, before this workflow, they would set up an entire video village. They would run out, you know, a couple of hotel rooms for, you know, a week, two weeks before the event. Speaker 3 00:08:21 And there would be drives. And, you know, there would be just a lot of messy drives and, and sneaker netting and all of that kind of stuff. There's just like more room for error that way, you know? So this is just like a super streamlined workflow that gets your media directly into the cloud. Your editors could literally be anywhere editing with premiere and, you know, creating content on the fly and what's occurred is that you continue to use the workflow that, you know, so I'm sure you guys have worked with a lot of editors. We have also at summit ag editors like to work the way they like to work. They want the format that they want. They don't want anything else. So to be able to deliver, edit, ready content while the event is still happening, that's kind of cool. And literally the production team is going there with a couple of cameras and video and encoder boxes. And that's it Speaker 4 00:09:22 For this POC. Is there any backup that's happening? Any confidence recording on site just in case the net work has? Speaker 3 00:09:30 Yeah, I'm not a hundred percent sure. I, I feel like, you know, I'm on the Senate side. I don't know, because I don't know what they're doing. Um, in terms of that, I did hear that they're, they're floating around, you know, black magic recorders, like the hyperdecks as the backup record. Got it. Speaker 4 00:09:47 Right. So there might be an SDI outgoing, or two SDI outs through a switcher from the camera's one's going to a hyperdeck and the other's going to the encoder. So that at least they've got that safety copy that if the network has an issue that, that is redundant though, these days with NDI and SRT, we've got some pretty decent ball works against network goblins, especially with SRT, because it does that first SRT is the secure, reliable transport mechanism. Right. I think it's something that high vision came up with and then they kind of made it public for a lot of folks, too, nothing to do with the subtitle files. Right. Exactly. But what's cool about that is just like file acceleration software or it uses UDP to make sure that the packets of the data are going as fast as it can across the wire. Speaker 4 00:10:38 One of the things that it does to make sure that the packets get there is it will have a buffer on the sending receiving side. And so on the receiving side, if things aren't in the right order, or if it detects that it's missing a packet and say, Hey, send me that one again. And then, because there's a little bit of buffer time, it can stick that one packet back in order again, and then boom, you've got the file just as you would expect it to be. And the network goblins have been mitigated by your long sword of SRT power. Speaker 3 00:11:10 We've had to do a little bit of optimization on our side to receive those SRT packets. Do you want to speak more to that? Speaker 2 00:11:20 Yeah, I actually was SRT, it was a more straightforward rather than actually with NDA, that format it's really created to be streaming over the internet in that case. That's totally true. Less suited to that. Let's say, Speaker 4 00:11:37 Okay. That makes sense. And the big benefit, obviously of both is that they're going over the ether nets so that, you know, we're transporting things over the network and that's what they're built to be doing, but one is built for the internet and the other is essentially just built for networks in general. So what did you guys find you needed to do in and around the kind of long haul transmission of NDI stuff to make it play nice with Speaker 1 00:12:02 that's a good question, Ben. Speaker 2 00:12:04 So it was a main thing here. It was a synchronization between the audio and video in India, and unfortunately they provided that audio synchronization on the end of the latest version of SDK, which come up on the like months ago or so. So we'll already, we'll see in the middle of that resolving that issue. Yeah. So that was as a major problem we faced with NDI, but once that resolved, it works pretty nice. Speaker 4 00:12:29 Got it. So it was more of a software fix for NDI in general that made it play a little bit nicer. Speaker 2 00:12:35 Yeah. So we've had to actually the issue runs a Odo. Sometimes it was not synchronized with the video stream, so it's, it can be like 500 minutes, seconds or so, and it's quite important. Right. Speaker 4 00:12:48 And when you're doing something multicam, that's really important for sure. Yeah. Speaker 5 00:12:52 Yeah. An NDI is really optimized to re in real life it's a monitoring mechanism and the reason that they transport audio and video separately is that they drop video frames when necessary, but they don't drop audio. So that while it's nice in a kind of monitoring environment, it's problematic in a, when you want to record every frame coming out of a source, let's say, Speaker 1 00:13:18 Great. That's really interesting. So it sounds to me like the NDI, you know, I, I wondered about this before, cause that's the way I always thought of NDI was more of a monitoring thing, but I did notice that it had been been used in production sort of, you know, just transmission of signals for many other uses so that I often find can be challenging for folks like us Charles. And when we start adopting, I would even say some of the highly compressed codecs that we have to deal with sometimes that some of these cameras shoot in, you know, they're, they're mainly designed to be a distribution format. And then we start finding these, you know, these cameras are shooting in that format, which could be introduced to some problems when we are, we're actually trying to use that stuff. So in this case, you see what I'm saying. You're trying to use NDI as an actual transport mechanism and it's not really designed for that. So you have to sort of work around all of the intricacies of it to make sure that it's reliable Speaker 5 00:14:09 And that's, that's true of the ATVC, that's a distribution format, but it's because it's so space efficient, the camera makers all jumped onto it the same way they did with H 2 64. And that's great for storage cards and stuff, but it's terrible for editing. Speaker 1 00:14:28 Exactly. It doesn't make it any easier for us, for the tech folks behind everything. When premier says, oh, it doesn't matter what kind of footage you have, just suck it into premiere and we can edit with it when it comes down to managing those files and trying to, you know, trying to put a system together that is responsive. Speaker 5 00:14:46 I mean, that's kind of true. And new tech would have you believe that, you know, everybody's happy with their recorded files too, but editors just like Jane was saying, they just don't buy that. Right. They may not have a good reason for it. It may be just simply, it's a word on a piece of paper they know, or it's just, it could be just, you know, sort of superstition. It could be any number of things, but people like what they like. And if it's not that, then it's a very, very uphill battle to get somebody to change that Speaker 4 00:15:18 The difference between a long GAAP codecs that have higher compression rates, but still look fantastic. And I frame codex that might be a little bit way to here, but we have each individual frame within time segmented by itself. And the compression happens within the frame certainly is less taxing on the editorial process. So as an editor, for sure, I get, I get why they'd want to use something like pro Rez or something like that on a regular basis that they just know it's going to work Speaker 5 00:15:54 Well. Sure. And well, even X, ABC is an I-frame Kodak, but the camera codecs aren't so X, ABC that's coming off a camera is a long hop and HEVC in particular, bogs down any kind of editing system very, very quickly if you get multiple layers because it's so compute intensive to decode. So yeah, that's absolutely true. And editor will have a much better experience just using some kind of I-frame Kodak of any, Speaker 4 00:16:23 But that's why we have find folks like send a deck who can come in and take those gaps and record them as I frame codex and then present them out and even transcode them on the fly. So that if we wanted something like ProRes proxy, you could spit it into lucid link and our friends in the edit bays, or we should say on their couches or in their offices at home can grab the files and the bits of information as they're growing during that capture. So Cinematheque can spit that out to lucid link. An editor at home could have that growing file in their timeline and be chewing it up and doing the cool things they need to do with it. Right? Yeah. Speaker 2 00:17:05 Yep. And we even tested to record progress HQ in full HD, 10 ADP directly to lucid link. And there was almost no delay. So it was like five, six frames delay with, so one of the PRS on the other end from the recording. So it's really close to real time. Speaker 1 00:17:25 Yeah, that's great. I mean, one of the things that I do want to sort of transition to here and we don't have to spend too much time on it, but I just want to call it out being that Chessa is very much about partnerships between various organizations, maybe various organizations that have completely different on games. It strikes me that the success of what you guys are doing right now is it's really sort of based on those partnerships, you mentioned lucid link. Of course that's a big one. This technology seems to be, you know, very heavily based around that. It's great that there's the partnership there. Jan, you also mentioned Envoy before, uh, for this particular POC that you were talking about. Let's just spend a little bit of time just focusing on the role of partnerships. I think it's great that you're able to, to develop those partnerships. We do the same thing every day in what we do, but being able to have discussions like this is great because it makes all the light bulbs go up over our heads. Speaker 3 00:18:12 Yeah. Yeah. So we're working with the bank, we're actually doing a webinar with them and a company called Dixon sports who does live logging. So it's actually going to be an edit while capture with live logging with premier first. And then we'll also do it with avid workflows. Everything's going to be in the cloud. So the idea is that we'll be capturing in the cloud. Somebody will be live logging against the proxy file. That's being written to the lucid link, try. And then the editor will be able to get the log and metadata the tags and then drag and drop those logged issue, six, four files to their timelines so that they can start editing right away. So when did you start putting everything in the cloud? It just, you just get like, um, the Julian ideas that shirt's sparking off, right? You're like, oh no, we can do this. Speaker 3 00:19:08 Now this editor can be here and now we can include this. It's just, there's so many possibilities. Another thing about partnership is like, whoa, okay, so great. We have this product that you can record whatever you want into the cloud. And you can have this like whole workflow production workflow, but how do you get the signal in the cloud? So that to us was an important thing to address right off the bat, you know, with our capture of the cloud product, we've already worked with video on somebody on is our partners to get the signal into the cloud. I know that we're going to be testing with Makita pretty soon. And apparently new tech has something called new tech bridge, which is basically like, I think that's an NDI that they wrap in Sr in an SRT wrapper. So apparently there's like an app that runs on the receiving end and there's an outlet that runs on the sender end, but having a reliable partner, a partner that we can say like, okay, when you want to capture something into the cloud, this is what you use. Speaker 3 00:20:12 That's important for us to be sure to say and stand behind. Yeah. So LBV, we also use Larix, you know, Lauer is actually, I don't know if you know what that is, but it's an app that you use on your Android or iPhone and it streams SRT. It's amazing. It's free. And actually when we put our first instance of Senate deck in the cloud, that's what we use because we're like, what can we use? We didn't, we just didn't have partnerships at that point. Yeah. And we were speaking with another reseller in the UK and he's like, why don't you let larynx? So we did. And we all downloaded it on our phone. And the first internal demo that we had of Semitech recording SRT in the cloud was just our developer, having his life phone pointed at himself, recording into the cloud is very cool. Speaker 3 00:21:05 Oh, that's great. Yeah. So, you know, it was that being said, we're also looking to the future with this cloud capture solution because, you know, we realize that video is sort of everywhere. You know, everybody needs to create video small, big doesn't matter where you are at people are creating a ton of content. And our next step is to have an even easier way to capture directly in the cloud. So even though our client, our new 2.0 client is brand new and people like it, and it's been successfully used already, we're looking to make it even simpler by doing like a one button record. And you can do like literally set up three iPhones or three Android phones with Larix. And if you're doing like a home cooking show, have multiple angles and have like a one button, you know, multicam experience for just a creator. And I think it's important for like, you know, higher ed and you know, a lot of different applications where you don't have technical people, but they need quality content recorded. Like how do you address that market? Speaker 1 00:22:21 I would even say for some very technical people who just aren't video people, they might need such an easy user interface. I mean, one of the use cases that I can think of is just, you know, YouTube influencers, for example, you've got, you know, some of them have, they have the wherewithal to be able to put a system like that together, but they're not necessarily video editors or, you know, so that, that, that sounds like a great use case. Yeah. Speaker 3 00:22:44 And all on demand. Right. So it's on like a per use basis. We haven't nailed down the exact pricing on it, but it'll be extremely importable basically. Yeah. So just to be able to like drag and drop all of your camera files into your bin and just have everything lined up so that you can quickly do an all multicam edit because frankly, a multicam edit is what we're used to seeing now, you know, Charles and I were like, kind of looking at just like the validity of editing multicam, even as like a creator. And if you look at all of like the sitcoms from the seventies, and now I think there was like, uh, a story that came out a long time ago, but basically in the seventies you had like seven to 10 cuts per scene. Now you have like 300 cuts per scene. Like, and that's like a normal multicam, edit. People are normal and they're used to it. Speaker 5 00:23:39 If you look at movies and the, you know, from the sixties, they'll have a shot, that's a minute. Like, what thing have you watched that has made in the last 25 years that that has a shot that's even like 10 or 15 seconds long. Speaker 4 00:23:53 Right? I've seen films that look like they're a continuous shot all the way through Speaker 3 00:23:59 Very crafty editing that. Speaker 5 00:24:00 But that's a thing, right? That itself is a thing like it's a thing. 99.9% of the world, like a two or three second shot is a long shot. There's constantly like that. RuPaul drag race show we were talking about it is a case in point there's probably not a shot that's longer than maybe one and a half seconds in the entire show. Maybe it's some of the stage performances, but other than that, they're always treating it. Like, it's like every person who's talking, you move to them. You never hear people talk off stage. Speaker 4 00:24:34 Right. Speaker 3 00:24:36 And the multicam frankly, is a really affordable way to get higher quality content because you're not having to like reset up for every single shot, you know, as a YouTube or if you are doing an exercise video or content or a cooking show or anything, instructional, you know, me, me as a viewer, I want to be able to see more than one shot because I want to sort of look all the way around and because that's how you learn. So being able to create content quickly by having all of those multiple camera angles saved and sort of ready to go is kind of important. So that's what we're planning on doing next. You know, we're planning on making more of a platform based record solution that is super easy to use, literally a one button set up. Got it. I'm really excited about that. Speaker 4 00:25:34 Yeah, that sounds exciting. Now, speaking of synchronization, if somebody does have some index servers, you know, maybe they've been a long-time customer and they've got some ZX 85 sitting around, but they also want to do some cloud stuff simultaneously. Like maybe they bring one ZX on site and they plan to do some recording with that for confidence, but then they also want to have a replicated workflow into the cloud, um, and have some encoders along as well. You can control all of that from the same interface and sync everything up simultaneously, right? Speaker 5 00:26:11 Yeah. I mean, our client runs really anywhere. So any device that has a public facing IP address can be controlled and that we've done. I was at a trade show in Oklahoma and I had, uh, the client on my laptop on the hotel, wifi controlling a machine in the Ukraine and one in big and hill in the UK, which is formula one in my garage slash machine room here. So you can connect and control any machine that has, you know, that's connected to the internet in some way. Awesome. Speaker 2 00:26:48 Yeah, because from client application standpoint, there is no difference where that, uh, on deck several applications actually, and it uses the same API at any instance. Speaker 4 00:26:59 Gotcha. So if I was a broadcaster and I wanted to use multiple Senate decks and kind of have a centralized control room on probably a whole lot less expensive than traditional broadcast transmission lines, just all going through the interwebs or a combination of internet and maybe some more traditional stuff, they could control all of that from one application. Yeah. That's kind of cool. Speaker 3 00:27:28 Yeah. So yeah, you can definitely control, like you were saying, on-prem Senate Cinemax, you can control Senate instances in AWS. There's even, you know, we were talking with somebody who wanted to maybe run Senator Jack on AWS outposts. So I don't know if you know what that product is, but it's like there it's AWS as hybrid solution. So you can literally go to AWS marketplace, purchase a cynic AMI to run on outpost control that locally record, you know, push that to the cloud. You can also just have Sidak running in the cloud and control that with the one user interface. I think that we at CENIC were surprised at how many different applications that were for the new client. We didn't envision it. And it's a lot like NDI where, you know, maybe new tech was like, oh, it'll just be for monitoring. And then it just morphed into all of these other things, because we in this broadcast space are always sort of under the gun to find new solutions to problems with, you know, just sort of taking what's out there and trying to make it work. So even for us, we're like, you know, when somebody asks us like, oh, can you do this? And we're like, yeah, I guess we can do it. You know, we haven't thought about it. That's not like a solution that we advertise, but yeah, we can do it. So it's always fun. Like when I hear, when you decide that, then I was like, yeah, yeah, we can do that. Right. Speaker 4 00:29:12 Yeah. And that new client application that runs as an independent piece of software that can run on a Mac or PC. Right. And so there isn't a specific web user interface that people necessarily need to log into. It's more of just a piece of software that they run to control. Um, all of the incoming signals. Speaker 3 00:29:33 Yeah, that's correct. But I think when we have this new new product, I like the one button multicam, that's going to be a web based application that you log in and stuff like that. So actually you can use that new UI as well, right. With our existing, you know, it's just the client. It's like, how do you want it to look? And how do, like, how do you want to operate it? Do you want the super simple mode where it's one literally one button or do you want a little bit more sophistication? So we just want to add some varying levels because we realized that not, not every user is going to be the same and you know, you can publish a knowledge base on a user manual and video. It's like, you know, some people just don't want to read it. So you have, you have to offer varying levels of accessibility to the end users Speaker 1 00:30:27 Or somebody else will. Speaker 3 00:30:29 And you can do that with the API to, you know, you're saying Jason, that you really love the idea of an API. And frankly you can do that. You know, somebody is using a reach engine or any man, and you're like, you know what? I really want the client, my client, they love their mom and that's all they want to see. You can do something very simply with our API. Speaker 1 00:30:53 Okay. So another thing that really strikes me here is the fact that you can sort of spin up these cloud record instances on demand. That means that you don't necessarily have to invest like upfront and a huge infrastructure, right? Yeah. Speaker 5 00:31:06 Even if you have some infrastructure already, if you need more, you can spin up those instances. So we have a number of customers who have studio installations, but, um, the number of cameras goes up and down depending on the production and in the past, they would have to rent more gear or buy more gear. And now if they get a, you know, a show that needs 56 GoPros, they can run those instances and just for that show. Speaker 1 00:31:34 Gotcha. Okay. Yeah. That's, that's great. That gives you the ability to sort of scale up and down. You might make a baseline investment based on like, oh, you know, what's the minimum viable product here in R and R studio and then, you know, on demand as needed, you know, at these record instances. Speaker 4 00:31:48 Yeah, exactly. And I can imagine too, for more event driven production companies where they know they might do six of big events a year, rather than shelling out, you know, a hundred thousand dollars upfront for a system, they might just be able to get away a whole lot cheaper just by spinning up the number of channels they need on a per instance, or a per event basis. And to get all of the functionality and perhaps even improved functionality as time goes along. Yeah. Speaker 2 00:32:17 And always the last minute changes. So it just decided the case. So we need to put one extra camera here. It will be really great for you, but previously you was limited with the amount of channels you have, but here you can just spin up another one. Speaker 3 00:32:33 Yeah, that's great. Yeah. It just means that you can move your production budget, you know, as needed to OPEX. Speaker 1 00:32:41 And that's a, that's a great way to put it. I mean, that's the whole model of the cloud really that we've talked about many times financially, is that you can move that investment. And this is just sort of furthering that, you know, that capability, which Speaker 4 00:32:53 Other integration touchpoints, we've talked a little bit, uh, obviously about the API and what we can do in terms of control, but you guys do some interesting things in terms of the API where say we want very specific file names. We could send those file names to send a deck and say for this upcoming record, use this naming convention and name the files this way. And even if people were tagging things from the user interface, we could also send things back to say a media asset management or a digital asset management system via XML and say, all right, this metadata that somebody might've just clicked in to add on whatever segment that was being recorded in the Senate deck 2.0 UI, and just really easily get that back into a ma'am for instance, right? Yeah. Speaker 2 00:33:44 Yes, definitely. We have that. So we have real time notification regarding what happens on that server. So you can be connected, uh, with a web socket, to, to the server and get all the real time updates. So if someone just connected from the client and update something, you immediately get notification that something is updated and you can check, is it aligned with, so what do you need or not? So if not, you can just roll bags as changes or just leave with them. Cool. Speaker 1 00:34:12 Great. Yeah. I mean the ability to sort of hook into WebSockets gives us the ability to trigger changes in the ma'am when changes in, you know, the other plat, the external platform are made, which Speaker 4 00:34:24 Is great. Jason, tell me more about web Speaker 1 00:34:26 Sockets. Well, it's kind of based around this idea of a notification system. So let's back up a little bit. When you have a platform with an API, you as the user, when you do something, there is a call that comes from your web browser to whatever the platform is to tell it, to do something. So what's kind of missing there. That's great, but what's kind of missing there. That's all call and response system. So you, you make a request to the server and then there's a response about that request. And that is the only communication that happens sometimes. Then there's the communication that comes from the server to either you or to the platform, you know, that that's notifying you, that something happened, some sort of an event happened. So let's, let's take the, the idea of a transcoder. So a transcoder has transcoded a file and let's, there's a couple of ways of knowing when that file's done being transcoded. Speaker 1 00:35:10 We could sit there and keep pulling the API and wait, you know, let's say we're pulling it every two seconds. Well, now you've got a process it's firing on an API call every two seconds. And the response may just be, I'm not done yet. So that's a lot of network chatter, right? It's also not the most efficient way of dealing with things. You might have hundreds of these processes going on at once. So all this network chatter is just not great, right? That's that's, that's not wonderful. So a website gets a system can give us the ability to hook into events that happen. So the transcoder could say, I'm done with this file. The web hook would then be admitted from the server saying this file's done. And if somebody is listening for web sockets and in some systems, it can even be, we only care about updates or we only care about new records and we only care about this type of job. Um, so, so in some systems, there's the ability to filter what we're listening for, but basically we're listening for changes and reacting to those changes rather than a user wants to do something. So they're going to press a button and that's going to send the, you know, send the call. This is a way of reacting to changes. Speaker 3 00:36:13 Just speaking of like, not WebSockets, cause I don't want, you just explained, but one of the things that we we can do with Manz is create custom XMLs. And um, I think in the government space, that's especially important because there, there might be like telemetry information, you know, that kind of stuff. That's really critical for sort of government and research. So there are a lot of things that's an addict does that? No, that we don't talk about actively all the time, because they're really kind of specific to that location, but adding that kind of telemetry information, supporting things like I read time code, which is that military base time code, those are all things that are really important for some of our customers. Like we have a lot of government customers and that's what they use. That's why they found Senator. Yeah. We actually have a customer. Charles, maybe you want to tell the story of ed because he does something really cool with the couch. Speaker 5 00:37:14 Yeah. Right. So ed, yeah. I think nickel is a integrator for your water exploration. Speaker 3 00:37:23 Yes. Speaker 5 00:37:24 Very specific video niche. So he does RVs usually on tender ships and that's where the Senate extra installed. And he did a cool thing with captions and sorters where they take the temperature and salinity and other types of sensor data. And they push that to a captions inserter and then the captions inserter. Right. And then we grab the captions. This is awesome data from the video stream with the inserted telemetry and write that to a QuickTime file that then has all the relevant telemetry with very little latency inside the file. So that it's always, Speaker 3 00:38:10 And you can try, it's just human readable. Right. So you can turn it on and Speaker 5 00:38:14 Yeah. Yeah. So you can turn it on and off and your QuickTime player or whatever. But it used to be that those things would come in through various types of data capture and have to be correlated and timestamped somehow to match. And now it all just comes together. So we're working with that actually to remove the captions inserter from process because set it up because a PC and PC can take in data through all kinds of different ways that we can do that directly without the inserter. Speaker 3 00:38:44 Hi, I'm always like, again, like I'm always like floored by like what people are able to do with stuff that is out there. Speaker 1 00:38:51 Give it determined tenacious person who is passionate about their field, like a platform that's extensible and like all kinds of stuff like that. And you'll find that they will create something really, really cool. Yeah. Well that's, I think that's a great place to wrap guys. I want to thank you all for your time. Thank you. Jane sung COO a deck project manager for Senate deck and Charles product manager for Senate deck. Thanks for your time guys. Thank you, listeners. The workflow show is a production of Chessa and more banana productions. Original music is composed and produced by Ben Kilburg. Again, subscribed to the workflow show and reach out to us on LinkedIn and Twitter at the workflow show. Thanks for listening. I'm Jason Whetstone.

Other Episodes

Episode 0

February 10, 2022 00:59:53
Episode Cover

#66 Virtual Reality in Media and Entertainment with Lucas Wilson, CEO of SuperSphere

This week on The Workflow Show, Jason and Ben are discussing virtual reality with Lucas Wilson of Supersphere. Our hosts ask Lucas about the...

Listen

Episode 0

April 03, 2013 00:29:42
Episode Cover

#13 "The New Frontier for Audio and Video Search"

In this episode, Nick and Merrel discuss the jaw-dropping audio and video search capabilities offered by two companies, Nexidia and Nervve Technologies. Keep in...

Listen

Episode 0

July 02, 2014 01:21:04
Episode Cover

#23 MAM Project Planning & Integration

This episode of The Workflow Show is a recording of the opening panel discussion at our MAM Project Planning and Integration Symposium held in...

Listen