Speaker 0 00:00 On our last segment of the workflow show, we talked about some things that we are doing, uh, Chessa and Adobe and bebop and all of these great things that we are doing in the industry to enable us to be able to work wherever we are, whether that be at home or at your parents house or wherever you might be during the, uh, the coven 19 pandemic crisis. Uh, so now what we would like to do is shift a little bit. We talked about, uh, team projects. Uh, we talked about the benefits of using cloud collaboration tools like frame, IO, iconic, things like that. Now we're going to talk about shared projects, how they differ from team projects, and then we're going to get into how Adobe integrates with MBD asset management platform. Uh, maybe some discussions about AI, artificial intelligence and metadata. And finally we're going to talk about some roadmap discussions with Dave homely of Adobe here. So Dave, let's start with this question. We talked about team projects. Let's talk about shared projects. How are they different?
Speaker 1 00:57 Yeah, so it's shared projects to be released very soon. On the last episode, we talked a little bit about, uh, this new creative cloud feature. We have, uh, called a beta prime, which allows people to start testing out, um, you know, beta beta features that we're working on. One of which is just about ready to come out is called shared projects. And we battle tested that, um, which is actually a great term considering we used the folks at Terminator 10 at blur studio. Our team worked on, uh, sort of finishing all the different features and getting editor acceptance around shared projects. So what is shared project? So for those that are used to traditional editing, like systems like avid might have, you know, they, they use, you know, what's known as bins a been on an avid system. It's equivalent to a premiere pro project, a.
Speaker 1 01:50 Dot. PRPR OJ file. And once you sorta, you know, view that your brain kind of gets around it, but if you come from that avid world, you might look at this and go, well, I just want to see a view, like an Explorer view of just all of my bins, AKA projects and premiere. And I don't want all of, you know, the media and all this other minutia to kind of get in the way of what's actually going on. And inside of these container files, which are really what, you know, almost like folders, if you will, almost what bins are. So I can look in there and sort of say, Oh, well that was the day one edit with the red cameras and here's how this is organized. So there are ways now using shared project where we can, can give you that same sort of experience where I can view like a master project and all of my projects under that.
Speaker 1 02:41 And just that view for me to manage what projects are coming and going. I don't get any, um, repetitive media, uh, trying to load because it's just project data. Um, and it's really done, done well. I do have a couple of people working on videos. Um, I'll give a shout to, uh, Colin Smith who worked for me for many, many years is now retired, has a YouTube show called video revealed, highly recommended, you know, column B, working on some shared project videos. And I've asked Colin to not only go down this path of how maybe an avid editor who also has to use premiere, you know, we're doing a lot of work with avid these days. Not another great company. You know, they've got pretty awesome storage and awesome ma'ams, uh, wanting to connect to premier and they've got some great solutions for that. So trying to give avid editors that way to look at premiere for what it can do in addition to the tools they already have.
Speaker 1 03:35 But let's not forget the creator. You know, we've got multiple teams in a building that may be just, you know, traditional Adobe or traditional final cut, uh, editors that this is just the way they've worked forever, but they have a need to share projects. And so we have to sort of attack this, um, tutorial, if you will, from a creator standpoint. So you'll be seeing some videos that will not really mention it in a structured workflow. Like on an avid, you'll have it sort of featured as how can creators use this, uh, this new workflow. So that'll be coming and it is blazingly fast. So we can, we can load movie reels and no time flat, very happy, uh, to see the same finally coming out. That's great. I think that's something that's going to be very welcome to our listeners. I always like to talk about, we have a lot of very highly technical listeners and you know, I like to make sure that we're speaking to everyone.
Speaker 1 04:26 So, and there's videos on it now by the way, we've released a few on shared projects and by the way, just to shout out to our partners, SNS SNS released a video with their storage using shared projects and a collaborative workflow. Uh, but that I think was pretty cool. So a lot of the vendors are starting to come out with that, um, on the sword side to really talk about how do we get people working together. And by the way, it does not require the cloud to edit. That's the main difference between team projects, bounces, project changes up and down. I'm in a shared environment because Hollywood is, you guys know, uh, where this was originally designed for is locked down. Like those machines are unplugged from the internet. They actually have internet police that go into these studios. And when our team worked on Deadpool, um, you know, we had to make sure all these machines are disconnected from the internet.
Speaker 1 05:16 And so this is completely designed with local network names. Um, and I will say for people going down this path, it really helps to have an integrator help you sort of get this thing going, especially with shared storage. So that's where you guys will definitely step in to help these guys out and get it set up slowly. That's one of the big values that we offer as we, uh, we sort of get and grok and understand and put together all of these different technologies. So shared projects comes with the same marriage counseling features, right Dave, that we're making sure that we're not overwriting or if we want to a good question and the answers that actually can't happen. The reason, because you know, think of it is a, you know, there's a door that has a lock on it and basically your spouse's locked out now.
Speaker 1 06:02 So, so if you're on a timeline, there are red locks and green locks and different symbols to sorta tell you. So an example might be an a movie. We have several reels or several sequences set up and I'm an assistant that comes in in the morning. I work a lot with the Cohen brothers. Um, you know, they have an assistant that comes in in the morning, sets all the timelines up, right, gets all this thing going and then you know, the brothers will come in to start doing their edits and everything's kind of organized. Well if the assistant is still organizing and they want to edit, they can see that the assistant has that sequence and it has a, it has a lock on it now that they wanted to. You can double click on that and get in a read only instance to start while maybe while you're having your coffee or just getting an idea or maybe you're just a reviewer.
Speaker 1 06:50 I can completely do whatever I want with that timeline except change anything. I can bring up another timeline. We call it pancaking where we have two timelines sort of winded on top of each other. I can do a select all, I can grab clips, drag them down, and then I'm free to manipulate if there is an edit that I want to start experimenting with. And I'm basically writing my own, my own instance, um, from that. So again, yeah, this is, this is a traditional way of, of the locking people out and unlocking those instances and allowing people to do that. And that's a great question. It's almost, it's almost like a sort of infinite versioning and about exactly. Everybody has different, yeah. Okay, cool. So very much as opposed to there is a current decision and like, you know, can be overwritten and all that kind of stuff.
Speaker 1 07:34 That's, that's cool. Yeah. And we've tried to make it as, as simple as possible from, from almost like a project management from a master project. So we're going to open up the whole movie. I do have some broadcasters that I'm talking to about this that aren't traditionally in this workflow where I'm like, well, if you have a lot of segments that have a lot of the same pieces, you know, create a, you know, a March folder, March 20, 20, when maybe we've got a campaign, I would have normally said March madness for example, where we have a, uh, an event going on and I want to be able to have a master project where I'm going to be pulling things from different timelines and multiple people all working from the same master file and I can see what everybody else is doing. And then when I'm done, great way to archive, right?
Speaker 1 08:18 Let's put all that on the ma'am. And then maybe next year you pull out that March project again and look and see what's going on and make another, um, you know, workflow based off of, of last year's work. So there's a lot of different ways to use this, which is why I mentioned Collins channel on YouTube video revealed is to try to get people into those ideas about, Hey, I'm not a traditional editor but I need to collaborate and work with multiple people. So write nice and again in the same way, same VPN or the same network instance with no cloud connect. Awesome. Dave, you mentioned, go ahead. Go ahead Ben. Oh, sorry. I was just gonna say, uh, you had mentioned a lot of cool Hollywood projects and I know Adobe has been working on kind of an awesome Blab out there so that you can bring people in and get good feedback and really Polish the tools.
Speaker 1 09:09 What can you tell us about that? Yeah, so we, we opened up, that's a great question. We opened up as a result of a lot of the movies, you know, working with, you know, David Fincher on some of his projects and we have a Hollywood team out there. Um, I used to run, you know, a lot of those teams, but I hired a guy, um, actually from, from avid who was there for 16 years. A great guy, been with us eight years now. And he is sort of now running all of Hollywood for us. And we have another guy we hired from the industry. I actually has an Oscar for Titanic. Uh, he's been helping us for the past 15 years, get this off the ground. And since then with all these movies that we then release, we've done a lot of things, even most recently like stranger things and, and other really cool projects.
Speaker 1 09:54 Um, we're working with, with Netflix and some of the streaming services now we need a full on Hollywood engineering team. And in Santa Monica, not too far from the pier. We actually have an Adobe office, uh, in Hollywood with a complete edit Bay with movie reels loaded from certain things that we can then battle test ourselves. We may bring in someone from blur studios for example, or maybe someone from venture studio, you know, into our office and sort of say, Hey, you know, we've been working on this feature for you like we were doing with shared project. Let us show you what we've got. And the engineer that's actually writing the feature is physically there giving the demo and then sometimes in many cases they go, yeah, that's close. So then, which is great, that's the feedback we want, which is sort of why we came out with this beta prime is we want more people having an opinion on, on what happens.
Speaker 1 10:49 So anyway, so that that office has been instrumental and lot of great things are coming out of that office. And it's not only just for Hollywood, but on the Hollywood edit. There is so many things that are time sensitive, demanding. Again, a lot of cases, 90% of the cases, no internet access. So you can't have a lot of these cloud things that you're used to. We need fast storage, fast reels to load, obviously quality, stability, performance, all of these different things on these demanding project. I think Deadpool, when that real loads, it's almost 800,000 clips that come in. Um, and I've actually seen that whole movie load on the premiere timeline. It's pretty amazing to watch. And I've worked with a good friend of mine vaccine Umansky, you know, he edited lots of great things. Uh, my favorite thing he edited by the way, is Sharknado to which Vashi puts it great.
Speaker 1 11:45 You know, um, VOD she's also got, got, got a great blog. Uh, you know, shark NATO is a, is a series that knows what it is, right? Which is what makes that movie so much fun cause it knows just how, how silly and how horrible the uh, some of the, uh, the graphics are on that and in a fun way. But we do take a lot of these, these, uh, editors, um, you know, very seriously on what keeps them productive. And there's lots of things out there that happen to people where, you know, performance and stability. So we've really been working amazingly hard to get the message out about, you know, when things or get you into trouble when you're trying to load old projects. That's probably a good topic. If you guys want to talk a little bit about that sort of opening the sort of legacy projects and some of this new twofold is, is one, you know, and I do talk to engineering.
Speaker 1 12:37 I mean I spend, and I'm on engineering calls just about every day. And um, and the thing that makes our team different is we have a direct access to engineering and I don't know any other company. I have very good friends at Apple. I know some great people at avid and black magic and, and other groups. But you know, we have, I just pick up the phone and talk to the guy that is our media formats guy and it's like, Hey, you know, there's this weird thing going on with Canon MXF files with this new camera and I, I'll use it, go out and start sourcing materials. Actually, I was just talking to a vendor this morning that'll be coming out with pro raw for their camera and I was actually testing it this morning before I jumped onto this, um, to this call. But anyway, as we're working on all these different things, one of the things I like to point out when you're opening up some of those legacy projects and a legacy project, by the way, we just came out with premiere 1404 last week and any time you see like a dot O something, it means the changes are probably minimal.
Speaker 1 13:38 Um, that, you know, if you sort of re read the changes versus like a fourteen.one or something like that. Right. Um, but you know, one example is, you know, if I'm working on a documentary for the last two years, there's this thing I read all the time on, you know, moving to premiere pro, you know, Facebook group, which has a really good one. There's some other ones that are out there. Don't ever upgrade a project in the middle of a, well, sometimes you can't. That's not applicable because I'm working on a documentary and you've come out with some new feature that is just, you know, like auto reframe. We were going to talk about AI in a bit, but you know, this way of just analyzing things that are going on or other technologies where I have to have this. So one of the things by the way that gets people in trouble is they don't understand, um, when they're bringing media in, we create cached files.
Speaker 1 14:26 Uh, by the way, all of these systems do, you know, final cut creates optimize files. And there's this ways. So when you hit the space bar, instantaneously audios, fires up, frames fire up, you know, the Aja or black magic or bluefish card all fire up. So these, these temporary cash files and longer cash files are created. By the way, this is an addition to audio wave form files that get created. So always different files get created to give you that snappy response. Well if we've gone in on a new version and we've said, Hey, we've now made H two, six, four or some other codec faster, that means the math has changed. And one of the things, and I think it's a fail on our part, I'll just say that we don't actually warn you that you're using old cash, meaning that the math has changed.
Speaker 1 15:14 So you know, I'm certainly lobbying for a way to, to, to go to engineering and say, you know, we've got a warn customers and take them to a preference panel before the project opens and recommend we delete all media cash just, and there's an option to do that. But only if there's no project open. So when I read some of these people that just want to punch the screen, which I totally get, you know, some projects gone wonky, either it's crashing or it's just not playing back right or something's missing. When you go in to clean up your cash, do you have to make sure no projects are open and then you'll be given a second option called delete all media cash. And when you do that, it wipes everything out. You've got to rebuild all that media. So you're gonna go get a cup of coffee for a minute if it's a big project, but at least you, you've got clean media based on what I like to call new math for the codex.
Speaker 1 16:07 So now you're going to get fast response. It's very clean versus, you know, some wonky HD six for cash or other cash that gets created. So we've changed a lot of this and premiere a 14, the 2020 version about how all of that, while the menu item looks the same, the behind the scenes as scenes as a much deeper cash cleaning. And again, I'll just say, you know, hats off to Chesapeake because when they call you guys, these are things that, you know, right? So when people are frustrated, they should kind of reach out and say, Hey, you know, we need some recommendations on best practices for how to do this in any way. I would, would recommend, you know, go into Facebook. Uh, even if you don't have an account set up a fake account that basically, uh, allows you to get tech support for, uh, for after effects and premiere and, and a good one's called moving to premier pro where you can just sort of read.
Speaker 1 16:59 And by the way, it's not all pretty all the time. And let's just face it, it's forums and we take that very seriously. We actually don't jump in on the support forums a lot because we like user to user interaction, but we have somebody on the team is always kind of watching that. You'll see me chime in every now and then. I actually took some screen grabs and some steps about how to properly clean your cash, uh, and, and things like that. But yeah, so we've been doing a lot of work we can performance and stability lately, but I, I do think that, you know, as much sort of issues as we had, you know, last year there was some windows issues kind of here and there. You know, there's people that are scared by the way to move to Mac 10, 15. And, uh, you know, a Mojave and I'm like, you should totally move to Mojave as soon as you can and your Mac can run that.
Speaker 1 17:46 What Apple's done with metal on that is, is just amazing for HD six for playback HD six, five. And I always call it metal too. That's not an official name, but that's just what I call it in my head cause it has changed so much since we had from the initial days of metal. It actually is our preferred method on the Mac for, uh, for playback. And you'll see that open Seattle is now been depreciated and it actually, we put depreciate, which is a warning to you, maybe a use metal and of course on windows you get your choice of open Seattle or Kuda a ticket to get that going. But, uh, I find, uh, just for people that are curious, you know, Mojave, I'm on my Mac book just rocks the house. It's great. Even on something like a 2017 laptop, it does great.
Speaker 1 18:32 Versus the new 16 inch that I'm using now. Um, works, works pretty awesome. So metal is very similar to Kuta. And, um, I think it's what it is, is a Catalina is 10 15, right? Catalina. Catalina. Yeah, you're right. Mojave and Catalina are the ones that I recommend. Gotcha. 10, 15 Catalina I totally recommend at a minimum I recommend a Mojave a 10, 14 yeah, thanks for that. Um, I have two different systems that I run all the time, so get rid of those Nvidia cards people, at least for users that wonder, you know, what is metal, what is Kuda if you go back years ago to the layer B days, you know, Intel Larrabee, which is part of their graphic system and this consortium to bring all of these sort of graphics things together at a sub level, if you will, and the operating system to make it a first class citizen in the operating system and make a baseline.
Speaker 1 19:31 That was sort of the beginnings of, of what Kuda finally became because what happened was, you know, AMD and Apple, I think, I can't remember all of the Intel, a bunch of other people were actually in on this consortium. People started to pull out and basically Nvidia said, we're just about done. So they just decided to go ahead and release it. So Kuda and metal are basically the same. Can't really say that the same thing, but they're the same idea to where things are at a, at a very low low level call, which is why people really want to see that come around. And I think so long as Apple's controlling the drivers on their side with metal, you'll get really, really great performance. I think a lot of us would have wanted to see Kuda on the Mac sort of come back. But I think Apple's got enough going on right now where that's, that's harder.
Speaker 1 20:22 I would just would love, I mean, Nvidia knows, I meet with them all the time. Great company. I just love to see them come out with, uh, with metal support, uh, because people want that hardware. And I will say the AMD hardware has actually been pretty good on the Mac. I don't really have a whole lot of whole lot of complaints, but I would like to see in an equal playing ground there for sure. Yeah. Yeah. That's great. Finally, so speaking of that, we've got a new Mac pro, right? The 2019 with the crazy afterburner card. Um, have you guys been playing with that? I've got one right over here that I am using all the time. So, um, I will say I stole it from a friend of mine. Um, just kinda hold it up for the cameras <inaudible> as I can get. Yeah, we, um, we, we got went in a few weeks ago and right away just noticed, um, how much of a beast it is, where it was.
Speaker 1 21:19 So that was the joke. So we actually, beta prime will be the first way to start testing, um, after burner. And I've been um, consulting, a lot of really good friends of mine that are master trainers and both final cut, um, and, and premiere and resolve. And we've been testing this and I am happy to say, uh, premiere is kicking ass, uh, right now and my test then it's, I'm already doing eight K 29, nine, seven tests. I have done a bunch of AK 60 test. Um, just to put a strain on it. And really it's just a matter of cutting everything. Um, in half, you know, if you can do, you know, two or three layers of 60, then that tells you that you can do twice as many in 29, nine, nine, seven. Right? But I am fully seeing, you know, in anywhere from three to four to five layers of, of eight K, uh, 29, 97 and you know, it starts to tap out at least now, I don't know where this will end up cause we're just finishing up testing right now.
Speaker 1 22:15 But my guess is by the time you put a Luma tree on each one, uh, at eight K a 29, 97, you'll probably be at about about four layers. So, so look at that in real world workloads. So if you're trying to do a multicam edit at 4k or even 10 80, uh, afterburners is going to do a lot of good. And I don't know if you saw this last week, Apple just allowed people to order that by itself, which is pretty awesome, right? The rumor is Apple is going to let some other people develop for it as well, which is really cool. Yeah. And right now it only does pro Rez D code to stop, do pro Rez in code and it is a programmable processor that I'm looking forward to for some other cool things to come out on that. And by the way, I just, there's a, you know, shout out to Intel to say, uh, Hey guys, I think the windows users would like to see something on the Intel side and I, you know, not to say that Adobe is going to jump on support.
Speaker 1 23:12 It depends on if they can do as good a job as I've seen an Apple's done on, on afterburner because I will say it's impressive and there's a, you know, it's certainly won me over in the and the tests that I've done. Uh, awesome. And hope to get that, uh, front and center for people sooner than later. So, uh, you want to talk a little bit about AI? Yeah, yeah. Well, by the way, I do want to mention one thing that I showed at, uh, at max just while we were talking about afterburner and performance. Um, wouldn't people like to see after effects go faster? Everybody? Yeah, sure. I'm sure nobody wants that, Dave. Yeah. So, um, at Adobe max last fall, that's our huge, uh, creator conference that happens every fall. Pretty awesome. We have about 15,000 people show up to that. I held a little, uh, you know, geek session on the last day, uh, that was supposed to be a small session to show people after effects running and multithreading environment, which I think a lot of people would love to see because today after effects, really when they come to you guys, they come to Chesapeake and say, Hey, this is what I'm doing.
Speaker 1 24:18 And you're kind of like, okay, well after effects, pretty much a single threaded app, we need to sell you as high a clock as you can get. So maybe you wouldn't get a 28 core machine. You would get a 16 with a hot clock and some memory and a decent GPU card to play. Um, play the timeline, the composition back that way. Well now I showed a dual Xeon, I can't remember how many, I think it was 110 threads we had going out on that particular one and it was hitting all of them at about 85% just slam on that machine and it was pretty impressive to watch it, chunk it out. And it's going to be processor independent so it doesn't make a difference if it's an Intel or Arisun or whatever. It's just looking at threads and cores and we're working really hard to bring that out, um, as soon as we can.
Speaker 1 25:06 So testing is fully underway. Uh, right now. And again, I did show that is probably some tweets out on after effects that that session I had, which was supposed to be small last fall, had over 300 people. I think crowding to sort of watch that. So I just want to let my aftereffects users know that we know that, you know, speed and certainly performance and aftereffects as is front and center on everybody's mind with, especially with these new hardware coming out. Yeah. Awesome. Yeah, that's great. So yeah, you, you had mentioned, uh, you know, AI. So a couple things where, um, we're working on with uh, with AI, the first thing that we brought to the AI movement if you will, at Adobe, which is trying to keep the creative alive and getting rid of the mundane crap that nobody wants to do. One of which would be if I have a 16 by nine video and I need to convert it by nine by 16 the storyline changes.
Speaker 1 26:02 We're also seeing this in new platforms like Quimby that are coming out and I'm working with some of the broadcasters and other media houses that have to develop two different sizes and you have to be able to tell a story and a nine by 16 and keeping the talent centered or whatever you want to do. So we actually use AI to read what we call salient regions on the screen to find out what's happening. We face track, we object track, all these different things to figure out what are we looking at, how should we track it? There's one demo we do, we're actually, we have one of our engineers throwing a Frisbee to her dog and we sort of pick up that there's two things on the screen and there's this action that goes in. So we start off on the human, she throws the Frisbee, we track the Frisbee and we see the dog and we watched the dog go to the person and return the Frisbee.
Speaker 1 26:53 All of that is tracked automatically via AI. And we came up with a workflow that doesn't change scaling and position key frames from what you are ready now. So rather than come out with some funky new control that just says, Oh you got to do this and you've got to kind of relearn how to key frame that we just throw all the key frames to things that you already know and it's gone gangbusters. I mean everybody just loves the fact that we've gone down this path. Cause I do get asked all the time, is Adobe AI going to get rid of me or getting rid of the creativity that is me? And the answer is no. We just want to get rid of the mundane and allow you to go back in and we don't show any AI is going to take the job. You know, you need to proxy that and look at that and sort of figure out how to put your own twist on that and you're going to noodle key frames as long as you can afford that projects do so.
Speaker 1 27:44 Right. Yeah. I mean this, this whole scenario that you just described here with the difference between a 16 by nine and a nine by 16 yeah. Frame. That is something that really hadn't occurred to me. I had occurred. To me, storytelling is very different than it changes the storytelling. Exactly. So I think that's an amazing example of, I wouldn't call it a new problem, it's kind of a new way of working based on the way we are consuming the media and the way consumers are actually consuming the media on their mobile devices. I mean that's the whole reason for the nine by 16 so for users that know about Quimby, I don't know if you, if you, have you looked at that platform much? But this is going to be a new streaming service that's coming out. I think Katzenberg and others are behind this movement.
Speaker 1 28:23 But I'm in this technology. It's basically it says, you know, I'm looking at my phone this way and it's more comfortable to hold the phone this way. So as soon as I turn my phone, you know, the phone senses that I've turned it recenters the video and basically calls on the nine by 16 video on the server at that frame without interrupting the audio track. So no matter how you turn the phone this way, the audio that remains constant and the content creators have to pivot on that exact frame. Assuming you, you know, you've got the bandwidth to make the story consistent. But if I'm at a football game and I'm in a 16 by nine and I'm off to one side of the screen, I can see the action, uh, maybe over my, my right shoulder. And now when I turn the phone, I have to change that edit.
Speaker 1 29:14 You know, I either have to, uh, have myself or I have to put me at the top of the screen of the bottom and have something, you know, however, I'm going to maneuver that so they have to be able to get it right and tell that story. And I think that's going to be a very popular format. I think Quimby's off to a very cool start. And we've been working with a lot of clients on that, on that workflow and a lot of them are using auto reframe as that feature. So that's really amazing. Yeah. Pretty cool. And then, you know, other things around Adobe, which is our, uh, AI platform, you know, trying to look at them mundane. I think we all agree speech to text is Pia, right? So nobody, nobody wants to have to deal with that, but it's a, seems to be the number one feature that I think when people start to talk about AI and some of these discussions that we have about a solutions architecture, uh, it's, it's almost a, an assumption that, that in any kind of AI integration is going to include speech to text.
Speaker 1 30:06 Yeah. And all the things that go with it. Because the thing about speech to text, just for a lot of viewers that think closed caption, which is you know, which is just used in the U S outside the U S we have what's known as open captions, um, that uh, are actually much more flexible. But uh, it's not so much about anybody that has an impairment or things like that. It's really more of, you know, I'm on the subway and I just want to be able to read it or I'm in bed and I don't want to interrupt someone next to me. I want to be able to hit mute. Or sometimes I just prefer to read what's going on and watch the action, not, not get distracted by whatever else is on the audio. So it's really, really important. And I also need to be able to flip languages.
Speaker 1 30:47 So if I, if it's coming in English, I need to be able to flip it and then have someone prove, uh, the Spanish or, or whatever. So, you know, we are, we are looking at that right now. We've been doing that for awhile. If someone needs to do this today outside of what Adobe is working on, you can use a product that I use a lot called transcripted, uh, by digital anarchy and uh, Jim has done a great job. Jim Tierney, uh, developing this beautiful platform that, uh, has lots of different options at, you know, 6 cents a minute or less sometimes. So it's pretty cheap to get this working and it integrates with premiere quite well. Trent's another one. There's a couple of these services that people they need that today can, can get any types of caption formatting. We're looking at it right now and doing some internal, uh, have an internal project going on right now that's just starting to, uh, to look at that.
Speaker 1 31:39 So you guys have done a whole lot of good integrations to back and forth with ma'am folks. Um, and I've noticed that there's a whole lot more openness that we're able to get much more information out of Adobe and go back and forth between the main platforms, which if somebody is using a media asset management platform as kind of their central hub for collaboration and understanding everything about the assets that people are reusing in projects so that they can manage the assets separately outside of projects. Adobe has been really fantastic about, um, working the way editors want to work, but then also bringing things in and out so that people can continue to use the tools they're comfortable with, but also be good citizens within their larger media environments. And it's just makes everybody's lives easier. So awesome. Save all metadata. Never throw metadata away.
Speaker 1 32:37 All metadata is important data, right? Yeah. Speaking of close captions, I mean that I keep them on all the time just cause I want the extra metadata, right? I might not understand the word because you might want to edit by word. You might want to search or, or whatever. Right. And we're looking at, you know, other technologies, you know, we've shown some of these types of technologies, you know, during Adobe max, go up on YouTube and look at Adobe max sneaks and you can get to see the peak of the future. We actually showed auto reframe, uh, that was shown, um, last year, uh, this past max that we had last fall 2019, we showed an awesome remover. I think you and I were talking about that. And the precession ready for this. Yeah, we all need that. So yeah, it'll, it'll look at certain things or maybe it's removing an F bomb or at least highlighting them, letting me know where we are, where we're working fast and furiously, uh, on some of that technology to really take a look at it.
Speaker 1 33:33 And it's really doing an amazing job of analyzing what's going on. And I think eventually with metadata, you know, I've seen some other, you know, sort of internal projects that might get us down to the realm of, of a quick auto cut based on what they're saying and what we're seeing. Cause you know, things like shot detection and other things where if I can look at information, look at what's happening on the screen and giving an assistant editors view again, we never wanted to make an an auto edit and say, you know, add it to the supply chain. It's like, no, let us just take out what's going on here. Do some auto cuts on it and let you sort of get in there and say, well let me know. Let me sort of look at this 30 60 92nd rendition of this and see if, if, if we got you in the ballpark of where you want to want to be.
Speaker 1 34:22 So not really pre announcing any features, just things that we're thinking about and noodling around with. And some of these things come out of a, you know, Adobe max sneaks where we can get in and start looking at some of these things. But I do think, you know, audio is, is sometimes more important than video, certainly in a podcast. But, um, you know, we just want to be able to clean these things up as quick as possible and get back to creating and not, not the mundane, right? Right. So much of what we look for in an AI integration or platform is, is it's information about the content itself as opposed to moving the content or bringing the content into a platform or delivering it to a platform. It's about the actual content itself. Yeah. And that's just been a task that I think has made the complexities of media management that much more complex because the more media you have, the more data you have and, and the more time it takes to get that data in there.
Speaker 1 35:19 And that seemed to be a huge sticking point for a lot of creatives and a lot of organizations they like, they want all the benefits of a media media, asset manager, media asset management platform, whatever. But uh, it's that resource of time or that resource of a, of a, of a stakeholder or a person to put in front of that and gather all that data. That seems to always be a, you know, a thing that's getting in the way of using some of these platforms more effectively. I think we've done a fairly decent job for a lot of the partners out there. We call them partner panels. So when you're inside premiere and you guys have installed a lot of these extensions, this is another thing you know, that allows us to make a, you know, one of the panels. So those are the squares that you see in premiere that you can resize or aftereffects and other apps.
Speaker 1 36:05 So we give them their own space. It actually connects to their many applications. Sometimes it's a headless app, sometimes it's a limited scale down version of just what an editor needs to see and then you can drag and drop and an export and do all these different things. Um, you know, we had talked about SNS before. One of the things I like to point out about those guys is a lot of the man vendors, um, you know, edit, share and the like, and a bunch of these other guys have a way to see the media look at the metadata. That's all fantastic. Now they're able to add AI on top of that. So SNS has a really cool feature. I was just playing with it. That allows me to go out to the server, uh, you know, put some information, some files on the server, it can start doing some AI and analysis of those files.
Speaker 1 36:50 Cause I, you know, I, I connect to an AWS instance in the back and it's sending those files up to the cloud in the background and doing analysis on those and then giving me more options in premiere on what to do with that. And again, we just make what's called our API APIs or software development kit. API is available to those guys and then they just kind of go crazy with it, again, resulting in a pretty unique workflow for each of those, uh, those partners that work with Adobe. Yep. That's great. Yeah. Um, we were talking about AI, I just remembered that when we were talking about Adobe max undoubtably the coolest demo to see when they go up to YouTube, maybe you guys can give them a link. It's called project fast mask. So imagine you've got someone, um, dancing on the street in Brooklyn or whatever.
Speaker 1 37:39 And I just need to pick up the dancer because I want to do something in the background, you know, and I don't want to have to get out and composite that person out frame by frame or use auto tracking. So we use AI to identify that's a person I see his legs, his arms, his head, I know what he's doing and we track their motion amazingly fast and then allow you to separate those two and put layers, you know, confetti or whatever behind the person. Blur it out. Maybe just a logo to have an, when I've shown project fast mask, which is in development right now too. Um, you know, a lot of our creative customers in the media space, they go bananas because there's no way you would have time to do that in the past. Be able to say, you know, I need to get this quick. You know, maybe there's an advertisement for Coca Cola or something where I need to put that logo behind the talent. You know, there's no way you're going to be able do that. Just given a time constraint. Even for something like social media. Well project fast mask is going to be one of those AI technologies for the aftereffects user that just says, you know, this is what AI is to them. Um, so it's something more interesting than maybe text. So it's project fast. That's great.
Speaker 0 38:53 Yeah. Yeah. That's fantastic fun. Cool. Well, Dave, humbly of Adobe, thank you so much for joining us today. This has been a really awesome discussion. Thank you for this wealth of information that you've imparted us with on remote editing during the, you know, the covert 19 crisis, some integrations with ma'ams and AI and just a what is coming up soon for Adobe. So that's great. We want to thank you for taking the time today. Yeah, it was great hanging out. Awesome. And I'd like to thank my cohost Ben Kilburg, senior solutions architect for Chessa. Thank you, Ben. Thank you, Jason. The workflow show is co-produced by my cohost, Ben Kilburg and Chessa sales operations manager, Jessica Amantha. Thanks for listening to the workflow show. We have a lot of fun producing this podcast, and we want to know what you think. Email
[email protected] with your stories about media asset management, workflow orchestration, and media automations. We also want your ideas for subjects and topics to discuss in future episodes. If you need some workflow therapy, get in touch with one of us. You can also visit our website, chesa.com anytime. Thanks for listening and stay healthy. I'm Jason Wetstone.