Stephen Beaulieu: Welcome to the 1999 BeDC, the contents day. I want to go over some of the logistics.
We're going to talk in general about the Media Kit, and give you all the introduction information you need so that in the morning you get a basic understanding of how things work. And this afternoon we're going to split into two different tracks. One for covering -- the cryptic side over there is the media node, and the media application track will be here in this room. They're going to cover different things. Media application track is going to talk about what you need to do to put together an application that uses media nodes to do whatever it is you want to do. The other side gets into the nitty-gritty of nodes that actually do work.
Here's what hopefully you will get out of today. The goal is that you leave today with a basic understanding of how the Media Kit works, concepts. We're not going to be giving you code snippets today. We're not going to be telling you, oh, you have to do this and that function. We want general concepts so you understand how the pieces are put together.
We are providing for you sometime in the next week or so, week or two, sample code for a lot of what we're talking about today that will show you the nuts and bolts of it. Today we want concepts.
One thing you will note, and you will like this, is that there have been a lot of changes that we've put into the Media Kit to make it easier for you to write applications and nodes that are coming out in Genki 4, coming out in Genki in general, our next release of the operating system.
At lunchtime, if you go back out to the registration desk, we have beta disks for all of you of the fourth release of the fourth beta of Genki so you can actually get to work on this as we get the code up and ready for you in the next week or so.
What we're going to start with this morning, first we will do a quick overview of the Media Kit, and then we will break into talking about building blocks, kind of back from the schedule from this morning's schedule.
We're going to then talk again about the building blocks. We will talk about how to put those building blocks together and how they interact, how applications of those interact. Then we'll take questions.
Let's get straight into it.
So, overview. We're going to talk about media and the BeOS, talk about what our concept of media is, how the media system is put together, general concepts. And we're going to show you how the media player application that was shown yesterday, what it actual looks like from a programmer's perspective. Go ahead.
So what in the world is media. Let's get definitions down. Media is data associated with time. You know, movies, sound. And one of the key important concepts is keeping it all in sync.
So you have a movie that is playing, Quicktime movie, and it's got both audio and video. You want to play those and keep them in sync regardless of what you might be doing to the video. So you might want to apply some sort of effect or you want to add a delay or reverb to the sound. You want to make sure everything plays in sync.
So that is media. That is what we're trying to accomplish, realtime manipulation in the realtime system.
What advantages does the BeOS have for doing that? What we hope to be able to show you with the 4.5 release that will have a lot of advantages.
We have very little latency in the system and that is important for a lot of developers and a lot of end users.
We have some example cards where -- audio cards where you can actually do work capturing and outputting the same sound, and doing some work inside where we get latencies of six milliseconds in and out where two milliseconds is inside the card each way. So that is really a low latency. Correspondingly, on the Windows and MacOS, you get latencies of 45, 30 to 50 milliseconds for doing things. That is one of the main advantages that we're hoping to provide, and as we've been seeing, is that you can get data in and out faster which means it is much more likely you can do realtime effects while still keeping the flow going.
The media system allows you to actually calculate this latency on the fly. So each of the nodes and the applications can determine roughly how much time it will actually take to do things and act accordingly. And that's good because it allows you to be pretty flexible. So if things change you can make adjustments and keep playing correctly.
Time can come from any source. We have a concept of a timer inside the kit called a time source that anyone can set up. So you can have a black burst external clock. If someone writes a time source node for it, everyone can work with that.
You have the audio in. The audio output has the card -- has a clock on it, so you can sync everything to that. The video input has a clock in it. You can sync to that if it is appropriate for your application.
Another thing is we have very flexible descriptions of media formats and the ability to extend that as new media formats come out.
So another advantage of the system is we're component-based. We have this concept of media node which we will go into in depth throughout the day. You can get the idea what we're doing there.
But with the media node, it provides a universal interface. You deal with this one class, and regardless of what you do, whether you are a filter, whether you're reading from a file, whether you are writing out to a file, whether you are writing out to a video output card or anything, we provide one interface regardless of what you do, everyone knows how to work with. It is very extendable.
As people come up with new media and as people come up with new cards and better capabilities, our system can expand by basically building a new component for that.
The actual format that you use to talk between two nodes is negotiated by the nodes. So if you are an application writer and all you want to do is play this file and have this filter occur on it, you know, play this effect, the actual nodes will handle figuring out the optimum format to pass that data back and forth, which is good. It is handled for you.
We also have a system to being able to create user interfaces on the fly for each node in the system.
So, anyway, that is what we think our advantages are. Let's actually get into the work for the beginning concepts.
So we're going to look at a little bit about what media looks like in the system itself. What a media node is, what an application is, what an add-on is, and what the media roster is, real quick overview.
As we said, media is data associated with time. In the BeOS, media data is actually passed through buffers, and there is a class actually used by the nodes. But the basic concept is we deal with things in buffers of data.
Media node. Media node is basically an object that processes media buffers. Very straightforward.
So what is media application? Media application is something in the BeOS that hooks together bunch nodes to perform some sort of task like capturing to a file.
Media add-on. So for those of you who might not be familiar, add-on is just a shared library that can do various things. A media add-on is an add-on that knows how to create media nodes. For example, there is an add-on that our system mixer is in and it creates you an audio mixer you can use.
So the Media Roster. The Media Roster is basically the object that applications use to hook nodes together. That is a key concept where we come back throughout the day.
Applications use the Media Roster object to talk to nodes and tell them what to do. Nodes do not use the Media Roster object. It is an application-only device. It is just important as we go on throughout the day.
So, let's actually look at the way the Media Kit architecture is laid out from a systems standpoint.
So those of you who have been programming in BeOS for awhile should be familiar with our basic programming model. You have an application and we provide software kits, usually in shared libraries that the application talks to. Those in turn talk to servers that actually do some of the work in the back end, which then talks to the kernel and drivers and eventually the hardware.How does it apply to the Media Kit? Pretty straightforward. There is the media application who talks to the Media Kit where the media roster lives. And the two main servers we'll be dealing with in the back end is the Media Server, which handles a lot of the communications, and the Media Add-On Server which actually is an application that loads up the media add-ons that gives you access to nodes that you didn't write yourself.
So from every one's standpoint, this is how it works. So the application gets a copy of the media roster. The application might have a media node that it wrote itself. It doesn't live as an add-on but it wants to be able to use it. It can then create that and talk to it through the Media Roster.
If it is an add-on that they didn't create themselves in a node they want to use, you talk to it through the Media Server, but you still use the Media Roster, which talks to the Media Server, which will then talk to the Media Add-On Server and will give you access to the media add-ons for creating nodes from there. That is kind of the basic model.
The demo from yesterday basically showed you. We've got a new application coming out in Genki called the media player. It basically supplies media files.
This is kind of what the internal of that application looks like from a programming standpoint.
So what the application has done here, each of these blue circles represents a node in the system.
So there is the file you want to play and then there is the screen and the speaker you want to get to. A lot of what we're going to talk about today, especially on the application side, is how you get this to work.
In general, the media player would see the QuickTime file. So I need to find a QuickTime file that knows how to do that. It sees that it's raw audio. It is plain audio data coming out. So it hooks it up to the system mixer which then in turn packs it out to the sound output, which actually plays the sound, but it is Cinepak compressed video.
So you have to find a video decoder and hook it up and then that goes to some node that knows how to draw the screen.
One thing you will notice here is the sound output is the time source. The card itself has the ability to keep track of time because that's how it has to put the information out.
And all of the nodes in the system are slaved to that time source so that you know when the time source -- when you are supposed to play something at time 50, everyone knows what time 50 is and can make sure it happens on time. That is kind of the overall goal.
This is the quick overview. We've talked about what the BeOS is, the basic concepts which we go into detail in just a moment, media nodes, media application, the fact that the Media Roster is used to have those two communicate.
The media add-ons are where the media nodes live. And we'll talk about the architecture of the Media Kit, the server and the add-ons.
We've shown you kind of what we're going towards. What we want to do through the day is show you how to put together those node chains. And from the node side, we actually get around to passing buffers down the chain, how do you do that. So we'll go ahead and switch over.
This is Owen Smith who will be coming up in a moment. He's one of my DTS engineers. Hey, I forgot to introduce myself. Bad form.
My name is Stephen Beaulieu, by the way. I am the developer of technical support manager here. But we'll switch over to have Owen talk about the building blocks and the Media Kit. Owen.