[Kdenlive-devel] Cutting list file format specification, Version 0.02
Jason Wood
jasonwood at blueyonder.co.uk
Sat May 18 22:43:00 UTC 2002
On Saturday 18 May 2002 9:05 pm, Christian Berger wrote:
> > > Well I don't know if we'll ever need it, but it sure is an interresting
> > > piece of information. Maybe we should also define a "creator" command,
> > > telling us which programm created the cutlist. Maybe also a date and
> > > comment field. Those fields definitely are good for recovering broken
> > > harddisks :)
> >
> > Of course, if you hard disk has died, recovering the cutting list is
> > probably the least of your worries -
>
> Or think of undeleting files.
>From experience it's normally the video files which get accidentally deleted
if the uninitiated are editing (what, you mean I still actually need the
video clips? I thought I'd put it all in the project... :-)), and there isn't
a whole lot that can be done about that except to make sure that you've got
backups!
> > > > Secondly, I have added a couple of extra parameter value types -
> > > > String, ID, and time and interpolation. We need to decide exactly on
> > > > how time should be represented within the file format.
> > >
> > > Well I'd say we use time in milliseconds. Unless we want to be limited
> > > to film and PAL/SECAM we have to be able to handle wiered framerates
> > > which aren't a multiple of 0.01. We cannot really do that with
> > > 00:00:00:00 formats.
> >
> > If we are likely to need, say, ten-thousandths of a second for the
> > correct accuracy at times, then rather than milliseconds just measuring
> > in seconds and allowing any fraction after the decimal point might be
> > better than choosing milliseconds.
> > Eg. 330.176215 seconds.
>
> This might be an idea.
>
> > The question is, should time be represented in
> > hours/minutes/seconds.fraction-of-a-second, or should it be represented
> > as just seconds.fraction-of-a-second?
>
> Only seconds. That hours/mintes/seconds system is braindeath, and to
> complicated. I mean think of leap seconds and all that stuff. Besides I
> don't see why we should base our file-format on the revolution speed of
> a planes :)
Ummm.... I wasn't actually expecting to be editing files of a size where
leap-seconds would need to be calculated :-) Anyway, it's not that
complicated - you just divide the integer second part by 60 a couple of times
to get the minutes and hours, and reverse the procedure when building it back
up. The suggestion here is human readability over simplicity, but it's not a
major hastle to parse - 3 numbers instead of one basically, and the
conversion is real simple. Something like :
double returnSeconds(int hours, int minutes, double seconds) {
return (hours*3600) + (minutes*60) + seconds;
}
Again, this is a cutting list thing, not an internal thing, I would expect you
to be working in seconds/fractions internally.
> > The main extension that I was later thinking of for the interpolation
> > would be some form of "freehand" ability - where you have multiple
> > points, so instead of just [start, end], you could have [start, middle1,
> > middle2, middle3, middle4, end]
> >
> > Of course, then you need to move on to the question of, "what if we don't
> > want them all equally spaced out? How do we handle the syntax?" which is
> > why for the moment, I think it's best to keep it simple. Since any
> > interpolation could be handled using multiple scenes anyway, I think this
> > is one are which shows why versions on cutting lists can come in handy!
>
> Well I think here my version is best, since it's a real programm we can
> do "anything" in forth.
We can also do anything in c/c++... when it comes down to it though, the
cutting list merely describes what is being done, it should not have anything
to do with carrying it out.
I only expect to ever support three types of interpolation : linear (moving
between two points), freehand (where the values move randomly about, and so
we simply generate a list of values in the order they occur ), and possibly
bezier - even this is unlikely though, as it will be just as easy to generate
a list of freehand points as it would be to describe the interpolation as a
language.
When the UI and cutter become sophisticated enough to actually want to handle
other forms then we can consider moving to a more complex definition and add
1 to the major version number of the file format. However at present,
whichever format is chosen, there is no need to handle anything more
complicated than linear interpolation between two points.
> > > Scenes also can have a length of less than a frame. This might be
> > > important if you want to "fill up" scenes.
> >
> > ??? I don't understand this point.
> > I would have assumed that a scene would be at least a frame, otherwise it
> > would never get rendered by the cutter.
>
> Well, but it would create a space in our file format, besides it creates
> audio.
Why would we want to create space in the file format of less than a frame? It
would either be ignored (as the next scene writes to the same frame) or would
lead to a gap of a frame, thus meaning that the next scene did not start at
the time that it was supposed to (and suggesting an error on the UI or
cutting generators part).
And how does it create audio? Audio will still be measured using the same
timings as video. If you are considering a gap where there is audio and no
picture, then it will still be limited to a minimum of 1 frame in size -
otherwise a gap in video will not be seen or we miss the beginning of the
next scene.
Something that needs considering is whether audio should be generated in the
same cutting list as the video. Audio will obviously have to use seperate
connections to the video, as the effects that it passes through will be
completely different. A lot of the time, the audio will also be coming from
completely different files to the video - and depending on the codec, may
very well end up in a different "audio" output.
Incidentally, that's a very good point. The codec used when editing with the
Matrox RT2000 card uses a seperate file to store video and audio - audio
being stored in .wav files. This is a really useful feature, because it means
you can use any sound editor going to fiddle with the sound, normalise it,
filter out noise, etc. etc. etc. to a much greater degree than is possible in
a program not dedicated to sound editing (i.e. kdenlive). Following the same
concept with the editing codec will be a very good idea.
> > > Maybe we should somehow distinguish between file IDs and connection IDs
> > > since I have to have a "list" of all the files being used in a scene
> >
> > The trouble is that once we get inside of a scene, we would then need a
> > way to distinguish between them whilst parsing them.
>
> Actually we have to.
No you don't. In one case you have a file input which... generates frame
images when asked for them and in the other case you have a connection input
which... generates frame images when it's asked for them. From an effects
point of view, the two should look identical. From the cutter encoders point
of view, again, the two should look identical.
> > I think a better solution here is using C++ and making use of inheritance
> > and polymorphism.
>
> Ohh well I've worked with those technices in Pascal, they might be
> usefull, especially for the Forth interpreter. However C++ still is not
> much more than a smart assembler.
That's what any programming language is. The whole idea of using a high level
programming language is because it makes the programmers job easier.
> Actually our files format can be seen
> as a much more sophisticated language than C :)
Sorry, but that is completely false. If our file format does become anywhere
near as complicated as any programming language then we have made a lot of
bad design choices, because it is serving a purpose much more restrictive.
Hell, even the formal specification for scheme is about 40/50 pages long and
that's as simplistic a language as you can get.
> I'd personally prefer
> writing it in Pascal which at least has propper string handling. :)
Umm... so does C++. that's what the std::string class is for. You shouldn't
need to work with null terminated strings ever if you don't want to.
Also there are plenty of libraries all over the place for doing complex stuff
with strings, but our file format does not need them - that's the point - it
is simple to parse. If it does need to get more complex then we should
change the format to XML and use one of the DOM's out there to do all the
parsing work for us (incidentally, this is what the project file for the GUI
will be doing).
Actually, it might not be a bad idea for the cutting lists as well... sure as
hell would remove all parsing hastles that we may have, commands and scenes
become tags, parameters become elements, easily extensible... hmmm...
> > > Well about the smoothing input of the croma effect (I'd call it
> > > cromakey) How would you define it? I mean it'S a color value.
> >
> > Ok, here I am thinking that this "smoothing" value determines the range
> > of values over which the chroma works. For example, you specify a blue
> > flatcolor image as the croma - RGB(0.0, 0.0, 0.8)
> >
> > We could then specify another flatcolor video as the smoothing. A higher
> > vaue for the color indicates a wider range - effectively, any color with
> > a value over 0.5 would mean that the value would always be chromakeyed to
> > some degree.
> >
> > The advantage of using a color is that we can specify seperate smoothing
> > factors for the different parts of the color value. E.g., since we might
> > want a larger range of blue colors to be accepted than their red and
> > green components, then we might specify a flatcolor video for smoothing
> > of say, RGB(0.05, 0.05, 0.2) (meaning a small variation in Red and Green
> > would be cromakeyed, and a wider variation in blue would be chromakeyed).
> > Smoothing might not be the best word here - threshold might be better.
> >
> > Smoothing would then just be a single value - any chroma effect would get
> > "smeared out" a little depending on smoothing to help remove jaggies
> > around the edge of the effect.
>
> I understand it. However I think we might wait with it for some time. It
> might be to time-consuming to do, but it's definitely something we can
> do in the future.
Nah, but implementing any of the effects we have defined is already "well in
the future" as it is. Except perhaps the flatcolor one :-)
> > > Maybe we should try something like this: A forth effekt. The same
> > > interpreter used in interpolation could calculate the whole effect.
> > > It would be called for every pixel and it get's the pixel values of the
> > > incoming images on the stack and leaves the outgoing pixel on top of
> > > it. When done efficiently that wouldn't be to slow.
> >
> > A simple way to make effects in some semi-intereprative way would be
> > cool.
>
> We can do it that way.
I would suggest that we write the UI, cutting list generator and a working
cutter first though...
> Servus
> Casandro
Incidentally, effects should be seperated from the main cutter code, so that
they can be reused by othr cutters - I'm thinking that seperate libraries may
be a good plan.
Cheers,
Jason
--
Jason Wood
Homepage : www.uchian.pwp.blueyonder.co.uk
More information about the Kdenlive
mailing list