Call for ideas: Music Stream applications.

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Call for ideas: Music Stream applications.

Erik Sandberg
Hi all,

A couple of days ago, I leaked about my thesis on music streams. I got pretty
much positive response from that, which I'm happy for.

A brief summary:
I'm writing a master's thesis, where I'm implementing something called Music
Streams for lilypond. A music stream can be seen as a normalisation /
simplification of a music expression, which only contains Events, and where
the events have been assigned to contexts. There are also special events for
setting properties, creating contexts, increment time, etc. The music streams
completely separates the music-expression parser from the formatter, so you
can convert a .ly file to a music stream, look at the stream, tweak it,
whatever, and then load it into the formatter. The music stream is still a
music description format, just like .ly (it contains no layout information).

We do not yet know if music streams will make it into lilypond, what I'm doing
is an experiment. So please don't expect anything.

That said, I'm looking for potential applications of music streams. This is
relevant for choosing adequate data structures.

The most important applications we've found so far are:
- Cleaner implementation of \partcombine and friends
- Load/save music streams to disk, use it as a stable music representation
format. It would be easier to maintain version compatibility for music
streams than for .ly files.
- Write converters from other formats (MusicXML, MIDI) to music streams, and
write a good stream->ly converter. This would possibly save some work
duplication.
- Write a stream->MusicXML converter
- Some GUI-related features (which aren't very important now, a GUI requires
lots of additional work in other areas before it can be done)
- Custom music stream filters. You could write a Scheme function that does
some analysis of all music inside a context, and tweaks it based on that.
Clever pagebreaks is one example (the analysis would be to look for repeats
and long rests, the tweaks would be to insert page-breaking penalties).

I'm looking for more applications, especially ideas on what stream filters can
be used for. I'm interested in applications where a music stream-based
solution is clearly better than using music functions or applycontext.

To give a rough idea of what a music stream is, here is a simple example of
what it could look like:

\score { << \new Voice {c2 d e f } \addlyrics { x y z w } }
=>
#<Stream_event (stream-event (0 ((id . \new) (type . Score) (ops) (unique . 1)
(class . CreateContext) (context . 0) (name . StreamEvent)))) >
#<Stream_event (stream-event (1 ((id . \new) (type . Staff) (ops) (unique . 2)
(class . CreateContext) (context . 1) (name . StreamEvent)))) >
#<Stream_event (stream-event (2 ((id . \new) (type . Voice) (ops) (unique . 3)
(class . CreateContext) (context . 2) (name . StreamEvent)))) >
#<Stream_event (stream-event (2 ((id . uniqueContext0) (type . Voice) (ops)
(unique . 4) (class . CreateContext) (context . 2) (name . StreamEvent)))) >
#<Stream_event (stream-event (1 ((id . \new) (type . Lyrics) (ops) (unique .
5) (class . CreateContext) (context . 1) (name . StreamEvent)))) >
#<Stream_event (stream-event (0 ((moment . #<Mom 0>) (class . Prepare)
(context . 0) (name . StreamEvent)))) >
#<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
MusicEvent) (context . 4) (name . StreamEvent)))) >
#<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
MusicEvent) (context . 5) (name . StreamEvent)))) >
#<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
StreamEvent)))) >
#<Stream_event (stream-event (0 ((moment . #<Mom 1/2>) (class . Prepare)
(context . 0) (name . StreamEvent)))) >
#<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
MusicEvent) (context . 4) (name . StreamEvent)))) >
#<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
MusicEvent) (context . 5) (name . StreamEvent)))) >
#<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
StreamEvent)))) >
#<Stream_event (stream-event (0 ((moment . #<Mom 1>) (class . Prepare)
(context . 0) (name . StreamEvent)))) >
#<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
MusicEvent) (context . 4) (name . StreamEvent)))) >
#<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
MusicEvent) (context . 5) (name . StreamEvent)))) >
#<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
StreamEvent)))) >
#<Stream_event (stream-event (0 ((moment . #<Mom 3/2>) (class . Prepare)
(context . 0) (name . StreamEvent)))) >
#<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
MusicEvent) (context . 4) (name . StreamEvent)))) >
#<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
MusicEvent) (context . 5) (name . StreamEvent)))) >
#<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
StreamEvent)))) >
#<Stream_event (stream-event (0 ((moment . #<Mom 2>) (class . Prepare)
(context . 0) (name . StreamEvent)))) >
#<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
StreamEvent)))) >
#<Stream_event (stream-event (4 ((class . RemoveContext) (context . 4) (name .
StreamEvent)))) >
#<Stream_event (stream-event (0 ((class . Finish) (context . 0) (name .
StreamEvent)))) >

--
Erik


_______________________________________________
lilypond-devel mailing list
[hidden email]
http://lists.gnu.org/mailman/listinfo/lilypond-devel
Reply | Threaded
Open this post in threaded view
|

Re: Call for ideas: Music Stream applications.

Trevor Bača-2
On 8/28/05, Erik Sandberg <[hidden email]> wrote:

> Hi all,
>
> A couple of days ago, I leaked about my thesis on music streams. I got pretty
> much positive response from that, which I'm happy for.
>
> A brief summary:
> I'm writing a master's thesis, where I'm implementing something called Music
> Streams for lilypond. A music stream can be seen as a normalisation /
> simplification of a music expression, which only contains Events, and where
> the events have been assigned to contexts. There are also special events for
> setting properties, creating contexts, increment time, etc. The music streams
> completely separates the music-expression parser from the formatter, so you
> can convert a .ly file to a music stream, look at the stream, tweak it,
> whatever, and then load it into the formatter. The music stream is still a
> music description format, just like .ly (it contains no layout information).
>
> We do not yet know if music streams will make it into lilypond, what I'm doing
> is an experiment. So please don't expect anything.
>
> That said, I'm looking for potential applications of music streams. This is
> relevant for choosing adequate data structures.
>
> The most important applications we've found so far are:
> - Cleaner implementation of \partcombine and friends
> - Load/save music streams to disk, use it as a stable music representation
> format. It would be easier to maintain version compatibility for music
> streams than for .ly files.
> - Write converters from other formats (MusicXML, MIDI) to music streams, and
> write a good stream->ly converter. This would possibly save some work
> duplication.
> - Write a stream->MusicXML converter
> - Some GUI-related features (which aren't very important now, a GUI requires
> lots of additional work in other areas before it can be done)
> - Custom music stream filters. You could write a Scheme function that does
> some analysis of all music inside a context, and tweaks it based on that.
> Clever pagebreaks is one example (the analysis would be to look for repeats
> and long rests, the tweaks would be to insert page-breaking penalties).
>
> I'm looking for more applications, especially ideas on what stream filters can
> be used for. I'm interested in applications where a music stream-based
> solution is clearly better than using music functions or applycontext.
>
> To give a rough idea of what a music stream is, here is a simple example of
> what it could look like:
>
> \score { << \new Voice {c2 d e f } \addlyrics { x y z w } }
> =>
> #<Stream_event (stream-event (0 ((id . \new) (type . Score) (ops) (unique . 1)
> (class . CreateContext) (context . 0) (name . StreamEvent)))) >
> #<Stream_event (stream-event (1 ((id . \new) (type . Staff) (ops) (unique . 2)
> (class . CreateContext) (context . 1) (name . StreamEvent)))) >
> #<Stream_event (stream-event (2 ((id . \new) (type . Voice) (ops) (unique . 3)
> (class . CreateContext) (context . 2) (name . StreamEvent)))) >
> #<Stream_event (stream-event (2 ((id . uniqueContext0) (type . Voice) (ops)
> (unique . 4) (class . CreateContext) (context . 2) (name . StreamEvent)))) >
> #<Stream_event (stream-event (1 ((id . \new) (type . Lyrics) (ops) (unique .
> 5) (class . CreateContext) (context . 1) (name . StreamEvent)))) >
> #<Stream_event (stream-event (0 ((moment . #<Mom 0>) (class . Prepare)
> (context . 0) (name . StreamEvent)))) >
> #<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
> MusicEvent) (context . 4) (name . StreamEvent)))) >
> #<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
> MusicEvent) (context . 5) (name . StreamEvent)))) >
> #<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
> StreamEvent)))) >
> #<Stream_event (stream-event (0 ((moment . #<Mom 1/2>) (class . Prepare)
> (context . 0) (name . StreamEvent)))) >
> #<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
> MusicEvent) (context . 4) (name . StreamEvent)))) >
> #<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
> MusicEvent) (context . 5) (name . StreamEvent)))) >
> #<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
> StreamEvent)))) >
> #<Stream_event (stream-event (0 ((moment . #<Mom 1>) (class . Prepare)
> (context . 0) (name . StreamEvent)))) >
> #<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
> MusicEvent) (context . 4) (name . StreamEvent)))) >
> #<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
> MusicEvent) (context . 5) (name . StreamEvent)))) >
> #<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
> StreamEvent)))) >
> #<Stream_event (stream-event (0 ((moment . #<Mom 3/2>) (class . Prepare)
> (context . 0) (name . StreamEvent)))) >
> #<Stream_event (stream-event (4 ((music . #<Music NoteEvent>) (class .
> MusicEvent) (context . 4) (name . StreamEvent)))) >
> #<Stream_event (stream-event (5 ((music . #<Music LyricEvent>) (class .
> MusicEvent) (context . 5) (name . StreamEvent)))) >
> #<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
> StreamEvent)))) >
> #<Stream_event (stream-event (0 ((moment . #<Mom 2>) (class . Prepare)
> (context . 0) (name . StreamEvent)))) >
> #<Stream_event (stream-event (0 ((class . OneTimeStep) (context . 0) (name .
> StreamEvent)))) >
> #<Stream_event (stream-event (4 ((class . RemoveContext) (context . 4) (name .
> StreamEvent)))) >
> #<Stream_event (stream-event (0 ((class . Finish) (context . 0) (name .
> StreamEvent)))) >
Hi Erik,

I think the concept of the music stream is a fantastic idea, for all
of the reasons you give in your list of example, particularly the
point that a type of normalized input file will be more stable between
releases than, say, .ly files can be expected to be. Even a cursory
scan of the archives shows that this added stability (which, it seems,
would engender a more robust version of convert-ly) would benefit
users considerably.

I have not yet taken advantage of \partcombine and related functions,
but it seems apparent that a reimplementation using music streams
would be *much* simpler than the current implementation. After a
couple of releases, such a reimplementation would presumably be more
stable, which is a direct user benefit. I ran into a minor bug with
\setAssociatedVoice the other day when engraving a song with
multivoice alternatives in the vocal part; it seems clear to me that
\setAssociatedVoice would be another good candidate for a more stable,
powerful reimplementation under music streams (just as the thesis
analyzes for \lyricsto).

Other applications for music streams occur to me but I'd like to ask
two questions that I've been mulling over before I hazard more
guesses:

1. will music streams represent all the "default" items the Lily
inserts *after* parsing an input .ly file, such as clefs,
timesignatures and the like?

2. the example music stream included here shows (if I'm reading
correctly) "what is connected to what" primarily by stream events
sharing the same music moment (in vertical direction) or by sharing
the same context (in the horizontal direction); ie, a NoteEvent and a
LyricEvent connect to each other (in the vertical direction) by virtue
of sharing the same music moment while two notes in the same voice
connected to each other by sharing the same context. First, are these
correct assumptions? Second, will music streams represent information
about "what is connected to what" in ways in addition to music voice-
and moment-concurrency? Examples might include: tempo indications
connecting (left aligning, say) with time signatures (not possible
explicitly under current Lily implementation, though Han-Wen's willing
to take on the feautre under sponsorship); voices wandering over two
or three different staves as frequently happens in piano music (which
already is possible under the current Lily implementation using
\changes); hairpin start- or stoppoints connecting to non-moment-based
items like barlines (also not currently possible, but again I think
there's been discussion on the list). To this list, of course, you
could add the point you brought up on the mailing list the other day
about multistaff chords.

Very exciting so far; let me know what you think about the two questions above.



Trevor Bača
[hidden email]

_______________________________________________
lilypond-devel mailing list
[hidden email]
http://lists.gnu.org/mailman/listinfo/lilypond-devel
Reply | Threaded
Open this post in threaded view
|

Re: Call for ideas: Music Stream applications.

Erik Sandberg
On Sunday 28 August 2005 15.53, Trevor Baca wrote:

> On 8/28/05, Erik Sandberg <[hidden email]> wrote:
>
> I have not yet taken advantage of \partcombine and related functions,
> but it seems apparent that a reimplementation using music streams
> would be *much* simpler than the current implementation. After a
> couple of releases, such a reimplementation would presumably be more
> stable, which is a direct user benefit. I ran into a minor bug with
> \setAssociatedVoice the other day when engraving a song with
> multivoice alternatives in the vocal part; it seems clear to me that
> \setAssociatedVoice would be another good candidate for a more stable,
> powerful reimplementation under music streams (just as the thesis
> analyzes for \lyricsto).
(I think that analysis is obsolete btw)

> Other applications for music streams occur to me but I'd like to ask
> two questions that I've been mulling over before I hazard more
> guesses:
>
> 1. will music streams represent all the "default" items the Lily
> inserts *after* parsing an input .ly file, such as clefs,
> timesignatures and the like?

It represents the same information that a music expression currently does.

AFAICR, time sigs and clefs are handled internally by setting properties, that
an engraver later uses to produce the symbols. Music streams will contain
SetProperty events that set those properties accordingly, but it will contain
no explicit clefs/time sigs.

> 2. the example music stream included here shows (if I'm reading
> correctly) "what is connected to what" primarily by stream events
> sharing the same music moment (in vertical direction) or by sharing
> the same context (in the horizontal direction); ie, a NoteEvent and a
> LyricEvent connect to each other (in the vertical direction) by virtue
> of sharing the same music moment while two notes in the same voice
> connected to each other by sharing the same context. First, are these
> correct assumptions?

Yes.

> Second, will music streams represent information
> about "what is connected to what" in ways in addition to music voice-
> and moment-concurrency?

No.

I've been thinking about supporting 'comment' events, which just would be a
kind of arbitrary hints, ignored by the typesetter.

This could be useful for converters (e.g. music-stream => MusicXML), where
each lyric _must_ be connected to a note. It could also be useful for
music-stream=>ly conversion, e.g. to hint that a lyrics context can be
represented with lyricsto.

> Examples might include: tempo indications
> connecting (left aligning, say) with time signatures (not possible
> explicitly under current Lily implementation, though Han-Wen's willing
> to take on the feautre under sponsorship);
That decision belongs to a later step in processing, I think. My guess would
be that alignment tweaking like that would be done through that spanner's
grob properties.

In any case, what can be represented in a music stream, should always be
representable in .ly, and vice versa, so new alignment must be representable
in .ly as well. Music streams doesn't (and can't) change the model lily uses
to describe music.

> voices wandering over two
> or three different staves as frequently happens in piano music (which
> already is possible under the current Lily implementation using
> \changes);
This is represented by letting events belong to a Voice, and inserting
ChangeStaff stream-events that makes that voice move across staves.

> To this list, of course, you
> could add the point you brought up on the mailing list the other day
> about multistaff chords.

Also not particularly related to music-streams, afaics. The problem with those
chords, is that one voice belongs to two staffs at once, this breaks the tree
hierarchy.

--
Erik


_______________________________________________
lilypond-devel mailing list
[hidden email]
http://lists.gnu.org/mailman/listinfo/lilypond-devel