Now that a number of MEGA-mini MPI’s have made their way out into the world, it’s time to start in with some software to support some of it’s extended features, and to help with further development. One of these features is the YMF-262 (OPL3) chip included in the MEGA-mini.
Though VGM style music is pretty easy to prepare and play back using a few cross-platform tools and a player that I wrote for the CoCo, I wanted to try something a little different. Something that would let you use the CoCo itself to compose music for the chip.
Thinking about how I might want to go about this, I decided on some features.
- would run on a single-speed (.89MHz) CoCo1/2
- would use the CoCoVGA’s 64 column text mode
- would generate sound data designed to play from interrupt
- would keep the resulting data as compact as possible (without using compression)
- would use a ‘tracker-like’ interface
One of the first things I determined was the general form the data would take. Using a tracker like format, the song pattern is 14 channels (columns) wide and uses one line per interrupt period to display the events for each channel.
I determined that at a minimum, we would need to define the following events (in the data) in order to operate the YMF-262…
- a ‘start interrupt period’ indicator per line (IRQ period)
- a ‘key-on’ event – the command to start playing a note
- a ‘key-off’ event – the command to stop playing a previously issued ‘key-on’
- an ‘end-of-song’ indicator
Deciding to structure these events as a series of commands, I came up with the following command (event) format. Commands can be of variable byte size, and are arranged in channel order (0-13) after an IRQ indicator. The upper nibble of a command byte indicates the command. The lower nibble indicates the channel the command is for. If the command requires additional data, it will immediately follow the command byte.
$Fx – start of IRQ period.
Data between this marker and the next ‘$Fx’ byte is the event data for this period, appearing in order of the channels (0-13). Data is only present for a channel if there is an event for the period
$Ex – ‘key-on’ event. This is a two-byte command
The lower nibble is the channel number, a note/octave byte follows this with the note encoded in the upper nibble (A-G = 1-7) and the octave number (0-7) in the lower nibble.
$Dx – ‘key-off’ event. One byte
The lower nibble is the channel (0-13).
$Cx – ‘song-end’
Indicates the end of song data.
These are the possible events I’m implementing to start with in building the composer. This scheme leaves room for eleven other commands to be defined at some later time if needed, but for now, this gives us something to work with.
The byte-stream used in the composer during editing is stored in the same arrangement as the final generated data will be in order to use the available system memory as efficiently as possible.
With the basic data format settled on, we move on to the user interface. Laying down the basic structure of the pattern editor screen.
The editor screen
The layout of the editor screen is shown above. You can see the pattern laid out by line number (IRQ period) and sound channel. Fourteen of the available eighteen 2-operator channels on the YMF-262 are used due to screen width. Fourteen is quite a few instruments to work with though. The remaining four can be used for additional in program sound effects depending on the software (game or whatnot) using the generated song.
The following video shows some of the initial progress. Pattern editor window display, navigation, and IRQ based parsing and playback of the data have been implemented. Next I’ll be working on the user interface for entering events in the editor.
Off to a good start
Since the last update I’ve been working on implementing some of the interface features for placing and removing events. I have a method of placing notes by octave and note in the pattern, and was even able to grab a piece of sheet music off of the internet and enter a recognizable tune into the first channel in just a few minutes.
I’ll be working on ways to make entering events quicker and easier as I work on other features of the program, such as loading and saving sequences to the CoCo SDC. My plan right now is to save the data out into files on the FAT volume that’ll be easy to take into a modern machine to use in cross-development environments for putting together CoCo software.
But before I implement loading/saving, I think it’s time to nail down how timing will be used in the composer. In particular, I’d like to make it easier to transcribe sheet music into the pattern. This should also make it easier for actual musicians who are accustomed to reading and writing music to work with. After some study of how timing information is encoded into sheet music, I think I’ve settled on how to handle timing.
My understanding is now that 4/4 timing is the most common employed, and so we’ll go with that as we move along. The 4/4 time signature indicates 4 quarter notes per measure. Song tempo is measured in beats per minute (BPM) and relates to our notes in that it gives us our note (and rest) durations, which are measured in beats.
From this, we’re able to calculate the time in seconds of a note, for instance, a quarter note in a 120 bpm song would have a duration of 1/2 of a second (120/60 = 2 beats per second).
Different pieces of music use all sorts of different tempos (bpm), so our composition software must take that into account. In order for the timing of our pattern based system to closely follow the usual system for music, we’ll use beats for our timing as well.
Notes can go down to very short durations, as little as a 256th note, or 1/64th of a beat. This is probably a little impractical for what we’re trying to do here, as we are also trying to minimize (or at least be reasonable with) storage and playback requirements of music on the CoCo.
Take for example a song at 120 bpm. A whole note would be 2 seconds, a 256th note .0078125 seconds (128th of a second). I think a more reasonable downward limit on note duration for our purposes is a 32nd note (1/8 of a beat), which will also be more in line with human hearing perception at beat rates we’ll likely be using.
So, going from that reasoning, our shortest duration, measured in beats, will be 1/8 of a beat. Since our composer operates on interrupt periods, this means 8 IRQs per beat will be the basis of our timing, from which we’ll calculate the timer setting for our IRQs.
In other words, from the user perspective, you’ll set your beats per minute, and then just remember that each line is 1/8 of a beat. The bpm will be adjustable to make it easy to transcribe and write music.
*** more to come ***