Which communication protocol is used in AdLib sound card?












7















By "AdLib sound card" I mean AdLib Music Synthesizer Card released in 1987.



Wikipedia says that AdLib uses frequency modulation synthesis to produce sound, but does it use MIDI communication protocol or some another way?




It's entirely possible that the FM synthesis uses MIDI under the hood to control the FM synthesis chip. The other option would be a specific driver control library which would be linked into the software.



-- From the question on the sound.stackexchange.com











share|improve this question























  • This type of device is (was?) well supported in Linux. So the driver details captured in that code-base would tell you pretty much everything.

    – jdv
    13 hours ago






  • 2





    Or, if you can find a copy of the "AdLib Music Synthesizer Card Programmer's Manual". My feelings are that MIDI was not used under the hood, as programming this thing was very much a low-level thing, where a handful of specific operators gave you access to oscillators, mixers, etc. (The control was such that you didn't have to use FM if you didn't want to -- there was an additive mode.)

    – jdv
    13 hours ago













  • @jdv Yes, I'm sure that MIDI wasn't used, because it's very easy to find YouTube videos entitled like "AdLib vs. MIDI". One: youtube.com/watch?v=He_mlHj7tOU, another: youtube.com/watch?v=v9snl7f5oms

    – john c. j.
    12 hours ago













  • Sound cards based on Yamaha's OPL2 / OPL3 FM synthesis chips had a predefined set of patch definitions corresponding to the 128 standard General Midi instruments. The MIDI driver software simply loaded the appropriate data into the chip as required. Computer games typically defined their own set of "non-standard" sounds, but still used MIDI to play the music. MIDI is a very compact data format and there would have been little value in trying to "invent a better wheel" to do the same task.

    – alephzero
    12 hours ago











  • @johnc.j. Both the options in the videos use MIDI. They are using the same MIDI file to control two different synthesizers. (Virtually every form of computer music still uses MIDI to control the synths, even if the synth is accessing hundreds of Gb of sampled audio to generate the actual sounds.)

    – alephzero
    12 hours ago
















7















By "AdLib sound card" I mean AdLib Music Synthesizer Card released in 1987.



Wikipedia says that AdLib uses frequency modulation synthesis to produce sound, but does it use MIDI communication protocol or some another way?




It's entirely possible that the FM synthesis uses MIDI under the hood to control the FM synthesis chip. The other option would be a specific driver control library which would be linked into the software.



-- From the question on the sound.stackexchange.com











share|improve this question























  • This type of device is (was?) well supported in Linux. So the driver details captured in that code-base would tell you pretty much everything.

    – jdv
    13 hours ago






  • 2





    Or, if you can find a copy of the "AdLib Music Synthesizer Card Programmer's Manual". My feelings are that MIDI was not used under the hood, as programming this thing was very much a low-level thing, where a handful of specific operators gave you access to oscillators, mixers, etc. (The control was such that you didn't have to use FM if you didn't want to -- there was an additive mode.)

    – jdv
    13 hours ago













  • @jdv Yes, I'm sure that MIDI wasn't used, because it's very easy to find YouTube videos entitled like "AdLib vs. MIDI". One: youtube.com/watch?v=He_mlHj7tOU, another: youtube.com/watch?v=v9snl7f5oms

    – john c. j.
    12 hours ago













  • Sound cards based on Yamaha's OPL2 / OPL3 FM synthesis chips had a predefined set of patch definitions corresponding to the 128 standard General Midi instruments. The MIDI driver software simply loaded the appropriate data into the chip as required. Computer games typically defined their own set of "non-standard" sounds, but still used MIDI to play the music. MIDI is a very compact data format and there would have been little value in trying to "invent a better wheel" to do the same task.

    – alephzero
    12 hours ago











  • @johnc.j. Both the options in the videos use MIDI. They are using the same MIDI file to control two different synthesizers. (Virtually every form of computer music still uses MIDI to control the synths, even if the synth is accessing hundreds of Gb of sampled audio to generate the actual sounds.)

    – alephzero
    12 hours ago














7












7








7








By "AdLib sound card" I mean AdLib Music Synthesizer Card released in 1987.



Wikipedia says that AdLib uses frequency modulation synthesis to produce sound, but does it use MIDI communication protocol or some another way?




It's entirely possible that the FM synthesis uses MIDI under the hood to control the FM synthesis chip. The other option would be a specific driver control library which would be linked into the software.



-- From the question on the sound.stackexchange.com











share|improve this question














By "AdLib sound card" I mean AdLib Music Synthesizer Card released in 1987.



Wikipedia says that AdLib uses frequency modulation synthesis to produce sound, but does it use MIDI communication protocol or some another way?




It's entirely possible that the FM synthesis uses MIDI under the hood to control the FM synthesis chip. The other option would be a specific driver control library which would be linked into the software.



-- From the question on the sound.stackexchange.com








ibm-pc gaming sound






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked 13 hours ago









john c. j.john c. j.

1405




1405













  • This type of device is (was?) well supported in Linux. So the driver details captured in that code-base would tell you pretty much everything.

    – jdv
    13 hours ago






  • 2





    Or, if you can find a copy of the "AdLib Music Synthesizer Card Programmer's Manual". My feelings are that MIDI was not used under the hood, as programming this thing was very much a low-level thing, where a handful of specific operators gave you access to oscillators, mixers, etc. (The control was such that you didn't have to use FM if you didn't want to -- there was an additive mode.)

    – jdv
    13 hours ago













  • @jdv Yes, I'm sure that MIDI wasn't used, because it's very easy to find YouTube videos entitled like "AdLib vs. MIDI". One: youtube.com/watch?v=He_mlHj7tOU, another: youtube.com/watch?v=v9snl7f5oms

    – john c. j.
    12 hours ago













  • Sound cards based on Yamaha's OPL2 / OPL3 FM synthesis chips had a predefined set of patch definitions corresponding to the 128 standard General Midi instruments. The MIDI driver software simply loaded the appropriate data into the chip as required. Computer games typically defined their own set of "non-standard" sounds, but still used MIDI to play the music. MIDI is a very compact data format and there would have been little value in trying to "invent a better wheel" to do the same task.

    – alephzero
    12 hours ago











  • @johnc.j. Both the options in the videos use MIDI. They are using the same MIDI file to control two different synthesizers. (Virtually every form of computer music still uses MIDI to control the synths, even if the synth is accessing hundreds of Gb of sampled audio to generate the actual sounds.)

    – alephzero
    12 hours ago



















  • This type of device is (was?) well supported in Linux. So the driver details captured in that code-base would tell you pretty much everything.

    – jdv
    13 hours ago






  • 2





    Or, if you can find a copy of the "AdLib Music Synthesizer Card Programmer's Manual". My feelings are that MIDI was not used under the hood, as programming this thing was very much a low-level thing, where a handful of specific operators gave you access to oscillators, mixers, etc. (The control was such that you didn't have to use FM if you didn't want to -- there was an additive mode.)

    – jdv
    13 hours ago













  • @jdv Yes, I'm sure that MIDI wasn't used, because it's very easy to find YouTube videos entitled like "AdLib vs. MIDI". One: youtube.com/watch?v=He_mlHj7tOU, another: youtube.com/watch?v=v9snl7f5oms

    – john c. j.
    12 hours ago













  • Sound cards based on Yamaha's OPL2 / OPL3 FM synthesis chips had a predefined set of patch definitions corresponding to the 128 standard General Midi instruments. The MIDI driver software simply loaded the appropriate data into the chip as required. Computer games typically defined their own set of "non-standard" sounds, but still used MIDI to play the music. MIDI is a very compact data format and there would have been little value in trying to "invent a better wheel" to do the same task.

    – alephzero
    12 hours ago











  • @johnc.j. Both the options in the videos use MIDI. They are using the same MIDI file to control two different synthesizers. (Virtually every form of computer music still uses MIDI to control the synths, even if the synth is accessing hundreds of Gb of sampled audio to generate the actual sounds.)

    – alephzero
    12 hours ago

















This type of device is (was?) well supported in Linux. So the driver details captured in that code-base would tell you pretty much everything.

– jdv
13 hours ago





This type of device is (was?) well supported in Linux. So the driver details captured in that code-base would tell you pretty much everything.

– jdv
13 hours ago




2




2





Or, if you can find a copy of the "AdLib Music Synthesizer Card Programmer's Manual". My feelings are that MIDI was not used under the hood, as programming this thing was very much a low-level thing, where a handful of specific operators gave you access to oscillators, mixers, etc. (The control was such that you didn't have to use FM if you didn't want to -- there was an additive mode.)

– jdv
13 hours ago







Or, if you can find a copy of the "AdLib Music Synthesizer Card Programmer's Manual". My feelings are that MIDI was not used under the hood, as programming this thing was very much a low-level thing, where a handful of specific operators gave you access to oscillators, mixers, etc. (The control was such that you didn't have to use FM if you didn't want to -- there was an additive mode.)

– jdv
13 hours ago















@jdv Yes, I'm sure that MIDI wasn't used, because it's very easy to find YouTube videos entitled like "AdLib vs. MIDI". One: youtube.com/watch?v=He_mlHj7tOU, another: youtube.com/watch?v=v9snl7f5oms

– john c. j.
12 hours ago







@jdv Yes, I'm sure that MIDI wasn't used, because it's very easy to find YouTube videos entitled like "AdLib vs. MIDI". One: youtube.com/watch?v=He_mlHj7tOU, another: youtube.com/watch?v=v9snl7f5oms

– john c. j.
12 hours ago















Sound cards based on Yamaha's OPL2 / OPL3 FM synthesis chips had a predefined set of patch definitions corresponding to the 128 standard General Midi instruments. The MIDI driver software simply loaded the appropriate data into the chip as required. Computer games typically defined their own set of "non-standard" sounds, but still used MIDI to play the music. MIDI is a very compact data format and there would have been little value in trying to "invent a better wheel" to do the same task.

– alephzero
12 hours ago





Sound cards based on Yamaha's OPL2 / OPL3 FM synthesis chips had a predefined set of patch definitions corresponding to the 128 standard General Midi instruments. The MIDI driver software simply loaded the appropriate data into the chip as required. Computer games typically defined their own set of "non-standard" sounds, but still used MIDI to play the music. MIDI is a very compact data format and there would have been little value in trying to "invent a better wheel" to do the same task.

– alephzero
12 hours ago













@johnc.j. Both the options in the videos use MIDI. They are using the same MIDI file to control two different synthesizers. (Virtually every form of computer music still uses MIDI to control the synths, even if the synth is accessing hundreds of Gb of sampled audio to generate the actual sounds.)

– alephzero
12 hours ago





@johnc.j. Both the options in the videos use MIDI. They are using the same MIDI file to control two different synthesizers. (Virtually every form of computer music still uses MIDI to control the synths, even if the synth is accessing hundreds of Gb of sampled audio to generate the actual sounds.)

– alephzero
12 hours ago










2 Answers
2






active

oldest

votes


















9














As documented by Jeffrey S. Lee, the AdLib [simply provides raw programmatic access to its OPL2:




The sound card is programmed by sending data to its internal registers via its two I/O ports: ...



The sound card possesses an array of two hundred forty-four registers; to write to a particular register, send the register number (01-F5) to the address port, and the desired value to the data port.



After writing to the register port, you must wait twelve cycles before sending the data; after writing the data, eighty-four cycles must elapse before any other sound card operation may be performed.




So, no MIDI, no other high-level format. The card produces nine channels of sound, each of which is the product of two sine-derived functions; you can instead configure it as six of those channels plus five percussion channels.



It supports automatic application of ADSR but otherwise it is a simple modal device. Set the current instrument set, their frequencies and volume envelopes, then they'll play continuously until you tell the card otherwise.



So, unlike MIDI or other formats like it, there's no inherent sequencing or timing of notes — the card has no autonomy in proceeding through music. It just makes the noises you've currently assigned to it.



(and as to implementation of that expensive-sounding audio generation, see this reverse engineering; summary: it's all log tables)






share|improve this answer


























  • This is correct so far as it goes, but it's only half the story. To produce some "interesting" sounds you need to change the values on the OPL registers under real time control. MIDI has always been used for that purpose, since it is a very compact representation of "music notation". The OPL chip is the "musical instrument," and MIDI is the "performer" on the instrument.

    – alephzero
    12 hours ago






  • 5





    @alephzero the question is "Which communication protocol is used in AdLib sound card?"; if you chose to use MIDI, that decision would live entirely in your code over on the CPU and the communication protocol you'd use with the AdLib sound card would still just be setting registers as and when you feel like it. So this is half the story as to how you'd write an AdLib music player, but the entire story as to the functioning of the AdLib card.

    – Tommy
    11 hours ago











  • @CodyGray the only omission was the original author's name, which is now corrected. Fragile vanity makes me observe that, even before editing, a link to the original was provided and the text was explicitly demarcated as a quote.

    – Tommy
    3 hours ago











  • Thanks for the update! Indeed, it was mostly there already.

    – Cody Gray
    1 hour ago



















4














As another answer has said, the OPL2/3 chip on the AdLib is driven by directly writing the registers that control the pitch, volume, and tone of each channel. But how does a game (or other music player) know what values to send when? There are a few different approaches.




  1. Raw data. The IMF format used by Apogee and early Id titles is a typical example of this. The file is a sequence of instructions, where every instruction consists of which register to write, what value to write, and how long to wait before processing the next instruction. This is easy to integrate into a game engine, and uses pretty much the minimal possible amount of CPU on playback. The DRO format used by DOSBox's OPL capture is a slightly more advanced version of this concept that produces smaller file sizes, and there's at least one game in existence that stores its music files on disk as DRO (I know because I wrote the music playback code for it).


  2. Raw data with separate instrument definitions. A simple way to save space (and simplify composer tools) is to define an instrument by directly dumping the OPL register values needed to create a given sound, and store the instruments separately from the note data. Then on playback you need commands to set the instrument for a given channel (which loads a bunch of registers from the instrument data), play a note at a given pitch and volume (loads the pitch, volume, and keyon registers), and stop a note (clears the keyon register). The ROL format used by AdLib themselves is an example of this method.


  3. MIDI. If you have instrument definitions for each of the 127 General MIDI instruments (you could get these from AdLib, or someone else, or make your own), and you parse MIDI events and do a little bit of conversion between the value ranges used by MIDI and the value ranges used by AdLib, you can play MIDI files. The Windows 3 - 9x MIDI Mapper driver did this, as did several standalone DOS MIDI players and DOS games.


  4. CMF (Creative Music Format) was a very popular format for AdLib music, which is almost MIDI, except that the headers are different (it isn't recognized as a MIDI file), and each file contains its own set of instrument definitions (like option #2) instead of using a standardized set of General MIDI patches. Program numbers in the MIDI data correspond to the built-in instruments rather than GM instruments. The "SBFMDRV" music driver provided by Creative Labs supports playback of CMF, and a number of games make use of it.


  5. Anything else. Since the programming interface to the OPL chip is so simple and low-level, other things are possible that I don't have time to write about. Procedurally generated music, using AdLib for sound effects rather than music by tweaking the register values in real time, AdLib S3M modules, etc., etc.







share|improve this answer
























  • If I could, I would upvote twice :-)

    – john c. j.
    7 hours ago











  • Option 2 is very similar to what you typically do for SID music (c64 et al) :)

    – Felix Palmen
    55 secs ago











Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "648"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9265%2fwhich-communication-protocol-is-used-in-adlib-sound-card%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









9














As documented by Jeffrey S. Lee, the AdLib [simply provides raw programmatic access to its OPL2:




The sound card is programmed by sending data to its internal registers via its two I/O ports: ...



The sound card possesses an array of two hundred forty-four registers; to write to a particular register, send the register number (01-F5) to the address port, and the desired value to the data port.



After writing to the register port, you must wait twelve cycles before sending the data; after writing the data, eighty-four cycles must elapse before any other sound card operation may be performed.




So, no MIDI, no other high-level format. The card produces nine channels of sound, each of which is the product of two sine-derived functions; you can instead configure it as six of those channels plus five percussion channels.



It supports automatic application of ADSR but otherwise it is a simple modal device. Set the current instrument set, their frequencies and volume envelopes, then they'll play continuously until you tell the card otherwise.



So, unlike MIDI or other formats like it, there's no inherent sequencing or timing of notes — the card has no autonomy in proceeding through music. It just makes the noises you've currently assigned to it.



(and as to implementation of that expensive-sounding audio generation, see this reverse engineering; summary: it's all log tables)






share|improve this answer


























  • This is correct so far as it goes, but it's only half the story. To produce some "interesting" sounds you need to change the values on the OPL registers under real time control. MIDI has always been used for that purpose, since it is a very compact representation of "music notation". The OPL chip is the "musical instrument," and MIDI is the "performer" on the instrument.

    – alephzero
    12 hours ago






  • 5





    @alephzero the question is "Which communication protocol is used in AdLib sound card?"; if you chose to use MIDI, that decision would live entirely in your code over on the CPU and the communication protocol you'd use with the AdLib sound card would still just be setting registers as and when you feel like it. So this is half the story as to how you'd write an AdLib music player, but the entire story as to the functioning of the AdLib card.

    – Tommy
    11 hours ago











  • @CodyGray the only omission was the original author's name, which is now corrected. Fragile vanity makes me observe that, even before editing, a link to the original was provided and the text was explicitly demarcated as a quote.

    – Tommy
    3 hours ago











  • Thanks for the update! Indeed, it was mostly there already.

    – Cody Gray
    1 hour ago
















9














As documented by Jeffrey S. Lee, the AdLib [simply provides raw programmatic access to its OPL2:




The sound card is programmed by sending data to its internal registers via its two I/O ports: ...



The sound card possesses an array of two hundred forty-four registers; to write to a particular register, send the register number (01-F5) to the address port, and the desired value to the data port.



After writing to the register port, you must wait twelve cycles before sending the data; after writing the data, eighty-four cycles must elapse before any other sound card operation may be performed.




So, no MIDI, no other high-level format. The card produces nine channels of sound, each of which is the product of two sine-derived functions; you can instead configure it as six of those channels plus five percussion channels.



It supports automatic application of ADSR but otherwise it is a simple modal device. Set the current instrument set, their frequencies and volume envelopes, then they'll play continuously until you tell the card otherwise.



So, unlike MIDI or other formats like it, there's no inherent sequencing or timing of notes — the card has no autonomy in proceeding through music. It just makes the noises you've currently assigned to it.



(and as to implementation of that expensive-sounding audio generation, see this reverse engineering; summary: it's all log tables)






share|improve this answer


























  • This is correct so far as it goes, but it's only half the story. To produce some "interesting" sounds you need to change the values on the OPL registers under real time control. MIDI has always been used for that purpose, since it is a very compact representation of "music notation". The OPL chip is the "musical instrument," and MIDI is the "performer" on the instrument.

    – alephzero
    12 hours ago






  • 5





    @alephzero the question is "Which communication protocol is used in AdLib sound card?"; if you chose to use MIDI, that decision would live entirely in your code over on the CPU and the communication protocol you'd use with the AdLib sound card would still just be setting registers as and when you feel like it. So this is half the story as to how you'd write an AdLib music player, but the entire story as to the functioning of the AdLib card.

    – Tommy
    11 hours ago











  • @CodyGray the only omission was the original author's name, which is now corrected. Fragile vanity makes me observe that, even before editing, a link to the original was provided and the text was explicitly demarcated as a quote.

    – Tommy
    3 hours ago











  • Thanks for the update! Indeed, it was mostly there already.

    – Cody Gray
    1 hour ago














9












9








9







As documented by Jeffrey S. Lee, the AdLib [simply provides raw programmatic access to its OPL2:




The sound card is programmed by sending data to its internal registers via its two I/O ports: ...



The sound card possesses an array of two hundred forty-four registers; to write to a particular register, send the register number (01-F5) to the address port, and the desired value to the data port.



After writing to the register port, you must wait twelve cycles before sending the data; after writing the data, eighty-four cycles must elapse before any other sound card operation may be performed.




So, no MIDI, no other high-level format. The card produces nine channels of sound, each of which is the product of two sine-derived functions; you can instead configure it as six of those channels plus five percussion channels.



It supports automatic application of ADSR but otherwise it is a simple modal device. Set the current instrument set, their frequencies and volume envelopes, then they'll play continuously until you tell the card otherwise.



So, unlike MIDI or other formats like it, there's no inherent sequencing or timing of notes — the card has no autonomy in proceeding through music. It just makes the noises you've currently assigned to it.



(and as to implementation of that expensive-sounding audio generation, see this reverse engineering; summary: it's all log tables)






share|improve this answer















As documented by Jeffrey S. Lee, the AdLib [simply provides raw programmatic access to its OPL2:




The sound card is programmed by sending data to its internal registers via its two I/O ports: ...



The sound card possesses an array of two hundred forty-four registers; to write to a particular register, send the register number (01-F5) to the address port, and the desired value to the data port.



After writing to the register port, you must wait twelve cycles before sending the data; after writing the data, eighty-four cycles must elapse before any other sound card operation may be performed.




So, no MIDI, no other high-level format. The card produces nine channels of sound, each of which is the product of two sine-derived functions; you can instead configure it as six of those channels plus five percussion channels.



It supports automatic application of ADSR but otherwise it is a simple modal device. Set the current instrument set, their frequencies and volume envelopes, then they'll play continuously until you tell the card otherwise.



So, unlike MIDI or other formats like it, there's no inherent sequencing or timing of notes — the card has no autonomy in proceeding through music. It just makes the noises you've currently assigned to it.



(and as to implementation of that expensive-sounding audio generation, see this reverse engineering; summary: it's all log tables)







share|improve this answer














share|improve this answer



share|improve this answer








edited 4 hours ago

























answered 12 hours ago









TommyTommy

14.9k14073




14.9k14073













  • This is correct so far as it goes, but it's only half the story. To produce some "interesting" sounds you need to change the values on the OPL registers under real time control. MIDI has always been used for that purpose, since it is a very compact representation of "music notation". The OPL chip is the "musical instrument," and MIDI is the "performer" on the instrument.

    – alephzero
    12 hours ago






  • 5





    @alephzero the question is "Which communication protocol is used in AdLib sound card?"; if you chose to use MIDI, that decision would live entirely in your code over on the CPU and the communication protocol you'd use with the AdLib sound card would still just be setting registers as and when you feel like it. So this is half the story as to how you'd write an AdLib music player, but the entire story as to the functioning of the AdLib card.

    – Tommy
    11 hours ago











  • @CodyGray the only omission was the original author's name, which is now corrected. Fragile vanity makes me observe that, even before editing, a link to the original was provided and the text was explicitly demarcated as a quote.

    – Tommy
    3 hours ago











  • Thanks for the update! Indeed, it was mostly there already.

    – Cody Gray
    1 hour ago



















  • This is correct so far as it goes, but it's only half the story. To produce some "interesting" sounds you need to change the values on the OPL registers under real time control. MIDI has always been used for that purpose, since it is a very compact representation of "music notation". The OPL chip is the "musical instrument," and MIDI is the "performer" on the instrument.

    – alephzero
    12 hours ago






  • 5





    @alephzero the question is "Which communication protocol is used in AdLib sound card?"; if you chose to use MIDI, that decision would live entirely in your code over on the CPU and the communication protocol you'd use with the AdLib sound card would still just be setting registers as and when you feel like it. So this is half the story as to how you'd write an AdLib music player, but the entire story as to the functioning of the AdLib card.

    – Tommy
    11 hours ago











  • @CodyGray the only omission was the original author's name, which is now corrected. Fragile vanity makes me observe that, even before editing, a link to the original was provided and the text was explicitly demarcated as a quote.

    – Tommy
    3 hours ago











  • Thanks for the update! Indeed, it was mostly there already.

    – Cody Gray
    1 hour ago

















This is correct so far as it goes, but it's only half the story. To produce some "interesting" sounds you need to change the values on the OPL registers under real time control. MIDI has always been used for that purpose, since it is a very compact representation of "music notation". The OPL chip is the "musical instrument," and MIDI is the "performer" on the instrument.

– alephzero
12 hours ago





This is correct so far as it goes, but it's only half the story. To produce some "interesting" sounds you need to change the values on the OPL registers under real time control. MIDI has always been used for that purpose, since it is a very compact representation of "music notation". The OPL chip is the "musical instrument," and MIDI is the "performer" on the instrument.

– alephzero
12 hours ago




5




5





@alephzero the question is "Which communication protocol is used in AdLib sound card?"; if you chose to use MIDI, that decision would live entirely in your code over on the CPU and the communication protocol you'd use with the AdLib sound card would still just be setting registers as and when you feel like it. So this is half the story as to how you'd write an AdLib music player, but the entire story as to the functioning of the AdLib card.

– Tommy
11 hours ago





@alephzero the question is "Which communication protocol is used in AdLib sound card?"; if you chose to use MIDI, that decision would live entirely in your code over on the CPU and the communication protocol you'd use with the AdLib sound card would still just be setting registers as and when you feel like it. So this is half the story as to how you'd write an AdLib music player, but the entire story as to the functioning of the AdLib card.

– Tommy
11 hours ago













@CodyGray the only omission was the original author's name, which is now corrected. Fragile vanity makes me observe that, even before editing, a link to the original was provided and the text was explicitly demarcated as a quote.

– Tommy
3 hours ago





@CodyGray the only omission was the original author's name, which is now corrected. Fragile vanity makes me observe that, even before editing, a link to the original was provided and the text was explicitly demarcated as a quote.

– Tommy
3 hours ago













Thanks for the update! Indeed, it was mostly there already.

– Cody Gray
1 hour ago





Thanks for the update! Indeed, it was mostly there already.

– Cody Gray
1 hour ago











4














As another answer has said, the OPL2/3 chip on the AdLib is driven by directly writing the registers that control the pitch, volume, and tone of each channel. But how does a game (or other music player) know what values to send when? There are a few different approaches.




  1. Raw data. The IMF format used by Apogee and early Id titles is a typical example of this. The file is a sequence of instructions, where every instruction consists of which register to write, what value to write, and how long to wait before processing the next instruction. This is easy to integrate into a game engine, and uses pretty much the minimal possible amount of CPU on playback. The DRO format used by DOSBox's OPL capture is a slightly more advanced version of this concept that produces smaller file sizes, and there's at least one game in existence that stores its music files on disk as DRO (I know because I wrote the music playback code for it).


  2. Raw data with separate instrument definitions. A simple way to save space (and simplify composer tools) is to define an instrument by directly dumping the OPL register values needed to create a given sound, and store the instruments separately from the note data. Then on playback you need commands to set the instrument for a given channel (which loads a bunch of registers from the instrument data), play a note at a given pitch and volume (loads the pitch, volume, and keyon registers), and stop a note (clears the keyon register). The ROL format used by AdLib themselves is an example of this method.


  3. MIDI. If you have instrument definitions for each of the 127 General MIDI instruments (you could get these from AdLib, or someone else, or make your own), and you parse MIDI events and do a little bit of conversion between the value ranges used by MIDI and the value ranges used by AdLib, you can play MIDI files. The Windows 3 - 9x MIDI Mapper driver did this, as did several standalone DOS MIDI players and DOS games.


  4. CMF (Creative Music Format) was a very popular format for AdLib music, which is almost MIDI, except that the headers are different (it isn't recognized as a MIDI file), and each file contains its own set of instrument definitions (like option #2) instead of using a standardized set of General MIDI patches. Program numbers in the MIDI data correspond to the built-in instruments rather than GM instruments. The "SBFMDRV" music driver provided by Creative Labs supports playback of CMF, and a number of games make use of it.


  5. Anything else. Since the programming interface to the OPL chip is so simple and low-level, other things are possible that I don't have time to write about. Procedurally generated music, using AdLib for sound effects rather than music by tweaking the register values in real time, AdLib S3M modules, etc., etc.







share|improve this answer
























  • If I could, I would upvote twice :-)

    – john c. j.
    7 hours ago











  • Option 2 is very similar to what you typically do for SID music (c64 et al) :)

    – Felix Palmen
    55 secs ago
















4














As another answer has said, the OPL2/3 chip on the AdLib is driven by directly writing the registers that control the pitch, volume, and tone of each channel. But how does a game (or other music player) know what values to send when? There are a few different approaches.




  1. Raw data. The IMF format used by Apogee and early Id titles is a typical example of this. The file is a sequence of instructions, where every instruction consists of which register to write, what value to write, and how long to wait before processing the next instruction. This is easy to integrate into a game engine, and uses pretty much the minimal possible amount of CPU on playback. The DRO format used by DOSBox's OPL capture is a slightly more advanced version of this concept that produces smaller file sizes, and there's at least one game in existence that stores its music files on disk as DRO (I know because I wrote the music playback code for it).


  2. Raw data with separate instrument definitions. A simple way to save space (and simplify composer tools) is to define an instrument by directly dumping the OPL register values needed to create a given sound, and store the instruments separately from the note data. Then on playback you need commands to set the instrument for a given channel (which loads a bunch of registers from the instrument data), play a note at a given pitch and volume (loads the pitch, volume, and keyon registers), and stop a note (clears the keyon register). The ROL format used by AdLib themselves is an example of this method.


  3. MIDI. If you have instrument definitions for each of the 127 General MIDI instruments (you could get these from AdLib, or someone else, or make your own), and you parse MIDI events and do a little bit of conversion between the value ranges used by MIDI and the value ranges used by AdLib, you can play MIDI files. The Windows 3 - 9x MIDI Mapper driver did this, as did several standalone DOS MIDI players and DOS games.


  4. CMF (Creative Music Format) was a very popular format for AdLib music, which is almost MIDI, except that the headers are different (it isn't recognized as a MIDI file), and each file contains its own set of instrument definitions (like option #2) instead of using a standardized set of General MIDI patches. Program numbers in the MIDI data correspond to the built-in instruments rather than GM instruments. The "SBFMDRV" music driver provided by Creative Labs supports playback of CMF, and a number of games make use of it.


  5. Anything else. Since the programming interface to the OPL chip is so simple and low-level, other things are possible that I don't have time to write about. Procedurally generated music, using AdLib for sound effects rather than music by tweaking the register values in real time, AdLib S3M modules, etc., etc.







share|improve this answer
























  • If I could, I would upvote twice :-)

    – john c. j.
    7 hours ago











  • Option 2 is very similar to what you typically do for SID music (c64 et al) :)

    – Felix Palmen
    55 secs ago














4












4








4







As another answer has said, the OPL2/3 chip on the AdLib is driven by directly writing the registers that control the pitch, volume, and tone of each channel. But how does a game (or other music player) know what values to send when? There are a few different approaches.




  1. Raw data. The IMF format used by Apogee and early Id titles is a typical example of this. The file is a sequence of instructions, where every instruction consists of which register to write, what value to write, and how long to wait before processing the next instruction. This is easy to integrate into a game engine, and uses pretty much the minimal possible amount of CPU on playback. The DRO format used by DOSBox's OPL capture is a slightly more advanced version of this concept that produces smaller file sizes, and there's at least one game in existence that stores its music files on disk as DRO (I know because I wrote the music playback code for it).


  2. Raw data with separate instrument definitions. A simple way to save space (and simplify composer tools) is to define an instrument by directly dumping the OPL register values needed to create a given sound, and store the instruments separately from the note data. Then on playback you need commands to set the instrument for a given channel (which loads a bunch of registers from the instrument data), play a note at a given pitch and volume (loads the pitch, volume, and keyon registers), and stop a note (clears the keyon register). The ROL format used by AdLib themselves is an example of this method.


  3. MIDI. If you have instrument definitions for each of the 127 General MIDI instruments (you could get these from AdLib, or someone else, or make your own), and you parse MIDI events and do a little bit of conversion between the value ranges used by MIDI and the value ranges used by AdLib, you can play MIDI files. The Windows 3 - 9x MIDI Mapper driver did this, as did several standalone DOS MIDI players and DOS games.


  4. CMF (Creative Music Format) was a very popular format for AdLib music, which is almost MIDI, except that the headers are different (it isn't recognized as a MIDI file), and each file contains its own set of instrument definitions (like option #2) instead of using a standardized set of General MIDI patches. Program numbers in the MIDI data correspond to the built-in instruments rather than GM instruments. The "SBFMDRV" music driver provided by Creative Labs supports playback of CMF, and a number of games make use of it.


  5. Anything else. Since the programming interface to the OPL chip is so simple and low-level, other things are possible that I don't have time to write about. Procedurally generated music, using AdLib for sound effects rather than music by tweaking the register values in real time, AdLib S3M modules, etc., etc.







share|improve this answer













As another answer has said, the OPL2/3 chip on the AdLib is driven by directly writing the registers that control the pitch, volume, and tone of each channel. But how does a game (or other music player) know what values to send when? There are a few different approaches.




  1. Raw data. The IMF format used by Apogee and early Id titles is a typical example of this. The file is a sequence of instructions, where every instruction consists of which register to write, what value to write, and how long to wait before processing the next instruction. This is easy to integrate into a game engine, and uses pretty much the minimal possible amount of CPU on playback. The DRO format used by DOSBox's OPL capture is a slightly more advanced version of this concept that produces smaller file sizes, and there's at least one game in existence that stores its music files on disk as DRO (I know because I wrote the music playback code for it).


  2. Raw data with separate instrument definitions. A simple way to save space (and simplify composer tools) is to define an instrument by directly dumping the OPL register values needed to create a given sound, and store the instruments separately from the note data. Then on playback you need commands to set the instrument for a given channel (which loads a bunch of registers from the instrument data), play a note at a given pitch and volume (loads the pitch, volume, and keyon registers), and stop a note (clears the keyon register). The ROL format used by AdLib themselves is an example of this method.


  3. MIDI. If you have instrument definitions for each of the 127 General MIDI instruments (you could get these from AdLib, or someone else, or make your own), and you parse MIDI events and do a little bit of conversion between the value ranges used by MIDI and the value ranges used by AdLib, you can play MIDI files. The Windows 3 - 9x MIDI Mapper driver did this, as did several standalone DOS MIDI players and DOS games.


  4. CMF (Creative Music Format) was a very popular format for AdLib music, which is almost MIDI, except that the headers are different (it isn't recognized as a MIDI file), and each file contains its own set of instrument definitions (like option #2) instead of using a standardized set of General MIDI patches. Program numbers in the MIDI data correspond to the built-in instruments rather than GM instruments. The "SBFMDRV" music driver provided by Creative Labs supports playback of CMF, and a number of games make use of it.


  5. Anything else. Since the programming interface to the OPL chip is so simple and low-level, other things are possible that I don't have time to write about. Procedurally generated music, using AdLib for sound effects rather than music by tweaking the register values in real time, AdLib S3M modules, etc., etc.








share|improve this answer












share|improve this answer



share|improve this answer










answered 8 hours ago









hobbshobbs

1,774611




1,774611













  • If I could, I would upvote twice :-)

    – john c. j.
    7 hours ago











  • Option 2 is very similar to what you typically do for SID music (c64 et al) :)

    – Felix Palmen
    55 secs ago



















  • If I could, I would upvote twice :-)

    – john c. j.
    7 hours ago











  • Option 2 is very similar to what you typically do for SID music (c64 et al) :)

    – Felix Palmen
    55 secs ago

















If I could, I would upvote twice :-)

– john c. j.
7 hours ago





If I could, I would upvote twice :-)

– john c. j.
7 hours ago













Option 2 is very similar to what you typically do for SID music (c64 et al) :)

– Felix Palmen
55 secs ago





Option 2 is very similar to what you typically do for SID music (c64 et al) :)

– Felix Palmen
55 secs ago


















draft saved

draft discarded




















































Thanks for contributing an answer to Retrocomputing Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9265%2fwhich-communication-protocol-is-used-in-adlib-sound-card%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

"Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

Alcedinidae

Origin of the phrase “under your belt”?