Fun with Abstractions: Multimedia Edition

If you're just joining us, you may want to catch up on previous emails

Today we start talking about the elephant in the room. How do we get our game server to communicate and coordinate with the lighting, sound, and video control consoles for the game? These theatrical design elements were very intricate, and we needed bidirectional communication. That is to say, events that happened in the software needed to be able to trigger cues in the theatrical controls, and manual cues in the theatrical controls need to trigger events in our software.

Let's look at this problem and decompose it a bit. Taken at face value, we have the following devices in the room that all need to talk to each other:

  • 1 Game server
  • 4 Terminal Puzzles
  • 1 Pretend supercomputer bent on global destruction
  • 1 Lighting console
  • 1 Sound console
  • 1 Video console

But we can break these into two groups:

Game Controls

  • 1 Game server
  • 4 Terminal Puzzles
  • 1 Pretend supercomputer bent on global destruction

Theatrical Controls

  • 1 Lighting console
  • 1 Sound console
  • 1 Video console

When we look at it this way, the problem starts to come into focus. There is already precedent for theatrical controls sending cues to each other. They can communicate using an old but very mature protocol called MIDI. And we've already determined that our game controls can communicate via web sockets. So what we really have is two groups of machines that need to talk to each other. Our problem is now one of translation. We have precedent for how the game controls will talk to each other, and we have precedent for how the theatrical controls can talk to each other. We've effectively reduced our big unknown to the question "How do we translate between web sockets events and MIDI messages?"

Any ideas?

More tomorrow.


Did you like this?

I send a daily email with tips and ideas like this one. Join the party!

Icon