fbpx
Skip to main content

Tag: Game Audio

Collective Creativity in Game Design & Music

Games and sound go together like macaroni and cheese. Sound is a large part of what makes games fun. So, what are some of the hot button issues that game designers need to know about the process of creating game sound — and what do audio folks need to know about game design and production? Good questions!

This roundtable brings together composers, sound designers, game designers, animators, programmers, producers, and anyone else involved in the production process. Organizers open up the floor and discuss the latest and greatest trends in games, as well as the most common production roadblocks. Let’s come together to develop a deeper understanding of process and pipeline.

 

Cooperative Design Roundtable: Collective Creativity in Game Design & Music (Presented by the IGDA)

Steve Horowitz  (Audio Director, IASIG/GAI)

Location: Room 211, South Hall

Date: Wednesday, March 20

Time: 5:00 pm – 6:00 pm

30th Annual IASIG Town Hall

Since its inception in 1994, the IASIG has been driven by a selection of audio-focused volunteers with the goal of uniting the audio hardware and software industries with creators, researchers, and educators in order to push for the state-of-the-art for the industry.

Want to improve game audio workflow or the professional audio tools available to you?

Join fellow game audio educators, sound designers, composers, audio programmers, and other audio professionals at the Interactive Audio Special Interest Group (IASIG) Town Hall.

The IASIG works as a Special Interest Group in both The MIDI Association and the International Game Developers Association representing Game Audio Developers in those organizations.

Steve Horowitz (Audio Director, IASIG/Game Audio Institute)

Tom Poole  (Director, JUCE / Audio Developers Conference)

Dave Berol  (Audio / Voice Director, Surfaceink Product Design & Development)

Austin Smith  (Content Manager, Sound Designer & Composer, Game Audio Institute/Blackwater Soundworks)

Pat Scandalis  (CEO/CTO , moForte Inc, MIDI Association MPE WG Chair)

Location: Room 3002, West Hall

Date: Thursday, March 21

Time: 12:45 pm – 1:45 pm

Pass Type: All Access Pass, Core Pass, Summits Pass, Expo Pass, Audio Pass, Indie Games Summit Pass – Get your pass now!

Game Developers Conference

The 38th annual Game Developers Conference (GDC), taking place March 18-22 at the Moscone Convention Center in San Francisco, delivers the game industry’s greatest source of content and connections—with innovative exhibitors, awe-inspiring technology, and powerful talks led by game developers who live and breathe this industry.

Game devs and industry experts who attend GDC 2024 can expect the return of many activities and events that make the Game Developers Conference the place to be for game developers. Attendees can attend a comprehensive selection of lectures, panels, and roundtable discussions, tour the GDC Expo to check out companies’ latest products and solutions, and head to the GDC Play area to try the newest indie sensations before they come out.

In addition, the Independent Games Festival and the Game Developers Choice Awards will take place live on Wednesday, March 20—with a simultaneous broadcast in the event app and on the GDC Twitch channel.

Orchestral Articulation Profile NAMM 2024

There are many orchestral sample libraries in the market, and they are essential for film scoring, game audio, studio, and live MIDI applications. These orchestral libraries have many kinds of articulations.  

For example, a string library might have a different set of samples for every articulation including marcato, staccato, pizzicato, etc.

However, there is no industry standard method-the method for selecting these different articulations has been different for each developer.  Many developers use notes at the lower end of the MIDI note range for “key switching”, but the actual keys used are different between different developers. Some developers use CC messages to switch between articulations, but again there is no industry wide consistency. Some plugin formats now have the ability for per note selection of articulations, but again the method for inputting that data is different for different applications. 

It is the goal of the MIDI-CI Profile for Note On Orchestral Articulation to provide a consistent way to encode articulation information directly in the MIDI 2.0 Note On message, using the Attribute Type and Attribute Data fields.

In arriving at this Profile, a study was made of orchestral instrument families, choir, big band instruments, guitar, keyboard instruments, and various non-western instruments to evaluate the degree to which they share common performance attributes and sound production techniques. Notation symbols and performance indications were also considered to determine, for example, how successfully a violin note marked with a trill might result in a musically meaningful or analogous articulation when the part is copied to an instrument as far afield as timpani—all without the composer having to re-articulate the timpani part, at least initially.

The Profile provides a comprehensive yet concise system of articulation mapping that includes a wide palette of articulation types and supports articulation equivalence across eight instrument categories.

The Profile was designed to offer articulation equivalence — a system of articulation mapping that allows a passage articulated for one instrument to be copied to another track and playedplay back with an equivalent or analogous articulation, regardless of the target instrument type.

When implemented by sample library developers, the Profile will greatly aid composers in highly significant ways.

First, it will simplify the process of substituting or layering sounds from the same or different sample libraries; Second, it will allow composers to quickly audition and orchestrate unison passages by copying an articulated part to other tracks and hear them to play back with equivalent or analogous articulations.

Join us on Youtube for The Orchestral Articulation Profile at NAMM 2024.

The Orchestral Articulation Profile

There are many orchestral sample libraries in the market, and they are essential for film scoring, game audio, studio, and live MIDI applications. These orchestral libraries have many kinds of articulations.  

For example, a string library might have a different set of samples for every articulation including marcato, staccato, pizzicato, etc.

However, there is no industry standard method-the method for selecting these different articulations has been different for each developer.  Many developers use notes at the lower end of the MIDI note range for “key switching”, but the actual keys used are different between different developers. Some developers use CC messages to switch between articulations, but again there is no industry wide consistency. Some plugin formats now have the ability for per note selection of articulations, but again the method for inputting that data is different for different applications. 

It is the goal of the MIDI-CI Profile for Note On Orchestral Articulation to provide a consistent way to encode articulation information directly in the MIDI 2.0 Note On message, using the Attribute Type and Attribute Data fields.

In arriving at this Profile, a study was made of orchestral instrument families, choir, big band instruments, guitar, keyboard instruments, and various non-western instruments to evaluate the degree to which they share common performance attributes and sound production techniques. Notation symbols and performance indications were also considered to determine, for example, how successfully a violin note marked with a trill might result in a musically meaningful or analogous articulation when the part is copied to an instrument as far afield as timpani—all without the composer having to re-articulate the timpani part, at least initially.

The Profile provides a comprehensive yet concise system of articulation mapping that includes a wide palette of articulation types and supports articulation equivalence across eight instrument categories.

The Profile was designed to offer articulation equivalence — a system of articulation mapping that allows a passage articulated for one instrument to be copied to another track and playedplay back with an equivalent or analogous articulation, regardless of the target instrument type.

When implemented by sample library developers, the Profile will greatly aid composers in highly significant ways.

First, it will simplify the process of substituting or layering sounds from the same or different sample libraries; Second, it will allow composers to quickly audition and orchestrate unison passages by copying an articulated part to other tracks and hear them to play back with equivalent or analogous articulations.