MIGS 2025: Industry Insights from APM Music’s Jon Foster
- Ezra Broder
- Nov 18
- 4 min read

For over 20 years, the video game community gathers in Québec every November at The Montréal International Game Summit. APM’s Jon Foster attended this year’s conference and shares his thoughts on Canada’s longest-running B2B video game conference.
What are the most critical takeaways from this year’s event and why?
MIGS 2025 stood out as an incredibly energizing event. It was intimate, well-curated, and full of the kinds of conversations that actually move the industry forward. The venue created a great atmosphere for real connection; those quick coffee run-ins often turned into deeper discussions about where games are headed and how we’re all adapting to new creative and technical realities.
What really impressed me was the quality and focus of the programming. Sessions on creative collaboration in audio development reinforced just how essential cross-discipline teamwork has become, while the panels diving into new AI capabilities sparked thoughtful conversations about both opportunities and challenges ahead. The content struck that rare balance of being practical, future-focused, and creatively inspiring.
As a first-time attendee and someone who attends larger conferences like GDC - I came away genuinely impressed. MIGS has established itself as a global-caliber event, one that’s absolutely worth attending if you’re looking to build meaningful relationships and explore new avenues for collaboration in the game industry.
Talk about specific panels you attended and what came out of those.
The “Creative Collaboration in Game Audio Development” panel was a major highlight for me. Hearing Oleksa Lozowchuk (Interleave), Rob Bridgett (SIE), Audrey Dubois (2K/Cloud Chamber), Bénédicte Ouimet (Ubisoft), and Louis-Philippe Dion (Haven Studio) break down the pillars of modern game audio was not only fascinating, but incredibly relevant to how I think about supporting developers with the right music.
A recurring theme was the power of a well-defined creative vision, or a north star, that guides audio decisions from day one. The panelists stressed how crucial it is to articulate that vision early and keep it alive throughout the production cycle, especially as teams scale and new voices join the project. I loved their emphasis on building teams that mix deeply experienced audio leads with newer team members who bring raw passion and fresh thinking. My biggest takeaway was that those early conversations, where abstract, often non-musical ideas get translated into clear musical direction, are absolutely essential. When the entire team shares a unified creative aim, it becomes far easier to keep everyone aligned all the way to launch.
Another session that left an impression was “The Role of Sound in Video Games,” where Jorge Peirano (Gameloft) used practical examples to show how audio shapes player experience on both technical and emotional levels. From directional cues that help players anticipate enemy proximity to musical and sonic moments that heighten intensity, it was a great reminder that audio isn’t just supportive, it’s foundational. Seeing gameplay clips stripped of sound versus fully mixed versions really reinforced how much we rely on audio for immersion, feedback, and satisfaction, especially in interactive spaces.
I also found the “Ubisoft and SOCAN Foundation Screen Music Lab” panel especially inspiring. Simon Landry (Ubisoft) and Julien Boumard Coallier (SOCAN Foundation) shared how the program nurtures emerging composers and gives them real opportunities to score interactive media. It was encouraging to see a major studio invest so intentionally in the next generation of game-music creators, and it underscored how much room there is for new voices in the space.
Finally, “Players as Creators: UGC in the Age of AI,” hosted by Andy Mauro (Storycraft), opened up one of the most forward-looking conversations of the week. The discussion explored what we do (and don’t) classify as UGC today, and showcased just how dramatically AI is reshaping development workflows. Seeing examples of a simple 2D image being transformed into a navigable 3D environment, or a solo developer building an entire game in weeks using AI-generated art pipelines, really challenged everyone’s assumptions about scale, creativity, and production velocity. It was one of the most talked-about panels for good reason.
How does music help Games tell powerful/entertaining stories and engage fans?
Music is often the final puzzle piece that elevates a game from “well-made” to genuinely memorable. It delivers the emotional clarity that visuals and gameplay alone can’t fully achieve. In many ways, it becomes the sonic identity of the world, an anchor players return to every time they step back into the experience.
A great score transports us instantly. It can communicate mood, tension, and narrative intention before a single line of dialogue is spoken. We’ve all had that moment entering a new area in a game where, even without seeing a threat, the shift in music tells us danger is near. That subconscious guidance is part of what makes gameplay feel intuitive and alive.
Music also grounds us in the authenticity of the world. Think of the hundreds of diegetic tracks woven into Marvel’s Spider-Man 2 - coming from bars, restaurants, street performers, carnivals - that give Manhattan its lived-in texture. Those details make the city feel real, not just rendered.
And of course, there’s the power of an iconic theme. Whether it’s the unmistakable Norse choir of God of War or the electrifying builds and drops of “Die For You” from Valorant Champions, these musical moments become part of the cultural footprint of the game itself. They hype us up, stay in our heads long after we’ve put the controller down, and shape the emotional memory of the worlds we love.
















Comments