29 May 2025

Nordic Game Conference 2025 - Day 2

Yes! I had the opportunity again to visit a conference! Love to do those! I went to NGC back in 2017, so I went with fond memories of last time and I can say this edition was certainly just as good.


Unfortunately, we had to miss the very first day because me and Dries, my colleague at DAE, had to teach our last class of the semester that Tuesday. So we flew into Copenhagen on Tuesday, took the train to Malmö, and settled in our hotel which was right next to the venue, Slagthuset. By sheer coincidence our good friends and ex-colleagues from Die Keure were attending the conference as well, so we met up with them and went for a drink at Mello Yello, a bar where they offered a discount for everyone with a badge from the conference - a smart move because the bar was filled to the brim while the others on the same square were as good as empty.

(Small disclaimer, I am very bad at remembering to take pictures, so there's not too many of those) 

Next morning we went to pick up our badges and start our first conference day, NG25 Spring, Day2!

After the introduction by Matilda Smedius (a voice actor mostly known for her role as Brigitte in Overwatch) we stayed for the first talk.

Essential AI systems for combat games

This talk by Marie Meyerwall I think was targeted at the students in the audience, since it was basically a game AI 101. It was an overview of the algorithms/systems used in gameplay AI, like hierarchical finite state machines, decision trees, GOAP, pathfinding, etc. I imagine it would be a brilliant introduction for our Gameplay Programming course at DAE, where we indeed cover all these topics. My main takeaways were:

  • Make arena's round, to avoid edge cases where NPC's get stuck in corners.
  • NPC vs NPC is the hardest 

The complete talk is available on YouTube and there is a more extended version on the GDC vault.

Future Games Malmö 

Apart from talks my mission was also to connect with other schools and companies. Since I teach game projects myself, this is my main interest. I'm not the best networker, but I donned the hat and went all in. 

I first talked with students at the booth of Future Games Malmö. At this school it appears they do a lot of game projects, starting from the first year. The (programmer) students get to choose between engine and gameplay development and get taught in both C++ and C#. The program is only 2.5 years, with the last half a year an internship (similar to DAE), which means the actual program itself is only 2 years! 
What stood out to me is that students at the school have a strong dislike for mobile game development, because of its performance restrictions. I found that odd since this gives you actually the most challenges while developing a game.
Unfortunately there were no teachers around to talk to, they were supposedly going to be there in the afternoon, but I have never seen any (went back a few times).

The Game Assembly Malmö

Next I went talking with teachers and students from TGA. I actually had several interactions with them during the conference since there were also some talks given by the teachers from TGA. Their program is very similar to future games; it also takes 2.5 years of which there are 7 months of internship - many of their students were at the conference in search for a position. In the first two years 50% of the time is spent in class and the other 50% is spent in game projects. When the students are looking for an internship they already made 8 games! Most notably, in the second year they exclusively use an in-house build engine. This gives the students a unique challenge in engine development, but it limits their opportunities to experiment with cutting edge technology, since they have to write that technology themselves.

The people I spoke to acknowledged the fact that they should promote the games made by the students more, at this moment they are not to be found anywhere on the internet. In contrast, all the games made by DAE students can be found on itch.io and steam (yes, shameless self-promotion).

Diversion.dev

Diversion.dev had a booth promoting a new source control system targeted at game development and virtual productions. They specifically market themselves as an alternative to perforce, and 70% cheaper as well. I'll be looking into this tool and if we decide to use it in our courses you can expect a new blog post about it as well.

I also learned that business cards are dead. Everyone at NGC used the LinkedIn app which can apparently generate a QR code that the other person can scan with their LinkedIn app and you're immediately connected. Loved it!

No 2 steps alike

This was an interesting talk from two employees of Motorica about their product. 

The statement was that animators spent 2-3 weeks on an animation, of which 80% exists out of chores rather than actually defining the character. 2 weeks sounds like a lot, but creating a character animation is an iterative process; you discover the identity of the character while creating the animation so often you need to go back and alter what you've made. This makes creating a character animation something very costly.

Motorica is basically an AI tool that can blend various "styles" into an animation. For example you have a walk cycle, you can then blend "zombie-like" walking into it for a giant that weighs 200kg etc etc. Artists can train the AI to learn new styles and thus you can easily create new animations by just changing parameters. Whether or not that solves their premise, it looks like something I'd like to experiment on with our students. They do not have time to create animations during group projects, so this tool might proof very useful for them!

Swedish internships and why they aren’t slavery

I walked in on this talk by coincidence, but I'm glad to have seen it. Presented by Ida Fontaine from TGA Stockholm, she gave an interesting overview of how the Swedish educational system concerning game schools is organized. 

There are in Sweden 6 master programs and 8 bachelor programs in game development. They all provide internships and in Sweden it is mandatory that this takes 1/3rd of the total education! The students write their thesis during their internship. An internship takes up to 30 weeks! (At DAE it is 18).

Internships are viewed as a course within the school, but you execute it at a company, so it is not to be regarded as a job. You still get student loans, insurance and other student benefits during the internship (pretty much the same as in Belgium). 

She made the statement that interns are better than juniors, since the students already developed 8 games before going into their internship (see earlier). A requirement from the companies that take interns is that they have supervisors for the students with a minimum of 2 year industry experience.

To help the students find an internship they organize a meet and greet event where companies and students can get to know each other, much like our career fair at the unwrap conference.

Fireside chat with Peter Molyneux

Next was a fireside chat with the legendary Peter Molyneux, who talked about his history as a game designer and the one last game he plans to make. I am not such a fan boy myself so I did not stay for the entire chat, but if you're interested you can watch it in full on YouTube.

Space Marine 2: Building a Global Warhammer 40K Experience

 Instead I went to see a talk about the localization of Space Marine 2. I am not too familiar with the Warhammer 40K universe (although I painted some space marines back in the day when I was 16). 

I had no idea this universe was that big, there are 600+ novels, the "Horus heresy" series has 64 (!) books and there's much much more. While developing and translating this game it was super important to adhere to this original lore so fans can recognize the books and novels in the game. 

There was a very complicated structure of approval by various stakeholders for translating every piece of text. Crucial to this was the concept of "IP terms" which were words that are specific to the 40K IP. These words need a specific approval procedure when translated, while other more common sentences and words could be translated by the teams themselves. Most important here was to have consistent localization teams and sustainable scheduling. I like that last term; sustainable means doable for the team executing it, not burning them out with overwork and tight deadlines. The sustainable schedule helped retaining the translators which was important for consistency.

Saving the game industry

I then went helping to save the game industry. I'm glad I did because:


During this talk we collectively voted for new rules to be followed by the games industry, which we'll hope will save the games industry. I don't remember them all, but I particularly loved this one: release dates are final and release is mandatory. This means, if you announce a release date, then that's it. It can no longer change and your game will be released on that day in whatever state it is at that moment.

There was always time for questions before each vote via an app and I must say that the Belgians were good at asking questions, since we got many selected to be asked on stage :) 

Next year we'll convene again and vote amendments and new rules if necessary.   

Beer 

And that was the last talk of the day, there was a free drink offered by Sharkmob, we went for food in Paddy's and ended up networking at Mello Yello again, were I met Mika Karhulahti from JAMK, a game development school in Finland and Christophe Laduron from HEAJ, the game school from the French part of Belgium.

(Continue reading what happened on day 3 here

Nordic Game Conference 2025 - Day 3

(Read about my day two at NGC 2025 here.)

Responsible people as we are at this age we did not overdo it last night so we were all fresh and ready for day three.

Building games in 2025: Emerging models

There was another fireside talk, this time about Indiana Jones and the great circle, but I did not stick around. Instead I went to a talk about publishing games in 2025 given by three studio directors.

I noticed that there was absolutely no mentioning of journalists or press in the talk; it seems that classic media for getting your game in the market is no longer a thing. Now it is all social media, community building and influencers. They mentioned "MVC's", minimum viable communities next to an MVP as a necessity to launch a game successfully.

They spoke about a concept of "nimble studios", small studios that through these new channels manage to create and publish large games, they were of course referring to Expedition 33 and the likes.

Even publishers are getting outdated - a statement was that a publisher should have a community of their own, build around the IP they publish. That way they still have something to offer to a developer (besides money).

From AAA to Education – Designing a game engine for learning game development

Now this was an interesting talkBjörn Ottosson (needs to update his LinkedIn picture), teacher at TGA, talked about the inhouse game engine their students need to use during their second year for the game projects they work on.

He also gave some extra insight in how TGA organizes their program, specifically their game projects. Over the entire program there are 7 cross discipline game projects with 10-20 students per project (!), giving them an output of 90+ games each year. That is quite different from what we do at DAE, where there are typically three of those projects with only 6 students per project, giving us a similar output of 100+ games per year.

The 1st year students create two games in Unity but after that all their projects are made with TGE - The Game Engine. In the second year there are 3 game projects, and all of them are made with the same group, so they stay in their group for their entire year! These groups need to build their own game engine and can start from TGE.

TGE itself is written in C++ for Windows and uses DirectX 11 for rendering and Dear ImGui for ui. They acknowledge that DX11 is old and are planning to upgrade to something more modern, but it is unclear at this moment if this will become Vulkan, DX12 or even SDL3.

Interesting is that the game runtime is very barebone, leaving much for the students to still decide on how they want to approach things. 

Also interesting is that every object in a scene is saved to a separate file, instead of having the data in a central scene file. This makes it easier to work with VCS, a good thing for students that are just starting to get the hang of git. 

I stayed after the talk to have a chat with Björn and he showed me a bit of the setup in Visual Studio, students get several classes in the engine that teach them how to implement features of a game and related engine code. Really nice!

CPU + GPU = Heart

I then went to a talk by AMD about optimization strategies for an APU, compared to those for a dedicated GPU. Let's be honest, the talk was a little over my head but my main takeaway was that whatever GPU optimizations exist, they also benefit an APU in the same way. An APU mostly has better memory access patterns since there is no need for copying data from the CPU to the GPU - they simply access the same buffers.

Asset streaming in Valheim

I then went to a talk about asset streaming in Valheim. They used Unity and there you have some options to do asset streaming, like addressables for example. Addressables however have the potential downside of having the same data loaded multiple times, which in the context of Valheim became an issue. 

Because of that they build there own system on top of Asset Bundles, in other words an alternative for Addressables which are also build on top of Asset Bundles. They introduced the concept of "Soft Refs" which will be loaded when necessary, in contrast to Hard Refs with which they referred to how all assets are loaded in Unity by default.

The main difference is that instead of having to define your own addressable groups, these groups are calculated at build time via analysis of the soft refs in the game.

Beyond Vibe coding (AI agent coding)

The speaker of this talk, Simon McCallum, was truly remarkable. It was not so much about actual vibe coding (which I would like to try out at some point) but rather more about AI philosophy.

In 2010 he said we were "DJ-programming" where we combine libraries and frameworks into a working application. Now vibe coding, or agentic programming, is where the programmer has mulitple AI agents working together performing all kinds of tasks to develop a program, going from file creation, scripting and interacting with github, mostly running on Docker containers.

Another way to approach AI is "navigator coding". Instead of asking for solutions to specific problems, ask for multiple options and you make the choice. By steering the AI in this way you remain in the driving seat and your brain is still thinking along with the AI, instead of just simply copying it over.

An interesting insight was this: today's first year students will never be better than AI. AI already knows a lot more about programming than a first year and it is also learning faster than them. Hence a student can never outperform an AI. I argued that this is assuming of course that the AI doesn't make any mistakes along the way, which at this point is not yet the case.

He wrapped up with some tangible tips on how to use copilot.


Nordic Game Awards

No more talks anymore, at 18:00 another happy hour was hosted by Sharkmob after which the Nordic Game Award show happened, which is always a very fun thing to watch. It was even allowed to bring beers into the theatre so the atmosphere was quite nice to say the least :)

Winner was Indiana Jones and the great circle, but I must say there were many impressive games that passed the revue. 

After the award show it was networking time, followed by an after party with marioke (karaoke but the lyrics are about games). Unfortunately there is absolutely no footage of me singing something on stage. 

Games! 

The last day of the conference was not very interesting and mostly targeted at students, most of the booths were gone and there were only some masterclasses and only one stage with some beginner talks. 

What I did not mention though is that there were a lot of games to play at the event and you could vote on them for the Nordic Game Awards. My favorite didn't make the cut however so I do want to give a shout-out to Dicetris, a game that I absolutely loved to play and am looking forward to see released on mobile.

It's already playable on crazy games in a sort of early access

Thanks!

If you've read this far then I extend you my sincerest respect and gratitude, let's connect on LinkedIn ;) 


23 June 2024

Integrating Wwise 2023.1 as a plugin for Unreal 5.4

This past semester I've been teaching a few classes for the 2nd year sound students at Howest - Digital Arts & Entertainment in the course "Sound Integration 2". Main goal: make them accustomed to use C++ in Unreal, but also make them able to integrate Wwise as a plugin in an Unreal project without destroying the project for their peers.

In courses where the sound students work together with art and programming students on game projects, we often had the situation where the sound student wanted to use the Wwise plugin, tried to add it to the project and subsequently broke the project for everyone. After a few hours of trying to get it fixed they usually give up and just use the built-in audio tools in Unreal, or use Fmod.

But we want them to be able to use Wwise, so we need to fix this. I myself have little knowledge of Wwise, but I do know a bit of C++ and Unreal so that's why I took a shot at clarifying for the students how they should integrate Wwise. 

And boy, it is indeed not for the faint of heart. The following is a summary of my research into this to get to a completely working plugin, but it took me quite a while to iron out all the details. There are numerous sources that guide you in this, but they're all very specific to a certain Wwise and Unreal version. The same goes for this blog post of course, although the below steps work for multiple versions.

Wwise as a local plugin

Turns out, there are two ways of integrating Wwise into Unreal, either as a local plugin to your project, or either as an engine-wide plugin. Let's start locally as this is the easiest way.

  •  Make sure to install Wwise 2023.1.x - there are quite some differences between different Wwise SDKs so no guarantees are given that this process will work for other versions. I used 2023.1.0 when I started writing this article and now I've updated to 2023.1.4. You need to repeat this process if you update Wwise.
  • Make sure to have Unreal 5.4.2 installed - same deal, there are differences between Unreal versions that might make this process different. I've also tried this with 5.3.2 and the process is the same. And again you need to repeat this process if you update Unreal.
  • If you don't have one already, create a new C++ Unreal project.
  • In the Wwise launcher you can now click the "Integrate Wwise in project..." button


  •  Set everything up as requested


  • If you leave the Wwise project path empty, a new Wwise project will be created inside the Unreal project. I'd rather not mix these so I recommend creating a Wwise project first next to the Unreal project and then entering the path to that project here, as in the screenshot.
  • If you do create a separate Wwise project then don't forget to point the soundbank folders to the Unreal project

  • I recommend adding an event with a simple test sound to verify your integration.
  • When Wwise is done integrating you can open the Unreal project. The plugin will be automatically active.
  • In the project settings we need to target the Wwise project and define the Soundbanks folder


  •  And enable this so assets get reloaded after banks are generated:


  •  Via Window -> Wwise browser the soundbanks can now be generated

  •  There should be a folder WwiseAudio in the content browser containing a soundbank


  • If not, I noticed it helped to restart Unreal.
  • I've added an AKAmbientSound actor in the scene that uses an AK event to play my test sound. That should be functional now.

And done! We have now a working integration of the Wwise plugin in our project! At a whopping size of 7.3GB! If you use git for your versioning needs you'll notice that git is quite unable to sync this. Perforce (what we use at DAE) can handle it, but it still slows down everyone's progress a lot.

Wwise as a an engine plugin

One way to avoid having to upload this enormous plugin is to have the plugin engine-wide instead of inside the project. With this approach all your peers need to install the plugin in their engine (once), but at least it's not part of the project anymore. For this approach Wwise gives us a nice warning:

They were not kidding... Let's be experts here:

  • Close Unreal, we'll be creating the plugin via command line. 
  • The Wwise plugin is configured to be enabled by default, switch that off or all your Unreal projects, whether or not using Wwise, will load the plugin and that's not desired. In the Wwise.uplugin file which you can find in your Unreal project, change this line from true to false.
"EnabledByDefault": false,
  • For some reason the Wwise.uplugin is missing some modules it need to be compiled as an engine plugin, so in that file, add the text below at the end of the modules list. Note, this seems to be fixed in Wwise 2023.1.1, so from then on this step is obsolete.
,
{
    "Name": "WwiseUtils",
    "Type": "Runtime",
    "LoadingPhase": "None"
},
{
    "Name": "WwiseProcessing",
    "Type": "Runtime",
    "LoadingPhase": "None"
},
{
    "Name": "WwiseEngineUtils",
    "Type": "Runtime",
    "LoadingPhase": "None"
},
{
    "Name": "WwiseObjectUtils",
    "Type": "Runtime",
    "LoadingPhase": "None"
}
  •  Open a terminal and navigate to this folder in your Unreal installation: "Unreal\Build\BatchFiles"

  • Now run this command (change YourProjectPath with the path to the project you integrated wwise with):
    .\RunUAT.bat BuildPlugin -plugin="C:\YourProjectPath\Plugins\Wwise\Wwise.uplugin" -package="C:\Temp\Wwise" -TargetPlatforms=Win64
    If all goes well, this should build the plugin. In the package parameter we specify where this plugin should be placed, in the example it's "c:\Temp\Wwise".


    This takes a while though, so go grab lunch.
  • Copy the folder "ThirdParty" from the local plugin to the build plugin. We now have all the files for the entire engine-wide plugin.You can zip the content and distribute as you like.
  • We don't need everything actually, for example the ThirdParty folder contains files for Visual Studio 2019 as well so if you're only using Visual Studio 2022 you can remove those older files. Same for when you're not targeting Win32 computers


  • To install into Unreal as an engine plugin, place the content of the build Wwise folder in the Plugins\Marketplace folder of your engine. If that folder does not exist yet, create it.
  • We can now remove the Wwise plugin from your project folder, since we don't need the local plugin anymore.
  • Since the plugin is no longer enabled by default (as it should) we should enable it specifically for our project. That can be done via the project settings in Unreal or simply add it with a text editor in your uproject file:
{
    "Name": "Wwise",
    "Enabled": true
}

If we open our Unreal project the Wwise integration is still completely functional + our project is a 7.3GB smaller.

 Let me know in the comments if this has helped you in any way!

11 April 2019

GDC 2019 - Part Three

The is the final post on my trip to GDC19, find the first here and the second here.

The last three days of GDC there was also the expo on which we had a booth as part of the Belgian pavilion, so I had less time to attend talks. This last post wraps up the talks I attended during those three days.

The Making of 'Divinity: Original Sin 2'

I just had to go to this session by my former employer, Swen Vincke. He talked about the rough ride Original Sin 2 was. Partly a trip down memory lane for me, as nothing changed much at Larian ;). Very nice to meet-up with ex-Larian-colleague Kenzo Ter Elst who was attending the same talk!

"Shadows" of the Tomb Raider: Ray Tracing Deep Dive

Somehow this happened to be the first talk I did on ray tracing, which is my favorite subject of all, while I had actually planned for many more. I still had time :)

I just read that the all the slides of the GDC19 talks by NVidia are online, so you can already check those!

The good thing of this talk is that it brought me somewhat up to speed with the all the new RT stuff. I mean, I "know" raytracers, having written quite a few as a student, but it has been +10 years since I actively did anything with ray tracing!

There are a bunch of new shaders we can write; rays are generated per light type by "raygen" shaders, we have anyhit and closesthit shaders. Even translucency gets handled by these shaders.

What I did not realize before GDC, but now fully understand, is the importance of the denoising step in the RT pipeline. GI in raytracing always yielded noisy results unless you calculated a massive amount of rays. In all appliances of RT I've seen at GDC, only one ray was cast per step of a path, yielding incredibly noisy results. So denoising is a central part of real time raytracing. A lot of optimization needs to go in this step, for example in this talk they showed a penumbra mask, areas where we know there is a half-shadow, and only denoised on those areas.

Interesting too were the acceleration structure concepts, BLAS and TLAS (Bottom Level Acceleration Structure and Top LAS). In Tomb Raider BLAS were used on a per mesh base, TLAS were regarded as scenes.

Real-Time Path Tracing and Denoising in 'Quake 2'

Another RT focused talk, this time how a Quake II build received ray tracing. It started as research project called q2vkpt that can be found entirely on github. After Christophs part of the talk came Alexey from NVidia detailing what extra features and optimizations they added.

I played the game at the NVidia booth for a while and had a short talk there with Eric Haines (some guy who apparently just released a book on ray tracing, nice timing). In the demo, with my nose to the screen, I could easily see what is called "fireflies", pixels that are outliers in intensity that do not really de-noise so very well.

No matter how good the raytracing, something still looks off if you ask me, but this was explained in the talk: the original textures of Quake contained baked lighting and while they did an effort to remove that, it was not entirely possible.

'Marvel's Spider-Man': A Technical Postmortem

I think this talk was the best I've seen at GDC19. Elan Ruskin announced that he would go fast through his slides and it would not be possible to take pictures of his slides. Boy, was he right! It was amazing, he went super fast, almost never missed, was always crystal clear. Luckily his slides can be found here.

Some things that stood out to me:

  • They worked on it for three years, producing 154MB worth of source code.
  • They use scaleform for their UI, we used that at Larian too, not sure if they still do though.
  • Concurrent components are hard
  • Scenes were build with Houdini, they defined a json format for their levels that could easily be exported from and into houdini. It's that "into" that struck me as odd, I learned that when you have a loop in your art tools that you run into issues, but hey, if it worked for them...
  • They used 128m grid size for their tiles, which were hexes! Hex3es allow for better streaming because three tiles cover almost 180 degrees of what you're going to see next, while with square tiles you'd need to stream 5 of them.
  • During motion blur (while swooping through the city) no extra mipmaps get streamed
  • There are a few cutscenes that can't be skipped in the game: they are actually just animated load screens; it was cool to see how everything outside the cutscene got unloaded and then loaded in.
  • At some point the game covered 90% of a blu ray disc and still it needed to grow, so they had to quite some little clever compression tricks to get everything on one disc.
  • One example was instead of using a classic index buffer they used offsets (+1, +2, etc) which yielded better compression results.

This talk is a must watch! Good thing it's available on the vault for free!

Back to the Future! Working with Deterministic Simulation in 'For Honor'

Last but definitely not least was this session on lockstep deterministic simulation by Jennifer Henry. In For Honor only player input is send over the network. There is no central authority, meaning that every peer simulates every step of the game. Every player keeps a history of 5 seconds worth of input. If a delayed input arrives, since the simulation is done completely deterministic, the whole simulation gets redone starting from the delayed input.

This proved to be hard, for starters: floating points are different between AMD and Intel, there's multithreading, random values, etc...

Central take-away: "don't underestimate desync". Jennifer showed a few cases. In debug mode desyncs get reported to Jira, but in a live build a snapshot gets recorded. Then the game either tries to recover, kicks the diverging peer or disbands the entire session.

Input gets processed quickly: 8 times per frame! The physics steps with 0.5ms:

Definitely worth to watch on the vault!

Wrap-up

So that's it! On Friday after the expo closed we went sightseeing a bit in SF and found this cool museum of very very old arcade machines! My smartphone takes really bad pictures so I cannot show much of them, but this one was really cool:

We went for dinner in "Boudin", which is a restaurant + bakery, where you could buy bread in all kinds of weird shapes, this one was nice:

Thanks for reading!

08 April 2019

GDC 2019 - Part Two

This is part two of my notes on GDC 2019, read the first part here.

Follow the DOTS: Presenting the 2019 roadmap

Intrigued by Unity's keynote I decided to attend this session. This was a rather high-level talk with lots of pointers to other content (for example lot's of talks where held at Unity's own booth). Main take away for me were the ECS samples that can be found on github. There is also a new transform system coming in 2019, curious for that as well.

At the keynote it was announced that Havok Physics will be integrated with Unity, together with a custom, completely C# based physics solution from Unity themselves. Personally I trust the in-house version a bit better atm, but maybe Havok will be more performant after all? It's just weird to have the two options.

There is also a new API in the works to control the unity update loop. Not sure why, since I think it will only complicate things.

At the moment the C# job system and the Burst compiler are released. ECS is due later in 2019 and then it's the plan to transfer all other internal systems over to ECS by the end of 2022.

It is sneakily never mentioned anywhere but I asked during the Q&A session: yes, Havok will still require a separate license.

Creating instant games with DOTS: Project Tiny

Build upon DOTS, the goal of of project tiny was to create a 2d web game smaller than 100kb. For that they stripped the editor from anything that was too much for the project, "tiny mode". For scripting they introduced TypeScript... Why??? We just got rid of JavaScript! Luckily they announced that they're going to switch this back to C# again. It's unclear to me they even bothered with TypeScript.

The goal in the end is that you can select the libraries you need for your game and remove all the others. Tiny mode will then be called "DOTS mode". It is only targeted for web, but mobile platforms will be added later. A bit more info can be found here.

Cool part of "DOTS mode" is the runtime runs in a sperate process, even in the editor. This means it can even run on another device, while you're working in the editor! It also implies that there will be no need anymore to switch platforms; conversion of assets will happen at runtime.

Another part of the DOTS improvements is the vast improvement of streaming, awake and start times have all but been eliminated, so that sounds promising too.

IMGui in the editor is also completely deprecated, UI will build with UIElements.

Looking forward to these changes, I might test this with a 2D GIPF project I'm working on...

Procedural Mesh Animation with non-linear transforms

This talk by Michael Austin was serious cool! He illustrated how we could implement easy wind shader code with non linear transforms. But then he went on and made extremely nice visual effects with only very few lines of math in the vertex shader.

I had not the time to note it all down thoroughly enough to reproduce it here, but I really recommend checking out this talk on the vault! If I find anything online I'll add it here and my fingers are itching to get started on a demo :)

Cementing your duct tape: turning hacks into tools

Not really my field of interest, but the speaker was Mattias Van Camp, ex-DAE student but (more importantly) ex-Kweetet.be artist! He even mentioned Kweetet during his introduction, the logo was on the screen!

He then defined the term "duct tape" he uses in his talk: duct tape is a hack that you throw away. What follows were two examples of duct tape code they had to write at creative assembly to work with massive amounts of assets. Both of the examples boiled down to the DRY principle, but this time applied to art assets instead of code or data. They used the Marmoset Toolbag to generate icons from max files for example, all automatically. Continuous integration FTW!

Are Games Art School? How to Teach Game Development When There Are No Jobs

Next I attended another session of the educators summit. Speaker Brendan Keogh made a case that game schools are art schools, meaning that once you graduate there are practically no jobs available. There were some interesting stats:

The sources for that data can be found here.

He then continued to make a case that we should train "work-ready" game dev students.

I'm a real fan of the first sentence on that slide! Students often do not realize this and we should tell them this indeed.

Another good take-away for me was the notion to not have the students in the first year create a game like X (which we actually do in Programming 2 and Programming 4) but instead have them make a game about Y. And Y can be anything, so you're not only restricted to games. The students will much more likely create something truly unique.

Something I should mention too: "Videogames aren't refrigerators". Just so you know.

Belgian Games Café

We quickly visited IGDA's Annual Networking Event, which was nice but not very interesting. After that we went to the Belgian Games Café, there were good cocktails, but no real beer :). Nice venue!

And it was cool to meet so many Belgian devs. And then the party got started once David started showing of his retro DJ skills :)