I recently watch this on youtube. With large instances of animated characters.
The game engine is:
Ultimate Epic Battle Simulator.
This is kind of crazy and pretty good performance on the other video I watched.
I hope we will add support for this mass of instances in the next few release :0.
I think patches are welcome.
Also I’d add my personal $100 on top if someone implements this for OpenGL/Linux.
Even the engine the game was developed in (Unity) doesn’t have out-of-the-box support for those numbers. Something has to be left to the game programmers
There are probably some clues in the developer’s videos as well if you look closely
Just watched something really interesting. The key point in the video was sending a texture to the GPU that holds the animation data.
But what about AI? Is it on GPU too? Or is it just for cutscenes?
Oh no, you wouldn’t want to do the AI on the GPU. This is just for display representation. Personally, at least for my game, I have the AI decisions running in a different thread.
I think for 4K entities the AI is what will hog the most CPU…
I did a test running 10,000 AI’s in my game to test them running a single event on the thread. No issues whatsoever Furthermore, I created my own scripting language to fire the event and ran the game for over 1000 years (while I slept to ensure I didn’t have any thread locks).
(while this isn’t the best test, as long as you’re managing your AI thread properly, you shouldn’t see issues with the CPU).
That looks like a fun project to implement. Not too different from GPU particles too.