Stealing Panda3D's Render Pipeline


#21

Look at https://gitter.im/urho3d/Urho3D for continued discussion on the bgfx work, among other things. TLDR: it is not trivial, and could result in loss of functionality or having to do some things differently.


#22

@cadaver
Quite offtopic, but have you considered coming back to leading Urho3D development?
It’s been more than a year since you stepped down, and, no offence to the other core developers, the leadership void that you left behind does not look like was filled yet.
Maybe starting a patreon for Urho (I know it was discussed before) is not such a bad idea.
There might be more than a few people from Urho community that would rather pay some amount monthly for Urho development rather then for a bloated commercial engine license (I do for sure). Good leadership is a great incentive.
Just think about it. It’s quite obvious that you still care about the project.


#23

No; I believe things must happen naturally according to people having a use for the engine, and wanting to take more responsibility. I don’t, any more, so I’d be forcing myself.

Also I think I have said it before, but for me personally money would be an anti-incentive, as it would mean more pressure to get things right. I think I’ve also said for the project in total donations / Patreon could very well be a good idea, if there just are other people who want to pursue it.

Obviously I care for the project in the sense that it’s nice seeing people working on it, keeping it going on, and keeping it on their minds (even if it means the same people bickering about it in several gitters), but like I said, things can’t be forced.


#24

@boberfly did most of bgfx porting already: https://github.com/boberfly/Urho3D/tree/feature/bgfx

What @cadaver said is very true. Port is at a point where it stops being simple. bgfx expects things to be done bit differently. Going full-bgfx would make sense as engine could be mended as needed to accommodate bgfx. So far work has stalled and we have no idea if it will ever be finished.


#25

I’ve said it before and I’ll say it again … I don’t believe the renderer is anything holding Urho3D back, if it is holding it back it’s less the renderer and more that the platforms being supported have complicated it to such a degree that no general purpose API can really cover either.

The real hold back is what hardware those that can do things have access to, I just don’t have reliable access to GL4.3 hardware (or that which can do it well) to be able to do GL compute, so while I’ve completed OpenCL/DX11 compute I can’t just magically make that work elsewhere.

I can plop my geometry/tessellation shader code onto github, but that isn’t as great a help as many might think, the hash structures Urho3D uses rely on 32-bit keys … enjoy collision city, in even basic tests shit collided constantly.


#26

From my beginners perspective, and working with sharpreality as the wrapper … being able to import and display other peoples work easily and on-the-fly will enhance the immediate utility of Urho3D. bgfx would help it seems. but i’m still dependent upon the uptake by sharp reality of the implementation of urho3D-next.

As I work through the various samples in feature samples and get them to work on a hololens, there is a bit of work to convert, and it’s related to the sharpreality api. But it’s the same conversion for most all of them, and not hard or time consuming once you get the pattern. But yes there is a conversion to do. It’s camera and scene related mostly, and reformatting.

Most of my time is spent reformatting to an AR environment and providing appropriate control of the scene with a different set of inputs and triggers, also with more info available such as surrounding physical environment, and generally a 360 degree view ability, around the scene and around the camera, depending on body movement, an additional degrees of freedom of sorts when using AR.

I describe this time as spent opening wide the door into AR rather than about fumbling with urho3d’s api, even though I do fumble a lot, and around the rendering, it’s not holding me back though, it’s just a set of different issues than other APIs might have depending upon what services are offered.

For what I’ve been doing the Urho3d api and engine is a great tool as is. I see your point about the different platforms supported versus supporting the most active ones and positioning for straightforward future enhancements.

But for me overall the Urho3D API is quite powerful and I think my current limitations is on-the-fly import of models. Though if I could get shadows working and had networking and 3d audio. Still, even without these immediate out of the box capabilities with sharprealty there is so much available that does work great. Even something simple like the animating scene is quite stunning visually in AR and opens the mind’s door to many possibilities that don’t readily appear when viewing on a screen versus standing in the middle of it in holographic space.


#27

IMHO -
PBR should be fixed to work, on mainstream platforms Urho supports.
More rendering APIs should be supported and enhanced.
Edit: And also put rendering into another thread.

Maybe we should learn something from game engines such as Godot has been doing.


#28

PBR should be fixed to work, on mainstream platforms Urho supports.

What are those mainstream platforms?

We can use business analysis data here, by default Linux and Mac are reliably year to year not a mainstream platform for games purchases (there’s problems because of dual-booting, but guess what, doesn’t matter … what platforms are games being purchased for). Should all linux and mac support be abandoned?

PBR has an awkward history in Urho3D, IIRC I was 3rd to begin a large work at it with the objective of being real-time on Intel HD4000 / Ivy-Bridge hardware and eventually got fed up with the general griping that PBR causes (that’s exactly what happened, I still google the last-straw user regularly to see his odd porn interests) and DragonCastJosh picked it up from me and carried it off into it’s current state while adding his own flavor to it dropping some of the things I had done that were specific to low-spec machines like YCoCg interleave and the like because they did in fact over-complicate the pipeline.

If you want a unified workflow that works everywhere you’re going to have to sacrifice with a preprocessing step to make that happen and remap your data.

Guess what, everything but Visual Studio absolutely sucks at that - it’s 2019 and everything except Visual Studio is incompetent at batch or partial-exec tasks … everything. How many CMake or premake commands do you want to memorize? How many batch files do you want to write to do those tasks and clean up after them?


#29

Note, I would be 100% behind and would contribute to a full rewrite to target a reasonable end-game common graphics API like Diligent as it covers a sufficiently broad scope compared to narrow trash like BGFX … note: using trash as a reference to clearly misinformed comments about geometry shader performance and nothing else that bk described while doing exactly what every single GPU vendor said DO NOT DO THIS that’s my beef with BGFX, the soviet pisses on reality and none of the BG utility stuff has any merit just a bunch of absolute garbage attempts at STL but failing grossly.


#30

When I picked it up from you I had no real prior rendering knowledge and was very new to programming on this scale, but I had a passion for advanced rendering techniques.

But overall it came down to if I struggled to understand a topic or some of the work you did I put it aside for a later date. This never came as o became busy with work after landing a job due to my work on PBR.

Since then I have learned a lot and have started playing around in a local branch of atomic with plans to submit a much better PBR system with many of the accompanying features you would expect in a modern engine.

As for platforms I now have a few android devices so I am able to test and development a version that works for Android, possibly striped down and simplified.


#31

Didn’t know about this diligent thing. I’ll add it to the list of possible candidates. Thx.


#32

Does Diligent support Metal on iOS? There is an open ticket to add metal support…


#33

That’s not clear. I’m going to try that next.


#34

I’m giving a try to Diligent.

  • on Mac, it supports Vulkan thru MoltenVK and it works.

the dreaded cube

on Ios, it is still opengl and it has some problems:

I’m now gonna try Android.