Ray Tracer Sandbox in Vulkan

Author: wyldfire

Score: 95

Comments: 34

Date: 2020-11-06 17:33:21

Web Link

________________________________________________________________________________

jlarocco wrote at 2020-11-06 21:55:07:

Perfect timing! I just got a new machine with a Quadro RTX4000 card, and I've been looking for example code to see what it can do.

It still blows my mind that Mandelbrot fractals (and now ray tracing) that would take 10s of minutes to render at 640x480 on my first computer, now render at 60fps on a 4k display with real-time panning/zooming.

gallerdude wrote at 2020-11-06 19:06:45:

Off-topic, is there any mid-level 3D API that I can plug vertices into directly? Everything that I see is either super-low level (Vulcan, OpenGL) or super-high level (Unity).

jleahy wrote at 2020-11-06 19:29:19:

Like it or not, the answer is probably OpenGL. If you go with version 3.3 (a good choice in my opinion) and then it's really only a few function calls and you're submitting vertices, and it's going to be supported on any platform. Use glfw and you don't need to worry about platform stuff (creating windows is a pain otherwise).

Counting the lines of a renderer I have lying around it's 300 lines (including much whitespace). That's very full functioned, it correctly positions the window in the center of the monitor, handles errors properly, takes mouse and keyboard input, sets up the view/projection matrices, uses shaders, etc.

jleahy wrote at 2020-11-06 19:45:43:

And just for you, I pulled some very ancient code of mine (2012) and stripped it down enough so that it renders a single triangle (and you can move around), nothing more.

https://github.com/jleahy/gldemo

That's a full OpenGL setup, with all the fiddly bits taken care of. Obviously it's a bit over the top (you don't need vertex buffer region management for one triangle), you don't need SIMD matmul, but that's the maximum.

You could port that to any language with OpenGL bindings.

phendrenad2 wrote at 2020-11-06 19:42:22:

Agreed. And despite being "deprecated", the "fixed-function" OpenGL 1.x API is still implemented on all major platforms (for now?), and it's even more high-level (although your graphics may look somewhat dated). Plus you can mix/match API levels (I think), so you can start with 1.x and move some things to 3.x/4.x style OpenGL when you need shaders. I wish Vulkan wasn't a huge steep learning cliff in comparison.

jleahy wrote at 2020-11-06 19:59:27:

It'll be there forever (compatibility), but the problem is there's not a lot of overlap between 1.x and 3.x/4.x. I think people are better sticking to the 3.x core profile. Shaders just aren't that hard and fixed function is a really outdated way of thinking.

Generally I'd also say stick with 3.x (rather than 4.x) unless you need tesselation shaders.

moron4hire wrote at 2020-11-06 22:09:09:

I believe with the latest versions of the APIs, the fixed function pipeline stuff is now emulated in the programmable pipeline, but the two don't mesh well, so you end up with particularly bad performance. I've had to find and install the old DX9 runtime to get some older games to run at reasonable frame rate.

jlarocco wrote at 2020-11-06 22:06:12:

I'm actually working on a library like that for Common Lisp, but it's still very much a work in progress.

My goal is to have a framework where I can experiment with different parts of OpenGL by sub-classing something and then override a method or two, and have it "just work" with everything else in the framework (shaders, viewers, animation, user interaction, etc.)

https://github.com/jl2/newgl/tree/buffer-refactor

flohofwoe wrote at 2020-11-06 23:01:27:

Shameless plug, check out sokol_gfx.h (and maybe sokol_gl.h, which is a simple GL 1.x style API on top of sokol_gfx.h):

https://github.com/floooh/sokol

sxp wrote at 2020-11-06 20:41:51:

For quick 3D visualization, I've used three.js and a minimal HTML page. It has loaders for many common 3D file formats so you start with one of the samples at

https://threejs.org/examples/

and hack around until it loads the data that you want.

fulafel wrote at 2020-11-06 22:15:42:

You may be looking for a 3D rendering engine. There are lots of choices depending on your target platform, preferred programming language, license preferences, requirements for real time vs animation or image production, etc.

typedef_struct wrote at 2020-11-06 19:29:42:

https://github.com/google/filament

and ThreeJS are probably the closest

PudgePacket wrote at 2020-11-06 23:16:23:

Webgl is honestly a pretty good mid ground between them.

moron4hire wrote at 2020-11-06 21:26:26:

I'm in kind of the same boat. I grew very disatisfied with Unity earlier this year and began looking for alternatives. I explored writing my own rendering engine, given that my requirements aren't very high.

Back in January, I discovered this really cool .NET Standard 2.0 library for abstracting Direct3D 11, Vulkan, Metal, OpenGL 3, and OpenGL ES 3 called Veldrid:

https://github.com/mellinoe/veldrid

The documentation is pretty good, for its own parts, and it has a fair number of examples for setting up things like different windowing libraries. I was able to put together a set of code for a single demo running in Windows Forms, WPF, and Xamarin Forms fairly easily. It also has support for SDL2.

Currently, I'm going through a Codevember exercise where I teach myself WebGL from scratch.

From what I've learned so far, most of the graphics APIs these days work in very similar ways. And Veldrid polishes over the few differences (esp. in the case of OpenGL). WebGL does, too, in that they present an OpenGL front-end, but the back-end can be implemented in different graphics APIs (for example, on Windows it's actually implemented in D3D 11 through ANGLE).

In general, you need to create a Shader Program--which is a combination of multipe Shaders of different types, e.g. Vertex, Compute, and Fragment--construct one or more Buffers to which you will load data (generally in a big, ol' smash of data), and configure how ranges within those Buffers will map to pointer locations within your Shader Program.

However, I've generally found that there is little documentation _anywhere_ on how to architect a data pipeline to use all of the various GPU resources efficiently. Everyone talks about "use as few draw calls as possible". But they don't really tell you how to achieve that.

My feeling is that a Shader Program is loosely analogous to a Material in something like Three.js or Unity. I'm guessing that the ideal approach is to take all of the Meshes in Three.js, or all of the Renderers in Unity, that have all of the same Materials, and then combining their Geometries into a single block of memory. And that's where Uber Shaders come in, as an attempt to also combine all of the different ways in which you'd want to render different Materials into a single Shader Program.

slx26 wrote at 2020-11-06 19:06:15:

there are two gifs that are very nice, but extremely heavy, almost 80MB together. you might not want to load that on mobile data

Sohcahtoa82 wrote at 2020-11-06 20:57:12:

Why would they use GIF instead of HTML5 video? Could have significantly reduced file size.

speedgoose wrote at 2020-11-06 21:13:58:

You can't embed videos in Github's markdown.

waiseristy wrote at 2020-11-06 21:03:04:

I had no idea someone wrote a Vulkan backend for ImGUI. Very useful!

https://github.com/Zielon/PBRVulkan/blob/master/PBRVulkan/Ra...

Impossible wrote at 2020-11-06 22:06:17:

ImGUI has backends for every graphics API in regular use, and some that might be considered deprecated (DX9). Including game engine abstraction layers in Unreal and Unity and console properietrary APIs (not public, of course). Here's a full list of public graphics APIs and windowing systems part of the official release (

https://github.com/ocornut/imgui/tree/master/backends

). Maybe with the exception of stb_image, it's only of the most integrated open source libraries in games and graphics

greggman3 wrote at 2020-11-07 02:14:04:

it's a mistake to think Dear ImGUI has "backends". Dear ImGUI is a few files that generate a vertex list. It then as a bunch of "examples" that show how to render that list with various APIs. Any slightly competent graphics programmer could take that vertex list and render it in any API. Those Examples are not Dear ImGUI. They are just examples.

This is the code

    for each command list
      upload vertex data
      upload index data
      set scissor
      bind texture
      draw

That's it! Adapt that to any API, whatever your using

Because it's an example and because so many unexperienced programmers use it they assume it's some official part of the library. It's not.

nitrogen wrote at 2020-11-07 02:53:26:

You both seem to be right, based on this commit message:

_> Moving backends code from examples/ to backends/… 24 days ago_

greggman3 wrote at 2020-11-07 05:43:57:

Moving them from examples to backends doesn't change anything. That's likely just a sign that "examples" is now examples of using the API, not examples of integrating with some GPU API.

One of the main reasons/goals/points of Dear ImGUI is being able to ingrate it into existing projects. It's deliberately API agnostic. It generates a very simple command list that you can then trivially render yourself in your own engine.

Everything else, all the platform specific code in the repo is entirely there to help people understand how to use it or to get started.

It's a mistake to think otherwise because it incorrectly limits Dear ImGUI's perceived usefulness. If you believe you have to take a backend or use one of the APIs provided then you'll mistakingly believe it would be hard to put in your existing game. If you understand what it's really about then it's trivial to put it in almost anything. That is by design.

shmerl wrote at 2020-11-06 18:54:52:

The only outliers who didn't adopt Vulkan today are MS, Sony and Apple. MS and Apple have very lock-in minded people among those who decide this.

Not sure what stops Sony from doing it.

Things went better with OpenXR.

pjmlp wrote at 2020-11-06 19:44:14:

Among several embedded OEMs and the whole CAD/CAM and scientific industry focused on OpenGL/DirectX, without any patience to deal with low level idiosyncratic of Vulkan and its increasing amount of extensions every month, which is why Khronos decided to try to advocate Vulkan to them via ANARI.

Let see if it as successful as OpenCL 2.x that had to reboot back into OpenCL 1.2 for version 3.0.

OpenXR doesn't define the 3D APIs, it is just device management, one can use it with whatever APIs one feels like it.

And then there is Hollywood switching to CUDA based render farms, like OTOY, because Vulkan Compute doesn't deliver what they need.

Arelius wrote at 2020-11-06 22:14:21:

Everyone says Nintendo supports Vulkan. But that wasn't actually the case outside of marketing materials in the past, and nobody has confirmed that this has changed since.

jsheard wrote at 2020-11-07 02:43:38:

The Switch went through Vulkan 1.0 conformance testing, and then again for Vulkan 1.1 and 1.2

https://www.khronos.org/conformance/adopters/conformant-prod...

There's also a platform integration extension attributed to NVIDIA and Nintendo

https://www.khronos.org/registry/vulkan/specs/1.2-extensions...

Seems unlikely they would go to that trouble if Vulkan weren't actually exposed to developers

Arelius wrote at 2020-11-06 22:26:24:

So who does that leave? Linux, Android and Stadia? And maybe Nintendo? There is about as much not supporting it as are.

m-p-3 wrote at 2020-11-06 19:27:53:

.

bazooka_penguin wrote at 2020-11-06 19:39:25:

They already had a low level graphics API for ps4, GNM, before Mantle. In fact, EA DICE's Johan Anderson who did most of the groundwork for mantle, said he got his ideas from working with console APIs. At the time he was working on the mantle spec it probably would have been the xbox 360 or ps3 APIs or an early form of the ps4 API. The X1 didn't launch with a low level API, but they provided one some time before 2014 according to a Metro Redux dev

pjmlp wrote at 2020-11-06 19:35:42:

Sony already has several low-level APIs, much easier to use than Vulkan and precede it.

bch wrote at 2020-11-06 19:42:09:

Are you talking about gnm, gnmx[0] ?

[0]

https://en.wikipedia.org/wiki/PlayStation_4_system_software

pjmlp wrote at 2020-11-06 19:46:21:

Yep.

It is kind of DirectX 12 like, with a shading language similar to HLSL.

Also Switch is always referred as an example of Vulkan support, they also do OpenGL 4.6, but what really matters for native titles on the platform is NVN.

Arelius wrote at 2020-11-06 22:15:31:

And I'm not sure that it actually has Vulkan support. Can confirm that OpenGL is at least technically supported.

shmerl wrote at 2020-11-06 20:15:32:

AMD also makes chips for Xbox, so that isn't really a reason for them. I haven't seen Sony saying they want Vulkan to succeed. Like MS with DX12 and Apple with Metal, they are so far stuck with their GNM.