SDL_shader_tools icon indicating copy to clipboard operation
SDL_shader_tools copied to clipboard

We need a real name for the shading language...

Open icculus opened this issue 3 years ago • 16 comments

Right now it's just SDL Shading Language (SDLSL).

Which is fine, whatever, but maybe something better would be nice, if anyone has a good idea.

icculus avatar Jun 29 '22 18:06 icculus

I'd not worry for the name for it yet, unless you want to advertise it straightaway.

Why not any of the following?

  • DMSL - DirectMedia Shader Language
    • Maybe pronounce it (or even write it) as Damsel?
  • ISLE - The Icculus Shader Language Extraordinaire
    • Naming things after yourself is always a good idea! Linux and Git and both named after Linus Torvalds after all)
  • Iccken or Slolus
    • Share the love and name it after yourself and slouken
  • Sizzl
    • SDL Shader Language -> SSL -> Szl -> Sizzle -> Sizzl
    • Because you want to date your name to a fashion from 10 years ago of dropping vowels)
  • Athena, Minerva, Ptah, or Bragi
    • Who doesn't love naming a language after a god? 👀

gingerBill avatar Jul 14 '22 09:07 gingerBill

Linux and Git and both named after Linus Torvalds after all

Hah!

icculus avatar Jul 14 '22 16:07 icculus

Honestly, SDSL would be perfect for multiple reasons:

  • GLSL and HLSL both have 4 letters, and end with SL, so SDSL already fits right in.
  • It could just stand for Simple DirectMedia Shader Language, which is logically sound.
  • It doesn't stray too far away from SDL's name itself, as much as I love the other options, this is important.

ENDESGA avatar Aug 02 '22 03:08 ENDESGA

SDLSLSL ;P SDL Simple Little Shader Language

sridenour avatar Aug 03 '22 23:08 sridenour

Tesseract

You can project a tesseract into 3D space in various different ways. Be it cell-first, face-first, edge-first or vertex-first. In a way, its like translating the higher dimension geometry into a simpler dimension.

One of the stated goals with SDL Shader Language was to compile down to a bytecode, then be able to translate that bytecode into another.

Since Shading Languages can deal with geometry, it seemed fitting to me. Or maybe I'm thinking too much and trying to compare apples (translating bytecodes) to oranges (projecting higher dimension geometry to lower ones).

Dawilly avatar Aug 04 '22 04:08 Dawilly

Tesseract

I'm probably not going to do this, but I really want to do it for the A Wrinkle in Time reference. :)

icculus avatar Aug 28 '22 02:08 icculus

I'm late to the party but

Tesseract

might lead to confusion with the open source Tesseract game (tesseract.gg), maybe

I'm not a fan of acryonyms so I think I'll keep saying "SDL shader language" because the association is simple.

"ah, it's SDL 's shader language"

Meanwhile, if a project using a lot of acronyms already may lead to a letter soup

"what does < insert 7 digit long acronym here > even mean? What is WTFPL, what is DX12?"

MatheusKS95 avatar Dec 25 '22 20:12 MatheusKS95

Why not simply use the WebGPU Shader Language (https://www.w3.org/TR/WGSL/), It aims to be a cross platform one.

Why create another one? It feels like How-Standards-Proliferate yet again.

lazalong avatar Jan 19 '23 07:01 lazalong

SDLSL is perfectly fine in my opinion. Simple and to the point, just like the language itself is presumably supposed to be.

Akaricchi avatar Jan 19 '23 15:01 Akaricchi

Why not simply use the WebGPU Shader Language (https://www.w3.org/TR/WGSL/), It aims to be a cross platform one.

https://github.com/libsdl-org/SDL_shader_tools/blob/main/docs/README-SDL_gpu.md#why-a-new-shader-language

icculus avatar Jan 19 '23 17:01 icculus

Why not simply use the WebGPU Shader Language (https://www.w3.org/TR/WGSL/), It aims to be a cross platform one.

https://github.com/libsdl-org/SDL_shader_tools/blob/main/docs/README-SDL_gpu.md#why-a-new-shader-language

Disclaimer: I'm in the peanut gallery with respect to SDL_gpu, but have been a long time SDL user, but I think the rationale articulated here needs to be rethought, or reworded.

HLSL, GLSL, and Metal Shader Language are all fairly complex languages, so to support them we would either need to build something that handles all their intricacies, or pull in a large amount of source code we didn't write and don't have a strong understanding of...including, perhaps, a dependency on something massive like LLVM.

They are complex because shading is complex. Control flow, barriers, wave/group semantics, data uniformity, execution uniformity, helper lanes, and all sorts of constructs need semantic grounding in the language. Some of the complexity is accidental, sure, but a lot of it is not. And something "massive like LLVM" isn't all that massive when juxtaposed with the problems that need to be solved.

By writing the compiler ourselves for a simple language, we could guarantee it'll be small, run fast, offer thread safety (so you can distribute compiles across CPU cores), accept a custom allocator, and be easily embedded in offline tools and also in games that want to compile shaders on-the-fly.

Nobody I know in the industry cares about parallel compilation of an individual shader. If you author a shader where this performance delta would be important, you've likely hit your GPU's shader instruction limit. We gain throughput by compiling many shaders in parallel (and games ship hundreds to tens of thousands of them).

Furthermore, existing compilers like DXC already support custom allocators directly, and this is common practice in all major game engines.

Tools are readily available that will translate shader source from one language to another, so our gamble is that in the worst case, it's just one more target and developers can write in HLSL/GLSL as they would anyhow.

More often than not, these tools do not exist, are not integrated, or are bad for various reasons (bad source mappings, or at the very least, another artifact you need to carry around in your debug pipeline). It's important for tools like PIX, RenderDoc, RGA, NSIGHT, and a plethora of other tools to be able to map bytecode back to source code. Most of these tools support HLSL and GLSL and I have no idea what "source to source" translation tools are integrated in any of the above.

But also...I think this is worth saying out loud: almost every popular shading language made a conscious choice to be as close to C code as possible, and we can make some changes to that syntax to make a better language. We don't have to reinvent the wheel here, just make it a bit more round.

HLSL has templates. Templates! And classes, inheritance, namespaces, interfaces, and plenty of other constructs that don't resemble C. Making a new shading language is reinventing the wheel by definition...

jeremyong avatar Sep 03 '23 21:09 jeremyong

And something "massive like LLVM" isn't all that massive when juxtaposed with the problems that need to be solved.

What do you need LLVM for in a basic transpiler? You can leave complex optimization passes to the backend, that's probably a better idea anyway.

We gain throughput by compiling many shaders in parallel (and games ship hundreds to tens of thousands of them).

The paragraph you are quoting is talking about parallel shader compilation, not computation.

HLSL has templates. Templates! And classes, inheritance, namespaces, interfaces, and plenty of other constructs that don't resemble C. Making a new shading language is reinventing the wheel by definition...

I don't think SDLSL (or whatever the name ends up being) really needs any of those things. Just being able to write truly cross-platform shaders without crazy preprocessor hacks, and to compile them at runtime without a massive dependency, is much more valuable for me. As far as I'm aware, that wheel hasn't been invented so far, and it's a damn shame.

More often than not, these tools do not exist, are not integrated, or are bad for various reasons (bad source mappings, or at the very least, another artifact you need to carry around in your debug pipeline). It's important for tools like PIX, RenderDoc, RGA, NSIGHT, and a plethora of other tools to be able to map bytecode back to source code. Most of these tools support HLSL and GLSL and I have no idea what "source to source" translation tools are integrated in any of the above.

It's annoying but not the worst thing in the world. If SDLSL shaders can store some basic debugging information like variable and function names, and the transpiler is able to translate that into the backend format (e.g. SPIR-V), which can be decompiled into GLSL or HLSL by those tools, that should be good enough for debugging. It'll probably end up being pretty close to the actual source code too, if the language is kept simple.

Akaricchi avatar Sep 03 '23 23:09 Akaricchi

What do you need LLVM for in a basic transpiler? You can leave complex optimization passes to the backend, that's probably a better idea anyway.

It's not just LLVM, in the case of DXC, it's clang, which provides a full frontend with diagnostics and other features that operate on the AST itself, in addition to reflection capabilities used to reflect bindings, struct layouts, and other such niceties.

The paragraph you are quoting is talking about parallel shader compilation, not computation.

Yes obviously (computation was a typo on my part).

I don't think SDLSL (or whatever the name ends up being) really needs any of those things. Just being able to write truly cross-platform shaders without crazy preprocessor hacks, and to compile them at runtime without a massive dependency, is much more valuable for me. As far as I'm aware, that wheel hasn't been invented so far, and it's a damn shame.

The premise of the original statement was that "other languages are just based on C" which is trivial to refute. It honestly probably only applies somewhat to GLSL, but even that's not entirely fair.

It's annoying but not the worst thing in the world. If SDLSL shaders can store some basic debugging information like variable and function names, and the transpiler is able to translate that into the backend format (e.g. SPIR-V), which can be decompiled into GLSL or HLSL by those tools, that should be good enough for debugging. It'll probably end up being pretty close to the actual source code too, if the language is kept simple.

Getting symbols to show up properly for an NSIGHT capture is non-trivial, even if you go the DXC -> DXIL route. If you have a custom shading language, have fun, but good luck when you're trying to diagnose a bad page fault or performance issue.

If you want to make your own shading language and bytecode "just because," go for it, but the rationale as stated makes little sense to me.

jeremyong avatar Sep 04 '23 01:09 jeremyong

If using SDL_gpu means pulling in a big 3rd-party shader compiler and possibly a large WebGPU/WGSL runtime, well then forget it.

The point of SDL_gpu doesn't seem to be for ultra-high-performance AAA-quality 3D graphics anyway, but instead for developers that want less demanding 3D (or 2D with shaders) without having to write and maintain their own backends for each platform they want to support. Using WGSL would mean having to add a lot more complexity and surface area to the SDL_gpu API to support all the additional complexity of WGSL.

sridenour avatar Sep 04 '23 23:09 sridenour

Let's not all have this argument in the "what should we name the shading language?" issue, please.

icculus avatar Sep 05 '23 05:09 icculus

Was wondering why I was getting so many notifications all of a sudden from this. SDSL is still what I call it, because it has the same rhythm as GLSL/HLSL. And frankly -as I stated preciously- it makes the most sense. All other tangents here are just cluttering this space, and they're all redundant towards the original question. It's up to icculus to choose regardless

ENDESGA avatar Sep 05 '23 06:09 ENDESGA