What's wrong with d3d12? It works perfectly fine for what it does. In my experience it causes a lot less issues than Vulkan. And it's not really due to windows not supporting Vulkan correctly, since my experience with Vulkan has mostly been on Linux.
I don't dislike Vulkan either, it's just that I don't see the point of replacing something that works pretty well.
Adopting Vulkan doesn't mean removing Direct X 12. Just like adopting spirv doesn't mean removing hlsl. No one said anything about getting rid of anything.
I don't think it's reinventing the wheel, since Vulkan was ready quite a bit after d3d12 but yeah I guess maybe it could be the standard on windows after d3d12 becomes obsolete...
But that's going to be in quite a while since I can't think of an actual feature (for end users) that is missing from one vs the other right now.
Everything on Windows already uses d3d12/DirectX basically so it would actually be a huge wheel reinvention to migrate to a standard just for the sake of it.
I think saying that DX was first so it's Vulkan that was reinventing the wheel is incorrect with historical context.
AMD and DICE developed a prototype API called Mantle. Which is what both DX and Vulkan are based on.
Both Vulkan(glNext back then) and DX12 were announced around the same time. VK came a bit later as standards are usually slower in coming to decisions but it's not like VK was reinventing anything from DX.
I remember we were having a laugh reading early DX12 documentation as it was in parts just copied from Mantle with names unchanged in places!
> DirectX 12 was announced by Microsoft at GDC on March 20, 2014, and was officially launched alongside Windows 10 on July 29, 2015.
> Vulkan 1.0 was released in February 2016.
What people forget is that Mantle was basically a proprietary AMD API that they wanted and developed until, well, the release of Metal in 2014 and DX 12 in 2015.
Only then did they "graciously" donated Mantle to Khronos for the development of modern APIs.
Vulkan was not just late. It suffers from the same issues as OpenGL before it: designed by committee, lackluster support from the major players.
AMD indicated from the beginning they wanted it to become the universal API.
Opening stuff up formally also takes time. So it all was going towards Vulkan in one form or another and no one was forcing MS to push DX12 NIH while this was happening.
And counter to your point, despite Mantle being "proprietary", MS directly used it to create DX12 (same as Vulkan used it), so AMD clearly didn't have any complaints about that.
> AMD indicated from the beginning they wanted it to become the universal API.
Was it an indication or was there any actual work done? Such as supporting anything else but AMD cards for example, inviting others to collaborate etc.?
> despite Mantle being "proprietary", MS directly used it to create DX12
I can't remember the term for it: what do you call when a single company develops something with little to no external input and collaboration, even if it's sorta kinda open?
As for "NIH"... Microsoft has/had a much bigger investment and interest in new gaming APIs than the few AMD cards that Mantle supported. And they already had their own platform, their own APIs etc. Makes sense for them to move forward without waiting for anyone
Over time the work was obviously done for Mantle → Gl Next → Vulkan. And that's becasue AMD were positive about this idea. Their initial presentation of Mantle was in that vein, i.e. to kickstart the progress of the common API.
MS just decided to do that whole thing for their NIH in parallel using parts of Mantle practically verbatim. It wouldn't have been possible without AMD basically allowing it.
Ah you are right, I forgot that they both were announced at around the same time. It just feels like Vulkan took forever. To the point where some teams at my job had to use OpenGL even for greenfield projects for quite a while after Vulkan was first announced (even when they wanted to use Vulkan).
I wonder if that means that dx12 and Vulkan could have a good interop/compatibility story, since they both have similar origins.
The same can also be said about D3D12, it is at least 'heavily inspired' by Mantle. In the end, not much of Mantle has survived in Vulkan either though. Mantle was a much cleaner API than Vulkan because it didn't have to cover so many GPU architectures as Vulkan (Mantle especially didn't have to care about supporting shitty mobile GPUs).
It's mostly on us, the developers.
Vulkan is fully supported on windows.
I would say that if you want to have multi-platform support just use Vulkan.
Covers most of the platforms(especially if you include MoltenVK)[0].
Though, for games, if you want to support Xbox, that usually throws a curveball into API choice planning. As that might be more important of a target than Linux/Android/Mac/iOS(maybe even combined) for your game. So if you already have to support DX for that..
vulkan is already supported on windows as a first-class citizen by all major IHVs. I am not sure what this "adoption" you speak would entail. If you're talking about replacing d3d12, that actually is a terrible idea.
what do you mean when you say "built into the os"? d3d12 is just an api. the d3d runtime is user-space, both the UMD that wraps it and the KMD are supplied by the hardware vendor. In the end, both a d3d app and a vulkan app end up talking to the very same KMD. See here for reference:
D3D is clearly more integrated into the OS than Vulkan is.
Most importantly, Windows includes a software D3D renderer (WARP) so apps can depend on it always being present (even if the performance isn’t spectacular). There are lots of situations where Vulkan isn’t present on Windows, for example a Remote Desktop/terminal server session, or machines with old/low-end video cards.
These might not be important for AAA games, but for normal applications they are.
Another example: Windows doesn’t include the Vulkan loader (vulkan-1.dll), apps need to bundle/install that.
> D3D is clearly more integrated into the OS than Vulkan is.
sure, but addressing the two points that you brought up would not entail changing windows _the operating system_, just the stuff that ships with it. you could easily ship swift shader along with warp and the loader library, both of those are just some application libraries as far as the os/kernel is concerned. of course now we're in the territory of arguing about "what constitutes an OS" :-)
I say this because vulkan is hamstrung by being an "open API" intended to run on a very wide range of devices including mobiles. this has major repercussions, like the awkward descriptor set binding model (whereas d3d12's descriptor heaps are both easier to deal with and map better to the actual hardware that d3d12 is intended to run on, see e.g. https://www.gfxstrand.net/faith/blog/2022/08/descriptors-are...). overall d3d has the benefit of a narrower scope.
Another problem with being an open API is that (and this is my own speculation) it's easier for IHVs to collaborate with just Microsoft to move faster and hammer out the APIs for upcoming novel features like work graphs for example, vs bringing it into the public working group and "showing their cards" so to speak. This is probably why vk gets all new shiny stuff like rtrt, mesh shaders etc. only after it has been in d3d for a while.
One could argue this is all solvable by "just" adding a torrent of extensions to vulkan but it's really not clear to me what that path offers vs d3d.
I would guess that if DX didn't exist the iteration on VK side would just be faster. Through extensions, like you've mentioned.
In the end it might have even speed up the adoption of such features. Currently if you have a multiplatform engine, even though windows is like 99% of your PC player base it's still sometimes a tough decision to just use a feature that you can't support on all your targets.
Does that support extend to ARM? Not sure if it's still the case, but I recall that early Windows on ARM devices didn't have native Vulkan (and I believe OpenGL was translated to DirectX via ANGLE).
I haven't laid my hands on any ARM windows devices so I wouldn't be able to tell you. I'd be somewhat surprised if the newer snapdragon stuff doesn't have vulkan support because qcom supports vulkan first-class on its gpus. in fact, on newer android devices OpenGL support might already be implemented on top of vulkan, but don't quote me on that.
But are you saying that compared to DX or just in general?
We're talking here about potential DX replacement, not about design in general and the bulk of it is very similar for both APIs.
There are some small quirks from Vulkan being made to be easily extensible which in the end I consider worth it.
I personally like how consistent the API is in both patterns and naming. After using it for a while, it's easy to infer what function will do from the name, how it will handle memory, and what you'll need to do with that object after the fact.
The DirectX specs are much better than both the OpenGL and Vulkan specs because they also go into implementation details and are written in 'documentation language', not 'spec language':
If you search for 'D3D12' spec what you actually find is D3D12 doesn't have a specification at all. D3D12's "spec" is only specified by a document that states the differences from D3D11. There's no complete holistic document that describes D3D12 entirely in terms of D3D12. You have to cross reference back and forth between the two documents and try and make sense of it.
Many of D3D12's newer features (Enhanced Barriers, which are largely a clone of Vulkan's pipeline barriers) are woefully under specified, with no real description of the precise semantics. Just finding if a function is safe to call in multiple threads simultaneously is quite difficult.
I don't think that going into implementation details is what I would expect from an interface specification. The interface exists precisely to isolate the API consumer from the implementation details.
And while they're much better than nothing, those documents are certainly not a specification. They're are individual documents each covering a part of the API, with very spotty coverage (mostly focusing on new features) and unclear relationship to one another.
For example, the precise semantics of ResourceBarrier() are nowhere to be found. You can infer something from the extended barrier documentation, something is written in the function MSDN page (with vague references to concepts like "promoting" and "decaying"), something else is written in other random MSDN pages (which you only discover by browsing around, there are no specific links) but at the end of the day you're left to guess the actual assumptions you can make.
*EDIT* I don't mean to say that Vulkan or SPIR-V specification is perfect either. One still has a lot of doubts while reading them. But at least there is an attempt of writing a document that specifies the entire contract that exists between the API implementer and the API consumer. Missing points are in general considered bugs and sometimes fixed.
> I don't think that going into implementation details is what I would expect from an interface specification.
I guess that's why Microsoft calls it an "engineering spec", but I prefer that sort specification over the Vulkan or GL spec TBH.
> The interface exists precisely to isolate the API consumer from the implementation details.
In theory that's a good thing, but at least the GL spec was quite useless because concrete drivers still interpreted the specification differently - or were just plain buggy.
Writing GL code precisely against the spec didn't help with making that GL code run on specific drivers at all, and Khronos only worried about their spec, not about the quality of vendor drivers (while some GPU vendors didn't worry much about the quality of their GL drivers either).
The D3D engineering specs seem to be grounded much more in the real world, and the additional information that goes beyond the interface description is extremely helpful (source access would be better of course).