Quote:
Originally posted by Cerowyn
The overhead of developing hardware abstraction layers (think OpenGL or DirectX), thread scheduling, memory management, etc. is too much for a game developer (who is already spending millions of dollars on just the game elements themselves) to shoulder without making games prohibitively expensive.
This is true, although back in the day of Win3.1, it was less of a big deal to roll almost everything yourself. There were games that required you to exit from Windows and run DOS/4GW. The DOS/4GW wikipedia page mentions that it was popularized by Doom, so I guess that was one of them. I think some games even shipped with bootable DOS disks (probably using DRDOS or FreeDOS) and DOS/4GW.
Nowadays you'd have to, as
Cerowyn says, provide a 3D graphics subsystem, your own networking stack, drivers for all sorts of crazy hardware, etc. Maybe you could leverage linux for this, but that's still a long way from your goal -- you still have to do all sorts of hardware detection and configuration every time you boot your game, which is painful. The benefit of using the user's installed OS is that in most cases, the OS came installed with the proper drivers because it was preinstalled on the machine, and in the rest of the cases, it's the user's problem to get the drivers working before your game even starts up.