This page may be out of date. Submit any pending changes before refreshing this page.
Hide this message.
Quora uses cookies to improve your experience. Read more

Why was Doom developed on a NeXT?

What was the reasoning behind using a NeXT computer to develop Doom? Why did idSoftware decide this was any better an option over PC, considering their games primarily targeted the PC market. It seems like an obscure (and expensive?) choice in hindsight.

4 Answers
John Carmack 

No regrets at all!

I bought our first NeXT (a ColorStation) just out of personal interest. Jason Blochowiak had talked to me about the advantages of Unix based systems from his time at college, and I was interested in seeing what Steve Job’s next big thing was. It is funny to look back – I can remember honestly wondering what the advantages of a real multi process development environment would be over the DOS and older Apple environments we were using. Actually using the NeXT was an eye opener, and it was quickly clear to me that it had a lot of tangible advantages for us, so we moved everything but pixel art (which was still done in Deluxe Paint on DOS) over. Using Interface Builder for our game editors was a NeXT unique advantage, but most Unix systems would have provided similar general purpose software development advantages (the debugger wasn’t nearly as good as Turbo Debugger 386, though!). Kevin Cloud even did our game manuals, starting with Wolfenstein 3D, in Framemaker on a NeXT.

This was all in the context of DOS or Windows 3.x; it was revelatory to have a computer system that didn’t crash all the time. By the time Quake 2 came around, Windows NT was in a similar didn’t-crash-all-the-time state, it had hardware accelerated OpenGL, and Visual Studio was getting really good, so I didn’t feel too bad moving over to it. At that transition point I did evaluate most of the other Unix workstations, and didn’t find a strong enough reason not to go with Microsoft for our desktop systems.

Over the entire course of Doom and Quake 1’s development we probably spent $100,000 on NeXT computers, which isn’t much at all in the larger scheme of development. We later spent more than that on Unix SMP server systems (first a quad Alpha, then an eventually 16-way SGI system) to run the time consuming lighting and visibility calculations for the Quake series. I remember one year looking at the Top 500 supercomputer list and thinking that if we had expanded our SGI to 32 processors, we would have just snuck in at the bottom.

Vladislav Zorov

In addition to Andrea Ferro's answer (that it had good development environments), keep in mind you’re talking about this man - John Carmack coded Quake on a 28-inch 16:9 1080p monitor in 1995

What did you have in 1995? An Intel Pentium 90 MHz? From the point of view of something that can drive a Full HD monitor (1920x1080), that’s a toy :)

About the choice being expensive - in software development, the hardware is often the cheapest component; the salary of a programmer can be over a hundred thousands dollars per year, and a game takes multiple years to make - spending $10–15k on a computer just isn’t that big a deal.

P.S. There’s now an official answer - John Carmack's answer to Why was Doom developed on a NeXT?

Andrea Ferro

Because at the time Doom was developed NeXT had the best software development environment in existence. It was very similar (actually a little better) than what whe have today on a Mac.

By comparison the software development environment for Windows at the time was notepad and the “dos command line”.

Jenny Baron

So if NeXT was based on unix, and MacOS is a derivative of that, why can't linux get to the level of MacOS GUI?
There are a couple of issues:

First is willpower. Linux development is done either by hobbyists or to some degree by companies. Hobbyists work on whatever they want, and it's often not graphic stuff. Companies (and the distributions count here) work on whatever they think they need, which often is not graphic stuff. Apple can order 1500 people to work on graphics stuff.

Second is inertia. For various technical and philosophical reasons people in Linux land like to keep using the same software and programming interfaces even if they are extremely old. The X11 window system is ancient in computer terms, and is something of a large series of hacks built on top of each other these days to get the vaguely modern features that are available. For a ton of people they consider that good enough. That makes progress incredibly difficult because they're held back by the window and system.

The Wayland windowing system is a pretty big step forward here and looks like it's going to end up taking over, but that'll be a while. I seem to remember that Ubuntu has their own as well, but I don't remember what it's called.

Third is taste. Apple has a lot of it (in my opinion), but they also have decades of experience and researchers and human interface labs and all sorts of resources that the vast majority of open source software doesn't have. So open source software often looks like a clone of other software (GIMP versus Photoshop) or just has some sort of generic or inscrutable interface. A lot of the most popular desktop environments on Linux look a hell of a lot like stuff that was on Windows or Mac OS X. Or, they look like stuff from the 80s because the developers were used to that and like it. Either way Apple has graphic designer so you can put on any project, where is there aren't a lot of graphic designers that seem to donate their time to open source projects. So a lot of the open source beliefs are made by programmers doing their best, but that often doesn't compare. Even if a graphic designer came along and suggested something, it's possible to programmers would reject it due to their own personal tastes.

Finally there's focus. Apple has one desktop operating system and it looks a certain way. They spend all their time on it. There are two major desktop environments in Linux, along with a number of smaller ones. Some distributions have their own. Some may be Linux only, others are restricted by what's available on the other platforms they support like OpenBSD or FreeBSD. In short there's a nontrivial amount of duplicated effort. Whether that's good or bad is how you see the situation.

But you also have choices being made. Apple goes out of their way to make their desktop extremely smooth and nice to use. The Linux kernel would never except patches that make the GUI much smoother somehow at the expense of keeping the system from running efficiently for other things. The patches would have to have a negligible effect otherwise to get accepted. Apple can decide that if this makes the GUI smoother or allows some new neat thing but it slows down the absolute maximum network speed by 1% that's OK. They have an absolute focus on user experience for their software. Linux and other open-source software doesn't. To some degree windows doesn't.

Actually Android is an excellent example of this. Google took Linux, Applied a ton of patches, wrote their own GUI layer, and did some other stuff to get the UI as good as they could and make some of the things they cared about easy to do. In the end it's basically not Linux (as in GNU/Linux, the whole OS), it just uses the kernel. Over the last couple of years Android has slowly been getting some of their code changes into the kernel and some of the updates made to the kernel by the normal process have replaced some of Android's custom code to make everyone's lives better. But that takes a lot of time and a company the size of Google to do it. Would be a Herculean task for a small team of developers. But that's what it takes to compete with Apple's GUI.

Long and short of it is it's hard to make a really good GUI on Linux. Distributions can try and make things better (Ubuntu has done a great job here and pushed user experience A LOT compared to previous distros). But it's hard to get the kind of singular focus that Apple can choose to do (or Microsoft or Google) when a huge chunk of your labor force is volunteer.

On the other hand Linux his produced incredible server operating system that's amazingly flexible. Open source is also produced a number of others like OpenBSD in FreeBSD. OS X has never been anywhere near is good in performance at being a server is Linux has from it's relentless pursuit of excellence scalability and high-speed operation. That's the trade-off the Linux community as a whole seems to have made.

View More Answers