I feel I owe folks an apology... the bit about it being a dev-kit was supposed to be tongue-in-cheek and I completely failed at conveying that. So sorry chaps. I'm not sure where the whole NDA thing is regarding devkit spec's but they are certainly nowhere near the system spec I use and outlined above. What I can say though is that a heck of a lot of next-gen console development is largely carried out on nicely-spec'd PC workstations that are generally higher-spec than the console (and console devkit variant). Prior to a new console launch, developers are briefed and provided with a general system specification of what the new system(s) are likely to be capable of (or at least some of the more popular or first-party developers are briefed!) and obviously this provides an early insight as to what said developer may or may not be able to do in terms of exploiting the system. But, relying on those spec's is a dangerous game as they can and do change; hence game engines have to be scalable to cater for the lowest common denominator right up to the latest bleeding-edge hardware... in an ideal world.
Typical multi-platform game development studios will have a build environment whereby a common code base is shareable across all platforms - e.g. code that can be run on any platform and doesn't tie itself specifically to one system. In addition, they'll usually have an abstracted (for example) game engine of some flavour that allows them to deploy a project specifically for that system. That way, the developer can issue a "draw my polygon" command using the abstracted API (without worrying about whether its on PS4, XB1, PC) and, depending on which system it is deployed to, the relevant platform specific code implements that "draw my polygon" command. This is obviously a somewhat simplified explanation but you get the idea... and it's not always so straightforward but I'm talking ideal world here.
As a rule of thumb, devkits are usually a souped-up version of the consumer system - with additional RAM and facilities for debugging and profiling code (through software and/or hardware hooks). To answer some of the above, yes - debugging is an expensive process and massively hits the performance. It's not uncommon to see 60fps games (or 100Hz physics engines) brought to their knees. Thankfully consumers only get to see the release version. I can't really say much more because I know nuffink.
As for the two machines I listed above, the i7 based 990x Extreme is the everyday machine and has the SLI'd 680's, but only 32GB RAM. This system also has the Samsung 840 Pro's and they are fantastic! It's the dual Xeon's (the other system) that pack the real power and these are used for serious number-crunching / processing of large data sets. In terms of GPU, these are kitted with a high-end Quadro card (which are cr*p for games and stupidly priced).