Is my game "based off" Portal because it borrows the same mechanics, puzzle-sovling, enemy list, characters, and control scheme? Or is it a completely independent work?
Likewise, Portal shipped with Halflife 2, portions of the code use to make portal work later found their way into Left4Dead and Counterstrike. Some of the story elements like "cake is a lie" and puzzle solving also showed up later Valve games. Would it make sense to say that Left4Dead is based off of portal because it's made by the same people and shares some code and design ideas? Or is this a completely different work because Left4Dead was a co-op run from zombies game while portal was a solo puzzle-platformer?
Linux is "based off" of Linux in the same way that the hypothetical 'fan-made Portal sequel' is.
Decades ago "UNIX" stopped refering to a particular game and became a certification. Imagine it's 1985 and Nintendo is busy making the first Metroid game. It ships and is hugely popular so we get a couple of sequels. Other companies see that people like the whole "explore a huge map to collect power-ups" style game. Some companies start making games that kinda look like Metroid but aren't very much fun: we're all disappointed.
Nintendo decides to make a 'certification', sort of like their "seal of quality" that you can buy for your game. If you make a game with a huge map and power-ups you can ask Nintendo to sign-off that it's "fun enough", that it has an appropriately large map with hidden areas, and that there are certain kinds of power-ups like the "ball". If you do that, you can use the 'metroid' in your game: Maybe there's a "star trek metroids' or 'metroids vs 1988 olympics'. If you pay Nintendo and they say okay then you can make a real metroid game, not just a game with similar elements.
That's what it means to be a UNIX today: You sit down, you make an operating system, and then a third party checks what you've done and says "yup, this is UNIX" and then you get to call your operating system "UNIX".
In the case of UNIX what matters is system interfaces (that is: how my program talks to the operating system, not how users interact with the software). Being a UNIX lets me know with 100% certainty that I can use a method called "msgsnd()" that will return me an integer, and will accept an integer, struct, size, and int to push messages into a queue. UNIX specs are about maintaining source-level compatibility. It's a spec for an OS that can run on hardware from the 1960s until today so it make any sense to talk about things like integer sizes, how interupt tables should be laid out, or calling conventions (what if your architecture doesn't have enough registers to pass all the variables). It tells you what the code will look like, not what the compiled binary will look like.
So why doesn't everyone go for UNIX certification? There's not much of a market for it. You can write a UNIX-like operating system (that is: an OS that will let you compile and run UNIX programs on it) without actually paying anybody. While having a certification to say "this will work" is nice, the real test is "does it work when I try". For the most part Linux does the UNIX thing well enough that nobody is willing to pay for certification. These days there are only a handful of specialized cases where you'd actually care. If there's no money to be made, and you can do all the same things, why bother with the certification? That's why there are only a handful of UNIXes out there: they're systems that are targeted (mostly) at governments or large corporations who have purchasing requirements that say "must use a UNIX system". They have that rules because they use specialized software that costs more than your house—with dedicated maintenance people who have 6-figure salaries to maintain it. The cost of saving $50,000 on the hardware/software to run the system isn't worth the risks of using software that isn't backed by somebody you can sue if it breaks.
Back to the BSD side of the question:it's a combination of things. Part of it was popular support by groups like the FSF and GNU foundation: the BSD license offers some freedoms that the GPL doesn't, but those kinds of things didn't seem to appeal to people that much. Part of it might be that BSD generally follows the various specs that make up UNIX pretty closely and that means there's less room to experiment.
A couple of years ago I ran the team that maintained software for AIX (a "big iron UNIX" if there ever was one). I spent most of my day using a Macbook Pro doing the actual work, then I'd transfer the source over and build it with IBMs compiler when it was ready to ship.
Likewise, about 99% of the programs I use on a daily basis work equally well on Mac OS X (a real UNIX), BSD (I can't believe it's not UNIX), and Linux (not UNIX, but who cares). The only caveat is that I have to compile them for my operating system, I can't just download some file and run it like it's a Java JAR or something.
There are two things to know:
So far as I know, there isn't a popular Linux or BSD distribution that ships with support for the Mach-O binary format used by Mac OS X, and Mac OS X doesn't support the ELF used by most Linux distributions.
Mac OS X does understand the XCOFF binary format which is used by AIX, so in principle you should be able to build an binary that can run on AIX and Mac OS X but I've never tried.
For example, the way you ask AIX (which ships with X11) to move windows around on the screen is done using the X protocol. On Mac OS X access to other Windows is restricted by default (to improve security: X11 sucks in this respect). If you want to ask some Window that you don't own to resize then you'll have to go through the Accessibility framework where the user can approve your application's access request.
Unfortunately, X11 isn't a standard part of Mac OS X (though Apple does make a package to install it available, you can't count on it being there). Likewise Apple's accessibility framework doesn't run anywhere but on Mac OS X. If I don't have a common ground then I can't easily write software to run on both Mac OS X and AIX without a bunch of hacks to make it work.
That's really the problem we haven't addressed: UNIX provides a common ground for solving very low level problems but once you get much higher (e.g. anything above reading/writing data or accessing memory) you're outside the scope of what it means to be UNIX and systems will differ.
That sort of 'common ground' problem means that you have to stick to pretty low level solutions in order to be truly cross-platform source-compatible.
Mac OS X provides a handful of libraries that are common on UNIX, BSD, and Linux systems (e.g. Apple owns CUPS which is the most popular way to handle printers) so you could write a (command line/ncurses) printer manager that should work basically everywhere. They don't share all the stuff that makes for pretty graphical applications. If you wanted to run the Mac OS X version of Microsoft Office or Adobe Photoshop on Solaris: You're not going to get very far because those applications depend on Mac-specific functionality that you won't find on other UNIX systems.