Unix philosophy

It's actually much deeper than most people actually realize.

"Everything is a file", "everything is a stream of bytes", "do one thing and do it well" - these may sound like empty platitudes to any experienced hacker who has experienced the struggle of working with modern GNU/Linux systems - but they reflect, in fact, some of the most fundamental principles of software in general.

This is my attempt of revealing those hidden pecularities that hide beneath all the nice-sounding buzzwords that the whole industry keeps repeating, but nobody bothers to follow. First, let's list some of the mistakes that most companies make when making huge software.

Dividing the responsibility doesn't make you immune to architectural issues!

It just makes you blind to them, because nobody looks at the high-level structure.

Usually it happens like this: team of 2-3 smart people sit down and write down the high-level architecture for the whole project, based on assumptions and guesses about the problem domain (made even harder by the fad of "Agile" programming - which is mostly understood as "do the first thing that comes to your mind and delegate fixing to some-other-time") - then throw a bunch of programmers to do specialized tasks without any knowledge of the underlying system.

Result: distributed stupidity.

A bunch of programmers who have no idea how anything else except their specific sub-domain works, and they just do whatever is needed to make the next "iteration" work on their code hell, hoping that there is someone out there more motivated who will eventually come down and fix it all.

The Unix Way: Start small and make the system easy to use and understand.

Agile programmers should especially learn from the Unix way.