“Back in the day”, you might say, “I created my site with Vi and was proud of it”. Well, I did. Can’t say I was proud, as such, just that the technology of the day was such that Vi was a perfectly usable tool (for those who knew how to use it) to build websites in. Later on, Dreamweaver, ColdFusion, Quanta, NetObjects and a multitude of other programs joined the fray, promosing assistance from simply completing your tags for you to figuring out what CSS you should use.
These days, I believe we have largely moved beyond hand-crafted web pages. At least, I have: my first site was actually composed using “vi” on an HP Unix workstation and uploaded onto the then-new Demon Internet. Later sites evolved slightly; I used “Vi” or, sometimes, Quanta, the KDE web development tool, to edit them, and they employed a small amount of automation: server-side includes ensured that the menus, header and footer of the site was kept consistent through all the pages.
These days, we expect somewhat more: dynamic content that changes as we use the site, like Amazon’s “Page You Made” or the BBC site’s ability to adjust the homepage topics you see to suit your tastes. Even smaller sites may find a use for enabling content depending on access priviledges, so staff members can see the intranet links and the general public can’t. Doing things this way, however, is a paradigm shift. No longer is there a concept of one page in which you put this content, and another page in which you put that. Now, we have multiple options, and it is even the user that ends up selecting which options they see.
Building a website using that principle, then, is going to take more than just a page of HTML and the Vi Editor. One of the tools that is rapidly coming to the fore in this area is the CMS, or Content Management System, and it is what it says, really. A way of managing a bunch of content that goes to form your website. There are several technologies involved, one of the main ones being PHP, the eponymously named “PHP: Hypertext Preprocessor”, which is, now, a high level language that lives within the web browser. PHP programs executing in the browser can therefore respond to requests for pages by constructing the page on the fly, rather than just send back a pre-written file as used to happen. Doing this isn’t really that hard: you need some spec of what the pages are called and a source of the content - sometimes as snippets stored in files. The trickly part is not spending several years making it happen - and then another couple when the CEO signals a shift in focus. One of the ways of avoiding this is to use someone else’s labour, and that is where systems like Drupal come in.
Drupal is a CMS, and it’s also written in PHP - PHP 5.3, to be precise. There are alternatives: Joomla and WordPress are the more obvious ones. Drupal is also GPL licensed, and this is important because everything in Drupal - the core and almost all of the additional stuff, is free to use, free to change to suit your needs, and free to improve, whether the user is a Fortune 500 company (and some do) or a single hacker in their mum’s bedroom (and some are). This freedom is partly about money, but mostly about the ability to do stuff. Hosting the website will cost as much as any other site, and you may still need to pay someone to make stuff that’s peculiar to your site. But in using the huge (2000+) suite of preconstructed elements that is Drupal, you are avoiding having to maintain, find security holes, fix bugs, and add new features to, what is normally a very large chunk of your site. Only the “glue” - perhaps some parts of the layout (theme) - will be solely your responsibility. And that is good. Very good. A modern CMS like Drupal is not small and it can weild a staggering degree of functionality out of the box. To maintain it for just one company would be cripplingly expensive.
So what *is* Drupal then?
Well, it’s a PHP program that runs in the web server. Requests for pages from web browsers are all fed to one entry point and a database - often MySQL - is consulted to work out what to do next. Components of Drupal called Modules can signal that some pages “belong” to them, and they then take up the task of building some of the main page elements, but other modules can, and normally do, chip in with other elements. For example, the main content may have a “breadcrumb” trail - the “parent” and “grandparent” pages of the current one - at the top of the page. Another module may add a menu, to indicate other pages on the site, and yet another may add the footer. The whole is pulled together and then themed: a consistent set of HTML tags and CSS rules are applied so that the page looks as the designer intends. Only after this is complete is the page sent to the user’s browser.
Surely this takes a very long time?
Well, it can. However, we now routinely run web servers on machines that would have been considered supercomputers not that long ago. And also Drupal does some very intelligent caching. Not just caching whole pages - as tools like Squid do - but caching page elements, and lookups, and menus, so that when that request comes in it’s already prepared and can just squirt stuff back.
Isn’t it very complicated then?
Well, it can be, but one of the delights of Drupal is that you can get into it and do stuff with very little. Some experience setting up a web server - as much as you needed a decade ago - is needed to get started, and then you can do an enormous amount with just a web browser, clicking and selecting stuff as you want. Some of the stuff - creating new modules, or making a new theme for example - needs more knowledge, there is a wide variety of module code and themes available, so that simple requirements probably don’t need it.
Where can I get it?
Jump off to Drupal.org for a wander around, and if you want a chat with folk, they can be found on IRC too.