However, php has it's downsides as well. Using standard GET request links isn't all too search engine friendly. Granted, it's probably better to be set up with some sort of templating system, you can technically still change sections of code in multiple documents (html files) very quickly if you know what you're doing on a *nix box.
As for search engine friendly, I find that setting up a php script to parse the REQUEST_URI out from a generated htm or html page within a script is pretty damn good for that. For example:
Using an .htaccess file you can specify a file as an executable PHP file, even if it has no php extension. Now, say you have a script that's used to generate the pages of your comic (for this lets call it
comic.php). What you'd do is have the another file that uses an include or require of
comic.php called just
comic.
Now you can go to
http://www.mysite.com/comic and apache will execute comic.php (as the include suggests). But this is just the start. The comic.php script would first parse the REQUEST_URI from the $_SERVER global and pull some unique url from it to generate the page. Because of the way this works, going to the url
http://www.mysite.com/comic/comic1page1.html would STILL execute comic.php. So when comic.php executes and parses the url, it would be able to tell that it needs to render comic #1's first page (you can have the URL setup and parse however you like).
Aside from the parsing, the url
http://www.mysite.com/comic/comic1page1.html is being handled the same as you would normally handle a GET request like this:
http://www.mysite.com/comic.php?comic=1&page=1. The only difference is that a search engine will actually pick this up much easier.
NOTE: Sorry if this seems a bit confusing at the moment it's almost 2:00am and it's been a rough day.