Having an idea of how well a given (web) application performs is one of the important things to consider while evaluating it. How would it behave on a "normal" day, a busy day or during a Social media inflicted visitor flood? How much system resources does the application need under all of those circumstances? How much storage does your application need after a few hundred, thousand or million users? And how fast will it be after that?
You would want to know these things before you put it online.
Gathering and analyzing this information can point you at bottlenecks in your design/infrastructure and - after fixing the problems - can help you to verify that everything performs OK (again).
A benchmarking tool could also help you choose a appropriate solution/framework for the job at hand.
Running those "benchmarks" by hand can be cumbersome, so you could automate this too. Adding graphs, charts and advanced report generation could be a next interesting step.
Of course there is already quite a lot of good benchmarking software which could probably just scratch your itch. Some of them support benchmarking using other protocols than HTTP and/or have other nice features. Here are a few:
For ad-hoc benchmarking and integration within scripts, quite a few people seem to prefer httperf. Tsung for example seems to be able to generate a realistic load on the target machines using various protocols.
Before conducting some more serious benchmarks there are a few special things to consider too: