The post is nice as it makes two good points:
First, use a dedicated system for testing and do not use your local machine.
Second, be aware of warm-up times and do not consider them in test results
Both are very valid points, however there is more to say on performance testing. I personally do not like the example of different loops. In my 10 years of enterprise software development I have never really thought about even writing a test for this. Also in my engagements in performance engineering the problem never was the usage of a wrong loop type. Summing it up, I see no point writing these kinds of tests.
Additionally I want to stress that the goal at a development machine is not to collect performance metrics, but rather identify conceptual bugs that might lead to performance problems. You can read more on this topic at http://blog.dynatrace.com/2009/01/07/how-to-find-invisible-performance-problems
So performance testing is more than just response times. It is about validating all performance relevant aspects of software. Development should focus on finding architectural problems rather than microbenchmarks.
If you really want to run performance tests in your Continuous Integration environments you should focus on real world use cases rather than micro benchmarks.
I believe the author is well aware of that, however it is not stated in the post.