Messages in this thread |  | | Date | Thu, 30 May 1996 09:36:40 +0100 (WET DST) | From | "Carlo E. Prelz" <> | Subject | Benchmark comparisons - a summary of the last year |
| |
Hi folks.
Does anybody remember my benchmark comparison messages? Well, I have stopped sending them to the list because I could not compile all releases on my home machine anymore. And patch-to-patch diffs were a bit meaningless. But when I compiled one release I would then run the benchmark software. And I updated the list of benchmark results that can be perused at http://www.linpro.no/cgi-bin/bm (choose machine Pimpinel).
I also found some time to upgrade the comparison software by adding an option to generate a (rough) graph of the results.
This morning I added the results for 1.99.9 and then ran some comparisons with all the results that I have. I think they are interesting, so I am posting them to the list.
Here is the first graph:
Variable: Process Creation Test
+---------------------------------------------------------------+ 403| * *| | * * *** ***** | | * * ***** ** ** * ********* * * | | ** * * * * * | | | | | | | | | | | | | | | | | | | | ************** | 114|** ***** | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
The numbers at the bottom correspond to the following patchlevels:
1. 1.2.10 #59 Tue Jun 13 09:46:17 MET DST 1995 i486 2. 1.3.3 #2 Mon Jun 19 21:16:39 MET DST 1995 i486 3. 1.3.4 #4 Mon Jun 26 22:25:32 MET DST 1995 i486 4. 1.3.5 #5 Thu Jun 29 20:59:28 MET DST 1995 i486 5. 1.3.6 #7 Fri Jun 30 20:40:51 MET DST 1995 i486 6. 1.3.7 #12 Thu Jul 6 18:28:12 MET DST 1995 i486 7. 1.3.7 #13 Fri Jul 7 11:04:49 MET DST 1995 i486 8. 1.3.8 #14 Fri Jul 7 20:06:44 MET DST 1995 i486 9. 1.3.9 #17 Tue Jul 11 20:15:53 MET DST 1995 i486 10. 1.3.10 #20 Thu Jul 13 20:58:09 MET DST 1995 i486 11. 1.3.11 #22 Tue Jul 18 22:26:10 MET DST 1995 i486 12. 1.3.12pre #23 Wed Jul 19 21:55:21 MET DST 1995 i486 13. 1.3.12 #24 Tue Jul 25 21:46:19 MET DST 1995 i486 14. 1.3.13 #25 Thu Jul 27 20:36:51 MET DST 1995 i486 15. 1.3.14 #27 Mon Jul 31 20:44:57 MET DST 1995 i486 16. 1.3.15 #28 Wed Aug 2 19:27:20 MET DST 1995 i486 17. 1.3.16 #29 Tue Aug 8 20:01:44 MET DST 1995 i486 18. 1.3.17 #30 Wed Aug 9 17:54:56 MET DST 1995 i486 19. 1.3.18 #35 Mon Aug 14 23:12:30 MET DST 1995 i486 20. 1.3.19 #36 Wed Aug 16 15:21:34 MET DST 1995 i486 21. 1.3.20 #40 Fri Aug 18 21:39:56 MET DST 1995 i486 22. 1.3.30 #42 Mon Oct 2 20:20:25 MET 1995 i486 23. 1.3.31 #44 Wed Oct 4 11:17:15 MET 1995 i486 24. 1.3.32 #45 Sat Oct 7 15:32:43 MET 1995 i486 25. 1.3.33 #49 Wed Oct 11 20:53:06 MET 1995 i486 26. 1.3.34 #50 Fri Oct 13 22:50:54 MET 1995 i486 27. 1.3.35 #51 Tue Oct 17 04:35:07 MET 1995 i486 28. 1.3.36 #56 Mon Oct 23 21:45:26 MET 1995 i486 29. 1.3.37 #59 Sun Oct 29 10:32:57 MET 1995 i486 30. 1.3.38 #67 Wed Nov 8 21:54:33 MET 1995 i486 31. 1.3.39 #68 Thu Nov 9 17:28:43 MET 1995 i486 32. 1.3.41 #69 Mon Nov 13 18:19:38 MET 1995 i486 33. 1.3.42 #70 Thu Nov 16 17:24:27 MET 1995 i486 34. 1.3.43 #71 Tue Nov 21 18:04:53 MET 1995 i486 35. 1.3.44 #73 Sat Nov 25 19:25:29 MET 1995 i486 36. 1.3.45 #75 Mon Nov 27 14:07:50 MET 1995 i486 37. 1.3.48 #76 Mon Dec 18 06:46:42 MET 1995 i486 38. 1.3.49 #77 Fri Dec 22 21:05:57 MET 1995 i486 39. 1.3.50 #78 Mon Dec 25 11:18:00 MET 1995 i486 40. 1.3.51 #79 Wed Dec 27 19:16:06 MET 1995 i486 41. 1.3.52 #80 Sat Dec 30 19:44:08 MET 1995 i486 42. 1.3.53 #81 Wed Jan 3 19:44:42 MET 1996 i486 43. 1.3.54 #82 Fri Jan 5 18:13:52 MET 1996 i486 44. 1.3.56 #83 Mon Jan 8 20:39:50 MET 1996 i486 45. 1.3.57 #84 Fri Jan 12 23:03:13 MET 1996 i486 46. 1.3.58 #85 Thu Jan 18 19:53:09 MET 1996 i486 47. 1.3.59 #86 Mon Jan 29 08:19:18 MET 1996 i486 48. 1.3.60 #89 Thu Feb 8 21:46:53 MET 1996 i486 49. 1.3.61 #91 Fri Feb 9 20:45:18 MET 1996 i486 50. 1.3.70 #92 Fri Mar 1 23:22:22 MET 1996 i486 51. 1.3.74 #93 Thu Mar 14 21:59:39 MET 1996 i486 52. 1.3.77 #94 Thu Mar 21 21:34:50 MET 1996 i486 53. 1.3.88 #96 Sat Apr 13 16:25:42 WET DST 1996 i486 54. 1.3.91 #97 Thu Apr 18 21:55:26 WET DST 1996 i486 55. 1.3.96 #98 Sat Apr 27 21:01:06 WET DST 1996 i486 56. 1.3.97 #99 Mon Apr 29 22:37:37 WET DST 1996 i486 57. 1.3.98 #100 Sun May 5 10:45:03 WET DST 1996 i486 58. 1.3.99 #102 Tue May 7 19:25:10 WET DST 1996 i486 59. 1.3.100 #103 Fri May 10 17:56:32 WET DST 1996 i486 60. 1.99.3 #104-pre-2.0 Mon May 13 21:17:09 WET DST 1996 i486 61. 1.99.5 #105-pre-2.0 Sat May 18 08:59:59 WET DST 1996 i486 62. 1.99.7 #106-pre-2.0 Wed May 22 07:09:29 WET DST 1996 i486 63. 1.99.9 #107-pre-2.0 Wed May 29 21:06:19 WET DST 1996 i486
As you can see, there has been a great improement between #21 and #22: that's between 1.3.20 and 1.3.30 - I had been away from Lisbon for 1 1/2 months at that time, so that's a sort of a black hole... There are a couple more "serious" holes - .61 to .70 and .77 to .88. I apologize profusely :-)
Anyway, here are graphs for other benchmarks:
Variable: Pipe Throughput Test
+---------------------------------------------------------------+ 19011| * * * | | * *** ** * * | |* * ** * *| | ** * ** ** ** ** | | ** * ** ** * ***** | | * ** | | * * * * * | | ** * * * * | | * * | | * * | | * | | | | | | * * | 14885| * | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: Pipe-based Context Switching Test
+---------------------------------------------------------------+ 10965| * * | | * *** * ** * * * *| | * ** * * * * ** * ** ** *** | | * * ** * * * *** * * | | ***** * * ** | | * | | | | | | | | | | | | * | | | |* ** | 3007| ***** ** | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: Execl Throughput Test
+---------------------------------------------------------------+ 121| ** ** *| | * * * * * | | * ** ** | | * * *** | | * | | | | | | * ** | | * | | | | * | | ** * ** ************** * | | * *** **** ******* * | | | 52|* | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: C Compiler Test
+---------------------------------------------------------------+ 54| ** ****** | | * ** * ***********| | * * | | | | | | | | ** | | * * * | | ********** * * | | * *** ** | | * ** ** **** * | | *** * | | | | | 42|* | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: Arithmetic Test (type = double)
+---------------------------------------------------------------+ 5071| * * * | | ** ******| | * * | | * * | | ***** * * * * * * * | | * * * ** * * * * | | * * * | | * ** | | * * | | ***** ** | | * * | | ** * ** * | | * * * | | | 5057|* | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: Dc: sqrt(2) to 99 decimal places
+---------------------------------------------------------------+ 13048| * ** | | ** *** *****| | *** ******** | | **** ******** * | | * * *** | | | | | | | | | | | | | | | | *** * | |** * *** *** | 5357| * ******* | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: Dhrystone 2 without register variables
+---------------------------------------------------------------+ 51002| * | | | | * | | * * | | * **** ** * ** ** *** * * ** ******************* *****| | * * ** * * * * | | * * | | * | | * | | * | | | | * | | | | * | 46284|* | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: Recursion Test--Tower of Hanoi
+---------------------------------------------------------------+ 728| ** * * * ** ****************** *** *** * * * **** | | ** ** ** * * * * *| | * ** * * | |* * | | * * * | | * | | * | | | | | | | | | | | | | | * | 632| * | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: Shell scripts (2 concurrent)
+---------------------------------------------------------------+ 57| * ****** * **| | ******* * * | | ** | | * | | ** | | ** * | | * * | | ********* **** | | | | | | *** * | | * ** ************* | | | | | 41|* | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Variable: System Call Overhead Test
+---------------------------------------------------------------+ 32191| * | | * * | | * * * * * ** *| | * * * * * * * * ***** | | * *** * * * * * * * ** * * * | | ** * * * * * * * | | * * * * * * | |* * * | | * * * | | | | | | | | * | | | 26550| * | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
Graphs can be obtained for each single benchmarks in the Byte magazine set.
Here are a few FAQs received at the time when I was regularly posting the comparison results on the net:
Q: Where can I find the benchmark software? A: I am using the unix benchmark set that was put together by Byte magazine at the time it was not terminally microsoft-biased. You can find the software at
ftp://tsx-11.mit.edu/pub/linux/sources/test-suites/benchmark.tar.Z
The file contains some docs that explain the significance of each test. I made a couple of patches to the sources. I don't have them here with me now, but I may post them or maybe upload them to tsx-11 if there are enough requests.
Q: How do you compare the results? A: I wrote a small program to compare results and dump the difference to stdout. Then I wrote some cgi stuff to be able to generate benchmark comparisons of a) many variables, two result sets, b) many result sets, one variable, sorted, c) many result sets, one variable, graph output. The software can access results for different machines. Dag Asheim from Oslo was so kind to let me install my software on his machine (our link is 14.400 baud :~-( ) so that you can all perform your comparisons on my data on the WWW. Point your form-capable browser to http://www.linpro.no/cgi-bin/bm. My home machine is called Pimpinel. Mirtillo is a portable that I run two benchmark sets on (I may run one more one of these days). Harappa is a set of results that were sent me by David Niemi, who was working on a newer version of the benchmark sw. (due to the fractal characteristics of my life during the past few months, I lost contact with him...).
I can distribute my software, but there are no docs.
Q: Can I provide some results for my machine so that they can be compared with the other results? A: Sure. Run the byte mag benchmarks (apply my patches first!). Collect the "log" files that you will find in the results subdirecory after each run (remember to change the name of the log file, or copy it elsewhere before running the next benchmark run, or the file will be rewritten). Send them to me along with some informations about the machine. I need: - A name for the machine - An e-mail address where the machine owner can be contacted Further info can be added. I have defined the following headers: - CPU - Speed - Memory - Memory speed - Hard disk controller - Elf linking - a.out linking - GCC version - Notes Not all need to be filled up, and new headers may be added easily if you want to provide more info.
Q: These results are meaningless/wrong/unreliable/etc. A: I try to be as much uniform in running the benchmarks as possible. My home machine has remained practically static for all the time that the results refer to. I still have an a.out system there. During the last ~8 months I did very little real work on that machine.
I always run the benchmark set after a reboot. I log in as root, start the run, logout and (generally) go to sleep. A benchmark run lasts a few hours.
Some results vary widely and irregularly (see pipe throughput test above). I can't help. These are the numbers that I get.
Q: The disk IO results are a bit strange. A: I know. Some results are missing altogether for all releases up to 1.3.13: at that time I discovered a bug in the software that meant that the past results were rubbish.
There was also a period where the IO results were varying between two platform levels. This behavior ended at release 1.3.53, when the performances grew to a higher level. Here is the graph for File Copy (30 seconds):
+---------------------------------------------------------------+ 3636| * * | | * *** ** * *| | * ** **** * ** | | * | | | | * * *** | | | | | | | | | | | | | | * | | | 939| ************* ********* | +---------------------------------------------------------------+ 1 2 3 4 5 6 123456789012345678901234567890123456789012345678901234567890123
****************************************************************************
The main fact is that our beloved system has grown a lot during the last year. Thanks to Linus, all the kernel gurus, and everybody else who helped make this wonderful utopia real!
Carlo
-- * Se la Strada e la sua Virtu' non fossero state messe da parte, * K * Carlo E. Prelz - fluido@marktest.pt che bisogno ci sarebbe * di parlare tanto di amore e di rettitudine? (Chuang-Tzu)
|  |