In this page I want to show you everything you need to deploy HammerDb into your SQL environment and set it up running as a SQL Agent job on an automated schedule.
The recommended way familiarise yourself with HammerDb is to use the GUI interface that comes with the application. This is a great tool and is relatively straight forward to use. The main bugbears I have with the GUI is that it requires you to log onto a machine somewhere to run it and you have to manually edit the config\ parameters. This can leave you prone to making mistakes like entering the wrong number of warehouses between test which directly effects the results you receive. There is also no way to schedule it to run at a specific time. If you want to be more scientific about the tests you perform, to me it’s important to ensure that you are using the same parameters between tests, you repeat the tests at the same time/(s) so that you can be sure things like backups or other processes running in the background aren’t impacting your results, and you repeat the test at different set times of the day to generate more accurate averages.
The official documentation also includes instructions for running workloads via the command line. Again, this is very useful, in fact the PowerShell scripts I have developed use this approach however there’s no native way to schedule the tests to run at a set time. Yes, you can manually set up a scheduled task or use SQL Agent to call .bat files that run the required commands, however this is a laborious and timely task which you aren’t going to appreciate if you’ve got hundreds of different SQL instances to manage.
The scripts I have developed will deploy the HammerDb executables, command line batch files and set up a SQL Agent job to run repeatable tests on a schedule. The benchmark results are by default written to .log files in the temp directory of the account running the commands. This is usually the SQL Agent Service accounts temp directory unless your using a proxy account. The SQL Agent job will gather the NOPM (New Orders per Minute) and TPM (Transactions per Minute) benchmark statistics from the temp directory and log them into a csv file or SQL table. This makes gathering and reporting benchmark results much easier.
These scripts are by no means a finished product, I’m sure there are many improvements that can be made and additional options that can be added to make them more useful. I should also add I’m not a PowerShell expert, but like to think I know enough to achieve the end result I’m looking for. There are probably methods that I have used that are inefficient or possibly not adhering to best practices, this is just what I have had time to develop so far and wanted to share as I thought it could be useful for others. I would love for additions and improvements to be made by people in the SQL community, and would kindly listen to any feedback that can help to make improvements. Please leave feedback in the comments. Like the majority of other people, I have a day job and other responsibilities so will try update this as and when I can.
Lets Get started “If a pictures worth a thousand words, why bother with that when a video’s worth a million”
The following video talks through the process for downloading the required files and deploying this onto a Hyper-V VM running Windows\SQL 2016.
Please forgive my 1st attempt at recording and editing a video like this. If this is of any use to anyone then I might consider recreating a better quality version but for now I’m using mediocre video software.
I’d suggest watching the video to begin with, then if you need a step by step breakdown and more detail please work your way through the instructions below. On with the Hammering !!!!
*Disclaimer* ensure you properly test using HammerDb in your environment. Running heavy workloads against VM’s can impact other VM’s on the same hosts, likewise if running in Azure you could rack up expensive charges so best to make sure you have done some testing beforehand.*
Step by step setup
It is intended that you run the deployment\install scripts from a central server, which will remotely install HammerDb along with a benchmark and benchmark log import SQL Agent job.
In the following examples the deployment server is called SQL03 and the target server upon which we want to install HammerDb and run benchmark tests is called SQL02. Both servers are running Windows 2016 and SQL Server 2016.
The Swarchy HammerDb install scripts have 3 prerequisites that must be met before the scripts can be run to deploy HammerDb into your environment.
1.Download and install DBA tools on your deployment server (you may need to use the offline install guide for this if your deployment server doesn’t have internet access)
2. Download the latest version of HammerDb (4.1 as of time of writing) https://hammerdb.com/download.html
3.Download T101 Swarchy HammerDb Benchmark Setup Scripts for SQL Server https://github.com/swarchy/HammerDBBenchmark
I recommend downloading these into a folder called HammerInstall on your internet enabled device
- Copy the downloaded files to a folder e.g. C:\temp\HammerInstall on your deployment machine aka SQL03
2.Copy dbatools-master to your machines PowerShell module directory
C:\Program Files\WindowsPowerShell\Modules Note these install instructions assume your deployment machine is not connected to the internet and therefore uses the offline install approach for dbatools. You should be ok to use the online install methods if these are available on your machine.
3.Extract dbatools-master and rename dba-master folder within to dbatools
4.Copy the dbatools folder to the root of the modules directory
Directory structure should be as follows
5.Load PowerShell command prompt and type import-module dbatools
6.Once prompt returns issue the following statement (change server name) to test connectivity to the SQL server you intend to run the benchmarking scripts against aka SQL02. Check results are retrieved successfully
Invoke-DbaQuery -SqlInstance sql02 -Query ‘SELECT name from sys.databases’
7.Create new folder within the C:\temp\HammerInstall called HammerMedia
8.Extract the HammerDb application (“HammerDB-4.1-Win”) and copy this into the HammerMedia folder
Should look like
9.Back in the C:\temp\HammerInstall directory, extract the HammerDbBenchmark-main zip
10.Once extracted copy the HammerScripts folder to C:\temp\HammerInstall
11.Delete the .zip folders and HammerDBBenchmark-main as these are no longer required
Your directory should look as follows
12.Browse to C:\Temp\HammerInstall\HammerScripts\WorkloadConfig and copy contents to C:\Temp\HammerInstall\HammerMedia\HammerDB-4.1-Win\HammerDB-4.1
C:\Temp\HammerInstall\HammerMedia\HammerDB-4.1-Win\HammerDB-4.1 directory should contain new .cmd and .tcl files.
These files define the workload that you will be running within the benchmarks.
I intend to go into more details about these within another blog post, however for more information about their contents in the meantime I would suggest reading https://hammerdb.com/docs/ch09.html
13. Still On the deployment server (SQL03) open PowerShell ISE as administrator
14.Within ISE open C:\Temp\HammerInstall\HammerScripts\InstallHammmerDb.ps1
This is the main script that is used to install the HammerDb benchmark tool and import\workload scripts on another remote machine. This is the machine which you intend to run benchmarks against which is the machine called SQL02 in this demo environment.
Scroll down to the bottom of the script to line 148 where the InstallHammerDb function is called
There are several parameters that you will need to change here for your environment
InstallHammerDb -SourceDir “$Root\HammerScripts” -HamMedia “$Root\HammerMedia\HammerDB-4.1-Win\HammerDB-4.1” -SQLInstance “SQL02” -ELogFileDir “$Root\SetupLogs\” -Destination “D:\HammerDbBenchmark”
|-SourceDir||The directory where your HammerDbInstall scripts are stored||In this example our scripts are stored in C:\Temp\HammerInstall\ which is determined with the $Myinvocation command on line 146|
|-HamMedia||The directory where the HammerDb media has been downloaded to||In this example we have placed the HammerDb media within C:\Temp\HammerInstall\HammerMedia\HammerDB-4.1-Win\HammerDB-4.1|
|-SQLInstance||This is the name of the SQL instance that you intend to deploy the scripts against and set up the benchmarking jobs (not yet available for named instances)||In this example we will be setting up the benchmarking jobs against SQL02 SQL Server|
|-ELogFileDir||The directory where any installation logs and error logs will be written on your deployment server||In this example it uses the $Myinvocation command on line 146 to set the directory to C:\Temp\HammerInstall\SetupLogs|
|-Destination||The directory where you want to deploy the HammerDb install media and scripts on your destination server||In this example a new folder will be created in the root of SQL02’s D drive called D:\HammerDbBenchmark|
15.Once you have set the parameters on line 148 the script should be ready to run within PowerShell ISE or your chosen PowerShell IDE
16.Once the script has run you should see an output similar to below directing you to check for any errors in the timestamped log
You can copy and paste the command in yellow to open the log in notepad
17.If all has gone well you should see commands reporting completing successfully. If there have been any errors this should also be reported in this log.
There is also a log generated by RoboCopy which details the copying of the HammerDb install media to the target server aka SQL02
18.Connect to your target server aka SQL02 and check the HammerDbBenchmark folder has been created on the D drive and that two SQL agent jobs have been created
19.Schedule your HammerDb_Benchmark job to run at a suitable time, alternatively if this is within a test environment where you’re sure you’re not going to impact any other services, you can right click and run the job from step 1 to check its working as expected.
I typically recommend running tests at the same time over a course of a few days and schedule the import job to run at the end of the day once the benchmark data has been generated.
There are 4 steps within the HammerDb_Benchmark job as shown
Step 1 Drop Existing Database
As part of the benchmark a new database called tpcc will be created in which the workload will be generated. This step checks if that databases exists and will delete it if that is the case so that a fresh workload database is created for each iteration of the test.
Step 2 Build New Tpcc database
This creates a new tpcc database using the default database data and log directories for the SQL instance
The database is created by the HammerDb process according to what is specified within the D:\HammerDbBenchmark\HammerMedia\CustomAutoTestBuild.tcl file copied over as part of the install process.
As mentioned, I intend to go into more details about these within another blog post, however for more information about their contents in the meantime I would suggest reading https://hammerdb.com/docs/ch09.html
Step 3 Reconfigure Database
This step reconfigures the tpcc database so that by default the mdf file is pre-sized to 2GB, the log ldf files is pre-sized to 1GB and the recovery model is set to SIMPLE
This is done to prevent the database from needing to auto grow which could impact the TPM/NOPM results achieved since these can be disk intensive operations.
The database has been set to simple recovery as we will not be running any log backups whilst the benchmark workload is running, this helps prevent excessive log growth and minimizes the chances of any background operations impacting the benchmark results.
I would most certainly recommend that these are changed according to the size of the workload database and size of the benchmark you are intending to run. This has just been set at these low levels for the basic demonstration, but if you are configuring a workload with multiple virtual users and several warehouse’s, you most certainly should do some testing to work out what works best for your configured workload.
Step 4 Run Benchmark
This is the step where the AutoRunWorkload.cmd file is called to run the workload that has been defined within the CustomAutoTestWorkload.tcl file previously copied over to the target server as part of the setup.
This kicks off the process that actually runs the workload
When the workload is running you should see the Tclsh Application process consuming resources on the target server aka SQL02.
As mentioned, I intend to go into more details about these files within another blog post, however for more information about their contents in the meantime I would suggest reading https://hammerdb.com/docs/ch09.html
20.Once the HammerDb_Benchmark job has finished check the job output for any errors
21.Browse to the temp directory of the account the HammerDb_Benchmark job is configured to run with. This is usually the SQL Agent account by default which can be found from SQL Configuration Manager
In this example the temp directory is
The directories are usually within C:\Users\UserAccount\AppData\ but the AppData folder may be hidden so you will need to either tick to view hidden items in file explorer or browse direct to the directory by typing the full path
22.Check the two files have been generated
Within the hammerdb.log file you should see that the schema build reported completing successfully
Within the other file hammerdb_60A78D651BF403E293739313.log, you should see the TPM and NOPM results
Once you have generated some benchmark results within the temp directory you can run the HammerbDbLogImport job to collect the results into a csv file
1.On the target server where you have been running the benchmark tests (SQL02) open CreateHammerDBLogImportJob.ps1 script with PowerShell ISE
In this example ours is on SQL02 stored within D:\HammerDbBenchmark\CreateHammerDBLogImportJob.sql
2.Scroll to the bottom of the script to the line that calls the ImportHammerLogs function and set the parameters accordingly for your environment
ImportHammerLogs -BaseFolder “D:\HammerDbBenchmark” -HammerLogsFolder “C:\Users\SVC_SQLA\AppData\Local\Temp” -WriteOp “csv” -LogSQLInstance “sql02” -LogDatabase “HammerResults” -LogTable “SQL02_HammerBenchmarkRlt”
|-BaseFolder||This is the directory on the target server where the HammerDb executables and PowerShell scripts have been installed||In this example this is upon SQL02 within D:\HammerDbBenchmark|
|-HammerLogsFolder||This is the temp directory of the account being used to run the HammerDb_Benchmark job. This is usually the SQL Agent Service account and is where the hammerdb results are written to||In this example this is the temp directory of TICKLE\SVC_SQLA on SQL02 which is C:\Users\SVC_SQLA\AppData\Local\Temp|
|-WriteOp||This can be used to specify whether to write the results to a csv file format, to a SQL table or both. As of writing 21/05/2021 the only option is csv, I intend to add SQL table at a later date.||Csv|
|SQLInstance||Not required, for future enhancement|
|– LogDatabase||Not required, for future enhancement|
|– LogTable||Not required, for future enhancement|
3.Once parameters are set and ps1 file has been saved you can then run the HammerDBLogImport job in the SQL Agent
4.Check the job output for any errors recorded
5.Within D:\HammerDbBenchmark on the target server aka SQL02, three new directories should have been created
HammerLog – Logs any errors or progress messages generated from the import jobs
HammerProcessed– The job moves the benchmark files from the SQL Agent job accounts temp directory to this folder once they have been written to a csv file
HammerResults– Contains the csv file with the NOPM and TPM results
6.Check the HammerLog for any errors
Depending on how many benchmark log files you imported, you should see a message similar to below.
7.Check the HammerResults folder contains a csv file
8.You can copy this to a machine that contains Excel to view the results in columnar format
For peace of mind you can check the HammerProcessed folder to see what TPM\NOPM results were written to the original log files, check the number of files and values match what is within the csv
9.Schedule the HammerDBLogImport job to run on a repeating schedule if required.
If you keep the existing csv file within the HammerResult folder it should have any new test log results appended to the existing file each time the job is run, therefore you could leave it there and copy it into Excel after a couple of days of running benchmarks
So far, I have only tested this with Windows 2016 and SQL 2016, I will strive to test 2017 and 2019 next. I’m also interested to run some test against Azure SQL Database and different sizes of VM in Azure in future articles.
HammerDB Download: https://hammerdb.com/download.html
Hammer DB Git Hub Link: https://github.com/swarchy/HammerDBBenchmark
If you running this in an environment that doesn’t have access to the internet, you will find the offline dbatools install method most useful https://dbatools.io/offline/
I must say a massive thankyou to Jeff Maaks who developed a similar set of PowerShell scripts (https://github.com/jmaaks/HammerDB-utils) for automating HammerDB on SQL Server, and helped me to work out how to parse the HammerDB results from the output logs. I must also shout out to https://dbatools.io/ for providing the Write-DbaDbTableData function used to import the HammerDB results into a SQL table. To be fair I could have used another method to write to the SQL table, but Dbatools is such a great set of PowerShell scripts that receives regular updates with a great community of followers, anything I can do to introduce more people to Dbatools is a good thing in my opinion. Brent Ozar who’s fun sessions and blogs encouraged me to have a go at starting a blog and to do this around HammerDb when he actually replied to one of my questions .
see Brent’s blog on running load tests https://www.brentozar.com/archive/2012/06/load-test-sqlserver/
Lastly I have to say thank you to my colleagues who first introduced me to HammerDb as part of a performance troubleshooting exercise we were conducting.
As mentioned this certainly isn’t a finished product and there’s probably issues \ questions that I haven’t thought about. This has just been a project to keep me busy and learn some new skills throughout lockdown that I hope will be of use for people in the community and welcome feedback.