To generate HTTP requests just use provided HttpSampler.
The following example uses 2 threads (concurrent users) that send 10 HTTP GET requests each to http://my.service.
Additionally, it logs collected statistics (response times, status codes, etc.) to a file (for later analysis if needed) and checks that the response time 99 percentile is less than 5 seconds.
usingstaticAbstracta.JmeterDsl.JmeterDsl;
+
+publicclassPerformanceTest
+{
+ [Test]
+ publicvoidLoadTest()
+ {
+ var stats =TestPlan(
+ ThreadGroup(2,10,
+ HttpSampler("http://my.service")
+ ),
+ //this is just to log details of each request stats
+ JtlWriter("jtls")
+ ).Run();
+ Assert.That(stats.Overall.SampleTimePercentile99, Is.LessThan(TimeSpan.FromSeconds(5)));
+ }
+}
+
TIP
When working with multiple samplers in a test plan, specify their names (eg: HttpSampler("home", "http://my.service")) to easily check their respective statistics.
TIP
JMeter .Net DSL uses Java for executing JMeter test plans. If you need to tune JVM parameters, for example for specifying maximum heap memory size, you can use EmbeddedJMeterEngine and the JvmArgs method like in the following example:
Depending on the test framework you use, and the way you run your tests, you might be able to see JMeter logs and output in real-time, at the end of the test, or not see them at all. This is not something we can directly control in JMeter DSL, and heavily depends on the dotnet environment and testing framework implementation.
When using Nunit, to get real-time console output from JMeter you might want to run your tests with something like dotnet test -v n and add the following code to your tests:
privateTextWriter? originalConsoleOut;
+
+// Redirecting output to progress to get live stdout with nunit.
+// https://github.com/nunit/nunit3-vs-adapter/issues/343
+// https://github.com/nunit/nunit/issues/1139
+[SetUp]
+publicvoidSetUp()
+{
+ originalConsoleOut = Console.Out;
+ Console.SetOut(TestContext.Progress);
+}
+
+[TearDown]
+publicvoidTearDown()
+{
+ Console.SetOut(originalConsoleOut!);
+}
+
`,1),N={class:"custom-container tip"},F=s("p",{class:"custom-container-title"},"TIP",-1),U={href:"https://github.com/abstracta/jmeter-java-dsl/issues/26#issuecomment-953783407",target:"_blank",rel:"noopener noreferrer"},W=s("p",null,[n("Check "),s("a",{href:"#http"},"HTTP performance testing"),n(" for additional details while testing HTTP services.")],-1),V=s("h2",{id:"run-test-at-scale",tabindex:"-1"},[s("a",{class:"header-anchor",href:"#run-test-at-scale","aria-hidden":"true"},"#"),n(" Run test at scale")],-1),O={href:"https://abstracta.github.io/jmeter-java-dsl/guide/#run-test-at-scale",target:"_blank",rel:"noopener noreferrer"},Y={href:"https://github.com/abstracta/jmeter-dotnet-dsl/issues",target:"_blank",rel:"noopener noreferrer"},K=s("h3",{id:"azure-load-testing",tabindex:"-1"},[s("a",{class:"header-anchor",href:"#azure-load-testing","aria-hidden":"true"},"#"),n(" Azure Load Testing")],-1),Z={href:"https://azure.microsoft.com/en-us/products/load-testing/",target:"_blank",rel:"noopener noreferrer"},X=t(`
usingAbstracta.JmeterDsl.Azure;
+usingstaticAbstracta.JmeterDsl.JmeterDsl;
+
+publicclassPerformanceTest
+{
+ [Test]
+ publicvoidLoadTest()
+ {
+ var stats =TestPlan(
+ ThreadGroup(2,10,
+ HttpSampler("http://my.service")
+ )
+ ).RunIn(newAzureEngine(Environment.GetEnvironmentVariable("AZURE_CREDS"))// AZURE_CREDS=tenantId:clientId:secretId
+ .TestName("dsl-test")
+ /*
+ This specifies the number of engine instances used to execute the test plan.
+ In this case, means that it will run 2(threads in thread group)x2(engines)=4 concurrent users/threads in total.
+ Each engine executes the test plan independently.
+ */
+ .Engines(2)
+ .TestTimeout(TimeSpan.FromMinutes(20)));
+ Assert.That(stats.Overall.SampleTimePercentile99, Is.LessThan(TimeSpan.FromSeconds(5)));
+ }
+}
+
`,3),$=s("code",null,"AZURE_CREDS",-1),Q=s("code",null,"tenantId:clientId:clientSecret",-1),nn={href:"https://portal.azure.com/#view/Microsoft_AAD_IAM/TenantPropertiesBlade",target:"_blank",rel:"noopener noreferrer"},sn={href:"https://learn.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal",target:"_blank",rel:"noopener noreferrer"},an=s("p",null,"With Azure, you can not only run the test at scale but also get additional features like nice real-time reporting, historic data tracking, etc. Here is an example of how a test looks like in Azure Load Testing:",-1),en=s("p",null,[s("img",{src:l,alt:"Azure Load Testing Example Execution Dashboard"})],-1),tn={href:"https://github.com/abstracta/jmeter-dotnet-dsl/tree/master/Abstracta.JmeterDsl.Azure/AzureEngine.cs",target:"_blank",rel:"noopener noreferrer"},on=t(`
WARNING
By default, the engine is configured to time out if test execution takes more than 1 hour. This timeout exists to avoid any potential problem with Azure Load Testing execution not detected by the client, and avoid keeping the test indefinitely running until is interrupted by a user, which may incur unnecessary expenses in Azure and is especially annoying when running tests in an automated fashion, for example in CI/CD. It is strongly advised to set this timeout properly in each run, according to the expected test execution time plus some additional margin (to consider for additional delays in Azure Load Testing test setup and teardown) to avoid unexpected test plan execution failure (due to timeout) or unnecessary waits when there is some unexpected issue with Azure Load Testing execution.
TIP
If you want to get debug logs for HTTP calls to Azure API, you can include the following setting to an existing log4j2.xml configuration file:
usingAbstracta.JmeterDsl.BlazeMeter;
+usingstaticAbstracta.JmeterDsl.JmeterDsl;
+
+publicclassPerformanceTest
+{
+ [Test]
+ publicvoidLoadTest()
+ {
+ var stats =TestPlan(
+ // number of threads and iterations are in the end overwritten by BlazeMeter engine settings
+ ThreadGroup(2,10,
+ HttpSampler("http://my.service")
+ )
+ ).RunIn(newBlazeMeterEngine(Environment.GetEnvironmentVariable("BZ_TOKEN"))
+ .TestName("DSL test")
+ .TotalUsers(500)
+ .HoldFor(TimeSpan.FromMinutes(10))
+ .ThreadsPerEngine(100)
+ .TestTimeout(TimeSpan.FromMinutes(20))
+ .TestName("dsl-test"));
+ Assert.That(stats.Overall.SampleTimePercentile99, Is.LessThan(TimeSpan.FromSeconds(5)));
+ }
+}
+
This test is using BZ_TOKEN, a custom environment variable with <KEY_ID>:<KEY_SECRET> format, to get the BlazeMeter API authentication credentials.
`,2),ln={href:"https://guide.blazemeter.com/hc/en-us/articles/115002213289-BlazeMeter-API-keys-",target:"_blank",rel:"noopener noreferrer"},un=s("code",null,".RunIn(new BlazeMeterEngine(...))",-1),rn=s("p",null,"BlazeMeter will not only allow you to run the test at scale but also provides additional features like nice real-time reporting, historic data tracking, etc. Here is an example of how a test would look in BlazeMeter:",-1),dn=s("p",null,[s("img",{src:u,alt:"BlazeMeter Example Execution Dashboard"})],-1),kn={href:"https://github.com/abstracta/jmeter-dotnet-dsl/tree/master/Abstracta.JmeterDsl.BlazeMeter/BlazeMeterEngine.cs",target:"_blank",rel:"noopener noreferrer"},mn=t(`
WARNING
By default the engine is configured to timeout if test execution takes more than 1 hour. This timeout exists to avoid any potential problem with BlazeMeter execution not detected by the client, and avoid keeping the test indefinitely running until is interrupted by a user, which may incur in unnecessary expenses in BlazeMeter and is specially annoying when running tests in automated fashion, for example in CI/CD. It is strongly advised to set this timeout properly in each run, according to the expected test execution time plus some additional margin (to consider for additional delays in BlazeMeter test setup and teardown) to avoid unexpected test plan execution failure (due to timeout) or unnecessary waits when there is some unexpected issue with BlazeMeter execution.
WARNING
BlazeMeterEngine always returns 0 as sentBytes statistics since there is no efficient way to get it from BlazMeter.
TIP
In case you want to get debug logs for HTTP calls to BlazeMeter API, you can include the following setting to an existing log4j2.xml configuration file:
JMeter DSL provides two simple ways of creating thread groups which are used in most scenarios:
specifying threads and the number of iterations each thread should execute before ending the test plan
specifying threads and duration for which each thread should execute before the test plan ends
This is how they look in code:
ThreadGroup(10,20,...)// 10 threads for 20 iterations each
+ThreadGroup(10, TimeSpan.FromSeconds(20),...)// 10 threads for 20 seconds each
+
But these options are not good when working with many threads or when trying to configure some complex test scenarios (like when doing incremental or peak tests).
When working with many threads, it is advisable to configure a ramp-up period, to avoid starting all threads at once affecting performance metrics and generation.
You can easily configure a ramp-up with the DSL like this:
ThreadGroup().RampTo(10, TimeSpan.FromSeconds(5)).HoldIterating(20)// ramp to 10 threads for 5 seconds (1 thread every half second) and iterating each thread 20 times
+ThreadGroup().RampToAndHold(10, TimeSpan.FromSeconds(5), TimeSpan.FromSeconds(20))//similar as above but after ramping up holding execution for 20 seconds
+
Additionally, you can use and combine these same methods to configure more complex scenarios (incremental, peak, and any other types of tests) like the following one:
If you are a JMeter GUI user, you may even be interested in using provided TestElement.ShowInGui() method, which shows the JMeter test element GUI that could help you understand what will DSL execute in JMeter. You can use this method with any test element generated by the DSL (not just thread groups).
For example, for the above test plan you would get a window like the following one:
TIP
When using multiple thread groups in a test plan, consider setting a name (eg: ThreadGroup("main", 1, 1, ...)) on them to properly identify associated requests in statistics & jtl results.
A usual requirement while building a test plan is to be able to review requests and responses and debug the test plan for potential issues in the configuration or behavior of the service under test. With JMeter DSL you have several options for this purpose.
This will display the JMeter built-in View Results Tree element, which allows you to review request and response contents in addition to collected metrics (spent time, sent & received bytes, etc.) for each request sent to the server, in a window like this one:
TIP
To debug test plans use a few iterations and threads to reduce the execution time and ease tracing by having less information to analyze.
TIP
When adding ResultsTreeVisualizer() as a child of a thread group, it will only display sample results of that thread group. When added as a child of a sampler, it will only show sample results for that sampler. You can use this to only review certain sample results in your test plan.
TIP
Remove ResultsTreeVisualizer() from test plans when are no longer needed (when debugging is finished). Leaving them might interfere with unattended test plan execution (eg: in CI) due to test plan execution not finishing until all visualizers windows are closed.
WARNING
By default, View Results Tree only displays the last 500 sample results. If you need to display more elements, use provided ResultsLimit(int) method which allows changing this value. Take into consideration that the more results are shown, the more memory that will require. So use this setting with care.
Note that we changed the suspend flag to y to block test execution until Remote JVM Debug is run in IDE.
run the JMeter .Net DSL test. The test should get stuck until you start Remote JVM Debug in the Java IDE.
start the Remote JVM Devug in the Java IDE.
wait for a breakpoint to activate and debug as usual 🙂.
TIP
JMeter class in charge of executing threads logic is org.apache.jmeter.threads.JMeterThread. You can check the classes used by each DSL-provided test element by checking the Java DSL code.
The DSL configures dummy samplers by default, in contrast to what JMeter does, with response time simulation disabled. This allows to speed up the debugging process, not having to wait for proper response time simulation (sleeps/waits). If you want a more accurate emulation, you might turn it on through the ResponseTimeSimulation() method.
A usual requirement for new DSL users that are used to Jmeter GUI, is to be able to review Jmeter DSL generated test plan in the familiar JMeter GUI. For this, you can use the ShowInGui() method in a test plan to open JMeter GUI with the preloaded test plan.
This can be also used to debug the test plan, by adding elements (like view results tree, dummy samplers, etc.) in the GUI and running the test plan.
The main mechanism provided by JMeter (and Abstracta.JmeterDsl) to get information about generated requests, responses, and associated metrics is through the generation of JTL files.
This can be easily achieved by using provided JtlWriter like in this example:
JtlWriter will automatically generate .jtl files applying this format: <yyyy-MM-dd HH-mm-ss> <UUID>.jtl.
If you need a specific file name, for example for later postprocessing logic (eg: using CI build ID), you can specify it by using JtlWriter(directory, fileName).
When specifying the file name, make sure to use unique names, otherwise, the JTL contents may be appended to previous existing jtl files.
An additional option, specially targeted towards logging sample responses, is ResponseFileSaver which automatically generates a file for each received response. Here is an example:
# Use part of a response in a subsequent request (aka correlation)
It is a usual requirement while creating a test plan for an application to be able to use part of a response (e.g.: a generated ID, token, etc.) in a subsequent request. This can be easily achieved using JMeter extractors and variables.
`,6),Ln={href:"https://github.com/abstracta/jmeter-dotnet-dsl/tree/master/Abstracta.JmeterDsl/Core/PostProcessors/DslRegexExtractor.cs",target:"_blank",rel:"noopener noreferrer"},zn=s("h2",{id:"protocols",tabindex:"-1"},[s("a",{class:"header-anchor",href:"#protocols","aria-hidden":"true"},"#"),n(" Protocols")],-1),En=s("h3",{id:"http",tabindex:"-1"},[s("a",{class:"header-anchor",href:"#http","aria-hidden":"true"},"#"),n(" HTTP")],-1),Rn=s("p",null,"Throughout this guide, several examples have been shown for simple cases of HTTP requests (mainly how to do gets and posts), but the DSL provides additional features that you might need to be aware of.",-1),Gn={href:"https://github.com/abstracta/jmeter-dotnet-dsl/tree/master/Abstracta.JmeterDsl/JmeterDsl.cs",target:"_blank",rel:"noopener noreferrer"},Cn={href:"https://github.com/abstracta/jmeter-dotnet-dsl/tree/master/Abstracta.JmeterDsl/Http/DslHttpSampler.cs",target:"_blank",rel:"noopener noreferrer"},Bn=t(`
As previously seen, you can do simple gets and posts like in the following snippet:
HttpSampler("http://my.service")// A simple get
+HttpSampler("http://my.service")
+ .Post("{\\"field\\":\\"val\\"}",newMediaTypeHeaderValue(MediaTypeNames.Application.Json))// simple post
+
But you can also use additional methods to specify any HTTP method and body:
You might have already noticed in some of the examples that we have shown, some ways to set some headers. For instance, in the following snippet, Content-Type header is being set in two different ways:
These are handy methods to specify the Content-Type header, but you can also set any header on a particular request using provided Header method, like this:
Additionally, you can specify headers to be used by all samplers in a test plan, thread group, transaction controllers, etc. For this, you can use HttpHeaders like this:
JMeter DSL automatically adds a cookie manager and cache manager for automatic HTTP cookie and caching handling, emulating a browser behavior. If you need to disable them you can use something like this:
`,15);function Nn(Fn,Un){const a=p("ExternalLinkIcon");return c(),i("div",null,[g,f,s("p",null,[n("Provided examples use "),s("a",y,[n("Nunit"),e(a)]),n(", but you can use other test libraries.")]),s("p",null,[n("Explore the DSL in your preferred IDE to discover all available features, and consider reviewing "),s("a",w,[n("existing tests"),e(a)]),n(" for additional examples.")]),s("p",null,[n("The .Net DSL currently does not support all use cases supported by the "),s("a",T,[n("Java Dsl"),e(a)]),n(", and currently only focuses on a limited set of features that cover the most commonly used cases. If you identify any particular scenario (or JMeter feature) that you need and is not currently supported, or easy to use, "),s("strong",null,[n("please let us know by "),s("a",_,[n("creating an issue"),e(a)])]),n(" and we will try to implement it as soon as possible. Usually porting JMeter features is quite fast, and porting existing Java DSL features is even faster.")]),s("div",q,[x,s("p",null,[n("If you like this project, "),s("strong",null,[n("please give it a star ⭐ in "),s("a",S,[n("GitHub"),e(a)]),n("!")]),n(". This helps the project be more visible, gain relevance and encourages us to invest more effort in new features.")])]),s("p",null,[n("For an intro to JMeter concepts and components, you can check "),s("a",J,[n("JMeter official documentation"),e(a)]),n(".")]),D,s("div",M,[I,s("p",null,[s("a",A,[n("Here"),e(a)]),n(" is a sample project in case you want to start one from scratch.")])]),s("div",H,[j,s("p",null,[n("JMeter .Net DSL uses existing JMeter Java DSL which in turn uses JMeter. JMeter Java DSL and JMeter are Java based tools. So, "),P,n(" for the proper execution of DSL test plans. One option is downloading a JVM from "),s("a",L,[n("Adoptium"),e(a)]),n(" if you don't have one already.")])]),z,s("div",E,[R,s("p",null,[n("Since JMeter uses "),s("a",G,[n("log4j2"),e(a)]),n(", if you want to control the logging level or output, you can use something similar to this "),C,n(', using "CopyToOutputDirectory" in the project item, so the file is available in dotnet build output directory as well (check [Abstracta.JmeterDsl.Test/Abstracta.JmeterDsl.Tests.csproj]).')])]),B,s("div",N,[F,s("p",null,[n("Keep in mind that you can use .Net programming to modularize and create abstractions which allow you to build complex test plans that are still easy to read, use and maintain. "),s("a",U,[n("Here is an example"),e(a)]),n(" of some complex abstraction built using Java features (you can easily extrapolate to .Net) and the DSL.")])]),W,V,s("p",null,[n("Running a load test from one machine is not always enough, since you are limited to the machine's hardware capabilities. Sometimes, is necessary to run the test using a cluster of machines to be able to generate enough load for the system under test. Currently, the .Net DSL only provides two ways to run tests at scale, but in the future, we plan to support more ("),s("a",O,[n("as Java DSL does"),e(a)]),n("). If you are interested in some not yet covered feature, please ask for it by creating an "),s("a",Y,[n("issue in the repository"),e(a)]),n(".")]),K,s("p",null,[n("To use "),s("a",Z,[n("Azure Load Testing"),e(a)]),n(" to execute your test plans at scale is as easy as including the following package to your project:")]),X,s("blockquote",null,[s("p",null,[n("This test is using "),$,n(", a custom environment variable containing "),Q,n(" with proper values for each. Check at "),s("a",nn,[n("Azure Portal tenant properties"),e(a)]),n(" the proper tenant ID for your subscription, and follow "),s("a",sn,[n("this guide"),e(a)]),n(" to register an application with proper permissions and secrets generation for tests execution.")])]),an,en,s("p",null,[n("Check "),s("a",tn,[n("AzureEngine"),e(a)]),n(" for details on usage and available settings when running tests in Azure Load Testing.")]),on,s("p",null,[n("You can easily run a JMeter test plan at scale in "),s("a",pn,[n("BlazeMeter"),e(a)]),n(" like this:")]),cn,s("p",null,[n("Note that is as simple as "),s("a",ln,[n("generating a BlazeMeter authentication token"),e(a)]),n(" and adding "),un,n(" to any existing JMeter DSL test to get it running at scale in BlazeMeter.")]),rn,dn,s("p",null,[n("Check "),s("a",kn,[n("BlazeMeterEngine"),e(a)]),n(" for details on usage and available settings when running tests in BlazeMeter.")]),mn,s("p",null,[n("Check "),s("a",hn,[n("DslThreadGroup"),e(a)]),n(" for more details.")]),vn,s("p",null,[n("In many cases, you want to be able to test part of the test plan but without directly interacting with the service under test, avoiding any potential traffic to the servers, testing some border cases which might be difficult to reproduce with the actual server, and avoid actual server interactions variability and potential unpredictability. In such scenarios, you might replace actual samplers with "),bn,n(" (which uses "),s("a",gn,[n("Dummy Sampler plugin"),e(a)]),n(") to be able to test extractors, assertions, controllers conditions, and other parts of the test plan under certain conditions/results generated by the samplers.")]),fn,s("p",null,[n("Check "),s("a",yn,[n("DslDummySampler"),e(a)]),n(" for more information o additional configuration and options.")]),wn,s("p",null,[n("Once you have a test plan you would usually want to be able to analyze the collected information. This section contains a few ways to achieve this, but in the future, we plan to support more ("),s("a",Tn,[n("as Java DSL does"),e(a)]),n("). If you are interested in some not yet covered feature, please ask for it by creating an "),s("a",_n,[n("issue in the repository"),e(a)]),n(".")]),qn,s("div",xn,[Sn,s("p",null,[n("By default, "),Jn,n(" will write the most used information to evaluate the performance of the tested service. If you want to trace all the information of each request you may use "),Dn,n(" with the "),Mn,n(" option. Doing this will provide all the information at the cost of additional computation and resource usage (fewer resources for actual load testing). You can tune which fields to include or not with "),In,n(" and only log what you need, check "),s("a",An,[n("JtlWriter"),e(a)]),n(" for more details.")])]),Hn,s("p",null,[n("Check "),s("a",jn,[n("ResponseFileSaver"),e(a)]),n(" for more details.")]),Pn,s("p",null,[n("Check "),s("a",Ln,[n("DslRegexExtractor"),e(a)]),n(" for more details and additional options.")]),zn,En,Rn,s("p",null,[n("Here we show some of them, but check "),s("a",Gn,[n("JmeterDsl"),e(a)]),n(" and "),s("a",Cn,[n("DslHttpSampler"),e(a)]),n(" to explore all available features.")]),Bn])}const Vn=o(b,[["render",Nn],["__file","index.html.vue"]]);export{Vn as default};
diff --git a/assets/index.html-46b1f23c.js b/assets/index.html-46b1f23c.js
new file mode 100644
index 0000000..8e87591
--- /dev/null
+++ b/assets/index.html-46b1f23c.js
@@ -0,0 +1 @@
+const e=JSON.parse('{"key":"v-8daa1a0e","path":"/","title":"","lang":"en-US","frontmatter":{"home":true,"heroHeight":68,"heroImage":"/logo.svg","actions":[{"text":"User Guide →","link":"/guide/"}],"features":[{"title":"💙 Git, IDE & Programmers Friendly","details":"Simple way of defining performance tests that takes advantage of IDEs autocompletion and inline documentation."},{"title":"💪 JMeter ecosystem & community","details":"Use the most popular performance tool and take advantage of the wide support of protocols and tools."},{"title":"😎 Built-in features & extensibility","details":"Built-in additional features which ease usage and using it in CI/CD pipelines."}],"footer":"Made by Abstracta with ❤️ | Apache 2.0 Licensed | Powered by Vuepress","footerHtml":true},"headers":[{"level":2,"title":"Example","slug":"example","link":"#example","children":[]}],"git":{},"filePathRelative":"index.md"}');export{e as data};
diff --git a/assets/index.html-808f81a0.js b/assets/index.html-808f81a0.js
new file mode 100644
index 0000000..821d13e
--- /dev/null
+++ b/assets/index.html-808f81a0.js
@@ -0,0 +1 @@
+const e=JSON.parse('{"key":"v-fffb8e28","path":"/guide/","title":"User guide","lang":"en-US","frontmatter":{},"headers":[{"level":2,"title":"Setup","slug":"setup","link":"#setup","children":[]},{"level":2,"title":"Simple HTTP test plan","slug":"simple-http-test-plan","link":"#simple-http-test-plan","children":[]},{"level":2,"title":"Run test at scale","slug":"run-test-at-scale","link":"#run-test-at-scale","children":[{"level":3,"title":"Azure Load Testing","slug":"azure-load-testing","link":"#azure-load-testing","children":[]},{"level":3,"title":"BlazeMeter","slug":"blazemeter","link":"#blazemeter","children":[]}]},{"level":2,"title":"Advanced threads configuration","slug":"advanced-threads-configuration","link":"#advanced-threads-configuration","children":[{"level":3,"title":"Thread ramps and holds","slug":"thread-ramps-and-holds","link":"#thread-ramps-and-holds","children":[]}]},{"level":2,"title":"Test plan debugging","slug":"test-plan-debugging","link":"#test-plan-debugging","children":[{"level":3,"title":"View results tree","slug":"view-results-tree","link":"#view-results-tree","children":[]},{"level":3,"title":"Debug JMeter code","slug":"debug-jmeter-code","link":"#debug-jmeter-code","children":[]},{"level":3,"title":"Dummy sampler","slug":"dummy-sampler","link":"#dummy-sampler","children":[]},{"level":3,"title":"Test plan review un JMeter GUI","slug":"test-plan-review-un-jmeter-gui","link":"#test-plan-review-un-jmeter-gui","children":[]}]},{"level":2,"title":"Reporting","slug":"reporting","link":"#reporting","children":[{"level":3,"title":"Log requests and responses","slug":"log-requests-and-responses","link":"#log-requests-and-responses","children":[]}]},{"level":2,"title":"Response processing","slug":"response-processing","link":"#response-processing","children":[{"level":3,"title":"Use part of a response in a subsequent request (aka correlation)","slug":"use-part-of-a-response-in-a-subsequent-request-aka-correlation","link":"#use-part-of-a-response-in-a-subsequent-request-aka-correlation","children":[{"level":4,"title":"Regular expressions extraction","slug":"regular-expressions-extraction","link":"#regular-expressions-extraction","children":[]}]}]},{"level":2,"title":"Protocols","slug":"protocols","link":"#protocols","children":[{"level":3,"title":"HTTP","slug":"http","link":"#http","children":[{"level":4,"title":"Methods & body","slug":"methods-body","link":"#methods-body","children":[]},{"level":4,"title":"Headers","slug":"headers","link":"#headers","children":[]},{"level":4,"title":"Cookies & caching","slug":"cookies-caching","link":"#cookies-caching","children":[]}]}]}],"git":{},"filePathRelative":"guide/index.md"}');export{e as data};
diff --git a/assets/index.html-ae1e22a7.js b/assets/index.html-ae1e22a7.js
new file mode 100644
index 0000000..6d5926c
--- /dev/null
+++ b/assets/index.html-ae1e22a7.js
@@ -0,0 +1,17 @@
+import{_ as e,r as t,o as p,c,a as n,b as s,d as o,e as l}from"./app-36a3af56.js";const i={},u=l(`
`,6),r={href:"https://github.com/abstracta/jmeter-dotnet-dsl-sample",target:"_blank",rel:"noopener noreferrer"};function d(k,m){const a=t("ExternalLinkIcon");return p(),c("div",null,[u,n("p",null,[n("a",r,[s("Here"),o(a)]),s(" is a sample project in case you want to start one from scratch.")])])}const b=e(i,[["render",d],["__file","index.html.vue"]]);export{b as default};
diff --git a/assets/index.html-dda19cfd.js b/assets/index.html-dda19cfd.js
new file mode 100644
index 0000000..ea3b1fe
--- /dev/null
+++ b/assets/index.html-dda19cfd.js
@@ -0,0 +1 @@
+import{_ as r,r as a,o as n,c as i,a as e,b as t,d as o,e as c}from"./app-36a3af56.js";const d="/jmeter-dotnet-dsl/assets/abstracta-logo-63bce99b.png",p="/jmeter-dotnet-dsl/assets/blazemeter-logo-a5731ee5.png",l="/jmeter-dotnet-dsl/assets/octoperf-logo-dc518d38.png",u="/jmeter-dotnet-dsl/assets/azure-logo-1bad44c3.png",h={},m=e("h1",{id:"support",tabindex:"-1"},[e("a",{class:"header-anchor",href:"#support","aria-hidden":"true"},"#"),t(" Support")],-1),g=e("h2",{id:"community-support",tabindex:"-1"},[e("a",{class:"header-anchor",href:"#community-support","aria-hidden":"true"},"#"),t(" Community Support")],-1),f=e("p",null,"The JMeter DSL project has a vibrant and active community that provides extensive support, on a best effort basis, to its users. Community support is primarily offered through the following channels:",-1),_={href:"https://discord.gg/WNSn5hqmSd",target:"_blank",rel:"noopener noreferrer"},b={href:"https://discord.gg/WNSn5hqmSd",target:"_blank",rel:"noopener noreferrer"},v={href:"https://github.com/abstracta/jmeter-dotnet-dsl/issues",target:"_blank",rel:"noopener noreferrer"},y={href:"https://github.com/abstracta/jmeter-dotnet-dsl/issues",target:"_blank",rel:"noopener noreferrer"},S={href:"https://github.com/abstracta/jmeter-dotnet-dsl/discussions",target:"_blank",rel:"noopener noreferrer"},w={href:"https://github.com/abstracta/jmeter-dotnet-dsl/discussions",target:"_blank",rel:"noopener noreferrer"},x=e("p",null,"The community is actively involved in proposing new improvements, answering questions, assisting in design decisions, and submitting pull requests. Together, we strive to enhance the capabilities and usability of JMeter DSL.",-1),k=e("h2",{id:"enterprise-support-by-abstracta",tabindex:"-1"},[e("a",{class:"header-anchor",href:"#enterprise-support-by-abstracta","aria-hidden":"true"},"#"),t(" Enterprise Support by Abstracta")],-1),D={href:"https://abstracta.us",target:"_blank",rel:"noopener noreferrer"},L=e("ul",null,[e("li",null,"Dedicated support team : Get prompt answers and peace of mind from a dedicated support team with the expertise to help you resolve issues faster."),e("li",null,"Customizations: Receive tailored solutions to meet your specific requirements."),e("li",null,"Consulting services: Access a team of experts to fine-tune your JMeter DSL usage, speed up implementation, work on your performance testing strategy and overall testing processes.")],-1),J=e("p",null,"Abstracta is committed to helping organizations succeed with JMeter DSL by providing comprehensive support and specialized services tailored to your enterprise needs.",-1),M={href:"https://abstracta.us/contact-us",target:"_blank",rel:"noopener noreferrer"},z=c('
JMeter DSL has received valuable support from industry-leading companies, contributing to the integration features and promoting the tool. We would like to acknowledge and express our gratitude to the following companies:
Explore the DSL in your preferred IDE to discover all available features, and consider reviewing existing testsopen in new window for additional examples.
The .Net DSL currently does not support all use cases supported by the Java Dslopen in new window, and currently only focuses on a limited set of features that cover the most commonly used cases. If you identify any particular scenario (or JMeter feature) that you need and is not currently supported, or easy to use, please let us know by creating an issueopen in new window and we will try to implement it as soon as possible. Usually porting JMeter features is quite fast, and porting existing Java DSL features is even faster.
TIP
If you like this project, please give it a star ⭐ in GitHubopen in new window!. This helps the project be more visible, gain relevance and encourages us to invest more effort in new features.
JMeter .Net DSL uses existing JMeter Java DSL which in turn uses JMeter. JMeter Java DSL and JMeter are Java based tools. So, Java 8+ is required for the proper execution of DSL test plans. One option is downloading a JVM from Adoptiumopen in new window if you don't have one already.
To generate HTTP requests just use provided HttpSampler.
The following example uses 2 threads (concurrent users) that send 10 HTTP GET requests each to http://my.service.
Additionally, it logs collected statistics (response times, status codes, etc.) to a file (for later analysis if needed) and checks that the response time 99 percentile is less than 5 seconds.
usingstaticAbstracta.JmeterDsl.JmeterDsl;
+
+publicclassPerformanceTest
+{
+ [Test]
+ publicvoidLoadTest()
+ {
+ var stats =TestPlan(
+ ThreadGroup(2,10,
+ HttpSampler("http://my.service")
+ ),
+ //this is just to log details of each request stats
+ JtlWriter("jtls")
+ ).Run();
+ Assert.That(stats.Overall.SampleTimePercentile99, Is.LessThan(TimeSpan.FromSeconds(5)));
+ }
+}
+
TIP
When working with multiple samplers in a test plan, specify their names (eg: HttpSampler("home", "http://my.service")) to easily check their respective statistics.
TIP
JMeter .Net DSL uses Java for executing JMeter test plans. If you need to tune JVM parameters, for example for specifying maximum heap memory size, you can use EmbeddedJMeterEngine and the JvmArgs method like in the following example:
Since JMeter uses log4j2open in new window, if you want to control the logging level or output, you can use something similar to this log4j2.xml, using "CopyToOutputDirectory" in the project item, so the file is available in dotnet build output directory as well (check [Abstracta.JmeterDsl.Test/Abstracta.JmeterDsl.Tests.csproj]).
TIP
Depending on the test framework you use, and the way you run your tests, you might be able to see JMeter logs and output in real-time, at the end of the test, or not see them at all. This is not something we can directly control in JMeter DSL, and heavily depends on the dotnet environment and testing framework implementation.
When using Nunit, to get real-time console output from JMeter you might want to run your tests with something like dotnet test -v n and add the following code to your tests:
privateTextWriter? originalConsoleOut;
+
+// Redirecting output to progress to get live stdout with nunit.
+// https://github.com/nunit/nunit3-vs-adapter/issues/343
+// https://github.com/nunit/nunit/issues/1139
+[SetUp]
+publicvoidSetUp()
+{
+ originalConsoleOut = Console.Out;
+ Console.SetOut(TestContext.Progress);
+}
+
+[TearDown]
+publicvoidTearDown()
+{
+ Console.SetOut(originalConsoleOut!);
+}
+
TIP
Keep in mind that you can use .Net programming to modularize and create abstractions which allow you to build complex test plans that are still easy to read, use and maintain. Here is an exampleopen in new window of some complex abstraction built using Java features (you can easily extrapolate to .Net) and the DSL.
Running a load test from one machine is not always enough, since you are limited to the machine's hardware capabilities. Sometimes, is necessary to run the test using a cluster of machines to be able to generate enough load for the system under test. Currently, the .Net DSL only provides two ways to run tests at scale, but in the future, we plan to support more (as Java DSL doesopen in new window). If you are interested in some not yet covered feature, please ask for it by creating an issue in the repositoryopen in new window.
usingAbstracta.JmeterDsl.Azure;
+usingstaticAbstracta.JmeterDsl.JmeterDsl;
+
+publicclassPerformanceTest
+{
+ [Test]
+ publicvoidLoadTest()
+ {
+ var stats =TestPlan(
+ ThreadGroup(2,10,
+ HttpSampler("http://my.service")
+ )
+ ).RunIn(newAzureEngine(Environment.GetEnvironmentVariable("AZURE_CREDS"))// AZURE_CREDS=tenantId:clientId:secretId
+ .TestName("dsl-test")
+ /*
+ This specifies the number of engine instances used to execute the test plan.
+ In this case, means that it will run 2(threads in thread group)x2(engines)=4 concurrent users/threads in total.
+ Each engine executes the test plan independently.
+ */
+ .Engines(2)
+ .TestTimeout(TimeSpan.FromMinutes(20)));
+ Assert.That(stats.Overall.SampleTimePercentile99, Is.LessThan(TimeSpan.FromSeconds(5)));
+ }
+}
+
This test is using AZURE_CREDS, a custom environment variable containing tenantId:clientId:clientSecret with proper values for each. Check at Azure Portal tenant propertiesopen in new window the proper tenant ID for your subscription, and follow this guideopen in new window to register an application with proper permissions and secrets generation for tests execution.
With Azure, you can not only run the test at scale but also get additional features like nice real-time reporting, historic data tracking, etc. Here is an example of how a test looks like in Azure Load Testing:
Check AzureEngineopen in new window for details on usage and available settings when running tests in Azure Load Testing.
WARNING
By default, the engine is configured to time out if test execution takes more than 1 hour. This timeout exists to avoid any potential problem with Azure Load Testing execution not detected by the client, and avoid keeping the test indefinitely running until is interrupted by a user, which may incur unnecessary expenses in Azure and is especially annoying when running tests in an automated fashion, for example in CI/CD. It is strongly advised to set this timeout properly in each run, according to the expected test execution time plus some additional margin (to consider for additional delays in Azure Load Testing test setup and teardown) to avoid unexpected test plan execution failure (due to timeout) or unnecessary waits when there is some unexpected issue with Azure Load Testing execution.
TIP
If you want to get debug logs for HTTP calls to Azure API, you can include the following setting to an existing log4j2.xml configuration file:
BlazeMeter will not only allow you to run the test at scale but also provides additional features like nice real-time reporting, historic data tracking, etc. Here is an example of how a test would look in BlazeMeter:
By default the engine is configured to timeout if test execution takes more than 1 hour. This timeout exists to avoid any potential problem with BlazeMeter execution not detected by the client, and avoid keeping the test indefinitely running until is interrupted by a user, which may incur in unnecessary expenses in BlazeMeter and is specially annoying when running tests in automated fashion, for example in CI/CD. It is strongly advised to set this timeout properly in each run, according to the expected test execution time plus some additional margin (to consider for additional delays in BlazeMeter test setup and teardown) to avoid unexpected test plan execution failure (due to timeout) or unnecessary waits when there is some unexpected issue with BlazeMeter execution.
WARNING
BlazeMeterEngine always returns 0 as sentBytes statistics since there is no efficient way to get it from BlazMeter.
TIP
In case you want to get debug logs for HTTP calls to BlazeMeter API, you can include the following setting to an existing log4j2.xml configuration file:
JMeter DSL provides two simple ways of creating thread groups which are used in most scenarios:
specifying threads and the number of iterations each thread should execute before ending the test plan
specifying threads and duration for which each thread should execute before the test plan ends
This is how they look in code:
ThreadGroup(10,20,...)// 10 threads for 20 iterations each
+ThreadGroup(10, TimeSpan.FromSeconds(20),...)// 10 threads for 20 seconds each
+
But these options are not good when working with many threads or when trying to configure some complex test scenarios (like when doing incremental or peak tests).
When working with many threads, it is advisable to configure a ramp-up period, to avoid starting all threads at once affecting performance metrics and generation.
You can easily configure a ramp-up with the DSL like this:
ThreadGroup().RampTo(10, TimeSpan.FromSeconds(5)).HoldIterating(20)// ramp to 10 threads for 5 seconds (1 thread every half second) and iterating each thread 20 times
+ThreadGroup().RampToAndHold(10, TimeSpan.FromSeconds(5), TimeSpan.FromSeconds(20))//similar as above but after ramping up holding execution for 20 seconds
+
Additionally, you can use and combine these same methods to configure more complex scenarios (incremental, peak, and any other types of tests) like the following one:
If you are a JMeter GUI user, you may even be interested in using provided TestElement.ShowInGui() method, which shows the JMeter test element GUI that could help you understand what will DSL execute in JMeter. You can use this method with any test element generated by the DSL (not just thread groups).
For example, for the above test plan you would get a window like the following one:
TIP
When using multiple thread groups in a test plan, consider setting a name (eg: ThreadGroup("main", 1, 1, ...)) on them to properly identify associated requests in statistics & jtl results.
A usual requirement while building a test plan is to be able to review requests and responses and debug the test plan for potential issues in the configuration or behavior of the service under test. With JMeter DSL you have several options for this purpose.
This will display the JMeter built-in View Results Tree element, which allows you to review request and response contents in addition to collected metrics (spent time, sent & received bytes, etc.) for each request sent to the server, in a window like this one:
TIP
To debug test plans use a few iterations and threads to reduce the execution time and ease tracing by having less information to analyze.
TIP
When adding ResultsTreeVisualizer() as a child of a thread group, it will only display sample results of that thread group. When added as a child of a sampler, it will only show sample results for that sampler. You can use this to only review certain sample results in your test plan.
TIP
Remove ResultsTreeVisualizer() from test plans when are no longer needed (when debugging is finished). Leaving them might interfere with unattended test plan execution (eg: in CI) due to test plan execution not finishing until all visualizers windows are closed.
WARNING
By default, View Results Tree only displays the last 500 sample results. If you need to display more elements, use provided ResultsLimit(int) method which allows changing this value. Take into consideration that the more results are shown, the more memory that will require. So use this setting with care.
Note that we changed the suspend flag to y to block test execution until Remote JVM Debug is run in IDE.
run the JMeter .Net DSL test. The test should get stuck until you start Remote JVM Debug in the Java IDE.
start the Remote JVM Devug in the Java IDE.
wait for a breakpoint to activate and debug as usual 🙂.
TIP
JMeter class in charge of executing threads logic is org.apache.jmeter.threads.JMeterThread. You can check the classes used by each DSL-provided test element by checking the Java DSL code.
In many cases, you want to be able to test part of the test plan but without directly interacting with the service under test, avoiding any potential traffic to the servers, testing some border cases which might be difficult to reproduce with the actual server, and avoid actual server interactions variability and potential unpredictability. In such scenarios, you might replace actual samplers with DummySampler (which uses Dummy Sampler pluginopen in new window) to be able to test extractors, assertions, controllers conditions, and other parts of the test plan under certain conditions/results generated by the samplers.
The DSL configures dummy samplers by default, in contrast to what JMeter does, with response time simulation disabled. This allows to speed up the debugging process, not having to wait for proper response time simulation (sleeps/waits). If you want a more accurate emulation, you might turn it on through the ResponseTimeSimulation() method.
A usual requirement for new DSL users that are used to Jmeter GUI, is to be able to review Jmeter DSL generated test plan in the familiar JMeter GUI. For this, you can use the ShowInGui() method in a test plan to open JMeter GUI with the preloaded test plan.
This can be also used to debug the test plan, by adding elements (like view results tree, dummy samplers, etc.) in the GUI and running the test plan.
Once you have a test plan you would usually want to be able to analyze the collected information. This section contains a few ways to achieve this, but in the future, we plan to support more (as Java DSL doesopen in new window). If you are interested in some not yet covered feature, please ask for it by creating an issue in the repositoryopen in new window.
The main mechanism provided by JMeter (and Abstracta.JmeterDsl) to get information about generated requests, responses, and associated metrics is through the generation of JTL files.
This can be easily achieved by using provided JtlWriter like in this example:
By default, JtlWriter will write the most used information to evaluate the performance of the tested service. If you want to trace all the information of each request you may use JtlWriter with the WithAllFields() option. Doing this will provide all the information at the cost of additional computation and resource usage (fewer resources for actual load testing). You can tune which fields to include or not with JtlWriter and only log what you need, check JtlWriteropen in new window for more details.
TIP
JtlWriter will automatically generate .jtl files applying this format: <yyyy-MM-dd HH-mm-ss> <UUID>.jtl.
If you need a specific file name, for example for later postprocessing logic (eg: using CI build ID), you can specify it by using JtlWriter(directory, fileName).
When specifying the file name, make sure to use unique names, otherwise, the JTL contents may be appended to previous existing jtl files.
An additional option, specially targeted towards logging sample responses, is ResponseFileSaver which automatically generates a file for each received response. Here is an example:
# Use part of a response in a subsequent request (aka correlation)
It is a usual requirement while creating a test plan for an application to be able to use part of a response (e.g.: a generated ID, token, etc.) in a subsequent request. This can be easily achieved using JMeter extractors and variables.
Throughout this guide, several examples have been shown for simple cases of HTTP requests (mainly how to do gets and posts), but the DSL provides additional features that you might need to be aware of.
As previously seen, you can do simple gets and posts like in the following snippet:
HttpSampler("http://my.service")// A simple get
+HttpSampler("http://my.service")
+ .Post("{\"field\":\"val\"}",newMediaTypeHeaderValue(MediaTypeNames.Application.Json))// simple post
+
But you can also use additional methods to specify any HTTP method and body:
You might have already noticed in some of the examples that we have shown, some ways to set some headers. For instance, in the following snippet, Content-Type header is being set in two different ways:
These are handy methods to specify the Content-Type header, but you can also set any header on a particular request using provided Header method, like this:
Additionally, you can specify headers to be used by all samplers in a test plan, thread group, transaction controllers, etc. For this, you can use HttpHeaders like this:
JMeter DSL automatically adds a cookie manager and cache manager for automatic HTTP cookie and caching handling, emulating a browser behavior. If you need to disable them you can use something like this:
The JMeter DSL project has a vibrant and active community that provides extensive support, on a best effort basis, to its users. Community support is primarily offered through the following channels:
GitHub Issuesopen in new window: For bug reports, feature requests, or any specific problems you encounter while using JMeter DSL, GitHub Issuesopen in new window is the place to go. Create an issue, and the community will jump in to assist you, propose improvements, and collaborate on finding solutions.
The community is actively involved in proposing new improvements, answering questions, assisting in design decisions, and submitting pull requests. Together, we strive to enhance the capabilities and usability of JMeter DSL.
In addition to community support, Abstractaopen in new window offers enterprise-level support for JMeter DSL users. Abstracta is the main supporter of JMeter DSL development and provides specialized professional services to ensure the success of organizations using JMeter DSL. With Abstracta's enterprise support, you can accelerate your JMeter DSL implementation and have access to:
Dedicated support team : Get prompt answers and peace of mind from a dedicated support team with the expertise to help you resolve issues faster.
Customizations: Receive tailored solutions to meet your specific requirements.
Consulting services: Access a team of experts to fine-tune your JMeter DSL usage, speed up implementation, work on your performance testing strategy and overall testing processes.
Abstracta is committed to helping organizations succeed with JMeter DSL by providing comprehensive support and specialized services tailored to your enterprise needs.
JMeter DSL has received valuable support from industry-leading companies, contributing to the integration features and promoting the tool. We would like to acknowledge and express our gratitude to the following companies: