Has anyone performance tested a PEGA CRM application?
This is my first time performance testing something more complex than what I am use to, I am encountering a good amount of issues figuring this out.
It has tons of asynchronous/AJAX requests that occur and for the most part I believe I have found the ability to correlate most of them. There are a ton of things that need to be correlated, but it is extremely hard to know what is needed to correlate.
One thing was that I had to do was put the headers in the correct order too and it worked? Not sure if that was ever the case
For example, there are tons of things like:
pzBFP (Generated within the source code most likely and not sure how to generate it myself)
pzPostData (is apart of the URL at the end so it may be like a session token)
pzHarnessID (I believe this is being passed parameters to fill parts of the UI)
TABTHREAD (I believe this is just a tab like in a browser, imagine if I opened another tab in the web app to do something, it’ll create a new tab.)
pzCTKN (This is CSRF token)
pzTransactionID
AJAXTRACKID
pzuiactionzzz
A lot of these are values that are appearing and then being generated with new pzTransctionID, AJAXTRACKID and pzHARNESSID throughout the workflow. I'm not entirely sure why some of the request doesn't work sometimes even though everything from the headers, request, parameters and correlated values are all the same as what I see on fiddler/dev/recorded scripts tools.
It's not. Just record the same test scenario twice using HTTP(S) Test Script Recorder. All the values which will differ are a subject to correlation (or parameterization using suitable JMeter Functions like timestamps, random strings, etc.)
Also there are semi-automated and automated correlation solutions like Correlation Recorder Plugin
You might also be interested in PEGA community materials: