OAuth Authorization Code Flow
Description
This scenario covers the OAuth authorization code flow. At the beginning of the test basic authentication is performed returning an SSO session identifier. The same session is reused by all subsequent test iterations. Each test iteration starts with a request sent to the authorization endpoint requesting an authorization code. This request contains the SSO session identifier. The next request with the authorization code demands an access token and refresh token from the token endpoint.
Each thread (user) creates a new session at the beginning which is reused by all test iterations performed by the thread. Thus, obtaining the session including authentication has a minimal impact on the test result.
Deployment Diagram
The deployment is as described in General Deployment Diagram
Sequence Diagram
The Figure 1 depicts one iteration of the test focusing on OAuth authorization code flow, it does not include basic authentication.
Considered Influence Factors
The fact the SSO session generated in the first test iteration is used by all subsequent requests minimizes the impact of the authentication and SSO session generation on the test result. This allows to focus on the OAuth authorization code flow, its computational complexity and JVM memory and cache memory requirements.
Chosen Testing Types
Maximal Performance State
Results
The Figure 2 depicts the values collected when performing this testing scenario in both, single and dual server deployments. With the correct configuration of the load balancer, it is noticeable that the throughput almost doubles with the addition of the second server.
The Figure 2 also depicts the almost constant throughput with regards to number of existing SSO sessions and issued OAuth tokens when we speak about orders of hundreds of thousands.
The specific of this OAuth scenario (comparing to some other performance testing scenarios) is that the actual scenario consists of multiple request/responses. Figure 2 depicts only the most significant requests which are Access Token Requests. To reach full potential of our OAuth implementation we had to greatly extend our testing environment.
|
Comparison of SAML and OAuth scenarios shows the throughput of OAuth is multiple times higher than the throughput of SAML. Consider this in situations when there’s a need for federation with components that support both protocols. |
Recommended/Expected Values
Values shown in Figure 2 are generally believed to be the values to aim to when configuring the system.
Testing Tool Input
The JMX file that can be imported and (with marginal changes dependent on the environment parameters) used for testing in the target environment is enclosed below.
Variables
The JMX file should be updated to match your environment.
Open the JMX file in JMeter, select
OAuthIdpAuthorizationCodeFlowTest and update User Defined Variables:
-
IDP_HOST_NAME- the DirX Access server host name. -
IDP_HOST_PORT- the DirX Access server port where the OAuth endpoints are exposed. -
OAUTH_PROVIDER_AUTHZ_PATH- the OAuth Authorization endpoint URL path. -
OAUTH_PROVIDER_TOKEN_PATH- the OAuth Token endpoint URL path. -
OAUTH_CLIENT_ID- the client_id of the OAuth client. -
OAUTH_CLIENT_SECRET- the client_secret of the confidential OAuth client. -
REDIRECT_URI- the OAuth client redirect_uri. -
AUTHN_APP_PATH- the URL path of the DirX Access authentication application. -
USERS_PATH- the path to a CSV file containing<user name>, <password>rows used for authentication.
Unstable Environment
Results

The Figure 3 depicts the behavior of this testing scenario in an unstable environment. The expected behavior, also clearly seen in the graph, is that when a system is fully functional after previous malfunction, the throughput increases back to a stable value. This graph depicts successful requests only.
Graph contains annotations of unstable behavior. First part of our Stability testing consists of turning off and on our nodes (DXA Servers and DXA Cache Servers). Testing in second part uses software to simulate network instability. The whole stability testing scenario takes 2 hours and 14 minutes and as in Performance section we focus only on Access Token Requests. DXA nodes and their corresponding graph annotations are depicted by following table:
| DirX Access node | Graph annotation |
|---|---|
DirX Access Server 1 |
S1 |
DirX Access Server 2 |
S2 |
DirX Access Cache Server 1 |
Cache 1 |
DirX Access Cache Server 2 |
Cache 2 |
In the first part, we focus on DXA Servers which show their ability to process requests with both, one or no servers running. Figure 2 depicts throughput of this testing scenario for which we had to extended testing environment, but there is no need for the stability testing to use this extended testing environment because here we focus mainly on the return to the original throughput. This is the reason why turning off one of the DXA Servers is not visible on the graph. Next testing focuses on DXA Cache Servers and how they affect the OAuth behavior. Here we focus on time at which the cache servers were turned back on. With this we are testing the assumption that none of our servers or cache servers will notice that the other cache servers went down. While testing this, we need to keep in mind that one cache server is configured as leader (Cache 1) and the other as follower (Cache 2). Based on the graph we can see, that turning one cache server down results in lowering the throughput, but only for a moment. On the other hand, turning both cache servers down results in errors because OAuth scenario is dependent on cache servers.
Second part uses software to simulate network instability, but our goal here is not to cut off specific node and interrupt every communication attempt, but to interrupt the communication between some nodes.
Graph annotations describe the way in which the communication is interrupted.
In case of bidirectional we interrupt communication between mentioned nodes and unidirectional shows that the first mentioned node cannot communicate with the second mentioned node, but the other way is still possible.
As can be seen interrupting the bidirectional communication between cache servers results in decrease of throughput but OAuth scenario is still able to fulfill received requests.
Last three annotations are a bit more complex than the previous ones.
For example, let’s pick the last annotation:
bidirectional S1, S2 → Cache 2 & Cache 1 Cache 2 means that both DXA Servers cannot communicate with Cache 2 and Cache 2 cannot communicate with them, but DXA Servers can communicate with each other and Cache 1, the part after the & represents that DXA Cache Servers cannot communicate with each other, but Cache 1 can communicate with DXA Servers.
Last few minutes on the graph concludes our testing round and shows how our testing environment is back to its original values. The whole stability testing shows that our implementation of OAuth recovers from every simulated instability of network environment.