Oracle® Fusion Middleware Administrator's Guide for Oracle Access Manager 11g Release 1 (11.1.1) Part Number E15478-02 |
|
|
View PDF |
The Oracle Access Manager Access Tester enables IT professionals and administrators to simulate interactions between registered OAM Agents and OAM 11g Servers to help troubleshoot issues involving agent connections and to test policy definitions. This chapter introduces the Oracle Access Manager Access Tester and how to use it. The following topics are provided:
Before you can perform tasks in this chapter:
Ensure that the OAM Administration Console, OAM run-time Server, and registered OAM Agent are running
Confirm the application domain and policies for one or more resources, as described in Chapter 9.
The Access Tester is a portable, stand-alone Java application that ships with Oracle Access Manager 11g. The Access Tester provides a functional interface between an individual IT professional or administrator and the OAM Server.
IT professionals can use the Access Tester to verify connectivity and troubleshoot problems with the physical deployment. Application administrators can use the Access Tester to perform a quick validation of policies. In this chapter, the term "administrator" represents any individual who is using the Access Tester.
The Access Tester can be used from any computer, either within or outside the WebLogic Server domain. Both a graphical user interface (known as the Console in this chapter) and a command-line interface are provided. Command line mode enables complete automation of test script execution in single or multi-client mode environments.
By appearing to be a real agent, the Access Tester helps with policy configuration design and troubleshooting, and sometimes with troubleshooting OAM Server responsiveness. When using the Access Tester, you must appear to be the real end user; the Access Tester does not actually communicate with a real end user.
To use the Access Tester, you must understand and administer authentication and authorization policies for an application or resource that is protected by Oracle Access Manager 11g.
The Access Tester enables you to:
Configure a request to be sent to the OAM Server that emulates what a real agent would send to the OAM Server in a real environment.
Send your request to the OAM Server and receives a response that is the same as the response that would received by a real Agent. The Access Tester uses the OAM Access Protocol (OAP) API to send requests over the OAP channel to the OAM Proxy running as part of the OAM Server. The OAM Server processes the request and returns a response.
Process and display the server response.
Proceed in the manner a real agent would to handle the response. For example, if a WebGate determines that a resource is protected by a certificate authentication scheme, then it must obtain the end user's certificate from the http SSL connection.
In the case of a certificate authentication scheme, you must point the Access Tester to a certificate to be used as the end user's credentials.
In addition to simulating the Agent while performing functions in the previous list, the Access Tester enables you to:
Review performance characteristics of intended policy changes
Track the latency of authentication and authorization requests
Stress test the OAM Server to establish low- and high-performance watermarks relative to desired user loads, and to size back-end hardware
Establish performance metrics and measuring on an ongoing basis to prove desired outcomes
During basic operations, the Access Tester does not make any determination about the Server response and whether it is a right or wrong response (for instance, whether or not resource X is protected, or user Y is authorized to access resource X). When operating the Access Tester, you must be aware of the policy configuration to determine if a specific response is appropriate.
The Access Tester offers advanced functionality that enables you to group a number of individual requests into a test script that can be sent to the OAM Server for processing. The output of such a test run can be captured by the Access Tester and used to compare against a similar document containing "known good" responses. In this way, the Access Tester can be used for automated testing of policy configuration against errant changes.
For more information, see the following topics in this chapter:
The two primary types of actors in the OAM architecture are the policy servers (OAM Servers) and OAM policy enforcement agents (WebGates or AccessGates). In the security world, Agents represent the policy enforcement point (PEP), while OAM Servers represent the policy decision point (PDP):
The Agent plays the role of a gatekeeper to secure resources such as http-based applications and manage all interactions with the user who is trying to access that resource. This is accomplished according to access control policies maintained on the policy server (OAM Server).
The role of the OAM Server is to provide policy, identity, and session services to the Agent to properly secure application resources, authenticate and authorize users, and manage user sessions.
This core OAM product architecture revolves around the following exchanges, which drive the interaction between the Agent and OAM Server. To expose interoperability and the key decision points, Figure 10-1 illustrates a typical OAM Agent and OAM Server interaction during a user's request for a resource.
The following overview outlines the processing that occurs between OAM Agents and OAM Servers. During testing, the Access Tester emulates the Agent and communicates with the OAM Server while the administrator emulates the end user.
Process overview: Interoperability between OAM Agents and OAM Servers
Establish server connectivity: The registered OAM Agent connects to the OAM Server.
The user requests accesses to a resource.
Validate resource protection: The Agent forwards the request to the OAM Server to determine if the resource is protected.
Protected: The OAM Server responds with the type of credentials required.
User credentials: Establishing the user identity enables tracking for Audit and SSO purposes, and conveyance to the application. For this, the Agent prompts the user for his credentials.
Authenticate user credentials: The Agent forwards the supplied user credentials to the OAM Server for validation.
Authentication Success: The Agent forwards the resource request to the OAM Server.
Authorize user access to a resource: The Agents must first determine if the user is allowed to access the resource by forwarding the request for access to the OAM Server for authorization policy evaluation.
The Agent grants or denies access based on the policy response.
The Access Tester supports only Open and Simple connection modes for communication with the OAM Server.
Note:
The Access Tester does not currently support OAM Servers and Agents configured for Cert mode transport security.The Access Tester encrypts all password-type values that it saves to configuration files and test cases. All network connectivity inherits the NetPoint Access Protocol (NAP) limit of a single connection pool (one primary or secondary connection pool).
Persistence: The Access Tester manages a number of data structures that require persistent storage between Access Tester invocations. XML-file-based storage is provided for the following types of information:
Configuration data to minimize data entry between invocations of the application (OamTestConfiguration)
Test scripts consisting of captured test cases (OamTestScriptCase)
Statistical data representing execution metric from a test run (OamTestStats)
XML Files for Input, Logging, and Analysis: The following XML files are produced when you run the Access Tester to process test scripts:
Configuration Script: config.xml is the output file generated using the Save Configuration command within the Access Tester. The name of this document is used within the input script to provide proper connection information to the Access Tester running in command line mode. For details, see "About the Saved Connection Configuration File".
Input Script: script.xml represents a script that is generated by the Access Tester after capturing one or more test cases. For details, see "About the Generated Input Test Script".
Target Output Script: oamtest_target.xml is generated by running the Access Tester in command line mode and specifying the input script. For details, see "About the Target Output File Containing Test Run Results". For example: -Dscript.scriptfile="script.xml" -jar oamtest.jar
Statistics: oamtest_stats.xml is generated together with the output script. For details, see "About the Statistics Document".
Execution Log: lamtest_log.log is generated together with the output script. For details, see "About the Execution Log".
For more information, see "About Access Tester Modes and Administrator Interactions".
In Console mode, the Access Tester provides a single window for interactions with the user. All Access Tester operations are available in the main window, which performs as a central dashboard where users can submit specific details for the test case and view responses.
Alternatively, you can use the Access Tester in command line mode and develop test scripts, which you can run interactively or in batches for computerized execution to maximize productivity and minimize costs and resources.
Run-Time: The Access Tester requires nap-api.jar in the same directory as the main jar oamtest.jar. Starting the application requires oamtest.jar.
Regardless of the mode you choose for running the Access Tester, your primary interactions with the Access Tester include:
Issuing Requests and Reviewing Results
You use the Access Tester to issue requests to the OAM Server to validate resource protection, policy configuration, user authentication, and user authorization. You can immediately analyze test case results and also retain the data for longer-term analysis, if needed.
Managing Test Scripts
You can build test scripts by capturing the data generated by test execution, which is available as stand-alone documents. You can run the test script for manual or automated analysis. The Access Tester provides for some automated analysis after each test run, while collecting full set of statistics to enable analysis after the fact.
Managing OAM Server Connectivity
You can manage application settings that include server connection information.
Figure 10-2 depicts the flow of information during operations in both Console and command-line modes. Details follow the figure. Advanced operations include building and executing test scripts.
Note:
Table 10-1 describes the process flow of information during both Console mode operations and command-line mode operations.
Table 10-1 User Interactions Using Console Mode versus Command Line Mode Operations
Console mode | Command Line Mode |
---|---|
The user starts the Access Tester from the command line. |
The user or a shell script starts the Access Tester in command line mode. |
The user opens a previously saved configuration file to populate the application fields and minimize data entry, including server connection fields. Alternatively, the user can use the Console and enter data manually |
The Access Tester starts processing test cases based on the input script. |
The user clicks the Connect button to open the connection with the OAM Server. |
The Access Tester opens a connection with the OAM Server based on details in the input script. |
Resource Protection: The user performs steps in a sequence to validate resource protection, authenticate user credentials, and authorize user access. |
Resource Protection: The Access Tester starts processing test cases based on the input script. |
When the test completes, the Access Tester generates:
|
Once the script completes, the Access Tester generates:
|
The user repeats steps as needed to complete validation |
The user repeats steps as needed to complete validation. |
The following overview outlines the tasks involved with using the Access Tester, and the topics where more information can be found in this chapter.
Task overview: Testing OAM 11g connections and policies includes
Review the following topics:
Perform and capture tests using the Access Tester Console as described in "Testing Connectivity and Policies from the Access Tester Console":
The Access Tester consists of two jar files that can be used from any computer, either within or outside the WebLogic Server domain. This section describes how to install the Access Tester, which involves copying the Access Tester jar files to a computer from which you want to run tests. The Access Tester must be started from a command line regardless of the mode you choose for test input: Console mode or command line mode. This section is divided into the following topics:
Starting the Access Tester Without System Properties For Use in Console Mode
Starting the Access Tester with System Properties For Use in Command Line Mode
This topic describes how to install the Access Tester for use on any computer. Following installation, the Access Tester is ready to use. No additional setup is required.
To install the Access Tester
Ensure that the computer from which the tester will be run includes JDK/JRE 6. For example, you can test for Java as follows:
java -version
The previous command returns the following information:
java version "1.6.0_18" Java(TM) SE Runtime Environment (build 1.6.0_18-b07) Java HotSpot(TM) Client VM (build 16.0-b13, mixed mode)
On a computer hosting the OAM Server, locate and copy the Access Tester Jar files. For example:
Oracle_HOME/oam/server/tester/oamtest.jar Oracle_HOME/oam/server/tester/nap-api.jar
Store the jar file copies together in the same directory on any computer from which you want to run the Access Tester.
Proceed as follows, depending on your environment and requirements:
Starting the Access Tester Without System Properties For Use in Console Mode enables you to manually drive requests.
Starting the Access Tester with System Properties For Use in Command Line Mode
Executing a Test Script enables you to use a test script that has been created against a "Known Good" policy configuration and marked as "Known Good"
The Access Tester supports a number of configuration options that are used for presentation or during certain aspects of testing. These options are specified at startup using the Java-D mechanism, as shown in Table 10-2, which describes all supported system properties.
Table 10-2 Access Tester Supported System Properties
Property | Access Tester Mode | Description and Command Syntax |
---|---|---|
log.traceconnfile |
Console and Command Line modes |
Logs connection details to the specified file name. -Dlog.traceconnfile="<file-name>" |
display.fontname |
Console mode |
Starts the Access Tester with the specified font. This could be useful in compensating for differences in display resolution. - Ddisplay.fontname ="<font-name>" |
display.fontsize |
Console mode |
Starts the Access Tester with the specified font size. This could be useful in compensating for differences in display resolution. - Ddisplay.fontsize ="<font-size>" |
display.usesystem |
Console mode |
Starts the Access Tester with the default font name and size (Dialog font, size 10). - Ddisplay.usesystem |
script.scriptfile |
Command Line mode |
Runs the script <file-name> in command line mode. -Dscript.scriptfile="<file-name>" |
control.configfile |
Command Line mode |
Overwrites script's "configfile" attribute containing the absolute path to the configuration XML file with the connection information. The Access Tester uses the configuration file to establish a connection to the Policy Server indicated by Connection element. -Dcontrol.config="<file-name>" |
control.testname |
Command Line mode |
Overwrites script's "testname" attribute of the Control element containing a string representing a name of the test series to be used in naming output script, stats, and log files. Output log files begin with <testname>_<testnumber>. -Dcontrol.testname="<String>" |
control.testnumber |
Command Line mode |
Specifies the control number to be used in naming output script, stats, and log files. Output log files begin with <testname>_<testnumber>. -Dcontrol.testnumber="<String>". Although the auto generated string is a 7 digit number based on current local time (2 character minutes + 2 character seconds + 3 character hundredths), any string can be used to denote the control number as long as it can be used in a filename. |
control.ignorecontent |
Command Line mode |
Overwrites script's "ignorecontent" attribute of the Control element indicating the Access Tester should ignore differences in Content between the original test case and current results. -Dcontrol.testname="true|false" |
control.loopback |
Command Line mode |
Runs the Access Tester in loopback mode to test the Access Tester for internal regressions against a known good script. Used for unit testing the Access Tester. -Dcontrol.loopback="true" |
To manually drive (and capture) requests and view real-time response through the graphical user interface, start the tester in Console mode. This procedure omits all system properties, even though several can be used with Console mode.
The jar file defines the class to be started by default; no class name need be specified. Ensure that the nap-api.jar is present in the same directory as oamtest.jar.
See Also:
To start the Access Tester in console mode without system properties
From the directory containing the Access Tester jar files, enter the following command:
java -jar oamtest.jar
Proceed to one of the following topics for more information:
This section is divided into the following topics:
To run a test script, or to customize Access Tester operations, you must start the tester in command line mode and include system properties using the Java -D option.
When running in command line mode, the Access Tester returns completion codes that can be used by shell scripts to manage test runs. When you run the Access Tester in Console mode, you do not need to act upon codes that might be returned by the Access Tester.
Shell scripts that wrap the Access Tester to execute specific test cases must be able to recognize and act upon exit codes communicated by the Access Tester. In command line mode, the Access Tester exits using System.Exit (N), where N can be one of the following codes:
0 indicates successful completion of all test cases with no mismatches. This also includes a situation where no test cases are defined in the input script.
3 indicates successful completion of all test cases with at least one mismatch.
1 indicates that an error prevented the Access Tester from running or completing test cases. This includes conditions such as No input script specified, Unable to read the input script, Unable to establish server connection, Unable to generate the target script.
These exit codes can be picked up by shell scripts ($? In Bourne shell) designed to drive the Access Tester to execute specific test cases.
Use the following procedure to start the Access Tester in command line mode and specify any number of configuration options using the Java-D mechanism.
To start the Access Tester with system properties or for use in command line mode
From the directory containing the Access Tester jar files, enter the command with the appropriate system properties for your environment. For example:
java -Dscript.scriptfile="\tests\script.xml" -Dcontrol.ignorecontent="true" -jar oamtest.jar
After startup, proceed to one of the following topics for more information:
This section introduces the Access Tester Console, navigation, and controls.
Figure 10-3 shows the fixed-size Access Tester Console. This is the window through which users can interact with the application if the Access Tester is started in Console mode. The window can not be resized. Details follow the screen.
At the top of the main window are the menu names within a menu bar. Under the menu bar is the tool bar. All of the commands represented by buttons in the tool bar are also available as menu commands.The Access Tester Console is divided into four panels, described in Table 10-3.
Table 10-3 Access Tester Console Panels
Panel Name | Description |
---|---|
Server Connection |
Provides fields for the information required to establish a connection to the OAM Server (a single primary server and a single secondary server), and the Connect button: See also: "Establishing a Connection Between the Access Tester and the OAM Server". |
Protected Resource URI |
Provides information about a resource whose protected status needs to be validated. The Validate button is used to submit the Validate Resource server request. See also: "Validating Resource Protection from the Access Tester Console". |
User Identity |
Provides information about a user whose credentials need to be authenticated. The Authenticate button is used to submit the Authenticate User server request. See also: "Testing User Authentication from the Access Tester Console". |
Status Messages |
Provides a scrollable status message area containing messages displayed by the application in response to user gestures. The Authorize button is used to submit the Authorize User server request. See also: "Observing Request Latency". |
Text fields support right-clicking to display the Edit menu and drag-and-drop operations using the mouse and cursor.
There are four primary buttons through which you submit test requests to the OAM Server. Each button acts as a trigger to initiate the named action described in Table 10-4.
Table 10-4 Command Buttons in Access Tester Panels
Panel Button | Description |
---|---|
Connect |
Submits connection information and initiates connecting. |
Validate |
Submits information provided in the Protected Resource URI panel and initiates validation of protection. |
Authenticate |
Submits information provided in the User Identity panel and initiates authentication confirmation. |
Authorize |
Submits information provided in the User Identity panel and initiates authorization confirmation. |
Table 10-5 identifies additional Access Tester Console buttons and their use. All command buttons provide a tip when the cursor is on the button.
Table 10-5 Additional Access Tester Buttons
Command Buttons | Description |
---|---|
![]() |
Loads connection configuration details that were saved to an XML file (config.xml, by default). You can refresh the information in the Console by clicking this button. |
![]() |
Saves connection configuration details to a file (default name, config.xml). You can add the name of this document to the input script to provide proper connection information to the Access Tester running in command line mode. The Save command button at the bottom of the Console saves the content of the Status Message panel to a log file. |
![]() |
Clears fields on a panel containing the icon. Tool bar action clears all fields except connection fields if the connection has already been established. |
![]() ![]() |
Captures the last named request to the capture queue with the corresponding response received from the OAM Server. Together, the request and response create a test case. The capture queue status at the bottom of the Console is updated to reflect the number of test cases in the queue. You can save the contents of the capture queue to create a test script containing multiple test cases using the Generate Script command on the Test menu or a command button. |
![]() |
Generates a test script that includes every test case currently in the capture queue, and asks if the queue should be cleared. Do not clear the queue until all your test cases have been captured and saved to a test script. |
![]() |
Runs a test script against the current OAM Server. The Status message window is populated with the execution status as the script progresses through each test case. |
![]() |
Imports a copied URI from the clipboard after parsing it to populate fields in the URI panel. |
![]() |
Displays a dialog showing the password in clear text |
The Access Tester provides the menus described in Table 10-6. All menu items have mnemonics that are exposed by holding down the ALT key (on Windows systems). There are also command accelerators (keyboard activation) available using the CTRL-<KEY> combination defined for each menu command.
Table 10-6 Access Tester Menus
Menu Title | Menu Commands |
---|---|
File |
Note: To minimize the amount of data entry the Save Configuration and Open Configuration menu (and tool bar command buttons) allow for specific Connection, URI, and Identity information to be saved to (and read from) a file. Thus, it becomes fairly simple to manage multiple configurations. Also, the configuration file can be used as input to the Access Tester when you run it in command line mode and execute a test script. |
Edit |
Provides standard editing commands, which act on fields:
|
Test |
Note: You can use functions here to capture the last request and response to create a test case that you can save to a test script to be run at a later time. |
Help |
This section describes how to perform quick spot checks using the Access Tester in Console mode with OAM Servers.
Spot checks or troubleshooting connections between the Agent and OAM Server can help you assess whether the Agent can communicate with the OAM Server, which is especially helpful after an upgrade or product migration. Spot checks or troubleshooting resource protection that can be exercised by Agents and OAM Servers can help you develop end-to-end tests of policy configuration during the application lifecycle.
The following overview identifies the tasks and sequence to be performed and where to locate additional information about each task.
Note:
You can capture each request and response pair to create a test case, and save the test cases to a script file that can be run later. For details, see "Creating and Managing Test Cases and Scripts".Task overview: Performing spot checks from the Access Tester Console
Start the Access Tester, as described in "Installing and Starting the Access Tester".
Add relevant details to the Server Connection panel and click Connect, as described in "Establishing a Connection Between the Access Tester and the OAM Server".
Enter or import details into the Protected Resource URI pane and click Validate, as described in "Validating Resource Protection from the Access Tester Console".
Add relevant details to the User Identity panel and click Authenticate, as described in "Testing User Authentication from the Access Tester Console".
After successful authentication, click Authorize in the User Identity panel, as described in "Testing User Authorization from the Access Tester Console".
Check the latency of requests, as described in "Observing Request Latency".
Before you can send a request to the OAM Server you must establish a connection between the Access Tester and the server. This section describes how to establish that connectivity.
You enter required information for the OAM Server and the Agent you are emulating in the Access Tester Connection panel and then click the Connect button. The Tester initiates the connection, and displays the status in the Status Messages panel. Once the connection is established, it is used for all further operations.
Caution:
Once the connection is established, it cannot be changed until you restart the Access Tester Console.Figure 10-4 illustrates the Server Connection panel and controls.
Table 10-7 describes the information needed to establish the connection. The source of your values is the OAM Administration Console, System Configuration tab.
Table 10-7 Connection Panel Information
After entering information and establishing a connection, you can save details to a configuration file that can be re-used later.
Use the following procedure to submit your connection details for the OAM Server.
Prerequisites
Installing and Starting the Access Tester
See Also:
"About the Connection Panel"To test connectivity between the Access Tester and the OAM Server
In the Server Connection Panel (Table 10-7), enter:
Primary and secondary OAM Proxy details
Timeout period
Communication encryption mode
Agent details
Click the Connect button.
Beside the Connect button, look for the green check mark indicating the connection is established.
In the Status Messages panel, verify a Yes response.
If the connection still cannot be made, start the Access Tester Console using the Trace Connection command mode and look for additional details in the connection log. Also, ask the OAM administrator of the OAM Server to review the policy server log.
Save Good Connection Details: From the Test menu, click the Generate a Script command and enter a name for this configuration file (or use the default name, config.xml).
Before a user can access a resource, the Agent must first validate that the resource is protected. Using the Access Tester, you can act as the Agent to have the OAM Server validate whether or not the given URI is protected and communicate the response to the Access Tester, as described here.
You must enter required information for the resource you want to validate in the Access Tester Protected Resource URI panel, and then click the Validate button.
To minimize data entry, you can import long URIs that you have copied from a browser and then click the Import URI command button. The Tester parses the URI saved to the clipboard and populates the URI fields in the Access Tester.
Figure 10-5 illustrates the panel where you enter the URI details to validate that the resource is protected. When combined, the URI fields follow RFC notation. For example: http://oam_server1:7777/index.html
.
Table 10-8 describes the information needed to perform this validation.
Table 10-8 Protected Resource URI Panel Fields and Controls
Field or Control | Description |
---|---|
Scheme |
Enter http or https, depending on the communication security specified for the resource. Note: The Access Tester supports only http or https resources. You cannot use the Access Tester to test policies that protect custom non-http resources. |
Host |
Enter a valid host name for the resource. Note: Your <host:port> combination specified in the Access Tester must match one of the Host Identifiers defined in the OAM Administration Console. If the host identifier is not recognized, OAM cannot validate resource protection. |
Port |
Enter a valid port for the URI. Note: The <host:port> combination specified in the Access Tester must match one of the Host Identifiers as defined in the OAM Server. If the host identifier is not recognized, OAM cannot validate resource protection. |
Resource |
Enter the Resource component of the URI (/index.htm in the example). This resource should match a resource defined for an authentication and authorization policy in the OAM Administration Console. Note: If protected, the resource identifier that you provide here must match the one specified in an authorization policy in the OAM Administration Console. |
![]() |
Click this button to parse and import a URI that is saved on a clipboard. |
Operation |
Select the operational component of the URI from the list provided in the Access Tester. The OAM Server does not distinguish between different actions, however. Therefore, leaving this set to Get should suffice. |
Get Auth Scheme |
Check this box to request the OAM Server to return details about the Authentication Scheme that is used to secure the protected resource. If the URI is protected, this information is displayed in the Status Messages panel. |
Validate |
Click the Validate button to submit the request to the OAM Server. When the response is received, the Access Tester displays it in the Status Messages panel. |
![]() |
A green check mark appearing beside the Validate button indicates a "Yes" response; the resource is protected. The Status Messages panel provides the redirect URL for the resource and that credentials are expected. Note: If you checked the Get Auth Scheme box, the name and level of the Authentication Scheme that protects this resource are also provided in the Status Messages panel. |
![]() |
A red circle appearing beside the Validate button indicates that the resource is not protected. A No response will also appear in the Status Messages. |
You can capture each request and response pair to create a test case, and save multiple test cases to a script file that can be run later.
Use the following procedure to submit your resource information to the OAM Server and verify responses in the Status Messages panel.
Prerequisites
Establishing a Connection Between the Access Tester and the OAM Server
See Also:
"About the Protected Resource URI Panel"To confirm that a resource is protected
In the Access Tester Protected Resource URI panel, enter or import your own resource information (Table 10-8).
Click the Validate button to submit the request.
Review Access Tester output, including the relevant data about the resource such as how the resource is protected, level of protection, and so on.
Beside the Validate button, look for the green check mark indicating the resource is protected.
In the Status Messages panel, verify the redirect URL, authentication scheme, and that credentials are expected.
Capture the request and response to create a test case for use later, as described in "Creating and Managing Test Cases and Scripts".
Retain the URI to minimize data entry and server processing using one of the following methods.
Proceed to "Testing User Authentication from the Access Tester Console"
This topic provides the following information:
Before a user can access a resource, the Agent must validate the user's identity based on the defined authentication policy on the OAM Server. Using the Access Tester, you can act as the Agent to have the OAM Server authenticate a specific userID for the protected resource. All relevant authentication responses are considered during this policy evaluation.
Figure 10-6 illustrates the Access Tester panel where you enter the information needed to test authentication.
Table 10-9 describes the information you must provide.
Table 10-9 Access Tester User Identity Panel Fields and Controls
Field or Control | Description |
---|---|
IP Address |
Enter the IP Address of the user whose credentials are being validated. All Agents communicating with the OAM Server send the IP address of the end user. Default: The IP address that is filled in belongs to the computer from which the Access Tester is run. To test a policy that requires a real user IP address, replace the default IP address with the real IP address. |
User Name |
Enter the userID of the individual whose credentials are being validated. Note: The Access Tester enables (or disables) the user name and password fields if the resource is protected by an authentication scheme that requires those credentials. Otherwise, this field is disabled. |
Password |
Enter the password of the individual whose credentials are being validated. |
? |
Click this button to display the password in clear text within a popup window. |
User Certificate Store |
The file (in PEM format) containing the X.509 certificate of the user whose credentials should be authenticated. If the URI is protected by the X.509 Authentication Scheme, the Access Tester uses the PEM-formatted X.509 certificate as a credential instead of (or in addition to) the username/password. If the the Authentication Scheme does not require an X.509 certificate, this field is disabled. Note: For certificate-based authentication to work, the OAM Server must be properly configured with root CA certificates and SSL keystore certificates. See Appendix E for details about securing communication between OAM 11g Servers and WebGates. |
... |
Click this button to browse the file system for the user certificate store path. |
Authenticate |
Click the Authenticate button to submit the request to the OAM Server and look for a response in the Status Messages panel. Note: The type of credentials supplied (username/password or X.509 certificate) must match the requirements of the authentication scheme that protects the URI. |
Authorize |
After the user's credentials are validated, you can click the Authorize button to submit the request for the resource to the OAM Server. Check the Status Messages panel for a response. |
![]() |
A green check mark appearing beside the Authenticate button indicates authentication success; The Status Messages panel also indicates "yes" authentication was successful, and provides the user DN and session id. A green check mark appearing beside the Authorize button indicates authorization success; The Status Messages panel also indicates "yes" authorization was successful, and provides application domain details. |
![]() |
A red circle appearing beside the Authenticate button indicates authentication failure; The Status Messages panel also indicates "no" authentication was not successful. A red circle appearing beside the Authorize button indicates authorization failure; The Status Messages panel also indicates "no" authorization was not successful. |
You can capture each request and response pair to create a test case, and save multiple test cases to a script file that can be run later.
Use the following procedure to submit the end user credentials to the OAM Server and verify authentication. All relevant authentication responses are considered during this policy evaluation.
Prerequisites
Validating Resource Protection from the Access Tester Console with URI information retained in the Console
See Also:
"About the User Identity Panel"To test user credential authentication
In the Access Tester User Identity panel, enter information for the user to be authenticated (Table 10-9).
Click the Authenticate button to submit the request.
Beside the Authenticate button, look for the green check mark indicating the user is authenticated.
Not Successful: Confirm that you entered the correct userID and password and try again. Also, check the OAM Administration Console for an active user session that you might need to end, as described in Chapter 12.
Capture the request and response to create a test case for use later, as described in "Creating and Managing Test Cases and Scripts".
Retain the URI and user identity information and proceed to "Testing User Authorization from the Access Tester Console".
Before a user can access a resource, the Agent must validate the user's permissions based on defined policies on the OAM Server. Using the Access Tester, you can act as the Agent to have the OAM Server validate whether or not the authenticated user identity can be authorized to access the resource.
Use the following procedure to verify the authenticated end user's authorization for the resource. All relevant authorization constraints and responses are considered during this policy evaluation.
Prerequisites
Testing User Authentication from the Access Tester Console with all information retained in the Console
See Also:
"About the User Identity Panel"Note:
Once the protected resource URI is confirmed and the user's identity is authenticated from the Access Tester, no further information is needed. You simply click the Authorize button to submit the request. However, if the resource is changed to another you must start the sequence anew and validate, then authenticate, and then authorize.To test user authorization
In the Access Tester User Identity panel, confirm the user is authenticated (Table 10-9).
In the Access Tester User Identity panel, click the Authorization button.
Beside the Authorization button, look for the green check mark indicating the user is authorized.
Not Successful: Confirm the authorization policy using the OAM Administration Console.
In the Status Messages panel (or execution log file), verify details about the test run.
Capture the request and response to create a test case for use later, as described in "Creating and Managing Test Cases and Scripts".
Proceed to:
To understand OAM Server performance you must know how well the OAM Server handles requests passed by the Agent. While there are many ways to expose a server's metrics, it is sometimes useful to expose server performance from the standpoint of the Agent. Using the Access Tester, you can do just that as described here.
Prerequisites
"Installing and Starting the Access Tester"
Task overview: Observing request latency includes
"Testing User Authentication from the Access Tester Console"
Check latency information in the execution logfile as shown here, as well as in other files generated during a test run. For example:
... [2/3/10 11:03 PM][info] Summary statistics [2/3/10 11:03 PM][info] Matched 4 of 4, avg latency 232ms vs 238ms [2/3/10 11:03 PM][info] Validate: matched 2 of 2, avg latency 570ms vs 578ms [2/3/10 11:03 PM][info] Authenticate: matched 1 of 1, avg latency 187ms vs 187ms [2/3/10 11:03 PM][info] Authorize: matched 1 of 1, avg latency 172ms vs 188ms ...
Proceed to:
Test management refers to the creation of repeatable tests that can be executed at any time by an individual administrator or system. Quick spot checks are very useful and effective in troubleshooting current issues. However, a more predictable and repeatable approach to validating server and policy configuration is often necessary. This approach can include testing OAM Server configuration for regressions after a product revision, or during a policy development and QA cycle.
To be useful such tests must allow for multiple use cases to be executed as group. Once the test scripts have been designed and validated as correct, replaying the tests against the OAM Server helps identify regressions in a policy configuration.
This section provides the information you need to perform test management in the following topics:
A test case is created from the request sent to, and response data received from, the OAM Server using the Access Tester. Among other data elements, a test case includes request latency and other identifying information that enables analysis and comparison of old and new test cases.
Once captured, the test case can be replayed without new input, and then new results can be compared with old results. If the old results are marked as "known good" then deviations from those results constitute failed test cases.
The test case workflow is illustrated by Figure 10-7.
Task overview: Creating and managing a test case
From the Access Tester Console, you can connect to the OAM Server and manually conduct individual tests. You can save the request to the capture queue after a request is sent and the response is received from the OAM Server. You can continue capturing additional test cases before generating a test script and clearing the capture queue. If you exit the Access Tester before saving the capture queue, you are asked if the test cases should be saved to a script before exiting. Oracle recommends that you do not clear the queue until all your test cases have been captured.
Once you have the test script, you can run it from either the Access Tester Console or from the command line.
You can save each test case to a capture queue after sending the request from the Access Tester to the OAM Server and receiving the response. You can capture as many individual test cases as you need before generating a test script that will automate running the group of test cases. For instance, the following outlines three test cases that must be captured individually:
A validation request and response
An authentication request and response
An authorization request and response
Table 10-10 describes the location of the capture options.
Table 10-10 Access Tester Capture Request Options
Location | Description |
---|---|
Test menu Capture last "..." request |
Select this command from the Test menu to add the last request issued and results received to the capture queue (for inclusion in a test script later). |
![]() |
Select this command button from the tool bar to add the last request issued and results received to the capture queue (for inclusion in a test script later). |
If you exit the Access Tester before saving the capture queue, you are asked if the test cases should be saved to a script before exiting. Do not clear the Access Tester capture queue until all your test cases have been captured.
To capture one or more test cases
Initiate a request from the Access Tester Console, as described in "Testing Connectivity and Policies from the Access Tester Console".
After receiving the response, click the Capture last "..." request command button in the tool bar (or choose it from the Test menu).
Confirm the capture in the Status Messages panel and note the Capture Queue test case count at the bottom of the Console, as shown here.
Repeat steps 1, 2, and 3 to capture in the queue each test case that you need for your test script.
Proceed to "Generating an Input Test Script".
A test script is a collection of individual test cases that were captured using the Access Tester Console. When individual test cases are grouped together, it becomes possible to automate test coverage to validate policy configuration for a specific application or site.
You can create a test script to be used as input to the Access Tester and drive automated processing of multiple test cases. The Generate Script option enables you to create an XML file test script and clear the capture queue. If you exit the Access Tester before saving the capture queue, you are asked if the test cases should be saved to a script before exiting.
Note:
Do not clear the capture queue until you have captured all the test cases you want to include in the script.You can create a test script to be used as input to the Access Tester and drive automated processing of multiple test cases. Such a script must follow these rules:
Allows possible replay by a person or system
Allows possible replay against different policy servers w/o changing the script, to enable sharing of test scripts to drive different Policy Servers
Allows comparison of test execution results against "Known Good" results
Following are the locations of the Generate Script command.
Table 10-11 Generate Script Command
Location of the Command | Description |
---|---|
Test menu Generate Script |
Select Generate Script from the Test menu to initiate creation of the script containing your captured test cases. |
![]() |
Select the Generate Script command button from the tool bar to initiate creation of the script containing your captured test cases. After you specify or select a name for your script, you are asked if the capture queue should be cleared. Do not clear the capture queue until all your test cases are saved to a script. |
Prerequisites
To record a test script containing captured test cases
Perform and capture each request that you want in the script, as described in "Capturing Test Cases".
Click the Generate Script command button in the tool bar (or choose it from the Test menu to include all captured test cases.
In the new dialog box, select or enter the name of your new XML script file and then click Save.
Click Yes to overwrite an existing file (or No to dismiss the window and give the file a new name).
In the Save Waning dialog box, click No to retain the capture queue and continue adding test cases to your script (or click Yes to clear the queue of all test cases).
Confirm the location of the test script before you exit the Access Tester.
Personalize the test script to include details such as who, when, and why the script was developed, as described next.
This section describes how to personalize and customize a test script.
The control block of a test script is used to tag the script and specify information to be used during the execution of a test. You might want to include details about who created the script and when and why the script was created. You might also want to customize the script using one or more control parameters.
The Access Tester provides command line "control" parameters to change processing of the script without changing the script. (test name, test number, and so on). This enables you to configure test runs without having to change "known good" input test scripts. Table 10-12 describes the control elements and how to customize these.
Table 10-12 Test Script Control Parameters
Control Parameter | Description |
---|---|
i |
Ignores differences in the Content section of the use case when comparing the original OAM Server response to the current response. The default is to compare the Content sections. This parameter can be overwritten by a command line property when running in the command line mode. Default: false (Compare Content sections). Values: true or false In command line mode, use Ignorecontent=true to over ride the specified value in the Control section of the input script. |
testname="oamtest" |
Specifies a prefix to add to file names in the "results bundle" as described in the previous section. In command line mode, use Testname=name to over ride the specified value in the Control section. |
Configfile="config.xml" |
Specifies the absolute path to a configuration XML file that was previously created by the Access Tester. In command line mode, this file is used by the Access Tester to locate connection details to establish a server connection. |
Numthreads Reserved for future use |
indicates the number of threads to be started by the Access Tester to run multiple copies of the test script. This supports stress testing of the OAM Server. Default: 1 |
Numiterations Reserved for future use |
indicates the number of iterations of the test should be performed by the Access Tester. This provides for longevity testing of the OAM Server. Default: 1 |
Prerequisites
Generating an Input Test Script
To customize a test script
Locate and open the test script that was generated by the Access Tester.
Add any details that you need to customize or personalize the script.
Save the file and proceed to "Executing a Test Script".
Once a test script has been created against a "Known Good" policy configuration and marked as "Known Good", it is important to drive the Access Tester using the script rather than specifying each test manually using the Console. This section provides the following topics:
You can interactively execute tests scripts from within the Access Tester Console, or use automated test runs performed by command scripts. Automated test runs can be scheduled by the operating system or a harness such as Apache JMeter, and executed without manual intervention. Other than lack of human input in command line mode, the two execution modes are identical.
Note:
A script such as .bat (Windows) or .sh (Unix) executes a test script in command line mode. Once a test script is created, it can be executed using either the Run Script menu command or the Access Tester command line.Table 10-13 describes the commands to execute a test script.
Table 10-13 Run Test Script Commands
Location | Description |
---|---|
Test menu Run Script |
Select the Run Script command from the Test menu to begin running a saved test script against the current policy server. The Status message panel is populated with the execution status as the script progresses. |
![]() |
Select the Run Script command button from the tool bar to begin running a saved test script against the current policy server. The Status message panel is populated with the execution status as the script progresses. |
Command line mode |
A script such as .bat (Windows) or .sh (Unix) executes a test script in command line mode. Once a test script is created, it can be executed using either the Run Script menu command or the Access Tester command line. |
The following overview describes how the Access Tester operates when running a test. Other than lack of human input in command line mode, the two execution modes are identical.
Process overview: Access Tester behavior when running a test script
The Access Tester loads the input xml file.
In command line mode, the Access Tester opens the configuration XML file defined within the input test script's Control element.
The Access Tester connects to the primary and secondary OAM Proxy using information in the Server Connection panel of the Console.
In command line mode, the Access Tester uses information in the Connection element of the configuration XML file.
In command line mode, the Access Tester checks the Control elements in the input script XML file to ensure none have been overwritten on the command line (command line values take precedence).
For each original test case defined in the script, the Access Tester:
Creates a new target test case.
Sends the original request to the OAM Server and collects the response.
Makes the following comparisons:
Compares the new response to the original response.
Compares response codes and marks as "mismatched" any new target test case where response codes differ from the original test case. For instance, if the original Validate returned "Yes", and now returns "No", a mismatch is marked.
When response codes are identical, and "the ignorecontent" control parameter is "false", the Access Tester compares Content (the name of the Authentication scheme or post authorization actions that are logged after each request). If Content sections differ, the new target test case is marked "mismatched".
Collect new elapsed time and store it in the target use case.
Build a new target test case containing the full state of the last server request and the same unique ID (UUID) as the original test case.
Update the internal statistics table with statistics for the target test case (request type, elapsed time, mismatched, and so on).
After completing all the input test cases, the Access Tester:
Displays summary results.
Obtains and combines the testname and testnumber, and generates a name for the "results bundle" (three files whose names start with <testname>_<testnumber>.
Note:
Shell scripts can automate generating the bundle by providing testname and testnumber command line parameters.Obtain testname from the command line parameter. If not specified in the command line, use the testname element of the input script's Control block.
Obtain testnumber from the command line parameter. If not specified, testnumber defaults to a 7-character numeric string based on the current local time: 2 character minutes, 2 character seconds, 3 character hundredths.
Generates the "results bundle": three files whose names start with <testname>_<testnumber>:
The target XML script contains the new test cases: <testname>_<testnumber_results.xml.
The statistics XML file contains a summary and detailed statistics of the entire test run, plus those test cases marked as "mismatched": <testname>_<testnumber_stats.xml
The execution log file contains information from the Status Message panel: <testname>_<testnumber_log.log.
In command line mode, the Access Tester exits with the exit code as described in "About the Access Tester Command Line Mode".
Prerequisites
Generating an Input Test Script
To run a test script
Confirm the location of the saved test script before exiting the Access Tester., as described in "Generating an Input Test Script".
Submit the test script for processing using one of the following methods:
From the Access Tester Console, click the Run Script command button in the tool bar (or select Run Script from the Test menu), then follow the prompts and observe messages in the Status Message panel as the script executes.
From the command line, specify your test script with the desired system properties, as described in "Starting the Access Tester with System Properties For Use in Command Line Mode".
java -Dscript.scriptfile="\tests\script.xml" -Dcontrol.ignorecontent="true" -jar oamtest.jar
Review the log and output files and perform additional analysis after the Access Tester compares newly generated results with results captured in the input script, as described in "Evaluating Scripts, Log File, and Statistics".
This section provides the following information:
At the end of a test run a "results bundle" gets generated containing three documents:
Target script: An XML document containing new test cases
Execution log: A text file containing the messages displayed during script execution
Execution statistics: An XML document containing test metrics and a list of mismatched elements
The matching pair of test cases in the original and target scripts shares the test case ID. This ID is represented by a UUID value, which makes it possible to compare individual test cases in the original script with those in the target script. For more information, see "About the Generated Input Test Script".
The statistics document contains the summary and detail statistics, as well as a list of test cases that did not match. The detailed statistics can be used for further analysis or to keep a historical trail of results. The summary statistics are the same statistics displayed at the end of the test run and can be used to quickly assess the state of a test run. The list of mismatched test cases as created in the statistics document contains test case IDs that have triggered mismatch and includes the reason for the mismatch, as seen in Table 10-14.
Table 10-14 Mismatched Results Reasons in the Statistics Document
Reason for a MisMatch | Description |
---|---|
Result |
The test cases did not match because of the difference in OAM Server response codes (Yes versus No). |
Content |
The test cases did not match because of the differences in the specific data values that were returned by the OAM Server. The specific values from the last test run that have triggered the mismatch are included. |
This is the output files that is saved using the Save Configuration command on the File menu; the default file name is config.xml. This connection configuration file includes details that were specified in the Access Tester Console, Server Connection panel.
Note:
An input test script file is also generated as described in the following topic. The name of the configuration file is used in the input test script to ensure that running the Access Tester in command line mode picks up connection information defined in the connection file.Example 10-1 Connection Configuration File
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <oamtestconfig xmlns="http://xmlns.oracle.com/idm/oam/oamtest/schema" version="1.0"> <connection timeout="30000" minnconn="1" mode="open"> <agent password="00030d05101b050c42" name="agent1"/> <keystore rootstore="" keystore_password="" keystore="" global_passphrase=""/> <primary> <server maxconn="1" port="2100" addr="oam_server1"/> </primary> <secondary> <server maxconn="1" port="0" addr=""/> </secondary> </connection> <uri getauthscheme="true"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> <identity> <id>admin1</id> <password>00030d05101b050c42</password> <ipaddr>111.222.3.4</ipaddr> </identity> </oamtestconfig>
The input test script is generated by using the Access Tester and capturing your own test cases. The "configfile" attribute of the "Control" element is updated after creation to specify the connection configuration file to be used in command line mode for establishing a connection to the OAM Server.
Example 10-2 Generated Input Test Script
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <oamtestscript xmlns="http://xmlns.oracle.com/idm/oam/oamtest/schema" version="1.0"> <history description="Manually generated using agent 'agent1'" createdon="2010-02-03T22:28:00.468-05:00" createdby="test_user"/> <control numthreads="1" numiterations="1" ignorecontent="false" testname="samplerun1" configfile="config.xml"/> <cases numcases="4"> <case uuid="465a4fda-d814-4ab7-b81b-f3f1cd72bbc0"> <request code="Validate"> <uri getauthscheme="true"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> </request> <response elapsed="984" code="Yes"> <comment></comment> <status>Major code: 4(ResrcOpProtected) Minor code: 2(NoCode)</status> <content> <line type="auth.scheme.id">LDAPScheme</line> <line type="auth.scheme.level">2</line> <line type="auth.scheme.required.creds">2</line> <line type="auth.scheme.redirect.url">http://dadvmh0172.us.oracle.com:14100/oam/server/</line> </content> </response> </case> <case uuid="009b44e3-1a94-4bfc-a0c3-84a38a9e0f2a"> <request code="Authenticate"> <uri getauthscheme="true"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> <identity> <id>weblogic</id> <password>00030d05101b050c42</password> <ipaddr>192.168.1.8</ipaddr> </identity> </request> <response elapsed="187" code="Yes"> <comment></comment> <status>Major code: 10(CredentialsAccepted) Minor code: 2(NoCode)</status> <content> <line type="user.dn">cn=weblogic,dc=us,dc=oracle,dc=com</line> </content> </response> </case> <case uuid="84fe9b06-86d1-47df-a399-6311990743c3"> <request code="Authorize"> <uri getauthscheme="true"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> <identity> <id>weblogic</id> <password>00030d05101b050c42</password> <ipaddr>192.168.1.8</ipaddr> </identity> </request> <response elapsed="188" code="Yes"> <comment></comment> <status>Major code: 8(Allow) Minor code: 2(NoCode)</status> <content/> </response> </case> <case uuid="61579e47-5532-42c3-bbc7-a00828256bf4"> <request code="Validate"> <uri getauthscheme="false"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> </request> <response elapsed="172" code="Yes"> <comment></comment> <status>Major code: 4(ResrcOpProtected) Minor code: 2(NoCode)</status> <content/> </response> </case> </cases> </oamtestscript>
This example was generated by running the Access Tester in command line mode and specifying the script.xml file as input to execute the 4 captured test cases:
Dscript.scriptfile="script.xml" -jar oamtest.jar
Notice the various sections in Example 10-3. As shown in the execution log, this test run found no mismatches, and shows that 4 out of 4 requests matched.
Example 10-3 Output File Generated During a Test Run
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <oamtestscript xmlns="http://xmlns.oracle.com/idm/oam/oamtest/schema" version="1.0"> <history description="Generated from script 'script.xml' using agent 'agent1'" createdon="2010-02-03T23:03:02.171-05:00" createdby="test_user"/> <control numthreads="1" numiterations="1" ignorecontent="false" testname="oamtest" configfile=""/> <cases numcases="4"> <case uuid="465a4fda-d814-4ab7-b81b-f3f1cd72bbc0"> <request code="Validate"> <uri getauthscheme="true"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> </request> <response elapsed="969" code="Yes"> <comment></comment> <status>Major code: 4(ResrcOpProtected) Minor code: 2(NoCode)</status> <content> <line type="auth.scheme.id">LDAPScheme</line> <line type="auth.scheme.level">2</line> <line type="auth.scheme.required.creds">2</line> <line type="auth.scheme.redirect.url">http://dadvmh0172.us.oracle.com:14100/oam/server/ </line> </content> </response> </case> <case uuid="009b44e3-1a94-4bfc-a0c3-84a38a9e0f2a"> <request code="Authenticate"> <uri getauthscheme="true"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> <identity> <id>weblogic</id> <password>00030d05101b050c42</password> <ipaddr>111.222.3.4</ipaddr> </identity> </request> <response elapsed="187" code="Yes"> <comment></comment> <status>Major code: 10(CredentialsAccepted) Minor code: 2(NoCode)</status> <content> <line type="user.dn">cn=weblogic,dc=us,dc=oracle,dc=com</line> </content> </response> </case> <case uuid="84fe9b06-86d1-47df-a399-6311990743c3"> <request code="Authorize"> <uri getauthscheme="true"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> <identity> <id>weblogic</id> <password>00030d05101b050c42</password> <ipaddr>111.222.3.4</ipaddr> </identity> </request> <response elapsed="172" code="Yes"> <comment></comment> <status>Major code: 8(Allow) Minor code: 2(NoCode)</status> <content/> </response> </case> <case uuid="61579e47-5532-42c3-bbc7-a00828256bf4"> <request code="Validate"> <uri getauthscheme="false"> <scheme>http</scheme> <host>oam_server1</host> <port>7777</port> <resource>/index.html</resource> <operation>Get</operation> </uri> </request> <response elapsed="171" code="Yes"> <comment></comment> <status>Major code: 4(ResrcOpProtected) Minor code: 2(NoCode)</status> <content/> </response> </case> </cases> </oamtestscript>
The statistics file (_stats.xml) is generated together with the target output script during the test run identified in the Execution log. The script.xml file was used as input to execute the 4 captured test cases. The test run found no mismatches, and shows that 4 out of 4 requests matched.
A sample statistics document is shown in Example 10-4. The various sections that provide statistics for this run, which you can compare against statistics for an earlier "known good" run.
Example 10-4 Sample Statistics Document
A sample statistics document is shown here. Notice, <oamteststats xmlns="http://xmlns.oracle.com/idm/oam/oamtest/schema" version="1.0"> <history description="Generated from script 'script.xml' using agent 'agent1'" createdon="2010-02-03T23:03:02.171-05:00" createdby="test_user"/> <summary> <total> <nummatched>4</nummatched> <numtotal>4</numtotal> <avgelapsedsource>238</avgelapsedsource <avgelapsedtarget>232</avgelapsedtarget> </total> <validate> <nummatched>2</nummatched> <numtotal>2</numtotal> <avgelapsedsource>578</avgelapsedsource> <avgelapsedtarget>570</avgelapsedtarget> </validate> <authenticate> <nummatched>1</nummatched> <numtotal>1</numtotal> <avgelapsedsource>187</avgelapsedsource> <avgelapsedtarget>187</avgelapsedtarget> </authenticate> <authorize> <nummatched>1</nummatched> <numtotal>1</numtotal> <avgelapsedsource>188</avgelapsedsource> <avgelapsedtarget>172</avgelapsedtarget> </authorize> <summary> <detail> <source> <validate> <yes>2</yes> <no>0</no> <error>0</error> <mismatch>0</mismatch> <elapsed>1156</elapsed> </validate> <authenticate> <yes>1</yes> <no>0</no> <error>0</error> <mismatch>0</mismatch> <elapsed>187</elapsed> </authenticate> <authorize> <yes>1</yes> <no>0</no> <error>0</error> <mismatch>0</mismatch> <elapsed>188</elapsed> </authorize> </source> <target> <validate> <yes>2</yes> <no>0</no> <error>0</error> <mismatch>0</mismatch> <elapsed>1140</elapsed> </validate> <authenticate> <yes>1</yes> <no>0</no> <error>0</error> <mismatch>0</mismatch> <elapsed>187</elapsed> </authenticate> <authorize> <yes>1</yes> <no>0</no> <error>0</error> <mismatch>0</mismatch> <elapsed>172</elapsed> </authorize> <target> </detail> <mismatch numcases="0"/> </oamteststats>
This sample execution log was generated together with the target output script during a test run using script.xml to execute 4 test cases. The test run found no mismatches, and shows that 4 out of 4 requests matched.
As you review this example, notice the information provided which is the same as the information you see in the Status Messages panel of the Access Tester. Notice the test cases, test name, connection configuration file, agent name, connection status, request validation status, authentication scheme, redirect URL, credentials expected, authentication status and user DN, session ID, authorization status, validation status, and summary statistics. Also notice that the target script and statistics document were generated by this run.
Example 10-5 Execution Log
[2/3/10 11:02 PM][info] Setting up to run script 'script.xml' [2/3/10 11:02 PM][info] Loading test cases and control parameters from script [2/3/10 11:02 PM][info] Loaded 4 cases [2/3/10 11:02 PM][info] Control data for this test run: [2/3/10 11:02 PM][info] Test name : 'samplerun1' [2/3/10 11:02 PM][info] Configuration file : 'config.xml' [2/3/10 11:02 PM][info] Ignore content : 'false' [2/3/10 11:02 PM][info] Loading server configuration from file [2/3/10 11:02 PM][info] Loaded server configuration [2/3/10 11:02 PM][info] Connecting to server as agent 'oam_agent1' [2/3/10 11:03 PM][info][request] Connect : Yes ... [2/3/10 11:03 PM][info] Test 'samplerun1' will process 4 cases [2/3/10 11:03 PM][info][request] Validate : Yes [2/3/10 11:03 PM][info] Authentication scheme : LDAPScheme, level : 2 [2/3/10 11:03 PM][info] Redirect URL : http://oam_server1.us.company.com:2100/server/ [2/3/10 11:03 PM][info] Credentials expected: 0x01 (password) [2/3/10 11:03 PM][info][request] Authenticate : Yes [2/3/10 11:03 PM][info] User DN : cn=admin1,dc=us,dc=company,dc=com [2/3/10 11:03 PM][info] Session ID : -1 [2/3/10 11:03 PM][info][request] Authorize : Yes [2/3/10 11:03 PM][info][request] Validate : Yes [2/3/10 11:03 PM][info] Summary statistics [2/3/10 11:03 PM][info] Matched 4 of 4, avg latency 232ms vs 238ms [2/3/10 11:03 PM][info] Validate: matched 2 of 2, avg latency 570ms vs 578ms [2/3/10 11:03 PM][info] Authenticate: matched 1 of 1, avg latency 187ms vs 187ms [2/3/10 11:03 PM][info] Authorize: matched 1 of 1, avg latency 172ms vs 188ms [2/3/10 11:03 PM][info] Generated target script 'samplerun1_0302171__target.xml' [2/3/10 11:03 PM][info] Generated statistics log 'samplerun1_0302171__stats.xml'