This documentation is for WSO2 API Manager 1.10.0 View documentation for the latest release.
Page Comparison - Publishing API Runtime Statistics Using RDBMS (v.2 vs v.24) - API Manager 1.10.0 - WSO2 Documentation

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.

This section explains how to set up WSO2 Data Analytics Server (WSO2 DAS) with RDBMS client, to collect and analyze runtime statistics from the API Manager, using the WSO2 APIM UI.

Excerpt Include
Publishing API Runtime Statistics
Publishing API Runtime Statistics

From API Manager 1.10.x onwards, you can fetch summarized data using two clients:

  1. The REST client which fetches data directly from the DAS.
  2. The RDBMS client which fetches data from the RDBMS.

By default the API Manager is configured to the REST client. If you want to configure API Manager and DAS without the RDBMS, see Publishing APIM Runtime Statistics Using REST Client. To use the RDBMS client, select the client in the <APIM_HOME>/repository/conf/api-manager.xml file. You can configure it in the Configuring WSO2 API Manager section.

Table of Contents


Configuring WSO2 DAS

  1. Download WSO2 Data Analytics Server from this location:
  2. If APIM and the API Manager and DAS run on the same machine, open the <DAS_HOME>/repository/conf/carbon.xml file and increase the default service port of DAS by setting the offset value as follows:

    Code Block

    This increments all ports used by the server by 13, which means the WSO2 DAS server will run on port 94449446. Port offset is used to increment the default port by a given value. It avoids possible port conflicts when multiple WSO2 products run on same host.

  3. Define the datasource declaration according to your RDBMS in the <DAS_HOME>/repository/conf/datasources/master-datasources.xml file. 


    This DB is used to push the summarized data after analyzing is done by WSO2 DAS. Later, WSO2 API Manager uses this DB to fetch the summary data and display it on the API Manager dashboard. MySQL databases are used here as an example. However, it is also possible to configure it with H2,Oracle etc. Note that you should always use the WSO2AM_STATS_DB as the datasoure name.

    The auto commit option should be disabled when working with WSO2 DAS. Set this in the JDBC URL or by adding <defaultAutoCommit>false</defaultAutoCommit> under the datasource <configuration> property as shown below:

    datasource name.

    Code Block
        <description>The datasource used for setting statistics to API Manager</description>
        <definition type="RDBMS">
            <validationQuery>SELECT 1</validationQuery>
        <defaultAutoCommit>false</defaultAutoCommit>         </configuration>

Configuring the MySQL database


  1. Similar to WSO2 Business Activity Monitor (BAM), WSO2 DAS does not automatically create the table structure in the database. Therefore, it needs to be done manually. If you are using MySQL as the database, download and paste the MySQL driver to the <DAS_HOME>/repository/components/lib directory. 
  2. Import the correct schema declaration script from the DB scripts folder the <APIM_HOME>/dbscript/stat/sql folder to the above database. For example, use the mysql.sql to create schemas in the database. 

Uploading the API Manager analytics file

WSO2 DAS uses SparkSQL to analyze the data. All definitions about the data published from WSO2 API Manager and the way it should be analyzed using Spark are shipped to WSO2 DAS as a .car file. 

  1. Download the file.
  2. Start the WSO2 DAS server and log in to the Management Console.
  3. Navigate to the Carbon Applications section under Manage and click Add.
  4. Point to the downloaded file and upload.
  1. Warning

    Similar to WSO2 Business Activity Monitor (BAM), WSO2 DAS does not automatically create the table structure in the database. Therefore, it needs to be done manually as -Dsetup does not work.

Configuring WSO2 API Manager

  1. Download WSO2 API Manager.
  2. Open the <APIM_HOME>/repository/conf/api-manager.xml file and set to RDBMS by uncommenting the RDBMS client code. 

    Code Block
    <!-- For APIM implemented Statistic client for DAS REST API -->
         <!-- For APIM implemented Statistic client for RDBMS -->
  3. Start the API Manager server and log in to the Admin Dashboard (https://localhost<Server Host>:9443/admin-dashboard/).
  4. Click Configure Analytics under the Settings section.
    Image Added
  5. Select the Enable check box and the settings to configure analytics appears. to enable statistical data publishing.
  6. Set the event receiver configurations according to the DAS server (e.g. tcp://localhost:7614 as the URL and admin/admin as the username/password). 

    Event receivers refer to the endpoint to which events are published from the API Gateway. WSO2 DAS consists of a thrift receiver, which acts as the default event receiver. By default, this receiver listens on port 7611 when no port offsets are set and increments according to the port offset set for the DAS server in step 2 above.

    Alternatively, the DAS Server connection details can be configured via <APIM_HOME>/repository/conf/api-manager.xml by configuring the following under APIUsageTracking.

    Code Block

    Please note that the analytics configurations added via admin-dashboard UI will get overwritten by the configuration details provided via api-manager.xml, when you restart the server if you follow this step.

  7. Click Add URL Group to save the configuration. 
  8. Clear all settings

    Set the configurations under the Data Analyzer


    Configurations section

    . Enter the URL, username and password you configured above in the WSO2AM_STATS_DB datasource in WSO2 DAS.
    Image RemovedClick Save

    (e.g. https://localhost:9446 as the URL and admin/admin as the username/password).

    This is the endpoint at which the data analyzer (WSO2 DAS, in this case) resides and is used to deploy the DAS C-App containing the summarization logic to WSO2 DAS. By default, WSO2 DAS listens on port 9443 when no port offsets are set and increments according to the port offset set for the DAS server in step 2 above.
  9. Clear all settings under the Statistics Summary Datasource section. Give the datasource definition that is used to store summarized statistical data. The tables are created automatically when the Hive script runs. You just need to create the schema. The same configurations will be done in the DAS server.
    • URL: The connection URL for the RDBMS datasource
    • JDBC Driver Class: The fully qualified Java class name of the JDBC driver
    • Username/Password: Credentials to be passed to the JDBC driver to establish a connection
    Image Added
  10. Click Save. It deploys the Analytics toolbox, which describes the information collected, how to analyze the data, and the location of the database where the analyzed data is stored.


    Tip: To edit the datasource connection pool parameters, click the Show More Options link.


If you are using MySQL as the database, copy and paste the MySQL driver library to the <AM_HOME>/repository/components/lib directory.

Invoking the sample

Invoke an API to generate traffic and see the statistics. 

  1. Log in to the API


    Publisher and deploy the sample CalculatorAPI.
    Image Added


    titleConfiguring API for Statistics of Usage by Destination
    To view the API Usage by Destination Statistics in API Publisher, you need to follow below steps and enable Usage Tracking for the API.

    1. Edit the API you have created and go to the implementation Stage.

    2. Open Show more Options under Endpoints and select Enabled for Destination-Based Usage Tracking.

    Image Added

    Now you have configured the Calculator API to track the Destination Based Usage statistics.

  2. Log in to the API Store and subscribe to the API you created. 
    Image Added
  3. Invoke the sample using the API Store or the cURL command and wait a few minutes for the analytics to be generated. 
    Image Added
  4. In the API Publisher, click API Usage under the Statistics section All Statistics section. For more information, see Viewing API Statistics.
    Image Added

Purging Data (optional)

Data purging is an option to remove historical data in WSO2 DAS. This is important since it is not possible to delete tables or table data in WSO2 DAS. By purging data, you can achieve high performance on data analysing without removing analysed summary data. Only data from the stream data fired by APIM is API Manager is purged and it is contained in the following tables:


Table of Contents

Using the admin console

  1. Navigate to the Data-explorer and select one of the above tables. 
  2. Click Schedule Data Purge.
  3. On the dialog box that appears, set the time and days within which you want to purge data and save.
  4. Repeat the steps for all of the tables above and wait for the data to be purged.

Using the global method

Note that this will affect all tenants.
  1. Open the <DAS_HOME>/repository/conf/analytics/analytics-config.xml file.
  2. Change the contents under the <analytics-data-purging> property as shown below,:

    Code Block
        <!-- Below entry will indicate purging is enable or not. If user wants to enable data purging for cluster then this property need to be enable in all nodes -->
        <cron-expression>0 0 12 * * ?</cron-expression>
        <!-- Tables that need include to purging. Use regex expression to specify the table name that need include to purging.-->
        <!-- All records that insert before the specified retention time will be eligible to purge -->
  3. Save your changes.