This documentation is for WSO2 Data Analytics Server 3.0.1. View documentation for the latest release.
||
Skip to end of metadata
Go to start of metadata

Apache Spark is used as the core analytics engine in DAS 3.0.0. For information on writing Spark queries to analyze the collected data, see Spark Query Language

Analytics scripts

Analytics scripts are used when you have to execute a set of Spark queries in a sequence. Also, you can schedule a Analytics script, to trigger it to execute the query automatically in a given period of time. (E.g. fire at 12 (noon) every day, or fire at every minute starting at 2 p.m. and ending at 2:59 p.m. every day etc.). You need to configure this scheduled time using a cron expression. For more information about cron expressions, see Cron Trigger Tutorial.

You can add/edit/delete scripts, and also you can provide your schedule time for your script to execute as described below.

Adding a new script

Follow the steps below to add a new Spark script.

  1. Log in to the WSO2 DAS Management Console.
  2. In the Main tab, click Scripts to open the Available Analytics Scripts page.
  3. Click Add New Analytics Script to open the Add New Analytics Script page. Then enter the following details related to your script as shown in the example below.

    Script NameMyFirstAnalyticsScript
    Spark SQL Queries

    define table Log (server_name string, ip STRING, tenant INTEGER, sequence LONG, summary STRING);

    SELECT ip FROM Log;

    SELECT server_name, count(*) FROM Log GROUP BY server_name;

    SELECT COUNT(*) FROM Log WHERE summary LIKE '%Joe%';

    SELECT substr(summary, 1, 5) FROM Log;

    SELECT LAST(ip) FROM Log;

    Cron Expression

    0 * * * * ?

    This cron expression defines the schedule time of the script to execute it in every minute. From the time you save the script, the script will be executed at the beginning of every minute. (E.g:.10:21:00, 10:22:00, 10:23:00,..)

  4. Click Execute, to execute the provided queries. This will display the results as follows.
    executing the added new script
  5. Click Add, to add the configured script.

Editing a script

Follow the steps below to edit an existing Analytics script.

  1. Log in to the WSO2 DAS Management Console.
  2. In the Main tab, click Scripts to open the Available Analytics Scripts page.
  3. Click Edit for the script you want to edit.

  4. Change the content of the script as required. You can update the scheduling information as well. 

    When you do not enter any value for the scheduling time, then your script is not scheduled to execute. However, if you want to ensure that your script is valid, click Execute. This will execute the queries that you give in the queries window.

    For example, you can edit the script  created above to unschedule the scheduled time as shown below.

  5. Click Update to save the changes as shown above.

Deleting a script

Follow the steps below to delete an Analytics script.

  1. Log in to the WSO2 DAS Management Console.
  2. In the Main tab, click Scripts to open the Available Analytics Scripts page.
  3. Click Delete for the script you want to delete.
  4.  Click Yes in the dialog box which appears to confirm deletion. 

    If you delete the script you cannot undo that operation, and it will be completely removed from the system. Also, deleting the script will delete the scheduled task associated with it.

Executing a script

You can execute the script manually when you are adding/editing the script, without using any scheduled task. This will trigger the execution of the script content provided in the queries window at that moment. Also, you can execute the script content out of the edit mode as shown below. During this operation, WSO2 DAS fetches the script content and gives it to Spark to execute all the queries in the script. Once the execution is completed the results are displayed.

Follow the steps below to execute the script content.

  1. Log in to the WSO2 DAS Management Console.
  2. In the Main tab, click Scripts to open the Available Analytics Scripts page.
  3. Click Execute for the script you want to execute.
  4. Now, the script execution job is immediately dispatched to Spark engine. It will display the results once the job is completed as shown below.
  • No labels