This documentation is for WSO2 DAS version 3.0.1. View documentation for the latest release.

Skip to end of metadata
Go to start of metadata

I see an exception stating - Cannot run program "null/bin/java" when running DAS? What is going wrong?

This happens when you have not set the JAVA_HOME environment variable and pointed to the installed JRE location. This needs to be explicitly set in your environment.

How can I scale up DAS?

If you want to scale up DAS to receiver a large amount of data, you can setup multiple receiver nodes fronted by a load balancer. If you want to scale up the dashboards (presentation layer), you can setup multiple dashboard nodes fronted by a load balancer.

I only see one DAS distribution. How do I create a DAS receiver node/ analyzer node/ dashboard node? 

The DAS distribution will contain all the features you need. We prepare each node by uninstalling relevant features. You can do this by the feature installation/ uninstallation ability that comes with WSO2 DAS. If you want to create a receiver node, you uninstall the analytics feature and the dashboard feature, and you will have a receiver node.

Can I send custom data/ events to DAS?

Yes, you can. There is an SDK provided for this. For a detailed article on how to send custom data to DAS using this SDK, see Creating Custom Data Publishers to BAM/CEP

You can also send data to DAS using the REST API. For information on sending data to DAS using the REST API, see REST APIs for Analytics Data Service.

How do I define a custom KPI in DAS?

The model for this is to first publish the custom data. After you send custom data to DAS, you need to define your analytics to match your KPI. To visualize the result of this KPI, you can write a gadget using HTML and JS or use the Gadget generation tool. For all artifacts related to defining a custom KPI, see KPI definition and monitoring sample.

Can DAS do real time analytics?

WSO2 DAS is built to do batch based analytics on large data volumes. However, it can do real time analytics by installing the WSO2 CEP feature on top of the DAS server. By design, DAS and CEP use the same components to send and receive data, making them compatible to process data. The WSO2 CEP server is a powerful real time analytics engine capable of defining queries based on temporal windows, pattern matching and much more.

I see that in the DAS samples, it writes the results to a RDBMS? Why does it do this?

The DAS does this for 2 reasons. One is to promote a polyglot data architecture. It can be stored in a RDBMS or any other data store. The second is that there is extensive support for many 3rd party reporting tools such as Jasper, Pentaho, etc. already support RDBMSs. With this sort of support for a polyglot data architecture, any reporting engine or dashboard can be plugged into DAS without any extra customization effort.

I get a read timeout in the analytics UI after executing a query?

This happens when there is a large amount of data to analyze. The UI will timeout after 10 minutes, if the data to be processed takes more time than this.

I am getting error "Thrift error occurred during processing of message..." Any idea why?

If you are getting the following while trying to publish data from DAS mediator data agent then check whether you have specified the receiver and authentication ports properly and have not mixed them up. The default values are

receiver port = 7611

authentication port = 7711

TID: [0] [BAM] [2012-11-28 22:46:40,102] ERROR {org.apache.thrift.server.TThreadPoolServer} -  Thrift error occurred during processing of message. {org.apache.thrift.server.TThreadPoolServer}
    org.apache.thrift.protocol.TProtocolException: Bad version in readMessageBegin
        at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
        at org.apache.thrift.TBaseProcessor.process(
        at org.apache.thrift.server.TThreadPoolServer$
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
        at java.util.concurrent.ThreadPoolExecutor$

  • No labels