You are on page 1of 394

CONNECTING TO DATA

SOURCES

Table of Contents
General Information about Data Sources ......................................................................................... 5
Understanding Data Sources ................................................................................................ 6
Create New SQL Data Source .............................................................................................. 9
Create a New Plug-in Data Source ..................................................................................... 13
What does the Refresh Metadata button do? .................................................................... 17
Metric Insights' Plugins........................................................................................................ 18
Registering Your Server with Google to use their Plug-in Resources ................................. 21
Populate a Data Source with an uploaded List of Reports.................................................. 26
How to Link to External BI Tools..................................................................................................... 30
What are external reports? ................................................................................................. 31
How to link to External report content ................................................................................. 33
How to link to a Parameterized External report .................................................................. 41
Configuring Remote Data Collectors .............................................................................................. 48
Overview of Remote Data Collection ................................................................................. 49
Configure a Remote Data Collector ................................................................................... 50
Installing Insightd on Windows Servers ............................................................................ 55
Upgrading Insightd on Windows Servers ............................................................................ 62
Installing Insightd on Linux Servers .................................................................................. 68
Upgrading Insightd on Linux Servers .................................................................................. 70
Troubleshooting a Remote Data Collector .......................................................................... 72
JDBC Connections ......................................................................................................................... 80
Establishing a JDBC Connection ....................................................................................... 81
Add a new JDBC Driver ..................................................................................................... 86
Validation errors: "Value '0000-00-00' can not be represented as java.sql.Date" ............... 89
Sourcing data from 1010data ......................................................................................................... 92
Establish connectivity to 1010data ...................................................................................... 93
Collect data from 1010data ................................................................................................. 97
Creating 1010data macro code for your 1st Basic Metric ............................................... 102
Sourcing Data from Adaptive Planning ........................................................................................ 111
How to establish connectivity to Adaptive Planning plugin ................................................112
How to collect data from Adaptive Planning plugin ............................................................ 116
Sourcing Data from Basecamp .................................................................................................... 121
Establish Connectivity to Basecamp ................................................................................. 122
How to Collect Data using Basecamp Plug-in................................................................... 127
Sourcing data from Business Objects .......................................................................................... 131
Establish Connectivity to Business Objects
................................................................ 132
Collect Data from a Business Objects Document
.................................................. 138
Sourcing Data using Elastic Search ............................................................................................. 146
Establish Connectivity to Elastic Search ........................................................................... 147
How to Collect Data using Elastic Search Plug-in............................................................. 151
Sourcing Data from Facebook...................................................................................................... 154
Establish connectivity to Facebook ................................................................................... 155
How to Collect Data from Facebook ................................................................................. 156
Sourcing Data from Fogbugz ....................................................................................................... 159
Establish connectivity to FogBugz .................................................................................... 160
How to collect data from Fogbugz..................................................................................... 164
Sourcing data from Google Analytics ........................................................................................... 169
Establish connectivity to Google Analytics ........................................................................ 170
Connecting To Data Sources

Page 2

How to fetch data from Google Analytics .......................................................................... 175


Collecting data from Google BigQuery ......................................................................................... 179
Establish connectivity to Google BigQuery ...................................................................... 180
How to fetch data from Google BigQuery ......................................................................... 185
Sourcing Data from Google Calendar .......................................................................................... 192
How to establish connectivity to Google Calendar ............................................................ 193
How to fetch data from Google Calendar .......................................................................... 197
Sourcing Data from a Google Spreadsheet ................................................................................. 203
Establish connectivity to Google Spreadsheets ................................................................ 204
Collect data from a Google Spreadsheet .......................................................................... 210
Sourcing Data from Graphite........................................................................................................ 215
Establish Connectivity to Graphite .................................................................................... 216
Sourcing Data from IBM CoreMetrics........................................................................................... 220
Establish connectivity to IBM Coremetrics ....................................................................... 221
Sourcing Data from Marketo ........................................................................................................ 225
Establish Connectivity to Marketo ..................................................................................... 226
Sourcing Data from Mixpanel ....................................................................................................... 231
How to Collect Data using Mixpanel Plug-in ..................................................................... 232
Sourcing Data using OLAP .......................................................................................................... 235
Establish Connectivity to OLAP ........................................................................................ 236
How to Collect Data using OLAP Plug-in .......................................................................... 241
Sourcing Data from Omniture....................................................................................................... 244
Establish Connectivity to Omniture ................................................................................... 245
Sourcing Data from QlikView........................................................................................................ 250
Getting ready for using Qlik with Metric Insights ............................................................... 251
Establish Connectivity to QlikView .................................................................................... 253
How to Collect Data using QlikView Plug-in...................................................................... 259
Sourcing Data using RSS............................................................................................................. 263
Establish connectivity to RSS ........................................................................................... 264
Collect data from RSS....................................................................................................... 268
Sourcing data from Salesforce ..................................................................................................... 272
What is the difference between SalesForce plug-in and SalesForce SOQL plug-in? ....... 273
Establish connectivity to Salesforce .................................................................................. 274
How to collect data from Salesforce.................................................................................. 279
Sourcing data from Salesforce SOQL .......................................................................................... 287
Establish connectivity to Salesforce SOQL ....................................................................... 288
Setting up Salesforce OAuth ............................................................................................. 292
How to collect data from Salesforce SOQL....................................................................... 296
What is the difference between SalesForce plug-in and SalesForce SOQL plug-in? ...... 299
Sourcing Data from Tableau ......................................................................................................... 300
Hosting choices for Tableau Server .................................................................................. 301
Hosting choices for Tableau Online................................................................................... 303
Establish Connectivity to Tableau Server.......................................................................... 304
Establish Connectivity to Tableau via a Remote Data Collector ........................................311
Establish connectivity to Tableau Online........................................................................... 320
How to collect data from Tableau ...................................................................................... 326
How to create an External Report from Tableau ............................................................... 328
Getting Started with Push Intelligence for Tableau ........................................................... 331
Best practices for Tableau................................................................................................. 335
Getting latest data from Tableau ....................................................................................... 341
Embed database credentials in Tableau ........................................................................... 343
Handling Date/Time columns in Tableau........................................................................... 345
Connecting To Data Sources

Page 3

Sending a Tableau Workbook ...........................................................................................


Using SSO on Tableau Server ..........................................................................................
Sourcing data from Treasure Data ...............................................................................................
Installing the Treasure Data Plugin ..................................................................................
Establish connectivity to Treasure Data ...........................................................................
How to collect data from Treasure Data............................................................................
Sourcing Data from Uptimerobot ..................................................................................................
Establish Connectivity to Uptimerobot ..............................................................................
How to Collect Data using Uptimerobot Plug-in ................................................................
Sourcing Data using a Web Service.............................................................................................
Connecting to a Web Service............................................................................................

Connecting To Data Sources

350
351
354
355
356
363
368
369
374
377
378

Page 4

General Information about Data Sources

Connecting To Data Sources

Page 5

Understanding Data Sources


GENERAL:
There are two types of Data Sources:
Configurable: Are Data Sources for which the source database and access credentials
are user-defined. Each Metric Insights customer can establish and maintain SQL-based
access to internal databases and Plug-ins connections to external sources, including Web
Services and a wide range of providers
Non-Configurable: Are received by all customers and require no customer maintenance
or configuration
A. Configurable Data Sources
You can use either of the following to pull data into Metric Insights:
SQL: Any source accessible through a JDBC driver is considered a SQL data source.
This includes traditional databases as well as NoSQL sources with JDBC access; e.g.
Hadoop Hive.
Plug-in: Are used to obtain Data that is not fetched using a JDBC driver and is collected
using a Plug-in. Plug-ins are special components, built by Metric Insights, that send a
native fetch command to a Data Source. Results are fetched in a way that allows Metric
Insights to consume the data. A list of supported Plug-ins can be found here.
NOTE: Web Services are considered to be a Plug-in configured to access data from a custom
Web Service
B. Non-Configurable Data Sources available to all customers are:
Manual/CSV Data: Enter data manually or upload from a csv file Enter Data Manually
and Upload Data from CSV provide additional information and links.
External Process: Push data to Metric Insights (does not require a credential set, but
can authenticate if desired); a Data Posting URL is provided See Source Metric Data
from an External Process for more information
Existing Reports: Use data in one or more Reports to populate a new Report: See:
Source a Report from another Report
Existing Metrics: Obtain data in one or more Metrics to populate a new Metric or Report:
See: Source a Report from a Metric
Local Excel File: Upload data (on PC's only) from an MS Excel file
C. Remote Data Collectors
If a Data Source requires a Remote Data Collector, see Configure a Remote Data
Collector for more information

1. Data Source Editors


Information on creating new Data Sources can be found at:

Connecting To Data Sources

Page 6

SQL Data Sources


Plug-in Data Sources

2. Select a Data Source on an Element Editor

Once you have completed the Add Element pop-up and are directed to the Metric or Report
Editor, the browser automatically scrolls down to the Metric/Report Calculation section
1. The Data Source drop-down list contains all of the SQL and Plug in Data Sources that
have been defined for your instance as well as all non-configurable Data Sources
2. You still have to select a Data Collection Trigger that fetches the data
3. Based on your Data Source selection, you either see a Plug-in Command or SQL
Statement text box into which you enter the fetch command

Connecting To Data Sources

Page 7

3. Select a Data Source on a Functional Editor

The Event, Dimension and Data Dependency Editors all have data collection settings similar to
those on the Element Editors
The example above shows the Dimension Editor

Connecting To Data Sources

Page 8

Create New SQL Data Source


You can use a JDBC driver to pull data into Metric Insights using a SQL fetch command. This
includes traditional databases as well as NoSQL sources with JDBC access; e.g. Hadoop Hive.
This article describes the general process for creating a SQL data source.
Information on creating a data sources based on a plug-in can be found here.
For a description of Metric Insights overall approach to Data Sources, click here.

1. Open Data Source list

2. Create New Data Source

Connecting To Data Sources

Page 9

2.1 Choose type

1. Select "SQL"
2. Click Next step

Connecting To Data Sources

Page 10

2.2 Complete all settings

1. Provide a descriptive Name for the connection.


2. Enter the Username.
3. Enter a Password.
4. Enter the Host Name.
5. The Port number will be set by default, based on your choice of JDBC Driver. Change it if
necessary.
6. Enter the Database name.

Connecting To Data Sources

Page 11

7. Select a JDBC Driver. (If you do not see the the driver that you wish to use in the drop-down
provided, contact Metric Insights for assistance.)
8. Optionally, specify the maximum number of concurrent Threads per Trigger Event to be used
in background processing when the system updates metrics and reports for this data source. If
you do not specify any value for this setting, batch data collection processing will be singlethreaded.
9. If you select "Yes" to specify that this Data Source is remote; i.e., behind a firewall, you will be
required to select a Connector or create a new one. See Define a Remote Data Collector for more
details.
10. Save.
11. Click Test Connection to ensure that settings are correct. (Some data sources, e.g., Hive,
cannot be tested this way.)
Note: The JDBC string will be created automatically based on your other inputs. In some cases,
however, it will not be possible to infer the correct string without additional inputs. If the Connection
Test fails, please check the documentation for your JDBC driver.

2.3 Review and close the confirmation

1. If the connectivity is established, the confirmation message appears; click OK to continue


NOTE: It is not possible to directly test Hive connectivity from the connection editor since there is
no standard query that can be run against a HiveQL instance. Contact Metric Insights for
assistance in validating a HiveQL connection.

Connecting To Data Sources

Page 12

Create a New Plug-in Data Source


Data that is not fetched using a JDBC driver is collected using a Plug-in. Plug-ins are special
components, built by Metric Insights, that send a native fetch command to a data source. It then
formats the results in a way that allows Metric Insights to consume the data. A list of supported
Plug-ins can be found here. It is also possible to access data from a custom web service by using
a Web Service Plug-in.
This article describes the general process for creating a Plug-in data source.
Information about how to create a new SQL data source is available here.
For a description of Metric Insights overall approach to Data Sources, click here.

1. Open Data Source list

2. Create New Data Source

Connecting To Data Sources

Page 13

3. Choose type

1. Select "Plug-in"
2. Click Next step

Connecting To Data Sources

Page 14

4. Complete all settings

1. Select a Plug-in from the drop-down list of configured Plug-ins. If you do not find the one
you need, contact your system Administrator or support@metricinsights.com
2. Create a unique name for this data source
3. If you select "Yes" to Use Remote Data Collector?, you will be required to select a
collector or create a new one For more information, reference Define a Remote Data
Collector
4. When the supported plug-in is configured, the required Plug-in Connection Profile
Parameters are defined and are defaulted into the Parameters grid once you select the
Plug-in setting
5. Any of these parameters may be edited by clicking the Edit link

Connecting To Data Sources

Page 15

4.1 Edit Parameters

1. Use the Edit icon to enter/modify the parameter


2. Change the parameter to be passed to the Data Source
Save
Repeat this process as necessary to update any/all parameters.
Save again before leaving the Editor so that your Data Source will be ready for use in defining an
Element.

Connecting To Data Sources

Page 16

What does the Refresh Metadata button do?


Learn what the Refresh Metadata button does in the floating save bar in the SQL Data Source
Editor.

For SQL data sources, if using the visual editor (1), table/column metadata from the data base is
collected periodically and cached on the Metric Insights server. When users go into the Visual
Editor from an Editor, the cached metadata is used instead of making a database hit to pull up
table/column information.
The Refresh Metadata button (2) allows you to force the refresh of this metadata.

Connecting To Data Sources

Page 17

Metric Insights' Plugins


About Plugins
Metric Insights has a flexible plug-in infrastructure which allows us to connect to many different
data sources to pull metrics into the KPI warehouse.
In order for Metric Insights to have access to your data source, however, in some cases it is
necessary to install Insightd (a thin remote data collection process) on a machine inside your
network. If a plug-in requires Insightd to be running on a machine in your network, it will be noted
below.
Note: Even if a plug-in is listed as not requiring Insightd, if your network configuration is such that
your data source is locked down and the Metric Insights server does not have access to it, you will
still need Insightd to run on a machine behind your firewall that has network access to your data
sources. General stand-alone installations of Metric Insights should not require this, however
The list below includes our most popular plug-ins. If you don't see the the plug-in that you need in
the list below, please contact Metric Insights.

Supported Plug-in List


Software

Description

Insightd Required?

1010 Data

No

Adapative
Insights

No

Amazon
RedShift

No

BOBJ

Business Objects Plugin

Yes - on the business


objects server

CoolaData

No

Elasticsearch

No

Facebook

Facebook plugin

No

Flat File

Periodically import CSV files from a directory on


a server

No

Google
Analytics

Google Analytics php plugin

No

Connecting To Data Sources

Page 18

Google
BigQuery
Google
Calendar

No
Google Calendar plugin for importing a google
calendar as an event calendar

Google
Spreadsheet

No
No

Graphite

Graphite Plugin

No

Hadoop Pig

Pig Latin plugin that runs against Hadoop. Last


statement must be a DUMP statement

Yes. On a node configured


to run Pig against your
cluster

IBM
Coremetrics

IBM Coremetrics plugin

No

Marketo

Marketo plugin

No

MicroStrategy

No

Mixpanel

Mixpanel plugin

No

MongoDB

MongoDB Shell command plugin. Last statement


must include printjson() statement

No

OLAP

Use the MDX query language to query OLAP


cubes like those in Microsoft SSAS

No

Omniture
Reporting

Omniture is an online marketing and web


analytics tool offered by Adobe Systems

No

QlikView

Yes

QlikSense

Yes

Salesforce

Salesforce plugin to pull in metrics from the


popular online CRM software

Splunk
SQL
Databases

No
No

Relational databases and other datastores with a


JDBC driver, like Hadoop Hive, MySQL,
Postgres

No

SugarCRM

No

Tableau

No

Teradata

No

Connecting To Data Sources

Page 19

TreasureData

Treasure Data Plugin

No

Web Service

Periodically hit a webservice to get measurement


values

No

Connecting To Data Sources

Page 20

Registering Your Server with Google to use their Plug-in


Resources
You must register with Google for every application instance (domain) required access to
BigQuery, Analytics, Calendar and other plug-in services that Google offers. The result allows you
to obtain a client ID and secret code.

1. Google APIs Console

Go to: https://code.google.com/apis/console/?api=plus
1. You can use the current Google console as seen here. However, the steps in this
document use the "old console".
2. Choose the Go back link (yellow section) to use the "old console".

2. Register with Google

1. Click the Services tab and select services for instance


2. Go to API Access tab in the left panel.

Connecting To Data Sources

Page 21

3. Create a new Client ID

1. Click Create an OAuth 2.0 client ID

3.1 Create Client ID

Provide

Connecting To Data Sources

Page 22

1. Product name
2. Product logo
3. Home Page URL

3.2 Create a new Web Application client id for your Metric Insights server

1. Select Web application


2. Enter site or hostname for your Metric Insights instance (in this example the hostname is
document.metricinsights.com)
3. Click Create client ID

3.3 Record Client ID and Client Secret for use in application configuration file

1. Record Client ID

Connecting To Data Sources

Page 23

2. Record Client secret


3. Click Edit settings...

3.4 Edit client settings

1. For Authorized Redirect URIs, enter /path/to/callback value as /editor/service/


validategoogleoauth2 In this example where the hostname is document.metricinsights.com, the
full value becomes https://document.metricinsights.com/editor/service/validategoogleoauth2
Note: This value is referenced by Metric Insights in const.php config file via:
define('GOOGLE_OAUTH2_REDIRECT_URI', HOSTNAME.'/editor/service/validategoogleoauth2');

2. For Authorized JavaScript Origins, leave the default that it provides. In this example where
the hostname is document.metricinsights.com, the default is https://document.metricinsights.com

4. Store the client ID credentials on the Metric Insights server


SSH to the Metric Insights server
1. Edit the file /var/www/iv/engine/config/const.php
# vi /var/www/iv/engine/config/const.php

2. Enter the values provided above for the variables shown below:
Modify the lines below inserting values obtained from Google for your host:

Connecting To Data Sources

Page 24

define('GOOGLE_OAUTH2_CLIENT_ID', '761272366824.apps.googleusercontent.com');
define('GOOGLE_OAUTH2_CLIENT_SECRET', 'yT4nDBBppanARIbWhU5WHNLG');

Connecting To Data Sources

Page 25

Populate a Data Source with an uploaded List of Reports


A list of reports may be "stored" within a plug-in Data Source after being uploaded from a file,
either csv or with some other type.

1. Create a Plug-in Data Source

Connecting To Data Sources

Page 26

2. Upload List from a txt file

1.
2.
3.
4.
5.

Click Load from CSV File


Browse and select a file
Click Import
Review results of upload
Review the list of External Reports in the grid

Connecting To Data Sources

Page 27

3. Create an External Report using the Data Source

Create a new External Report, paying special attention to these settings:


1. Report Type: Choose a type consistent with the Data Source being used (In this example,
Tableau)
2. Show Report in: For this example, choose "Viewer"
3. Plug-in Connection Profile: Select an option consistent with the Data Source being
used (In this example, Tableau)
4. External Report: Pick a report that has the desired data from the list of uploaded Reports
5. External Report URL: Populated automatically once you choose a report
6. Collect Image

Connecting To Data Sources

Page 28

4. Review Image

The image reflects the Report selected from the list stored in the Data Source

Connecting To Data Sources

Page 29

How to Link to External BI Tools

Connecting To Data Sources

Page 30

What are external reports?


This will describe how Metric Insights interfaces with other 3rd party BI Tools like Tableau,
Microstrategy, etc.

Overview
When creating a new metric, the last two choices allow you to link to reports you've already
developed in other BI platforms, like Tableau, Crystal Reports, or Microstrategy. The difference
between an 'External Report' and 'Other External Content' is that with an External Report you
can actually drill down from another Metric Insights tile into the specified external report. External
Content, however, is just any URL of your choosing that will not be passed in any dynamic
parameters from Metric Insights.

URL Templates
For Other External Content, any old static url is defined for each report. However, for External
Reports that you wish to drill down to, you can use a URL template that will use the following
variable place holders
:measurement_start_time
:measurement_end_time
and, if the metric is dimensioned, by the dimension's bind_variable. So, if the metric is
dimensioned by sales channel, for example, you might also have the :channel template
variable

Connecting To Data Sources

Page 31

Thumbnails
Metric Insights tiles will actually auto update and generate values based on the value of the metric.
In contrast, external report thumbnails will just be a static icon that the report developer sets.
Sometimes report developers will actually take a screenshot of the external report and store it with
the external report definition to allow for a more realistic and seamless transition from Metric
Insights to the external BI tool.

External tool authentication


Each URL that Metric Insights will link too will, of course, require some form of login for that
system. Metric Insights will not be able to bypass authentication to any other system.

End result
Once an external report is created, a business user will be able to seamlessly link to a variety of
relevant reports all from their own personal list of favorites or report categories they have access
to.

Connecting To Data Sources

Page 32

How to link to External report content


This article describes creating a new 'External Content' that will be linked from a Metric Insights. If
you wish to drill to External content based on dimension values, look at how to set up drill paths for
external reports

1. On any page, use Admin Menu to begin creation of the element

Connecting To Data Sources

Page 33

2. Create a new External Content report

Select Other External Content

Connecting To Data Sources

Page 34

3. Set up the Metric Insights report metadata

Even though the report is already created in your external BI tool, for related metrics Metric
Insights still need to know a couple of things about this particular content.
To facilitate this, set
1.
2.
3.
4.

Measurement Interval,
Measure of,
Category,
and Topic

That way, if you're looking at another monthly report that happens to look at some measure, your
external content report will also show up to the user as a related report.

Connecting To Data Sources

Page 35

4. Set up the external content URL

1. In this case, we'll just use a static web page that we've created just for examples:
http://external.metricinsights.com/daily_sales_country.php .
2. Select 'SAVE' to display External Content Editor

Connecting To Data Sources

Page 36

5. Select where to display the content

1. Select to display your content in either MetricInsights Viewer or on the External webpage
2. Select 'Upload Image File' to display pop-up where you can add a Preview Image

Connecting To Data Sources

Page 37

6. Set preview image

Take a screenshot of your report and upload it as the preview image. This image should be a .png
file and to avoid distortion should have a width to height ratio of 1.8 (in other words, the width is
nearly twice the height).

Connecting To Data Sources

Page 38

7. Enable and Publish

1. After you're satisfied, save


2. Make visible on the Home Page
3. Enable and Publish

Connecting To Data Sources

Page 39

8. Verify that the new external content

Now that it's enabled, make sure the new external content tile properly shows up in the Home
Page. Double clicking on the report should take you to the url you expect.

Connecting To Data Sources

Page 40

How to link to a Parameterized External report


This article describes creating a new 'External Report' that will be linked from a Metric Insights.

1. Create a new External Content report

From the Add a new Element menu in Metric Insights, select External Report.

Connecting To Data Sources

Page 41

2. Set up the Metric Insights report metadata

Even though the report is already created in your external BI tool, for related metrics Metric
Insights still need to know a couple of things about this particular content.
To facilitate this, set the
1.
2.
3.
4.

Measurement Interval
Measure of
Category
Topic

That way, if you're looking at another monthly report that happens to look at some measure, your
external content report will also show up to the user as a related report.

Connecting To Data Sources

Page 42

3. Choose the report type or add a new one

A report type is used for organizational purposes. In addition, though currently unimplemented, it
will allow administrators to associate an icon url for each report type that allows users to
differentiate between report types when scrolling through your list of favorites.
NOTE: This will be auto-populated in future releases.

Connecting To Data Sources

Page 43

Create the report type

Only Name field is required

4. Set up the external report URL template

External report URLs can by default take special parameters to help get relevant information. For
example, a report may take a start_date and end_date parameters when running. To facilitate
this, Metric Insights has the following variables available to use in the report URL template.
:measurement_start_time - Metric Insights will automatically choose a reasonable start
and end times based on the report frequency and drill down path. For example, if you're
looking at a Monthly Sales report, and you click on January 2012, the
:measurement_start_time will be 'Jan 01, 2012 00:00:00' with :measurement_end_time =
'Jan 31, 2012 23:59:59'
:measurement_end_time - see above.
for dimensioned metrics only: a dimension name. For example, if your Dimension is
'Country' and you set up the Dimension to have a bind variable name of 'country_id', you
would be able to use :country_id in your URL template.

Connecting To Data Sources

Page 44

4. Set up a screen shot for preview images

Take a screenshot of your report and upload it as the preview image. This image should be a .png
file and to avoid distortion should have a width to height ratio of 1.8 (in other words, the width is
nearly twice the height)

Connecting To Data Sources

Page 45

5. Save and enable the metric

After you're satisfied, save and enable the external report so that it may be visible in the
dashboard.

Connecting To Data Sources

Page 46

6. Verify the new external content

Now that it's enabled, make sure the new external content tile properly shows up in the Tile View
at Home Page. Double clicking on the report should take you to your external BI tool where you'll
need to login

Connecting To Data Sources

Page 47

Configuring Remote Data Collectors

Connecting To Data Sources

Page 48

Overview of Remote Data Collection


Overview
In order to safely connect a Metric Insights instance that is hosted on the cloud to data sources
that live behind your firewall, you will need to download Insightd, a thin, remote data collection
framework that can be used to run queries on Metric Insights behalf.
Insightd will run on a machine behind your firewall and will periodically poll Metric Insights for the
queries it needs to run. Once these queries are assigned and completed, the results are then
securely posted back to the Metric Insights server to get turned into your visualizations.

Connecting To Data Sources

Page 49

Configure a Remote Data Collector


In order to safely connect a Metric Insights instance that is hosted on the cloud to data sources
that live behind your firewall, you will need to download Insightd, a thin, remote data collection
framework that can be used to run queries on behalf of Metric Insights.
Insightd will run on a machine behind your firewall and will periodically poll Metric Insights for the
queries it needs to run. Once these queries are assigned and completed, the results are then
securely posted back to the Metric Insights server to get turned into your visualizations.

1. Select 'Remote Data Collectors' from Admin menu

Connecting To Data Sources

Page 50

2. Add a new remote data collector

1. Give your new remote data collector a name


2. Enter the hostname of your remote data source

3. Connect a data source to the new remote data collector


3.1 Click the name of the new remote data collector to edit it

Connecting To Data Sources

Page 51

3.2 Add a SQL database or Plug-in data source to the remote data collector

Note: A single remote data collector can connect to multiple data sources if they are all accessible
through the same remote host.

Connecting To Data Sources

Page 52

3.3 Choose the SQL database or Plug-in connection that you want to connect
to

Note: We will connect to Tableau for demonstration purposes.

Connecting To Data Sources

Page 53

4. Finally, download the Insightd installer on to the machine your


server
As mentioned in the overview, the insightd installer will need to run on a server behind your
firewall. Metric Insights allows you to download a pre-configured package of Insightd to install on
your machine.

5. Install insightd
Now, copy the downloaded zip file to the server that will be running Insightd.
Please refer to the following sections to install insightd on Windows and Linux servers.

Connecting To Data Sources

Page 54

Installing Insightd on Windows Servers


This article assumes you have configured a remote data collector install, as described in
Configuring a Remote Data Collector

1. Download the Insightd installer from Metric Insights


On the Windows Server that you plan to install on, open a web browser and point it to your metric
insights server. Log in as admin, and go to Admin -> Remote Data Collectors.

2. Select the data collector that you wish to download

Connecting To Data Sources

Page 55

3. Download the Insightd zip file


When downloading the zip file, you might get an error message like the following that warns you
that the insightd zip isn't normally downloaded. Tell the browser that you're sure you want the zip
file. We promise it's not dangerous.

Connecting To Data Sources

Page 56

4. Extract the zip file and run the installer executable.


In a Windows Explorer window, right click on the downloaded zip file and extract all the insightd
files to a directory.

Connecting To Data Sources

Page 57

4.1

4.2

5. Configure the Business Objects SDK location if desired


If you wish to install the SAP Business Objects plugin, the location of the SDK is necessary. This
should be the directory where all the Business Object jar files live (usually something like

Connecting To Data Sources

Page 58

C:\Program Files\Business Objects\Common\4.0\java\lib). If you are not using the Business


Objects plugin, just use the default.
When done, click 'Close' to finish the installer.

Connecting To Data Sources

Page 59

6. Start the Insightd Windows Service


Navigate to the Windows Services list and start the Metric Insights Daemon service

Connecting To Data Sources

Page 60

7. Check the Windows Event log for any errors


If you notice that the service hasn't started for some reason, check the event log for more
information. There will be a special folder in the Event Log for the Metric Insights Daemon.
In the case below, Insightd failed to start up because it couldn't reach the Metric Insights server for
some reason. This can happen if port 8443 is not open on your Metric Insights instance.

Connecting To Data Sources

Page 61

Upgrading Insightd on Windows Servers


This article walks you through the steps to upgrade your version of Insightd on Windows Servers.
Insightd is the Windows service that runs the Remote Data Collector for your Metrics Insights
application. You can review more about what this is in the Overview of Remote Data Collection.
http://help.metricinsights.com/m/Connecting_to_Data_Sources/l/107311-overview-of-remote-datacollection

1. Get new version


To get a new version, follow the steps on how to download the new version in the article for
Installing Insightd on Windows Servers.
Start with Step 1 and continue to Step 3, the steps that explain how to download the Insightd
installer.
http://help.metricinsights.com/m/Connecting_to_Data_Sources/l/107314-installing-insightd-onwindows-servers

2. Uninstall old version


After getting the new version of the Remote Data Collector in the previous step, the next steps are
to uninstall the old version and then to install the new version.
The next several steps walk you through how to uninstall the old version of the Remote Data
Collector.

Connecting To Data Sources

Page 62

2.1 Backup config.ini file


Before uninstalling the old version of the Remote Data Collector, make a backup copy of your
config.ini file (located at "Metric Insights\Insightd\config\config.ini") and save it somewhere such as
your home directory.

Connecting To Data Sources

Page 63

2.2 Stop Windows Service


Now stop the Remote Data Collector if it is running. Find the Metric Insights Daemon in the
Windows Services control panel. If it is running then the panel will show you an option to Stop the
service and to Restart the service. If the service is not running, then it will only show you an option
to Start the service.
If the service is running, then Stop the windows Insightd service.
Click Stop the service.

Connecting To Data Sources

Page 64

2.3 Uninstall Windows Service


Once the Insightd Windows Service is stopped, then it is safe to uninstall the service.
Uninstall the Insightd service.
Run the uninstall-insightd.exe file located at "Metric Insights\Insightd\bin\uninstallinsightd.exe". You can double click on it to run it.

3. Install new version


Finally, after completing the above steps of getting the new version of the Remote Data Collector,
and then uninstalling the old version, the next step is to install the new version.
The next several steps walk you through how to install the new version of the Remote Data
Collector.

Connecting To Data Sources

Page 65

3.1 Install the Windows Service


To install the Insightd Windows Service, follow the steps in the article Installing Insightd on
Windows Servers that explains how to extract the installer from the downloaded file and then how
to install the service. Only go as far as running the installer. Do not start the Insightd Windows
Service yet. Follow Step 4 through Step 5 in the article Installing Insightd on Windows Servers.
http://help.metricinsights.com/m/Connecting_to_Data_Sources/l/107314-installing-insightd-onwindows-servers
If you get "Error opening file for writing: ...Win32EventLogAppender.dll" just click Ignore and
continue.

Connecting To Data Sources

Page 66

3.2 Restore config.ini


Copy the config.ini file you saved in a prior step to the "Metric Insights\Insightd\conf\" directory.

3.3 Start the Windows Service


Finally, start the Insightd Windows Service. Continue at Step 6 in the article for Installing Insightd
on Windows Servers.
http://help.metricinsights.com/m/Connecting_to_Data_Sources/l/107314-installing-insightd-onwindows-servers

Connecting To Data Sources

Page 67

Installing Insightd on Linux Servers


This article assumes you have configured a remote data collector install, as described in
Configuring a Remote Data Collector

1. Download the Insightd installer from Metric Insights


On the Linux Server that you plan to install on, open a web browser and point it to your metric
insights server. Log in as admin, and go to Admin -> Remote Data Collectors.

2. Select the data collector that you wish to download


Select the remote data collector you wish to install.

Connecting To Data Sources

Page 68

3. Download the Insightd zip file


When downloading the zip file, you might get an error message like the following that warns you
that the insightd zip isn't normally downloaded. Tell the browser that you're sure you want the zip
file. We promise it's not dangerous.

4. Extract the zip file and start Insightd


root@document:~# unzip remote_data_collector.zip
root@document:~# cd datacollector/bin
root@document:~# ./insightd start &

Connecting To Data Sources

Page 69

Upgrading Insightd on Linux Servers


This article walks you through the steps to upgrade your version of Insightd on Linux Servers.
Insightd is the Linux daemon that runs the Remote Data Collector for your Metrics Insights
application. You can review more about what this is in the Overview of Remote Data Collection.
http://help.metricinsights.com/m/Connecting_to_Data_Sources/l/107311-overview-of-remote-datacollection

1. Uninstall old version


The following steps walk you through how to uninstall the old version of the Remote Data
Collector.

1.1 Stop Insightd


Stop the Remote Data Collector if it is running. Find the location where the remote data collector is
running and stop it.
Stop the current remote data collector:
./datacollector/bin/insightd stop

If it's not running then you might get the following message:
"The insightd can not be stopped because it is not currently running."

1.2 Save Insightd (save current configuration)


Move files to a saved location. You can give the name of the new directory anything you want.
mv ./datacollector/ ./datacollector-todays-date

2. Get new version


To get a new version, follow the steps on how to download the new version in the article for
Installing Insightd on Linux Servers.
Start with Step 1 and continue to Step 3, the steps that explain how to download the Insightd
installer. At this point do not extract the zip file. Do not start up Insightd.
http://help.metricinsights.com/m/Connecting_to_Data_Sources/l/107315-installing-insightd-onlinux-servers

Connecting To Data Sources

Page 70

In rare cases you might receive the remote data collector via a different means. For example, from
Metric Insights company directly. For example you might get a patched version or one with a new
feature. In that case just follow the directions given for that situation.

3. Install new version


The next several steps walk you through how to install the new version of the Remote Data
Collector.

3.1 Install Insightd


Install Insightd
unzip remote_data_collector.zip

This will install the remote data collector (i.e., extract the contents of the zip file) into the
./datacollector directory. If you obtained this file via a different means then it might have a different
name.

3.2 Check if new configuration is correct


Check if new configuration is correct
cat ./datacollector/conf/config.ini

If you downloaded this from your Metric Insights system described in the above steps, then this
step is not necessary. If you obtained the remote data collector via a different means then this step
is recommended.

3.3 Restore Configuration (if new configuration not correct)


If you downloaded this from your Metric Insights system described in the above steps, then this
step is not necessary. However, if you obtained it via a different means and you need to restore
your previous configuration for some reason then you can do the following
mkdir -p ./datacollector/conf
cp -p ./datacollector-todays-date/conf/config.ini ./datacollector/conf/

3.4 Start Insightd


Start insightd
./datacollector/bin/insightd start &

Connecting To Data Sources

Page 71

Troubleshooting a Remote Data Collector


This article provides some tips for troubleshooting problems with your Remote Data Collector

1. Check if remote data collector is active


From the Admin menu navigate to the Remote Data Collectors screen.
The Remote Collectors screen lists all your Remote Data Collectors and provides a visual queue
whether the remote data collector is active. It also lists the last time the collector successfully
communicated with the system.

Connecting To Data Sources

Page 72

1.1 Active remote data collector


If your remote data collector is active then it will show with white background, and the column for
"Last Heartbeat Time" will have a value within a couple minutes of the current system time.

1.2 Inactive remote data collector


If your remote data collector is not active then it will show with pink background, and the column
for "Last Heartbeat Time" will have a value with the last time it successfully connected. The image
below is for a remote data collector that is not active - the Last Heartbeat Time is old and the
background is pink. You will need to troubleshoot more.

2. Restart the remote data collector


On the machine that the remote data collector is running, stop and restart the remote data
collector

Connecting To Data Sources

Page 73

2.1 Windows - Restart remote data collector


Find the Services menu on the Windows machine and stop and restart the remote data collector.
The service is named 'Metric Insights Daemon'

2.2 Linux - Restart remote data collector


Find the directory where the remote data collector is running and stop and restart the process. See
instructions for where you installed the program.
Stop the remote data collector:
/var/www/datacollector/bin/insightd stop

Start the remote data collector:


/var/www/datacollector/bin/insightd start

Connecting To Data Sources

Page 74

3. Check error messages from the remote data collector on the


remote machine
On the machine that the remote data collector is running, view the error messages from the
remote data collector.

3.1 Windows - Check error messages from the remote data collector on the
remote machine
Find the Event Viewer menu on the Windows machine and choose the events for the service
named 'Metric Insights Daemon'. Errors are flagged in red and provide information on problems
that the remote data collector encountered.

3.2 Linux - Check error messages from the remote data collector on the
remote machine
View the errors in the log.
cat /var/log/insight/insightd-error.log

Connecting To Data Sources

Page 75

4. Check error messages on Metric Insights server


View the errors in the log.
Debian instance:
cat /var/log/apache2/pyinsight-error.log

CentOS instance:
cat /var/log/httpd/pyinsight-error.log

5. Verify correct settings in the remote data collector configuration


file

Connecting To Data Sources

Page 76

5.1 Verify settings in the UI on Metric Insights


1. From Admin menu navigate to Data Collectors screen.
2. From Data Collectors screen navigate to Remote Collector Editor screen by clicking on the
data collector entry in the list.
3. Note the setting values for "Username for remote agent" and "Password for remote agent".
You will compare these to what is used on the remote data collector machine. In this
example the values are "tableau.metricinsights.com" and "myPassword123".
4. Note the domain of your Metric Insights instance. In this example it is
"demo.metricinsights.com" as seen in the web browser url.

Connecting To Data Sources

Page 77

5.2 Windows - Verify settings in configuration file on remote machine


Navigate to the location of the configuration file used by the remote data collector
C:\Program Files (x86)\Metric Insights\Insightd\conf

1. View the configuration file "config.ini". You can view it with Notepad editor.
2. Compare the setting values for "hostname" with "Username" in previous step. Do the same
for "password". In this example the values are "tableau.metricinsights.com" and
"myPassword123".
3. Compare the setting value for "requestURL" with the domain in previous step. In this
example the value is "demo.metricinsights.com".

5.3 Linux - Verify settings in configuration file on remote machine


Find the directory where the remote data collector is running and view the configuration file
"config.ini". See instructions for where you installed the program.
View the configuration file "config.ini". You can use vi editor or print it to screen as in the example
below.
cat /var/www/datacollector/conf/config.ini

Connecting To Data Sources

Page 78

[authentication]
hostname = tableau.metricinsights.com
password = myPassword123
requestURL = demo.metricinsights.com
[parameters]
FetchRestartWait = 300
PollingInterval = 1
PostingRestartWait = 300
PostingTimeoutInterval = 10
HeartbeatAlertInterval = 10
PostingRestartsAllow = 3
CacheDNSLookup = 0
FetchRestartsAllow = 3
FetchTimeoutInterval = 30

1. Compare the setting values for "hostname" with "Username" in previous step. Do the same
for "password". In this example the values are "tableau.metricinsights.com" and
"myPassword123".
2. Compare the setting value for "requestURL" with the domain in previous step. In this
example the value is "demo.metricinsights.com".

Connecting To Data Sources

Page 79

JDBC Connections

Connecting To Data Sources

Page 80

Establishing a JDBC Connection


Before you can run any queries, you will need to set up a connection to your data sources.

1. On any page, use Admin Menu to begin creation of the JDBC


Source Database Connection

2. On Source Database Connections, open the pop-up

Connecting To Data Sources

Page 81

3. Set type of new Data Source connection

Select SQL type

Connecting To Data Sources

Page 82

4. Describe the Connection

1. Provide a descriptive Name for the connection.


2. Enter the Username.
3. Enter a Password.
4. Enter the Host Name.
5. The Port number will be set by default, based on your choice of JDBC Driver. Change it if
necessary.
6. Enter the Database name.

Connecting To Data Sources

Page 83

7. Select a JDBC Driver. (If you do not see the the driver that you wish to use in the drop-down
provided, contact Metric Insights for assistance.)
8. Optionally, specify the maximum number of concurrent Threads per Trigger Event to be used
in background processing when the system updates metrics and reports for this data source. If
you do not specify any value for this setting, batch data collection processing will be singlethreaded.
9. Save.
Note: The JDBC string will be created automatically based on your other inputs. In some cases,
however, it will not be possible to infer the correct string without additional inputs. If the Connection
Test fails, please check the documentation for your JDBC driver.

5. Validate created connection from Data Source Editor

Connecting To Data Sources

Page 84

6. New connection can be found in Data Sources list

NOTE: the test feature does not currently work for testing connections to Apache Hive.
1. Click the Test Link to Verify connectivity to the data source.
2. If you need to further edit the connection, click on the name link of the datasource to access the
pop-up editor

7. Confirm success

1. If the connectivity is established, the confirmation message appears; click OK to continue


NOTE: It is not possible to directly test Hive connectivity from the connection editor since there is
no standard query that can be run against a HiveQL instance. Contact Metric Insights for
assistance in validating a HiveQL connection.

Connecting To Data Sources

Page 85

Add a new JDBC Driver


The JDBC Driver drop-down in the Add/Edit a SQL Data Source Editor includes the list of JDBC
drivers that have been installed on the Metric Insights server.
You may add to this list of JDBC drivers by following the steps below.
PRE-REQUISITE:
You must have the driver JAR file stored locally on your computer
For the last step you will need to request Metric Insights to Register this driver.

1. Obtain the JAR file


1. Identify the proper JDBC driver JAR file for the driver to be added
2. Download to your local machine
NOTE: A list of sources for JDBC drivers can be found at: http://www.sqlsummit.com/
JDBCVend.htm

2. Open the list of Data Sources

1. Using the Admin drop-down list, choose Data Sources

Connecting To Data Sources

Page 86

2. Click the ADD icon to the right on the Data Sources grid field
Optionally, you can Edit an existing SQL Data Source on its editor by clicking its Name link and
add your new JDBC driver from there; if this is your approach, skip to Step 4

3. Add the new Driver

Select Add New Driver Jar File

Connecting To Data Sources

Page 87

4. Select the new driver

1. Use the Browse button to find and Open the driver found the .jar file that you downloaded
and saved to your local hard drive
2. Once the Jar file is in the text box, click SAVE

5. Register Driver in Metric Insights


Contact Metric Insights and Metric Insights will register this Driver in the system. What Metric
Insights will do is add info to the system (mysql jdbc_driver table) that includes your jdbc driver
information. E.g., jdbc string template to use (e.g., jdbc:postgresql://<host>:<port>/<db>) , sample
SQL statement to test connectivity and jdbc class name to invoke in the driver (e.g.,
org.postgresql.Driver).
NOTE: Once complete you will then be ready to create a Data Source using this new JDBC driver

Connecting To Data Sources

Page 88

Validation errors: "Value '0000-00-00' can not be represented


as java.sql.Date"
ANALYSIS:
This is a MySQL-specific problem that does not happen often. It is caused by storing zero dates
('0000-00-00 00:00:00') in MySQL and trying to convert those into date objects in Java.
Unfortunately, Java does not understand dates in this format, so the MySQL JDBC driver will
throw this error by default.
RECOMMENDED ACTION:
Apart from modifying the source data, the easiest solution is to modify the JDBC connection URL
for the affected Data Source to include ?zeroDateTimeBehavior=convertToNull as shown below.
REFERENCES:
For more information about this JDBC parameter, see the Datetimes bullet under section
20.3.3.3.3 of the MySQL Manual.

1. Problem examples:
1.1 Example: "Value '0000-00-00' can not be represented as java.sql.Date"

Connecting To Data Sources

Page 89

1.2 Example: "Cannot convert value '0000-00-00 00:00:00' from column N to


TIMESTAMP"

2. Solution:
2.1 Navigate to the data source editor for the affected data source
Make sure the data source drop down is pointing to the appropriate data source, then click the
gear icon to the right to edit the data source.

Connecting To Data Sources

Page 90

2.2 Fix the SQL Data Source URL

The suggested fix is to append '?zeroDateTimeBehavior=convertToNull' to the end of the JDBC


string. So the original string "jdbc:mysql://localhost:3306/demo" would become
"jdbc:mysql://localhost:3306/demo?zeroDateTimeBehavior=convertToNull"

Connecting To Data Sources

Page 91

Sourcing data from 1010data

Connecting To Data Sources

Page 92

Establish connectivity to 1010data


1. Go to Data Source Editor

2. Add New Data Source

Connecting To Data Sources

Page 93

3. Select "Plug-in" as data source type

4. Select '1010data' from Plug-in picklist

Connecting To Data Sources

Page 94

5. Additional fields will be display for entry

1. Enter a unique Name for your data Source


2. Set Use Remote Data Collector to "No"
3. Enter Authorized Username and Password

Connecting To Data Sources

Page 95

6. Profile Parameters are generated automatically

Generated parameters may be modified/added via the gear icons

7. Why dont normal 1010data UIDs work with Metric Insights?


Metric Insights will run queries in the background, if a user is using their UID (or two metrics
decide to update at the same time) the result will be one will fail or kick the other off.

8. What are SAM Pools?


SAM pools are a mechanism used to associate multiple sessions on the 1010data backend with a
single set of credentials.
Using SAM credentials allow Metric Insights instances to keep metrics up to date without
interfering with the user or each other.

Connecting To Data Sources

Page 96

Collect data from 1010data


This article will show you how to create a Metric using a 1010data plug-in as a Data Source. It
assumes that you have already established connectivity to your 1010data server.

1. Add New Element to start

2. Create a simple metric without wizard

Connecting To Data Sources

Page 97

3. Define Basic Metric Information

Connecting To Data Sources

Page 98

4. Adjust Metric Calculation block

1.
2.
3.
4.

Set 1010data plug-in in Data Source field


Enter Table Name
Complete the Macro Code fetch command
Validate Plug-in Command

Connecting To Data Sources

Page 99

5. Collect Data

1. If validation is finished successfully you will see how many records are fetched. Information
message is displayed in green font. If any error occurs during validation you will see error
message in red font.
2. Collect Data

Connecting To Data Sources

Page 100

6. View Chart

Connecting To Data Sources

Page 101

Creating 1010data macro code for your 1st Basic Metric


This will show you how to create a very simple Metric - a visualization that measures sales over
time.

1. Build your macro code

Connecting To Data Sources

Page 102

1.1 Formatting date to mm/dd/yyyy

Insert a willbe in the Edit Actions screen to reformat the date to a mm/dd/yyyy format. Here is the
example code: <willbe name="Date" value="date" format="type:date4y"/>

Connecting To Data Sources

Page 103

1.2 Select Rows to show only recent data.

1. Use is greater than in the drop down - this will be used on the Metric Insights side.
2. Regardless of the actual dates you are analyzing, choose a recent date to limit row count.

Connecting To Data Sources

Page 104

2. Build a Tabulation

Connecting To Data Sources

Page 105

3. Now we have Daily Sales for Total Month.

Connecting To Data Sources

Page 106

4. Copy Macro Code to paste into Metric Insights.

Metric Insights takes raw macro code from 1010data. Copy the 1010data macro code out of the
Edit Actions window to paste into Metric Insights.
To see how to build a Metric from scratch, click the following this link:

Connecting To Data Sources

Page 107

5. Paste your code into Metric Insights Metric Editor

Paste the code into the Macro Code box in the Metric Editor.

Connecting To Data Sources

Page 108

6. Remove table name note at the beginning of the code, and paste
table into table name field.

1. Paste Table Name into Table Name field & remove table note from code.
2. Click the Validate Plug-In Command button to verify code.

7. Validated Code

Once validated, the macro code window will convert from red to green. You will also see the
number of records fetched from your code.

Connecting To Data Sources

Page 109

8. Replace your date selection in the code with


:last_measurement_time

Connecting To Data Sources

Page 110

Sourcing Data from Adaptive Planning

Connecting To Data Sources

Page 111

How to establish connectivity to Adaptive Planning plugin


An Administrator can use the process described in this article to create a new Plug-in Data
Source that is required to allow Elements to fetch data from Adaptive Planning to create a
visualization in Metric Insights.

1. Go to Data Source editor

1. Select Data Sources in Admin menu

Connecting To Data Sources

Page 112

2. Add New Data Source

1. Select to Add New Plug-in

Connecting To Data Sources

Page 113

3. Select "Plug-in" as Data Source Type

Choose Plug-in value

4. Select 'Adaptive Planning' from Plug-in picklist and input


meaningful Name

Press Save

Connecting To Data Sources

Page 114

5. Plug-in Data Source Editor will open

1. Enter Username
2. Set Password

6. Save Basecamp Plug-in

Save plug-in

Connecting To Data Sources

Page 115

How to collect data from Adaptive Planning plugin


This article will show you how to create an Element using a Adaptive Planning plug-in as a data
source. It assumes that you have already established connectivity to your Adaptive Conectivity
account.

1. Add new Element

Select Add New Element in Admin the menu

Connecting To Data Sources

Page 116

2. Select Report

Choose Report

Connecting To Data Sources

Page 117

3. Choose Adaptive Planning plug-in as a Data Source for new


element

Connecting To Data Sources

Page 118

4. Add XML Request and Validate Statement

Connecting To Data Sources

Page 119

5. Publish your new element

Connecting To Data Sources

Page 120

Sourcing Data from Basecamp

Connecting To Data Sources

Page 121

Establish Connectivity to Basecamp


An Administrator can use the process described in this article to create a new Plug-in Data
Source that is required to allow Elements to fetch data from Basecamp to create a visualization in
Metric Insights.

1. Go to Data Source editor

Connecting To Data Sources

Page 122

2. Add New Data Source

3. Select "Plug-in" as Data Source Type

Connecting To Data Sources

Page 123

4. Select 'Basecamp' from Plug-in picklist and input meaningful


Name

Connecting To Data Sources

Page 124

5. Plug-in Data Source Editor will open

1. Enter Username
2. Set Password
3. Click on gear icon to set User ID parameter.

6. Add User ID parameter to profile

Connecting To Data Sources

Page 125

7. Save Basecamp Plug-in

Connecting To Data Sources

Page 126

How to Collect Data using Basecamp Plug-in


This article will show you how to create an Element using a Basecamp plug-in as a data source. It
assumes that you have already established connectivity to your Basecamp account.

1. Add new Element

Connecting To Data Sources

Page 127

2. Choose Basecamp plug-in as a Data Source for new element

1. Choose Basecamp as your Data Source.


2. Input your Plug-in Command manually, OR
3. Use Visual Editor.

Connecting To Data Sources

Page 128

3. Basecamp Query Builder is called by Visual Editor link

You can select fields in Included Fields section, filter selection or group fields based on Filter on
and Group By options.

Connecting To Data Sources

Page 129

4. Publish your new element

1. Validate your plug-in command


2. Then Collect data
3. And Enable & Publish your Metric.

Connecting To Data Sources

Page 130

Sourcing data from Business Objects

Connecting To Data Sources

Page 131

Establish Connectivity to Business Objects


Metric Insights uses a plug-in Data Source to connect to Business Objects (BOBJ). This allows
data from an existing Business Objects report to be used to build an element using Metric Insights
tools. In order to prepare the connection, you must first create the Data Source using the process
described below.
PRE-REQUISITE
Your Metric Insights instance must be configured to support Business Objects.

1. Create Business Objects plug-in

1. Find Data Sources item in Admin menu

2. Prepare to create a new Plug-in Data Source

1. Click Add New Data Source

Connecting To Data Sources

Page 132

3. Set type of new Data Source

1. Select Plug-in
Click Next Step

4. Define the basics

1. Select "BOBJ Data Collection Plugin" from the Plug-in pick list (NOTE: If this option is
not found, contact your Administrator or support@Metricinsights.com for assistance since
your instance has not been configured for BOBJ
2. Enter a meaningful Name
3. Optionally, enter the Username and Password required for this Plug-In

Connecting To Data Sources

Page 133

Click Save

5. Continue the Data Source Definition

1. Click the Edit icon for auth.type parameter

5.1 Add auth.type parameter

1. Input auth.type value in this pop-up


Save

Connecting To Data Sources

Page 134

6. Continue the Data Source Definition

1. Click the Edit icon for cms parameter

6.1 Add cms parameter

Input cms value in this pop-up


Save

Connecting To Data Sources

Page 135

7. Add Remote Data Collector

Click Add New Remote Collector

7.1 Define Remote Data Collector

1. Select Remote Collector from the list


Click Save.

Connecting To Data Sources

Page 136

8. Save Business Objects plugin

Connecting To Data Sources

Page 137

Collect Data from a Business Objects Document


By fetching data from an existing Business Objects (BOBJ) report, you can use the data to create
a Metric Insights element.
PRE-REQUISITE
You must have both the BOBJ plug-in configured in your Metric Insights instance and have created
a Plug-in Data Source for BOBJ.

1. Create an element

1. Select Add an Element in the Admin menu

Connecting To Data Sources

Page 138

2. Select type of the element

1. Choose Metric type


2. Check Use step-by-step wizard
Click Next step

Connecting To Data Sources

Page 139

3. Provide Basic Information

1. Set Measurement interval


2. Set Measure of
Click Next:Collect data

Connecting To Data Sources

Page 140

4. Prepare Metric for Data Collection

1.
2.
3.
4.

Find BOBJ plug-in in Data Source drop-down list


Select a Data Collection Trigger from the pick list
Select Business Objects Document from drop-down list
Enter the Plug-in Command manually or using the Visual Editor link

Connecting To Data Sources

Page 141

4.1 Select desired columns

You may select Field and set Expression for it, Fetch options are set below table with fields

5. Validate plug-in command

Connecting To Data Sources

Page 142

6. Collect Data

1. If validation is finished successfully, you will see how many records are fetched displayed
in green font; if an error occurs during validation, an error message is presented in red
font.
Click Collect Data

Connecting To Data Sources

Page 143

7. Review results

1. You will get message how many values are recorded.


Click Next to generate Preview

Connecting To Data Sources

Page 144

8. Review Chart and complete definition

1. Check "Make Visible...."


Click Enable metric to add tile on your Home Page

Connecting To Data Sources

Page 145

Sourcing Data using Elastic Search

Connecting To Data Sources

Page 146

Establish Connectivity to Elastic Search


An Administrator can use the process described in this article to create a new Plug-in Data
Source. It is required to allow Elements to fetch data using Elastic Search.

1. Go to Data Source editor

Connecting To Data Sources

Page 147

2. Add New Data Source

3. Select "Plug-in" as Data Source Type

Connecting To Data Sources

Page 148

4. Select 'ElasticSearch' from Plug-in picklist and input meaningful


Name

5. Plug-in Data Source Editor will open

1. Click on gear icon to input your server's Endpoint (server can be local or remote).
2. Provide Index parameter - the name of your Database.

Connecting To Data Sources

Page 149

3. Optionally, provide Type parameter - the name of the table in your Database.

6. Save Elastic Search Plug-in

1. Save your changes.

Connecting To Data Sources

Page 150

How to Collect Data using Elastic Search Plug-in


This article will show you how to create an Element using an Elastic Search plug-in as a data
source. It assumes that you have already established connectivity to Elastic Search.

1. Add new Element

Connecting To Data Sources

Page 151

2. Choose Elastic Search plug-in as a Data Source for new element

1. Choose Elastic Search as your Data Source.


2. Input your Plug-in Command.

Connecting To Data Sources

Page 152

3. Publish your new element

1.
2.
3.
4.

Validate your plug-in command.


You will see sample results under 'Validate' button.
Save the Element to continue configuring.
If configuring of the Element finished, Enable & Publish it.

Connecting To Data Sources

Page 153

Sourcing Data from Facebook

Connecting To Data Sources

Page 154

Establish connectivity to Facebook


The information required to set-up a Plug-in Data Source to be used to fetch data from Facebook
to be used by an Element to create a Metric Insights visualization.

Required Variables
Include the following const.php variables:
1.
2.
3.
4.

FACEBOOK_OAUTH2_APP_ID
FACEBOOK_OAUTH2_APP_SECRET
FACEBOOK_OAUTH2_REDIRECT_URI
FACEBOOK_OAUTH2_PERMISSIONS

The above are standard OAuth2 parameters required by Facebook API and are described in
appropriate Facebook docs
http://developers.facebook.com/docs/reference/dialogs/oauth/
http://developers.facebook.com/docs/howtos/login/server-side-login/

Connecting To Data Sources

Page 155

How to Collect Data from Facebook


This article will show you how to create an Element (Report in this case) using a Facebook plug-in
as a data source. It assumes that you have already established connectivity to your Facebook
account.

1. Add new Element

Connecting To Data Sources

Page 156

2. Choose Facebook plug-in as a Data Source for new element

1. Choose Facebook as your Data Source.


2. Input your Plug-in Command
3. To see the Facebook Query Language (FQL) Reference click on appropriate link in hint
box.
NOTE: Do not forget, that The Facebook authentication token hat to be reissued every 60 days.

Connecting To Data Sources

Page 157

3. Publish your new element

1. Validate your plug-in command


2. After successful validation Save your Report to see Sample Result Set
3. And if you've finished configuring of the Report Enable & Publish it.

Connecting To Data Sources

Page 158

Sourcing Data from Fogbugz

Connecting To Data Sources

Page 159

Establish connectivity to FogBugz


This article describes the process of creating a new Plug-in Data Source for accessing FogBugz
so you can query the Data Source for a new or existing element.
FogBugz is an integrated web-based project management system with bug/issue tracking,
discussion forums, customer relationship management, and evidence based scheduling.

1. Open Admin Menu

Click the Data Sources link to access Data Sources list

Connecting To Data Sources

Page 160

2. Add new Data Source

1. Click the Add New Data Source button


2. Select Plug-in as type
3. Click Next step button

Connecting To Data Sources

Page 161

3. Define main settings

1.
2.
3.
4.
5.

Select 'FogBugz' plug-in in the drop-down list


Enter Name
Complete credential fields
Click Save
Edit Plug-in Data Source page opens

Connecting To Data Sources

Page 162

4. Set required parameter

1. Click the gear icon to open the Edit Plug-in Data Source Parameter pop-up
2. domain: Enter the domain name of your FogBugz account
3. Click Save

Connecting To Data Sources

Page 163

How to collect data from Fogbugz


This article will show you how to create an Element using a Fogbugz plug-in as a data source. It
assumes that you have already established connectivity to your Fogbugz account.

1. Add new Element

Connecting To Data Sources

Page 164

2. Choose Fogbugz plug-in as a Data Source for new element

1.
2.
3.
4.

Choose Fogbugz as your Data Source.


Select one of the Filter from the drop-down.
Input your Plug-in Command manually, OR
Use Visual Editor.

Connecting To Data Sources

Page 165

3. Fogbugz Query Builder is called by Visual Editor link

You can select fields in Included Fields section, filter selection or group fields based on Filter on
and Group By options.

Connecting To Data Sources

Page 166

4. Validate your Plug-in Command

1. Validate your Plug-in Command

Connecting To Data Sources

Page 167

5. Enable and Publish Report

1. Upon successful validation, you will see Sample Result Set.


2. Save the Report and continue to configure it OR
3. If configuring of the Report completed, Enable and Publish.

Connecting To Data Sources

Page 168

Sourcing data from Google Analytics

Connecting To Data Sources

Page 169

Establish connectivity to Google Analytics


This article shows how to create a new plug-in connection to Google Analytics.

1. Open Plugin Connection Profile Editor

Start by creating your new metric or report and performing the following actions in the Metric
Calculations section of the Metric Editor (or Report Editor):
1. Expand Data Source drop down list
2. Select Add New Data Source

Connecting To Data Sources

Page 170

1.1 Set plug-in type for new data source

2. Specify the name and credentials for the new profile

Next, perform the followings steps to define your new connection profile:
1.
2.
3.
4.
5.

Pick 'Google Analytics' as the Plug-in


Specify a name for your new Plug-in
Specify your Google Analytics Username
Provide the password for the specified username
Optionally specify number of threads per trigger

Connecting To Data Sources

Page 171

Save

3. Plugin Data Source Editor will display for additional input

Click on the gear icon next to the Profile ID variable

4. Enter your Profile ID from Google Analytics

Enter your Profile ID in the pop-up window.

Connecting To Data Sources

Page 172

4.1 Finding your Google Analytics Profile ID: Step 1

Login to your Google Analytics account and click the Admin link (top right-hand corner) to access
your Admin page. Then choose the profile that you want.

4.2 Finding your Google Analytics Profile ID: Step 2

Click the Profile Settings tab and find your Profile ID near the top of the tab

Connecting To Data Sources

Page 173

4.3 Enter Profile ID into Plug-in Data Source parameter

5. Connecting the Data Source to your Elements


You are now ready to connect your new data source to a metric or report

Connecting To Data Sources

Page 174

How to fetch data from Google Analytics


1. Assign the Profile to your New Element

Next, close the profile, back to Element Editor and set the newly defined profile as the source for
your metric/report.

Connecting To Data Sources

Page 175

2. Define the Google Analytics Query

You can define your Google Analytics query by either:


1. Typing it into the Plug-In command field
2. Clicking on the 'Query Builder' link to build the query using drop-downs
The Query Builder includes drop-downs and examples to facilitate the construction of your query.
If you're not familiar with Google Analytics, we suggest that you start there.

Connecting To Data Sources

Page 176

3. Query Builder makes it easy to define complex Google Analytics


queries

The Query Builder provides an example for each entry. Just click on the desired box and the
example will appear on the right. The first three boxes (i.e., 'dimensions', 'metrics', and 'segment')
include drop-down menus that will enable you to select entries by clicking your mouse.
Note: The 'segment' box will enable you to select or define a Google Analytics 'advanced
segment', which is a type of filter. It should not be confused with the dimensions that are used in
dimensioned metrics and reports.
You can fetch data incrementally from Google Analytics by referencing :last_measurement_time
for the start-date in the Query Builder. In the example above, the system will fetch visitors by day
incrementally by substituting the last date for which data was collected for the start-time variable in
the query expression.
IMPORTANT: By default, Google Analytics returns data for a relatively brief date range. If you
want a larger date range, a start-date must be specified.

Connecting To Data Sources

Page 177

4. Handling dates in Google Analytics

As you know, Google identifies weeks and months by number, e.g., '11'. Metric Insights requires
dates, e.g., 11/1/2014. If you look in the 'Time' section of the Visual Editor, you will see three
special Metric Insights functions which are designed to address this issue:
mi:hour returns a datetime value that is computed by concatenating ga:date and ga:hour
(i.e., the native Google Analytics 'hour' function, which simply returns the hour of the day).
mi:week returns a date value corresponding to the first day of the week that is identified by
ga:week (i.e., the native Google Analytics 'week' function, which simply returns the week of
the year).
mi:month returns a date value corresponding to the first day of the month that is identified
by ga:month (i.e., the native Google Analytics 'month' function, which simply returns the
month of the year).

Connecting To Data Sources

Page 178

Collecting data from Google BigQuery

Connecting To Data Sources

Page 179

Establish connectivity to Google BigQuery


This article describes the process of creating a new Plug-in Data Source for accessing a Google
BigQuery to be able to query the data source for a new or existing Metric.
PRE-REQUISITE
Before you use any Google plug-in resources, you must have registered your server with Google.
See Registering Your Server with Google for more information.

1. Create BigQuery plug-in

Find Data Sources item in Admin menu

2. Add New Data Source

Connecting To Data Sources

Page 180

3. Select the plug-in type of data source

4. Define the Profile

1. Specify "Google BigQuery" as the Plugin


2. Provide a Plug-in name
3. Save

Connecting To Data Sources

Page 181

5. Obtain the "Token" parameter

Before you can complete this step, you must have registered your server with Google. See
Registering Your Server with Google for more information.
1. Enter a BigQuery Job Id for the "job_id_prefix" parameter. This is simply a tag that you will
see in your BigQuery logs so that you can tell which queries are coming from Metric
Insights. You can leave that as the default value or change if you like. The value will not
affect your connection.
2. Click the Gear icon for "token" parameter

Connecting To Data Sources

Page 182

5.1 Login using credentials for Google BigQuery account

1. Confirm by clicking Allow Access

5.2 Note the Token ID

1. Notice that a token ID appears in the Parameters grid in the Value column
2. Save changes

Connecting To Data Sources

Page 183

6. Check if profile is valid in the Metric Editor

Open Metric editor


1.
2.
3.
4.

Set BigQuery plug-in as Data Source for element


Click the Refresh link next to the Project field and choose a project from the drop-down
Enter statement into the Plugin Command box.
Click the Validate Plug-in Command link to run it against Google BigQuery

Connecting To Data Sources

Page 184

How to fetch data from Google BigQuery


1. Add a new Element

1. Open the Admin Menu on your Home Page


2. Click "Add an Element"

2. Select type of the element and use wizard for creation

Connecting To Data Sources

Page 185

3. Provide main information

Set Measurement interval (1) and Measure of (2) for metric. Then press Collect data (3).
If you don't already have a Measure, see Define Your First Measure

Connecting To Data Sources

Page 186

4. Prepare metric for Data Collection

1. Find Google BigQuery plug-in in Data Source list


2. Set Trigger
3. Select Project from drop-down list
4. Input Plug-in Command in the text box

Connecting To Data Sources

Page 187

5. Validate plug-in command

6. Enter Last Measurement Time to specify what data points should


be fetched

Connecting To Data Sources

Page 188

7. Collect Data

1. If validation is finished successfully you will see how many records are fetched. Information
message is displayed in green font. If any error occurs during validation you will see error
message in red font.
2. Set Time point from what data should be collected.
3. Press Collect Data

Connecting To Data Sources

Page 189

8. Fetch is completed

You will get message how many values are recorded. Press Next button to generate Preview

Connecting To Data Sources

Page 190

9. Metric creation is finished

Check "Make Visible" flag is set to true and click Enable metric to add tile on your Home Page

Connecting To Data Sources

Page 191

Sourcing Data from Google Calendar

Connecting To Data Sources

Page 192

How to establish connectivity to Google Calendar


In order to retrieve Event data from a Google Calendar, you must have a Plug-in Data Source.
This article covers the creation process.
PRE-REQUISITE:
Before you use any Google plug-in resources, you must have registered your server with Google.
See Registering Your Server with Google for more information.

1. Define a new Data Source

1. Select Data Sources from Admin menu to open the Data Sources grid
2. Click Add New Data Source

Connecting To Data Sources

Page 193

1.1 Choose Type

1. Choose "Plug-in"
2. Click Next step to open the Add Plug-in Data Source pop-up

1.2 Define initial settings

1. Plug-in: Select "Google Calendar" from the drop-down list


2. Name: Enter a title for your new Google Calendar plug-in
3. Click Save to open the Edit Plug-in Data Source Editor

Connecting To Data Sources

Page 194

1.3 Complete Definition

1. Click the Edit icon for the token variable in the Parameters section.
2. On the Pop-up window that appears, view/choose your Google Calendar account and
sign in

Connecting To Data Sources

Page 195

1.4 Allow application to use data

Click Accept

1.5 Test connection to be confident of correct settings

Click Test Connection to receive a validation confirmation

Connecting To Data Sources

Page 196

How to fetch data from Google Calendar


An Event Calendar can be populated automatically based on entries in a Google Calendar. With
this capability, a user can maintain a set of planned events in a Google Calendar; e.g., promotions
or seasonal sales and have them automatically appear in Metric Insights.
PRE-REQUISITE:
You must have created a Google Calendar Plug-in set to access your Google Calendar to serve
as your Event Source.
RELATED ARTICLES
For more information on Event Calendars and how they apply to Metrics, see:
Understanding Events
Understanding Key Events
Associate an Event Calendar with a Metric

1. Open the Admin drop-down

1. Select More

Connecting To Data Sources

Page 197

1.1 Find Event Calendars in Admin Menu

1. Select "Event Calendars" to open the Event Calendar grid page

Connecting To Data Sources

Page 198

2. Create New Event Calendar

1.
2.
3.
4.

Click Add New Event Calendar icon or button


Name: Enter a title for your calendar (appears as a legend on the associated Chart View)
Event Source: Select your Google Calendar's plug-in from the drop-down list
Calendar: Select the appropriate Google Calendar from the available calendars in your
Google account
5. Save the new Event Calendar to open its Event Calendar Editor

3. Format the Plug in Command - Option 1 - Using the 'FIELDS'


statement
1. The FIELDS statement is used to fetch data from Calendar. The following statement selects all
the columns:

Connecting To Data Sources

Page 199

FIELDS: start, end, name, description, COUNT(*);


2. The START_TIME/END_TIME clauses is used to restrict data by time values that are taken
from start/end columns.
START_TIME: 2013-01-01;
END_TIME: 2013-10-10;
3. Calendar Data can be filtered by STATUS clause: confirmed, tentative, cancelled. The following
statement selects only cancelled events
STATUS: cancelled;
4. GROUP_BY_DATE_SCOPE is used if COUNT(*) field has been selected in FIELDS statement.
It helps to group data by date field.
GROUP_BY_DATE_SCOPE: day;

Connecting To Data Sources

Page 200

4. Format the Plug in Command - Option 2 - Using the Visual Editor

After making your selections using checkboxes and calendar icons,


Click Save to generate your command and place it in the Plug-in Command text box

Connecting To Data Sources

Page 201

5. Complete the Event Calendar definition

After making any necessary edits to the Event Calendar's settings:


1. Click Validate Plug-in Command and review results
2. After the Command has been successfully validated, click Collect Data to fetch Event
data
Once you have finished editing and saved all of your changes, the Event Calendar is ready to be
used and associated with one or more Metrics

Connecting To Data Sources

Page 202

Sourcing Data from a Google Spreadsheet

Connecting To Data Sources

Page 203

Establish connectivity to Google Spreadsheets


In order to retrieve data from a Google Spreadsheet, you must have a Plug-in Data Source.
This article covers the creation process.
PRE-REQUISITE:
Before you use any Google plug-in resources, you must have registered your server with Google.
See Registering Your Server with Google for more information.

1. Define a new Data Source

Select Data Sources from Admin menu to open the Data Sources grid

Connecting To Data Sources

Page 204

1.1 Add new Data Source

Click Add New Data Source

Connecting To Data Sources

Page 205

1.2 Choose Type

1. Choose "Plug-in"
2. Click Next step to open the Add Plug-in Data Source pop-up

1.3 Define initial settings

1. Plug-in: Select "Google Spreadsheet" from the drop-down list


2. Name: Enter a title for your new Google Spreadsheet plug-in

Connecting To Data Sources

Page 206

3. Click Save to open the Edit Plug-in Data Source Editor

1.4 Complete Definition

1. Click the Edit icon for the token variable in the Parameters section.
2. On the Pop-up window that appears, view/choose your Google account to get access to
Google Drive with your spreadsheets and sign in

Connecting To Data Sources

Page 207

1.5 Allow application to use data

Click Accept

Connecting To Data Sources

Page 208

1.6 Test connection to be confident of correct settings

Click Test Connection to receive a validation confirmation

Connecting To Data Sources

Page 209

Collect data from a Google Spreadsheet


A Metric Insights element can be populated automatically based on data fetched from a Google
Spreadsheet.
PRE-REQUISITE:
You must have created a Google Spreadsheet Plug-in that is set to access Google
Spreadsheets to serve as your Data Source.

1. Create Report

1. Click Add an Element from Admin menu


2. Select Report
3. Click Next Step

Connecting To Data Sources

Page 210

2. Set general parameters

1. Enter parameters for this report


2. Click Save

Connecting To Data Sources

Page 211

3. Adjust Data Collection section

1. Assign Google Spreadsheet profile as Data Source


2. Select the source spreadsheet from available files fetched by plug-in
3. Click Visual Editor link to select columns to display

Connecting To Data Sources

Page 212

4. Select data to be displayed

1.
2.
3.
4.
5.

Select the fields for your element


Optionally select the Aggregate functions to apply to each selected field
Optionally add Where criteria
Optionally add Group By criteria
Click Save

Connecting To Data Sources

Page 213

5. Validate Plug-in Command

Click Validate Plug-in Command

Connecting To Data Sources

Page 214

Sourcing Data from Graphite

Connecting To Data Sources

Page 215

Establish Connectivity to Graphite


An Administrator can use the process described in this article to create a new Plug-in Data
Source that is required to allow Elements to fetch data from Graphite to create a visualization in
Metric Insights.

1. Go to Data Source editor

Connecting To Data Sources

Page 216

2. Add New Data Source

3. Select "Plug-in" as Data Source Type

Connecting To Data Sources

Page 217

4. Select 'Graphite' from Plug-in picklist and input meaningful Name

5. Plug-in Data Source Editor will open

1. Click on gear icon to set host parameter.

Connecting To Data Sources

Page 218

6. Add host parameter to profile

After providing "host" parameter, your Plug-in Data Source will be ready to use in creation of new
Elements.

Connecting To Data Sources

Page 219

Sourcing Data from IBM CoreMetrics

Connecting To Data Sources

Page 220

Establish connectivity to IBM Coremetrics


If you are creating a metric or reports with data from IBM Coremetrics and you have not yet
established a connection to IBM Coremetrics from Metric Insights, you must first establish this
connectivity. IBM Coremetrics is a plug-in so connecting to this source requires that you define a
new a Plug-in Connection Profile.

1. Open Plugin Connection Profile Editor

Start by creating your new metric or report and then clicking the List icon next to the 'Data Source'
field.

Connecting To Data Sources

Page 221

1.1 Add New Data Source

Click on the 'Add New Data Source' button to define the new Data Source

2. Specify 'Plug-in' as the new data source type

Select the 'Plug-in' option

Connecting To Data Sources

Page 222

3. Specify the name and credentials for the new data source

Next, perform the followings steps to define your new Data Source:
1.
2.
3.
4.

Pick 'IBM Coremetrics' as the Plug-in


Specify a name for your new Plug-in
Specify your IBM Coremetrics Username
Provide the password for the specified username

Then click 'Save'

Connecting To Data Sources

Page 223

4. Enter your Coremetrics Client ID

When you click the Save button in the previous step, you'll be taken to the Plug-in editor. You'll
need to provide your Coremetrics Client ID here. To do so, click the gear icon next to the 'Value'
field.

5. Enter your Coremetrics Client ID (Step 2)

Enter your Coremetrics Client ID and click the 'Save' button

6. Connecting the Data Source to your Elements


You are now ready to connect your new data source to a metric or report

Connecting To Data Sources

Page 224

Sourcing Data from Marketo

Connecting To Data Sources

Page 225

Establish Connectivity to Marketo


An Administrator can use the process described in this article to create a new Plug-in Data Source
that is required to allow Elements to fetch data from Marketo to create a visualization in Metric
Insights.

1. Go to Data Source list

Connecting To Data Sources

Page 226

2. Add New Data Source

3. Select "Plug-in" as Data Source Type

Connecting To Data Sources

Page 227

4. Select 'Marketo' from Plug-in picklist and input meaningful Name

Save

5. Plug-in Data Source Editor will open

1. Click on gear icon to set Endpoint parameter


2. Click on gear icon to set Secret parameter

Connecting To Data Sources

Page 228

3. Click on gear icon to set User ID parameter.

6. Add Endpoint parameter to profile

7. Add Secret parameter to profile

8. Add Secret parameter to profile

Connecting To Data Sources

Page 229

9. Save Marketo Plug-in

Connecting To Data Sources

Page 230

Sourcing Data from Mixpanel

Connecting To Data Sources

Page 231

How to Collect Data using Mixpanel Plug-in


The Mixpanel plug-in allows you to retrieve information from its tool that tracks "events' on
websites and segments them by such dimensions as: source, campaign, medium, keyword. Using
the Metric Insights plug-in allows you to use this data to create visualizations easily using our
robust Metrics and Reports.

1. Assign the Data Source to your Element

1. Select a Mixpanel Plug-in


2. Enter the fetch Command
3. Validate Plug-in Command

1.1

1. Enter the Substitution date parameter used to validate the command

Connecting To Data Sources

Page 232

Click Select

2. Review Validation Results

1. Check to ensure that data was fetched during the Validation process
2. Use the Calendar to choose the start date for data collection
Click Collect Data

Connecting To Data Sources

Page 233

2.1 Check fetch results

1. Ensure that values were collected so that a Chart can be generated


2. Consider impact of any fetch feedback or error messages
When you have finished configuring your Metric:
Click Enable & Publish to display your Metric Viewer

Connecting To Data Sources

Page 234

Sourcing Data using OLAP

Connecting To Data Sources

Page 235

Establish Connectivity to OLAP


An Administrator can use the process described in this article to create a new Plug-in Data
Source. It is required to allow Elements to fetch data using OLAP.

1. Go to Data Source editor

Connecting To Data Sources

Page 236

2. Add New Data Source

3. Select "Plug-in" as Data Source Type

Connecting To Data Sources

Page 237

4. Select 'OLAP' from Plug-in picklist and input meaningful Name

1. Input Username
2. Provide Password
3. And click on Save button

Connecting To Data Sources

Page 238

5. Plug-in Data Source Editor will open

1. Click on gear icon to set URI parameter.

Connecting To Data Sources

Page 239

6. Provide URI Parameter

1. Provide URI parameter.


2. And Save.
Your OLAP Plug-in is ready to use.

Connecting To Data Sources

Page 240

How to Collect Data using OLAP Plug-in


This article will show you how to create an Element (Report in this case) using an OLAP plug-in as
a Data Source. It assumes that you have already established connectivity to OLAP.

1. Add new Element

Connecting To Data Sources

Page 241

2. Choose OLAP plug-in as a Data Source for new element

After providing basic Report Information, configure Report Calculation settings.


1. Select OLAP plug-in as a Data Source.
2. Input your Plug-in Command.

Connecting To Data Sources

Page 242

3. Publish your Element

1. Validate your Plug-in Command. Upon successful validation you will see sample results
under 'Validate' button.
2. If you want to continue configuring of the Element, click on Save button and keep on
configuring.
3. If configuring of the Element finished, Enable & Publish it.

Connecting To Data Sources

Page 243

Sourcing Data from Omniture

Connecting To Data Sources

Page 244

Establish Connectivity to Omniture


Omniture is an online marketing and web analytics tool offered by Adobe Systems. As of 2012,
Adobe began the process of retiring the Omniture name since former Omniture products were
integrated into the Adobe Marketing Cloud. use
Omniture is designed to collect usage statistics across product lines. Creating a plug-in to
Omniture allows data collected therein to be fetched by Metric Insights through the plug-in
connector and used to populate Metrics and Reports.
PREREQUISITE
You must be an Administrator to establish connectivity to Omniture plug-in

1. Create Omniture plug-in

1. Find Data Sources item in the Admin menu

Connecting To Data Sources

Page 245

2. Add New Data Source

1. Click Add New Data Source to add new plug-in

3. Set type of new Data Source

1. Set Plug-in type of data source


Click Next Step

Connecting To Data Sources

Page 246

4. Select Omniture type in the drop-down and input meaningful


Name

5. Plug-in Data Source Editor

1. Enter Username
2. Set Password

Connecting To Data Sources

Page 247

6. Save Omniture plugin

7. Check your Adjustments

Using the Report Editor:

Connecting To Data Sources

Page 248

Set Data Source - Omniture Reporting plugin


Input simple command in the Plug-in Command text box
Press Validate
If command is validated successfully, you will see Sample Result Set

Connecting To Data Sources

Page 249

Sourcing Data from QlikView

Connecting To Data Sources

Page 250

Getting ready for using Qlik with Metric Insights


This article gives you a quick overview of what you'll need before you can use your Qlik application
with Metric Insights. This includes both QlikView and Qlik Sense applications.

1. Setup a Metric Insights server


Metric Insights is a web application that runs on a real or virtual server, and it can be hosted in the
cloud or it can be hosted on premise in your private network.

1.1 Host in cloud


We can host the Metric Insights application for you in the Amazon cloud.
Alternately, you can run Metric Insights in your private Amazon cloud. If you host in Amazon, then
you will need an EC2 instance.
Both options work well when running a trial of the application. More details are included in
Deployment options.

1.2 Host on premise


You also have the option of running Metric Insights on premise. We have an OVA file that you can
launch in VMware. All you need is a dedicated machine with VMware installed. This option works
well when running a trial of the application. More details are included in Deployment options. And
specific details and limitations on running Metric Insights on VMware or any Virtualization
environment is documented here.
For production installation, you can also run Metric Insights on bare metal. This requires special
install scripts to setup depending on the configuration of your bare metal environment.

2. Access to Qlik server


2.1 QlikView
For QlikView you will need access to the QlikView server. Metric Insights requires you to install a
small agent on the QlikView server. This agent communicates with QlikView via its API and returns
data to Metric Insights.
A port (8443) will need to be opened for communication back from the agent on the QlikView
server to Metric Insights.
The agent is a Java / .NET program that is a few MB in size. It is referred to as a Remote Data
Collector and we have detailed instructions on how to install it.

Alternate Approach

Connecting To Data Sources

Page 251

An alternate approach is to host the Metric Insights agent on another Windows server in the same
network as QlikView. In this configuration the agent communicates with QlikView server over ports
80 and 443 (http/https), and also port 4747 (using "qvp" - the QlikView protocol).
A port (8443) will need to be opened for communication back from the agent on the Windows
server to Metric Insights.

Collecting data
Finally, once setup, pulling data from QlikView is easy. You configure a data connection in Metric
Insights. Then begin collecting data.

2.2 Qlik Sense


For Qlik Sense, Metric Insights accesses the Qlik Sense server via the same ports and protocol
that your users access Qlik Sense via web browser - port 80 (http), 443 (https). In addition, Metric
Insights accesses Qlik Sense server via port 4747 for communication via Qlik Sense WebSocket
API, and port 4244 for NTLM authentication.
For explanation of the Qlik Sense server ports refer to Qlik Sense documentation.

Connecting To Data Sources

Page 252

Establish Connectivity to QlikView


This article describes the process of creating plug-in Data Source to connect to QlikView. This
Data Source will allow data from existing QlikView objects to be used in building elements using
Metric Insights tools.
PRE-REQUISITE
Your Metric Insights instance must be configured to support QlikView.

1. Go to Data Sources editor

Connecting To Data Sources

Page 253

2. Add new Data Source

3. Set type of new Data Source

Connecting To Data Sources

Page 254

4. Define the basics

1. Select Qlikview from the plug-in pick list


2. Enter Unique Name for your plug-in
3. External reports fetch method: Choose "Automatically" and Metric Insights will
automatically discover all Qlikview objects on Qlikview server and list them all. Choose
"Manually" if you want to manually establish the list of Qlikview objects from which you
want to pull data. You can either enter them one at a time or in bulk via upload from a csv
file.
4. Enter the Username and Password required for this Plug-In. Username must be same
format that your QlikView server uses for Authentication. For example, if Active Directory
then you typically include the Domain as thus: <Domain>\<Username> (e.g., corp\edun)
5. Save

Connecting To Data Sources

Page 255

5. Edit the Data Source's Parameters

The Parameters section is where you specify how to connect to QlikView. The parameters include:
1. Objects In Containers - specify the types of QlikView objects to pull data from (e.g.,
graph, table, text)
2. Objects In Sheets - specify the types of QlikView objects to pull data from (e.g., graph,
table, text)
3. Path to server folder - if pulling from the file system then specify which folder contains the
.qvw files. Not used if pulling via 'server'
4. QVW File Filter List (Comma-Separated) - narrow the list of QlikView documents. Wild
card (*) is allowed
5. URI Scheme - how to access the QlikView Server (either http or https)
6. server - the host name of the QlikView server (e.g., localhost, qlikview.metricinsights.com,
...). Leave blank if pulling .qvw documents via the file system
7. Click through server - the URL to use for creating UI links back to your QlikView server

Connecting To Data Sources

Page 256

5.1 Enter the 'server' address

1. Click the gear icon to edit server parameter.


2. Enter server address. If Remote Data Collector runs on the QlikView server then enter
'localhost'; otherwise if it runs on a different machine then enter the QlikView server name
(e.g., qlikview.metricinsights.com)
3. Save
Optional: You can leave the server parameter blank. If you do then the plugin pulls QVW data
from the file system instead of going through QlikView server. See next section for instructions.

5.2 Enter the 'server' address - pull .qvw document data from file system

Pull QVW data from the file system instead of going through QlikView server
1. If you leave the 'server' parameter blank then the plugin pulls QVW data from the file system
instead of going through QlikView server.

Connecting To Data Sources

Page 257

2. For this case, you will want to specify the location of the .qvw files via the Path to server folder
parameter.

6. Configure Remote Data Collector for your QlikView server


To fetch data automatically you have to configure Remote Data Collector. General instructions on
setting up Remote Data Collectors can be found here.

7. Set user credentials when pulling .qvw file data via file system

Pull QVW data from the file system instead of going through QlikView server
If you specified the 'server' parameter with the QlikView server name, then you can skip this step.
If you left the 'server' parameter blank, so that you will pull .qvw document data via the file system,
then you will want to do this step.
You specify the user credentials in the the Remote Data Collector. This user is used for accessing
the .qvw documents on the file system.
On the machine where the Remote Data Collector is installed, open the 'Metric Insights Daemon'
Windows Service
1. Set the User in Log On menu. The default user is the local system account. You can enter a
different user here. You will then need to restart the Windows Service for it to take effect.

Connecting To Data Sources

Page 258

How to Collect Data using QlikView Plug-in


This article will show you how to create an Element (Report in this case) using a QlikView plug-in
as a Data Source. It assumes that you have already established connectivity to your QlikView
server.

1. Add new Element

Connecting To Data Sources

Page 259

2. Choose QlikView plug-in as a Data Source for new element

After providing basic Report Information, configure Report Calculation settings.


1.
2.
3.
4.

Select QlikView plug-in as a Data Source.


Select Object you want to fetch data from.
Input your Plug-in Command manually, OR
Use Visual Editor.

Connecting To Data Sources

Page 260

3. Qlik Query Builder is called by Visual Editor link

You can select fields in Included Fields section, filter selection or sort fields based on Filter on
and Sort By options.
You can also select date field from the list, or construct your own by Construct Date Field option.

Connecting To Data Sources

Page 261

4. Publish your Element

1. Validate your plug-in command.


2. Upon successful validation Save to see Sample Result Set.
3. If your Sample Result Set is satisfactory, Enable & Publish the report

Connecting To Data Sources

Page 262

Sourcing Data using RSS

Connecting To Data Sources

Page 263

Establish connectivity to RSS


1. Go to Data Source Editor

Connecting To Data Sources

Page 264

2. Add New Data Source

3. Select "Plug-in" as data source type

Connecting To Data Sources

Page 265

4. Select 'RSS' from Plug-in picklist

5. Additional fields will be display for entry

1. Enter unique name of your Data Source.


2. Enter the number of data collection threads that can be run concurrently for this Data
Source.
And Save.

Connecting To Data Sources

Page 266

6. Add RSS Feed Url you want to fetch data from

1. Click at Gear icon to specify RSS Feed Url


2. Save changes

Connecting To Data Sources

Page 267

Collect data from RSS


This article will show you how to create a Metric or Internal Report using a RSS as a data source.
It assumes that you have already established connectivity to RSS you want.

1. Create a new element based on your RSS plug-in data source

Connecting To Data Sources

Page 268

2. Create the plug-in command

1.
2.
3.
4.

Specify plug-in command


Click on 'Validate Plug-in Command' button
Review results of validation
Click on 'Collect data' link to collect data

Notes on Syntax
A query builder will be available shortly to help you build queries. These are the fields that are
available:
'pubDate by Day', type='datetime'
'Number of records', type='numeric'
'title', type 'text',
'link', type 'text',

Connecting To Data Sources

Page 269

'description', type 'text',


'category', type 'text',
'pubDate', type 'datetime',
'guid', type 'text',
Here is an example of how to return all fields:
{
"select":["*"]
}
Here is an example of how to select the number of articles that contain a string of text
(value:your search term(s) here) along with the publication date (pubDate):
{
"select":["pubDate","COUNT(*)"],
"where":[{"column":"title","condition":"contains","value":"say"}], "group":["pubDate GRANULARITY
DAY"]
}
Separate multiple where conditions with commas:
{
"select":["pubDate","COUNT(*)"],
"where":[{"column":"title","condition":"contains","value":"Amazon"},
{"column":"description","condition":"contains","value":"customers"}],
"group":["pubDate GRANULARITY DAY"]
}
The values are not case sensitive, e.g., Amazon and amazon will return the same result. The
"contains","value":" statement will work on multiple terms as an exact phrase match. For the title,
Amazon increases customers worldwide you can include
{"column":"title","condition":"contains","value":"Amazon increases"} but inexact phrases will not
work like:
{"column":"title","condition":"contains","value":"Amazon customers"}. To do this you need to create
separate conditions:
{

Connecting To Data Sources

Page 270

"select":["pubDate","COUNT(*)"],
"where":[{"column":"title","condition":"contains","value":"amazon"},
{"column":"title","condition":"contains","value":"customers"},
{"column":"description","condition":"contains","value":"netflix"}],
"group":["pubDate GRANULARITY DAY"]
}

3. After successful data collection enable and publish your element

1. Review results of data collection


Enable and publish your element

Connecting To Data Sources

Page 271

Sourcing data from Salesforce

Connecting To Data Sources

Page 272

What is the difference between SalesForce plug-in and


SalesForce SOQL plug-in?
There are two ways to get data from SalesForce - using SalesForce plug-in or SalesForse SOQL
plug-in. You can see what is the difference between those plugins below.

1. SalesForce plug-in
SalesForce plug-in allows you to create elements using Visual editor.
The main distinctions of using SalesForce plug-in are:
You don't need to know SOQL (Salesforce Object Query Language) to create plug-in
commands;
You can use Visual editor;
BUT, you need to have existing reports in SalesForce as a source.
See more information here

2. SalesForce SOQL plug-in


SalesForce SOQL plug-in allows you to create elements by adding plug-in commands manually.
The main distinctions of using SalesForce SOQL plug-in are:
You can easily create new element - just type-in your plug-in command;
You don't need additional set up before creating new element;
BUT, you have to know SOQL (Salesforce Object Query Language) to create plug-in
commands.
See more information here

Connecting To Data Sources

Page 273

Establish connectivity to Salesforce


An Administrator can use the process described in this article to create a new Plug-in Data
Source that is required to allow Elements to fetch data from SalesForce to create a visualization in
Metric Insights.

1. Create Salesforce plug-in

Find Data Sources item in Admin menu

2. Add New Data Source

Connecting To Data Sources

Page 274

3. Set type of new Data Source

Set plug-in type of data source and press Next Step button

4. Select Salesforce type in the drop-down and input meaningful


Name

Connecting To Data Sources

Page 275

5. Plug-in Data Source Editor

1. Enter Username
2. Set Password
3. Click at Gear icon for token parameter

Connecting To Data Sources

Page 276

6. Add token parameter to profile

You will need a 'security token' in order to use the Salesforce plugin. You can get it from the
Salesforce browser interface. Instructions are available in Salesforce.

7. Save Salesforce plugin

Connecting To Data Sources

Page 277

8. Check your Adjustments

Go to Metric Editor
1. Set Data Source - SalesForce plugin
2. Input simple command in the text box
3. Press Validate
If command is validated successfully, you will see message how many records are returned.

Connecting To Data Sources

Page 278

How to collect data from Salesforce


1. Create an element using Salesforce plug-in

2. Select type of the element and use wizard for creation

Connecting To Data Sources

Page 279

3. Provide main information

Set Measurement interval (1) and Measure of (2) for metric. Then press Collect data (3).

Connecting To Data Sources

Page 280

4. Prepare metric for Data Collection

1. Find SalesForce plug-in in Data Source list


2. Set Trigger
3. Select Report from drop-down list
4. Input Plug-in Command manually or using Visual Editor

Connecting To Data Sources

Page 281

5. SalesForce Query Builder is called by Visual Editor link

You may select Field and set Expression for it, Fetch options are set below table with fields

Connecting To Data Sources

Page 282

6. Validate plug-in command

7. Enter Last Measurement Time to specify what data points should


be fetched

Connecting To Data Sources

Page 283

8. Collect Data

1. If validation is finished successfully you will see how many records are fetched. Information
message is displayed in green font. If any error occurs during validation you will see error
message in red font.
2. Set Time point from what data should be collected.
3. Press Collect Data

Connecting To Data Sources

Page 284

9. Fetch is completed

You will get message how many values are recorded. Press Next button to generate Preview

Connecting To Data Sources

Page 285

10. Metric creation is finished

Check "Make Visible" flag is set to true and click Enable metric to add tile on your Home Page

Connecting To Data Sources

Page 286

Sourcing data from Salesforce SOQL

Connecting To Data Sources

Page 287

Establish connectivity to Salesforce SOQL


An Administrator can use the process described in this article to create a new Plug-in Data
Source that is required to allow Elements to fetch data using SalesForce SOQL to create a
visualization in Metric Insights.

1. Go to Data Source editor

Connecting To Data Sources

Page 288

2. Add New Data Source

3. Select "Plug-in" as Data Source Type

Connecting To Data Sources

Page 289

4. Select 'SalesForce SOQL' from Plug-in picklist and input


meaningful Name

Connecting To Data Sources

Page 290

5. Plug-in Data Source Editor will open

1. Click on gear icon to set token parameter. After you click on icon system will automatically
request SalesForce SOQL token, and upon successful validation the token will appear in
"Value" column. See our article on viewing and updating your SalesForce token here.
2. And then Save SalesForce SOQL plug-in.

Connecting To Data Sources

Page 291

Setting up Salesforce OAuth


In order to use the Salesforce SOQL plug-in, you will need to login to the Salesforce browser
interface and setup an OAuth connection. This article summarizes the key steps. Detailed
instructions can be found in the Salesforce support area. (See Creating a Connected App.)
Note: The 'regular' Salesforce plugin does not require an OAuth connection. You can set up a
connection using that plugin simply by inputting your Salesforce username and password, plus
your 'security token' (which is essentially an extra password, obtainable from the Salesforce
browser interface).

1. Login to Salesforce

1. Go to the "Setup" menu:

Connecting To Data Sources

Page 292

2. Use APP Setup Section in Left Sidebar

1. Choose "Apps" in the Create Apps sub-section of App Setup


2. Click the "New" button in the Connected Apps section

Connecting To Data Sources

Page 293

3. Client ID and Client Secret

When you click "Edit" you will see your Consumer Key (ID) and your Consumer Secret. You need
to add your SalesForce Secret and Client ID to the Const.php file (/var/www/iv/engine/config/
const.php).
define('SALESFORCE_OAUTH2_CLIENT_ID',
'3MVG9y6x0357Hlefte5xC92_J.LmfhKJ0Z_...5JydLTMG5bh6eosdfWERZjDOMbc_2FWeBclda5gD6');
define('SALESFORCE_OAUTH2_CLIENT_SECRET', '79282882...');
Note: These are client specific.

Connecting To Data Sources

Page 294

4. Enable Metric Insights to Refresh Your Token

In the API (Enable OAuth Settings) section:


1. Add "Perform requests on your behalf at any time (refresh_token, offline_access)" to
Selected OAuth Scopes.
2. Save

Connecting To Data Sources

Page 295

How to collect data from Salesforce SOQL


This article will show you how to create an Element using a SalesForce SOQL plug-in as a data
source. It assumes that you have already established connectivity to SalesForce SOQL.

1. Add new Element

Connecting To Data Sources

Page 296

2. Choose SalesForce SOQL plug-in as a Data Source for new


element

1. Choose SalesForce SOQL as your Data Source.


2. Input your Plug-in Command.

Connecting To Data Sources

Page 297

3. Publish your new element

1. Validate your plug-in command


2. Then Collect data. You will see the total number of collected records under 'Validate'
button
3. Enable & Publish your Metric.

Connecting To Data Sources

Page 298

What is the difference between SalesForce plug-in and


SalesForce SOQL plug-in?
There are two ways to get data from SalesForce - using SalesForce plug-in or SalesForce SOQL
plug-in. Differences detailed below.

1. SalesForce plug-in
SalesForce plug-in allows you to create elements using Visual editor.
The main distinctions of using SalesForce plug-in are:
You don't need to know SOQL (Salesforce Object Query Language) to create plug-in
commands;
You can use Visual editor;
BUT, you need to have existing reports in SalesForce as a source.
See more information here

2. SalesForce SOQL plug-in


SalesForce SOQL plug-in allows you to create elements by adding plug-in commands manually.
The main distinctions of using SalesForce SOQL plug-in are:
You can easily create new element - just type-in your plug-in command;
You don't need additional set up before creating new element;
BUT, you have to know SOQL (Salesforce Object Query Language) to create plug-in
commands.
See more information here

Connecting To Data Sources

Page 299

Sourcing Data from Tableau

Connecting To Data Sources

Page 300

Hosting choices for Tableau Server


If you have not already decided how you are hosting the Metric Insights application then this article
can give you some guidance.
You have several options for hosting the Metric Insights application and pulling data from your
Tableau Server. You can host Metric Insights in Amazon EC2 cloud, or you can host in VMware.
You can also host on your bare metal but this article does not cover that option.

1. Metric Insights in our cloud


Hosting Metric Insights in our Amazon EC2 is the easiest method.

1.1 Tableau Server open to internet


If your Tableau server is open to the internet for web browser access, then this architecture choice
is the best.
Metric Insights pulls your Tableau data directly from Tableau server via your web login credentials.
No extra ports for Tableau server need to be opened, no extra software needs to be installed. Just
follow the instructions in your Metric Insights application for setting up the connection to your
Tableau server.

1.2 Install remote data collector


If your Tableau server is not open to the internet for web browser access, then you can still use
this architecture choice.
You just install our remote data collector on a server in your network that can access your Tableau
server. Typically you install this on your Tableau server, but you are not limited to that machine.
Any server that has access to your Tableau server is fine.
The remote data collector is a small piece of Java code (few MB in size) that has bare functionality
to do the following:
1. Get job requests from the Metric Insights server (HTTPS POST via port 8443)
2. Fetch the data from Tableau server (HTTPS GET/POST via port 80/443)
3. Post the result back to the Metric Insights server (HTTPS POST via port 8443)
The remote data collector is just a simplified web browser in this respect. It makes web requests to
the Metric Insights server; it makes web requests to your Tableau server.
Just follow the instructions in your Metric Insights application for setting up the connection to your
Tableau server with a remote data collector.

2. Metric Insights in your hosted cloud


Hosting Metric Insights in your Amazon EC2 is an easy method too.

Connecting To Data Sources

Page 301

Once hosted, Metric Insights pulls your Tableau data directly from your Tableau server via your
web login credentials. Just follow the instructions in your Metric Insights application for setting up
the connection to your Tableau server.

3. Metric Insights on VMware


Hosting Metric Insights in VMware is also an option.
Once loaded, Metric Insights pulls your Tableau data directly from your Tableau server via your
web login credentials. Just follow the instructions in your Metric Insights application for setting up
the connection to your Tableau server.

Connecting To Data Sources

Page 302

Hosting choices for Tableau Online


If you have not already decided how you are hosting the Metric Insights application then this article
can give you some guidance.
You have several options for hosting the Metric Insights application and pulling data from Tableau
Online.

1. Metric Insights in our Cloud


Hosting Metric Insights in our Amazon EC2 is the easiest method.
Under this configuration, Metric Insights pulls your Tableau data directly from Tableau Online via
your Tableau Online login credentials. Just follow the instructions in your Metric Insights
application for setting up the connection to your Tableau Online data.

2. Metric Insights in your hosted Cloud


Hosting Metric Insights in your Amazon EC2 is an easy method too.
Under this configuration, Metric Insights pulls your Tableau data directly from Tableau Online via
your Tableau Online login credentials. Just follow the instructions in your Metric Insights
application for setting up the connection to your Tableau Online data.

Connecting To Data Sources

Page 303

Establish Connectivity to Tableau Server


This article describes the process of connecting to Tableau Server in order to use Tableau reports
as data sources in Metric Insights. If your Tableau Server is accessible from your Metric Insights
server over the network, this is most likely the best solution. General instructions for establishing
a Data Source based on a plug-in can be found here. Specific settings for this approach follow.
Alternative Approaches:
If Tableau is inaccessible because it is behind a firewall or for some other reason, then you
will probably need to connect via a remote data collector.
If you are connecting to Tableau Online, then follow the instructions for establishing
connectivity to Tableau Online.

1. Create a Tableau plug-in data source

1.
2.
3.
4.

Access the Data Sources list page


Click Add New Data Source
Plug-in: Select "Tableau"
Use Remote Data Collector: Set to "No". NOTE: If you need to choose "Yes", then
follow instructions on how to connect via a remote data collector.
5. External reports fetch method: Choose "Automatically" and Metric Insights will
automatically discover all Tableau reports on Tableau server and list them all. Choose

Connecting To Data Sources

Page 304

"Manually" only if you use a Data Collector or if you have a large list of reports and only
want a few of them. You can either enter them one at a time or in bulk via upload from a
csv file.
6. Save the new Data Source
NOTE: You can always access the Data Source Editor by clicking its Name link in the Data
Sources grid

2. Edit the Data Source's Parameters

Once you have added a Data Source, the Editor opens to allow you to complete additional
settings. Page down to the Parameters Section

2.1 Enter the name of your Tableau server

1. Click the Edit (gear) icon on the "Tableau server" row and
2. Enter the name of your Tableau server on the pop-up
3. Save to place your entry in the Value column

Connecting To Data Sources

Page 305

2.2 Enter your Site ID

If your Tableau server has only one site or has multiple sites and you want to pull data from the
Default site, skip this step. Otherwise, if you want to pull data from one of the other (not the
Default) sites configured on a multi-site Tableau server:
1. Click the Edit icon on the the "Site ID" row
2. On the pop-up, enter the Site ID. NOTE: In the example above we are pulling data from
the "metricinsights" site configured on the Tableau server.
3. Save the Parameter
IMPORTANT: If your Tableau server has multiple sites, and you want to pull data from more than
one site, then you will need to configure a Tableau Data Source for each of the sites from which
you will extract data.

2.3 Enter Project List


Optionally, enter the Project(s) that your Tableau reports are in. This parameter only applies if you
chose Automatically for External reports fetch method. This parameter will limit the Tableau
reports (views) that are discovered and listed to only those that belong in the Projects on Tableau
server.

2.4 Enter Authentication Method if using Tableau trusted server authentication


Optionally, enter the value trusted for plugin parameter 'authMethod' if you want to use Tableau
trusted server authentication. This parameter will only work if your Tableau server is setup for
trusted authentication. Also see Using SSO on Tableau Server. You will not need to enter a
password if using this method. If you want more information on Trusted Authentication in Tableau
Server then refer to documentation at http://onlinehelp.tableausoftware.com/v8.2/server/en-us/
trusted_auth.htm

Connecting To Data Sources

Page 306

3. Add Tableau Reports

If you chose Manually for External reports fetch method:


1. Navigate to the Tableau Reports List section
Use one of the methods below to create a list of Reports to be used as sources for data

3.1 Add Tableau reports individually

1. Click Add New External Report


2. Enter the Report ID and Name
3. Save

Connecting To Data Sources

Page 307

NOTE: The format for Report ID is "workbook/view" as illustrated in the example above with the
entry "Finance/Taleof100Start-ups"

3.2 Upload a CSV file

1. Click Load From CSV File


2. Browse to choose the comma separated values file that contains the list of Tableau
reports
3. Import
NOTE: The format for the Report ID column in the CSV file is "workbook/view"

Connecting To Data Sources

Page 308

3.3 Determine the Tableau Report ID

To confirm the value you entered for your Tableau reports, sign in to your Tableau server and
navigate to the view that contains your Tableau report. The "workbook/view" part of the url as seen
in the address bar of your web browser contains the value to enter for your Tableau report.
In this example, the "workbook/view" is "Finance/Taleof100Start-ups" (outlined in Red above).
This is the value that you enter as your Tableau Report ID

Connecting To Data Sources

Page 309

3.4 Review the list of Reports

Once you have entered or uploaded at least one Tableau Report, the External Reports grid
contains a row for each one

Connecting To Data Sources

Page 310

Establish Connectivity to Tableau via a Remote Data


Collector
This article describes the process of using a Remote Date Collector to connect to a Tableau
server that is not accessible from your Metric Insights server, usually because it is behind a
firewall. This connection allows you to use the Tableau reports as Data Sources in Metric
Insights.
General instructions on setting up data sources based on plug-ins can be found here. Below the
specific process and settings that apply to using a Remote Data Collector are explained.
ALTERNATIVE OPTIONS:
If your Tableau Server is accessible over the network from your Metric Insights server, you
can connect to Tableau Server directly.
If you are connecting to Tableau Online, then follow the instructions for establishing
connectivity to Tableau Online.

1. Create a new Tableau plug-in Data Source

1. Access the Data Sources list page


2. Click Add New Data Source

Connecting To Data Sources

Page 311

3. Plug-in: Select "Tableau"


4. Use Remote Data Collector: Set to "Yes"
5. External reports fetch method: Choose "Automatically" and Metric Insights will
automatically discover all Tableau reports on Tableau server and list them all. Choose
"Manually" if you want to manually establish the list of Tableau reports from which you want
to pull data. You can either enter them one at a time or in bulk via upload from a csv file.
6. Save
NOTE: You can always access the Data Source Editor by clicking its Name link in the Data
Sources grid

2. Edit the Data Source's Parameters

Once you have added a Data Source, the Editor opens to allow you to complete additional
settings. Page down to the Parameters Section

2.1 Enter the name of your Tableau server

1. Click the Edit (gear) icon on the "Tableau server" row


2. Tableau Server: Enter the name
3. Save

Connecting To Data Sources

Page 312

2.2 Enter your Site ID

If your Tableau server has only one site or has multiple sites and you want to pull data from the
Default site, skip this step. Otherwise, if you want to pull data from one of the other (not the
Default) sites configured on a multi-site Tableau server:
1. Click the Edit icon on the the "Site ID" row
2. On the pop-up, enter the Site ID value. NOTE: In the example above we are pulling data
from the "metricinsights" site configured on the Tableau server.
3. Save
IMPORTANT: If your Tableau server has multiple sites, and you want to pull data from more than
one site, then you will need to configure a Tableau Data Source for each of the sites from which
you will extract data.

2.3 Enter Project List


Optionally, enter the Project(s) that your Tableau reports are in. This parameter only applies if you
chose Automatically for External reports fetch method. This parameter will limit the Tableau
reports (views) that are discovered and listed to only those that belong in the Projects on Tableau
server.

Connecting To Data Sources

Page 313

3. Add Tableau Reports

If you chose Manually for External reports fetch method:


1. Navigate to the Tableau Reports List section
Use one of the methods below to create a list of Reports to be used as sources for data

3.1 Add Tableau reports individually

1. Click Add New External Report


2. Enter the Report ID and Name
3. Save

Connecting To Data Sources

Page 314

NOTE: The format for Report ID is "workbook/view" as illustrated in the example above with the
entry "Finance/Taleof100Start-ups"

3.2 Upload a CSV file

1. Click Load From CSV File


2. Browse to choose the comma separated values file that contains the list of Tableau
reports
3. Import
NOTE: The format for the Report ID column in the CSV file is "workbook/view"

Connecting To Data Sources

Page 315

3.3 Determine the Tableau Report ID

To confirm the value you entered for your Tableau reports, sign in to your Tableau server and
navigate to the view that contains your Tableau report. The "workbook/view" part of the url as seen
in the address bar of your web browser contains the value to enter for your Tableau report.
In this example, the "workbook/view" is "Finance/Taleof100Start-ups" (outlined in Red above).
This is the value that you enter as your Tableau Report ID

Connecting To Data Sources

Page 316

3.4 Review the list of Reports

Once you have entered or uploaded at least one Tableau Report, the External Reports grid
contains a row for each one

4. Configure a Remote Data Collector for your Tableau server


Configure a Remote Data Collector for your Tableau server. General instructions on setting up
Remote Data Collectors can be found here. You can add a new Remote Data Collector either by
choosing the appropriate link under the Admin menu or alternatively, you can click the "Add New
Remote Data Collector" button in the Plug-in Data Source editor.

Connecting To Data Sources

Page 317

4.1 Add Tableau data source created in Step 1 to your Remote Data Collector
The instructions on setting up Remote Data Collectors will instruct you to add a new Plug-in
Connection Profile to your remote data collector. Please use the Tableau Plug-in Data Source that
you created in Step 1.

5. Install the Remote Data Collector on your Tableau server


Login to Metric Insights on your Tableau server, then download the Insightd Zip File, extract the
contents, and install Insightd. General instructions on installing Insightd on Windows servers can
be found here.

Connecting To Data Sources

Page 318

6. Start the Metric Insights Daemon

Once you have installed Insightd on your Tableau server, you will need to start the Metric Insights
Daemon. Click the Start button and then open the Services panel.

6.1 Find the Metric Insights Daemon and start the service

Find the Metric Insights Daemon and start the service. Now you're ready to collect data from your
Tableau server!

Connecting To Data Sources

Page 319

Establish connectivity to Tableau Online


This article describes how to connect to Tableau Online in order to use reports from Tableau
Online as Data Sources in Metric Insights.
General instructions on setting up data sources based on plug-ins can be found here. Below the
specific process and settings that apply to using a Tableau Online are explained.
ALTERNATIVE APPROACHES:
If you are connecting to Tableau Server, then follow the instructions for establishing connectivity,
either directly or via a remote data collector.

1. Create a new Tableau plug-in Data Source

1.
2.
3.
4.

Access the Data Sources list page


Click Add New Data Source
Plug-in: Select Tableau
Use Remote Data Collector: Set to No since it is not allowed for Tableau Online

Connecting To Data Sources

Page 320

5. External reports fetch method: Choose "Automatically" and Metric Insights will
automatically discover all Tableau reports on Tableau server and list them all. Choose
"Manually" if you want to manually establish the list of Tableau reports from which you want
to pull data. You can either enter them one at a time or in bulk via upload from a csv file.
6. Save
NOTE: You can always access the Data Source Editor by clicking its Name link in the Data
Sources grid

2. Enter the Data Source's Parameters

Once you have added a Data Source, the Editor opens to allow you to complete additional
settings. Page down to the Parameters Section

2.1 Identify the Tableau Server

1. Click the Edit (gear) icon on the "Tableau server" row


2. Tableau Server: Enter the name NOTE: At present, the Name of Tableau Online is
https://online.tableausoftware.com
3. Save

Connecting To Data Sources

Page 321

2.2 Enter your Site ID

1. Click the Edit icon on the "Site ID" row


2. On the pop-up, enter the Site ID value. NOTE: In the example above we are pulling data
from the Metric Insights site configured on Tableau Online.
3. Save
IMPORTANT: If you have multiple Tableau Online sites, and you want to pull data from more than
one site, then you will need to configure a Tableau Data Source for each of the sites from which
you will extract data.

2.3 Enter Project List


Optionally, enter the Project(s) that your Tableau reports are in. This parameter only applies if you
chose Automatically for External reports fetch method. This parameter will limit the Tableau
reports (views) that are discovered and listed to only those that belong in the Projects on Tableau.

3. Add Tableau Reports

As indicated above, Tableau Online does not support automatic retrieval of the Reports List. So for
your choice of Manually for External reports fetch method:

Connecting To Data Sources

Page 322

1. Navigate to the Tableau Reports List section


Use one of the methods below to create a list of Reports to be used as sources for data

3.1 Add Tableau reports individually

1. Click Add New External Report


2. Enter the Report ID and Name
3. Save
NOTE: The format for Report ID is "workbook/view" as illustrated in the example above with the
entry "Finance/Taleof100Start-ups"

Connecting To Data Sources

Page 323

3.2 Upload a CSV file

1. Click Load From CSV File


2. Browse to choose the comma separated values file that contains the list of Tableau
reports
3. Import
NOTE: The format for the Report ID column in the CSV file is "workbook/view"

3.3 Review the list of Reports

Once you have entered or uploaded at least one Tableau Report, the External Reports grid
contains a row for each one

Connecting To Data Sources

Page 324

3.4 Determine the Tableau Report ID

To confirm the value you entered for your Tableau reports, sign in to Tableau Online and navigate
to the view that contains your Tableau report. The "workbook/view" part of the url as seen in the
address bar of your web browser contains the value to enter for your Tableau report.
In this example, the "workbook/view" is "Finance/Taleof100Start-ups" (outlined in Red above). This
is the value that you enter as your Tableau Report ID

Connecting To Data Sources

Page 325

How to collect data from Tableau


This article will show you how to create a Metric or Internal Report using a Tableau report as a
data source. It assumes that you have already established connectivity to your Tableau server.

1. Create a new element based on your Tableau plug-in data source

Create a new element based on your Tableau plug-in data source:


1. Choose Tableau as your Data Source
2. Choose the specific Tableau report that you want to use

Connecting To Data Sources

Page 326

2. Use the Visual Editor to create your Plug-in Command

Click the Visual Editor link to bring up a list of available fields for your Metric or Report. (Once you
know the syntax and field names, you can just type them directly into the plug-in command box.)

3. Publish your new element


Validate your plug-in command, collect data, and then publish your new element!

Connecting To Data Sources

Page 327

How to create an External Report from Tableau


This article will show you how to create an External Report that is linked to a report from your
Tableau server.

1. Add a 'Tableau' External Report Type

Create a new External Report element. Then:


1. Open the Report Type drop-down menu and select 'Add New Report Type'
2. Give your new report type a name. We've used 'Tableau' here but any name that users can
recognize will work.
3. Your new report will include an image (for the dashboard tile and the preview pane). If you
want the image to be dynamically updated based on the underlying Tableau report, select
the Tableau Plug-in Data Source that you created earlier. Otherwise, you can leave the
'Image Source Plugin' field empty.
Note: The 'Icon URL' field can be left empty.

Connecting To Data Sources

Page 328

2. Choose a viewing option for your Tableau external report

Choose whether to show the report in the Metric Insights Viewer or in an External Web Page.

3. Provide a URL for your Tableau source report and upload an


image
You'll need to provide a URL for your Tableau report so that Metric Insights can find it. You'll also
need an image for the home page tile and the preview pane. The next step will depend upon
whether you want the image to be 1) static; or 2) dynamically updated based on the underlying
Tableau report.

3.1 Enter a URL and upload a .png file if you want to use a static image

If you want to use a static image for your report, then:


1. Select the 'No' option next to 'Automatically Refresh Report Image'.
2. Enter the URL where your Tableau source report can be found.
3. Click 'Upload Image File'.

Connecting To Data Sources

Page 329

4. Enter the name and location of your Image File.


Note: The image file can be a simple screen-capture photo of your Tableau report.

3.2 Select your Tableau source report if you want a dynamically generated
image

If you want to use a dynamic image for your report, then:


1.
2.
3.
4.
5.

Select the 'Yes' option next to 'Automatically Refresh Report Image'.


Choose how often you want to refresh the image.
Select the Tableau Plug-in Data Source.
Choose the Tableau source report that you want your External Report to link to.
The External Report URL will be generated automatically based on your other inputs. If
you like, you can modify the URL by appending a question mark (?) followed by any filter
or parameter settings. In the example above, the URL specifies that the Tableau source
report's County filter shall be set to King county.

When you have entered all of your selections, click 'Collect Image'.

4. Save your new Tableau report


Finish creating your new Tableau External Report by clicking 'Save'.
Note: If you chose 'Show report in External Web Page' in Step 3, you will not see your new report
in the Metric Insights viewer when you are finished. If you want to see your new report, you can go
to the Home page and click the tile for your new report. Alternatively, you can open the Elements
List under the Admin menu on any page.

Connecting To Data Sources

Page 330

Getting Started with Push Intelligence for Tableau


This 'mini-jumpstart-guide' is designed to help you get the most out of your Tableau investment by
using Metric Insights' alerting and email-bursting capabilities.

1. Create initial users

Once you've connected to Tableau and created some basic content in Metric Insights, you'll want
to create some users. The fastest way to do that is to select the Users link in the Admin menu and
then click the Add New User button at the bottom of the page.

Connecting To Data Sources

Page 331

1.1 Configure new user

When you click Add New User, you'll be prompted for some basic information about the new user.
Enter the necessary items and click the Save button.

Connecting To Data Sources

Page 332

1.2 Create users by uploading CSV file

You can also create new users by uploading a CSV file. The required fields are Username, Email,
Use LDAP to authenticate, Firstname, and Lastname. (The import page includes a link where you
can obtain a sample CSV file in the expected format.)

2. Set up automatic alerts


Now that you've created some content and some users, it's time to connect the two. One way to
do that is with automatic alerting.
Metric Insights enables you to alert users automatically when certain types of events are recorded
in your data. The first step is to set up one or more alerts; you can learn about that here. Once
you've created some alerts, users can opt-in to receive them. (Note: New users will be opted-in by
default.)

3. Set up email digests


Another way to push Tableau reports to your users is by 'email bursting'. As with automatic
alerting, this is essentially a two-step process.
The first step is to place some Tableau reports into a 'favorites' folder and then 'share' that folder
with some users; you can learn about shared favorites here. Once you've shared a 'favorites'
folder with other users, they can opt-in to receive daily email digests of the elements in that folder;
you can learn about that here. (Note: New users will be opted-in by default.)

Connecting To Data Sources

Page 333

4. Administer user preferences

New users will automatically be opted-in to receive alerts and email digests. Any administrator can
subsequently edit any user's preferences. Just select the Users link in the Admin menu and then
click the link for the user whose preferences you want to edit.

Connecting To Data Sources

Page 334

Best practices for Tableau


This article describes how to get the best out of your integration with Tableau

1. Publishing dashboards and worksheets


In Tableau Desktop you have a workbook that is composed of dashboards and
worksheets. A dashboard is like a canvas where you drop in one or more worksheets. You
can also add other graphical elements and controls to the dashboard.

Within a given workbook in Tableau Desktop, both dashboards and worksheets can be
published to Tableau Server, where they then become views in Tableau Server.

2. How Metric Insights gets data from Tableau Server


Worksheets: For Metric Insights to work, you must publish the underlying worksheets that
make up the dashboard; it is from each of these worksheets that Metric Insights pulls data.
It does so via the view url of Tableau Server using CSV export. E.g.,
https://tableau.example.com/views/workbookName/sheetName?format=csv

Dashboards: Pulling the data from a published dashboard does not work very well. It only
gives the data from the first (alphabetic order) worksheet in the dashboard.

3. Best practices
You will probably want to publish dashboards for your users to see and use.
But you will want to make the underlying data available so that Metric Insights can pull. So you
must also publish your worksheets.
1. OPTION 1: Publish all worksheets in same workbook: One approach is to publish all
the worksheets in addition to the dashboards under a given Tableau Desktop workbook.
Then if you don't want your users to see the worksheets you deny permissions on the
worksheets to all users. See section below for instructions.
2. OPTION 2: Publish all worksheets in separate workbook: Another approach is to
publish your one workbook from Tableau Desktop to two workbooks in Tableau Server.
The first workbook has just the dashboard views in Tableau Server that your users access.
And the second workbook has just the worksheet views in Tableau Server that only Metric
Insights accesses. See section below for instructions.

Connecting To Data Sources

Page 335

4. OPTION 1 - Publish all worksheets in same workbook


For the one approach described above, you publish the worksheets with the dashboard, but you
make the worksheet views on Tableau Server only available to Metric Insights, and not visible to
your users.
This section describes how to do that.

4.1 Tableau Desktop - Publish all worksheets in same workbook as


dashboards
First, in Tableau Desktop, publish all worksheets in addition to the dashboards in the same
workbook on Tableau Server.
Note, if you choose "hide sheet" option in Tableau Desktop, then the worksheet is not an option
under Publish to Server to publish as a view. So you must "unhide sheet" before you can publish
the worksheet.

Connecting To Data Sources

Page 336

4.2 Tableau Server - Select worksheets

Then on Tableau Server, select the worksheet Views that you don't want your users to see, and
select Permissions link at top of screen to edit permission for the selected views.

4.3 Tableau Server - Deny view access permissions

1. Next, click the X button to remove access to all the users. This approach works if you are
signed in with the credentials that Metric Insights uses for pulling data.
2. An alternate approach is to select the edit link to the right of your workbook.

Connecting To Data Sources

Page 337

4.4 Tableau Server - Deny view access permissions

If you chose to Edit the permissions instead of Removing the permissions in the previous step,
then follow these instructions.
Select Deny for View permissions for any or all of your users.
Do not deny view access for the Tableau user account that Metric Insights will use.
In this example, we deny view for All Users. We do that because we are currently signed in with
the Tableau user account that we use in Metric Insights for pulling the data from the views.

5. OPTION 2 - Publish all worksheets in separate workbook


As a different approach, you publish your dashboards in one workbook on Tableau Server, and
your worksheets in a separate workbook. And you make the worksheet views on Tableau Server
available to Metric Insights but not visible to your users.

5.1 Tableau Desktop - Publish dashboards to workbook


First, on Tableau Desktop publish your dashboards to the workbook on Tableau Server as you
would normally do.

Connecting To Data Sources

Page 338

5.2 Tableau Desktop - Publish all worksheets to separate workbook

Then, on Tableau Desktop choose to publish your worksheets to a separate workbook on Tableau
Server.
Note, if you choose "hide sheet" option in Tableau Desktop, then the worksheet is not an option
under Publish to Server to publish as a view. So you must "unhide sheet" before you can publish
the worksheet.

Connecting To Data Sources

Page 339

5.3 Tableau Desktop - Add / Edit permission

When publishing this workbook with worksheets, choose Add / Edit permissions to select Deny for
View permissions to any or all of your users.
If you are not signed in as the same Tableau user account that Metric Insights uses, then make
sure that user has view permission too.

5.4 Tableau Desktop - Publish all worksheets to separate workbook


Finally, publish this workbook. Now only the user that published this workbook with the worksheets
can access, and any additional users you gave View permission to.

Connecting To Data Sources

Page 340

Getting latest data from Tableau


If you are using a live data connection for your Tableau Server views, then Metric Insights might
not see the latest information. This is because Tableau Server is caching the older data.
Also, sometimes when you publish updated workbooks to Tableau Server, Metric Insights does not
see the latest changes in Tableau Server. This is the result of Tableau Server caching older
information. This can occur when adding new columns to the Tableau view but you do not see
them in Metric Insights.
In either case, the best practice is to set the Caching parameter to refresh more often. This
ensures that Metric Insights sees the latest information in the Tableau Server views.

1. Set Caching to 'Refresh more often'

If using a live data connection, or publishing an updated version of your Tableau workbook, then
setting the Caching to "Refresh more often" will ensure that Metric Insights always sees the latest
information.
Set Caching to "Refresh more often":
1. On the computer running Tableau Server, run the program: Configure Tableau Server
2. Navigate to Data Connections menu tab
3. Under Caching section, select 'Refresh more often'

Connecting To Data Sources

Page 341

For more information see Tableau article on 'Data in a Tableau Server View is Out of Date':
http://kb.tableausoftware.com/articles/issue/view-out-of-date

Connecting To Data Sources

Page 342

Embed database credentials in Tableau


If you are using a live data connection for your Tableau Server views, then Metric Insights might
not be able to pull data. Tableau server might return an HTTP 406 error from a data pull (CSV
export), or an HTTP 500 error from an image pull (PNG export).
In this case, the best practice is to Embed Database Credentials for your data source when
you publish your Tableau workbook.

1. Embedding Database Credentials in Tableau Server


By default, views connected to live data require users to log in to the data source with a database
username and password. If you discover that Metric Insights is unable to pull data from these
views then the problem is most likely the live data is requiring a separate set of credentials to
access. You can get around this if you configure Tableau Server to embed these database
credentials, so that Metric Insights can pass through the login process and go directly to the view.
See http://kb.tableausoftware.com/articles/knowledgebase/embedding-database-credentialstableau-server for information on how to embed database credentials in Tableau Server.

2. Web browser message when Tableau Server requires password to


access View

One way you will know that you should embed database credentials for a view is when you access
that view for the first time with web browser in Tableau Server. You get a message that prompts
you to enter the database credentials.

Connecting To Data Sources

Page 343

3. Confirming an embedded database credentials problem


You can typically confirm that you need to embed database credentials in Tableau server by
running a simple test from your web browser.
Manually attempt to pull the data (CSV) or image (PNG) pull via your web browser. Just add
?format=csv or ?format=png to the end of the url in your web browser address and hit return. If
nothing is returned, then with your Web Browser network traffic setting turned on, you should see
an HTTP 406 or 500 error returned.
https://tableau.metricinsights.com/views/Finance/Homepriceandsupply?format=csv
https://tableau.metricinsights.com/views/Finance/Homepriceandsupply?format=png

Connecting To Data Sources

Page 344

Handling Date/Time columns in Tableau


Tableau allows you to publish worksheets with various ways to show dates and times. Because of
how Metric Insights pulls data from Tableau Server, it is important to understand how it pulls and
processes the Tableau date/time fields. This article gives you some best practices for handling the
date/time columns in Tableau views.

1. How Metric Insights pulls data from Tableau


First, it is important to understand how Metric Insights pulls data from Tableau. Metric Insights
pulls data via the view url of Tableau Server using CSV export. E.g.,
https://tableau.example.com/views/workbookName/sheetName?format=csv

Metric Insights then infers the data type of each column in the exported CSV by interrogating the
values. For Metric Insights to interpret a column as containing date/time values, it needs to see
that the values conform to a particular date/time format.
Note: if some of the values in a column are in date/time format, but other values are not, then
Metric Insights will type the whole column as a Text field rather than a Date field. To prevent this
from happening you will need to ensure that all values in the column are of date/time format.

2. The various date/time formats that Metric Insights looks for from
Tableau
This is just a sample list of the date/time formats that Metric Insights looks for in the CSV file
exported from Tableau Server.
"yyyy-MMMMM-dd HH:mm:ss",// 2013-August-01 01:23:45
"yyyy-MM-dd HH:mm:ss", // 2012-01-01 01:23:45
"MM-dd-yyyy",
// 01-01-2012
"yyyy-MM-dd",
// 2012-01-01
"yyyy-MM",
// 2012-01 but not 2012-2013
"M/d/yyyy h:m:s a",
// "4/15/2013 1:00:00 AM" for hourly or minute Tableau
"MM/dd/yyyy",
// 01/01/2012
"MM/dd/yyyy HH:mm:ss", // 01/01/2012 01:23:45
"MMMMM d, yyyy h:mm a", // "April 15, 2013 1:00 AM" for minute Tableau
"MMMMM d, yyyy h a",
// "April 15, 2013 1 AM", "April 15, 2013 12 PM" for hourly Tableau
"MMMMM dd, yyyy",
// January 01, 2012
"MMMMM dd,yyyy",
"dd MMM yyyy",
// "05 Nov 2013" for Tableau
"MMMMM, yyyy",
// January, 2012
--- also catches January, 12 -> January 0012
"MMMMM,yyyy",
"MMMMM yyyy",
// January 2012
--- also catches January 12 -> January 0012
"MMM, yyyy",
// Jan, 2012
--- also catches Jan, 12 -> Jan, 0012
"MMM,yyyy",
"MMM yyyy",
// Jan 2012
--- also catches Jan 12 -> Jan 0012

Connecting To Data Sources

Page 345

"yyyy-MMM-dd",
// 2012-FEB-01
Oracle style. 12-FEB-12 -> Jan 12, 0012
"EEE MMM dd HH:mm:ss z yyyy", // Wed Aug 21 15:50:21 UTC 2013

2.1 Metric Insights needs at a minimum the Year and Month for a Date/Time
field
Metric Insights requires at a minimum a Year and a Month in the Date/Time field. In the previous
list of formats you can see that Year and Month are required. The following is a list of the date/time
parts that are needed as you expand into days, hours and minutes.
1.
2.
3.
4.

Year, Month
Year, Month, Day
Year, Month, Day, Hour
Year, Month, Day, Hour, Minute

3. How you know Metric Insights recognized a Tableau field as a


Date/Time
When you use the button to Validate your Tableau plug-in command query in the Element editor
for a Metric or a Report, a sample result set is displayed and columns that are interpreted as date/
time will display in YYYY-MM-DD hh:mm:ss format. For example,
2014-06-13 23:52:00

If you see a column value in this displayed format then you know that Metric Insights has
interpreted the column as a Date.

Connecting To Data Sources

Page 346

3.1 Date/Time recognized in Visual Editor (Query Builder)

Use the Visual Editor link in the Element Editor (Metric or Report) to bring up the Query Builder.
The Query Builder screen shows the column interpreted as date/time with Type DATE.

3.2 Date/Time recognized in Metric

Use the button to Validate your Tableau plug-in command query in the Metric editor, and the
sample returned data shows the column interpreted as date/time in YYYY-MM-DD hh:mm:ss
format

Connecting To Data Sources

Page 347

3.3 Date/Time recognized in Report

Use the button to Validate your Tableau plug-in command query in the Report editor, and the
sample returned data shows the column interpreted as date/time in YYYY-MM-DD hh:mm:ss
format

3.4 In Report editor Data Columns section, the Display Format tells you that it
is a Date

For Reports, you can also use the Save button and then explore the Display Format for the
column. For a Date column it will give you Date/Time formats to choose.

4. Workarounds when Tableau column is not recognized as Date/


Time
If the Date/Time column(s) in the Tableau view are in a format that Metric Insights is not able to
determine as a Date/Time then you have several options.

Connecting To Data Sources

Page 348

4.1 Publish your Tableau worksheet so that the Date/Time columns conform to
one of the date/time formats that Metric Insights understands
This might be an iterative approach, but based on the examples provided earlier in this article you
can format your column in Tableau desktop as one of the recognized date/time formats and
publish to Tableau server.

4.2 Construct a Composite Date/Time field in the plugin request

If your date/time value is spread over several columns, then you can build a Composite date/time
column. In the above example, use the control in the Visual Editor to construct a Composite date
from the individual year, month, day columns in the Tableau view.

4.3 Use an Intermediate Report in Metric Insights, manipulate the Date/Time


there

You can create a Report from the Tableau view, and then create other Metrics (or Reports) from
this newly created Report. Because the report saves the data pulled from Tableau in an underlying
SQL table, you can simply write SQL against the report and apply any SQL functions you want for
converting or formatting data from the report into date/time columns as appropriate. See the help
documents on making an Existing Report.

Connecting To Data Sources

Page 349

Sending a Tableau Workbook


Sometimes Metric Insights will ask you to export a Tableau workbook and send it. This is done for
helping with best practices and sometimes for troubleshooting issues you may have. The
workbook includes the dashboards and worksheets.

Sending a Packaged Workbook


To send a "packaged workbook" (i.e., *.twbx) file, follow the instructions from Tableau:
http://kb.tableausoftware.com/articles/knowledgebase/sending-packaged-workbook
Note: If the workbook uses an external data source such as SQL, you will need to export the data
first, so that it can be included with the *.twbx file. Instructions for doing that are in the same
article. It's all straightforward.

Connecting To Data Sources

Page 350

Using SSO on Tableau Server


This article describes how you can use Single Sign On (SSO) from Metric Insights to Tableau
Server. You will need to configure Tableau Server for this functionality. Then in Metric Insights you
simply choose to use SSO for Tableau.
In Tableau Server you will configure Trusted Authentication. The steps for doing this are covered in
this article. Trusted authentication simply means that you have set up a trusted relationship
between Tableau Server and the Metric Insights. When Tableau Server receives requests from
Metric Insights, Tableau assumes the credentials of the Metric Insights user. If you want more
detailed information on Trusted Authentication in Tableau Server then refer to documentation at
http://onlinehelp.tableausoftware.com/v8.2/server/en-us/trusted_auth.htm

1. Username is same on Metric Insights and Tableau Server


To use SSO, each username created on Metric Insights must have an identical username in
Tableau Server. For example, if you sign in with username 'admin' in Metric Insights, then to be
able to SSO to Tableau server you must also have the username 'admin' in Tableau server. This is
true for all usernames that you want to have SSO ability for.
If your username in Metric Insights does not have an identical username in Tableau server, then
you do not get SSO to Tableau for this user.

2. Find IP address of Metric Insights instance


You only need to do this for versions of Tableau server 8.0 and prior. For Tableau server 8.1 and
beyond, you can use the fully qualified domain name of the Metric Insights instance.
For example, one way to find is from the following:
root@demo:~# ifconfig
eth0
Link encap:Ethernet HWaddr
inet addr:10.146.246.13

root@sandbox:~# ifconfig
eth0
Link encap:Ethernet HWaddr
inet addr:10.192.61.233

3. Configure Tableau Server for trusted authentication


Use Tableau Server command line utilities to add Metric Insights IP address (or fully qualifed
domain name for Tableau server 8.1 and beyond) to list of trusted sites.
Using Metric Insights IP address:

Connecting To Data Sources

Page 351

D:\Tableau\Tableau Server\8.1\bin> .\tabadmin.bat set wgserver.trusted_hosts "10.146.246.13,


10.192.61.233"
D:\Tableau\Tableau Server\8.1\bin> .\tabadmin.bat config
D:\Tableau\Tableau Server\8.1\bin> .\tabadmin.bat restart

Using Metric Insights fully qualified domain name:


D:\Tableau\Tableau Server\8.1\bin> .\tabadmin.bat set wgserver.trusted_hosts "mi-dev.
mycompany.com, mi-prod.mycompany.com"
D:\Tableau\Tableau Server\8.1\bin> .\tabadmin.bat config
D:\Tableau\Tableau Server\8.1\bin> .\tabadmin.bat restart

4. Verify trusted authentication from Metric Insights for the user


From Metric Insights server run a command line utility to confirm trusted authentication to Tableau
server. Use a username that exists on Tableau server. For example, 'admin'
curl -k -dusername=admin https://tableau-test.metricinsights.com/trusted/
curl -3 -k -dusername=admin https://tableau-test.metricinsights.com/trusted/
curl -ssl -k -dusername=admin https://tableau-test.metricinsights.com/trusted/

In the curl request, supply the username (e.g., 'admin') and the url of Tableau server (e.g.,
https://tableau-test.metricinsights.com/trusted/). Any of the above curl request arguments should
work.
The above curl request returns -1 from Tableau server if not trusted, and returns non-negative
ticket value if trusted. For example,
Not trusted:
root@sandbox:~# curl -3 -k -dusername=admin https://tableau-test.metricinsights.com/trusted/
-1

Trusted:
root@sandbox:~# curl -3 -k -dusername=admin https://tableau-test.metricinsights.com/trusted/
Co_GwPDEfUzAuj1mDO2MH-FE

5. Configure Metric Insights for Tableau SSO


The following steps show how to configure Metric Insights for SSO to Tableau Server

Connecting To Data Sources

Page 352

5.1 Add External Report Type for Tableau SSO

Add External Report Type for Tableau SSO. You can do this when adding a new External Report.
In the drop down for Report Type, choose Add New Report Type. Then choose Tableau SingleSign for Drill Down Authentication. If you already have an existing External Report Type
defined for Tableau, you can enable SSO authentication for all reports associated with this type by
modifying the Drill Down Authentication setting.

Connecting To Data Sources

Page 353

Sourcing data from Treasure Data

Connecting To Data Sources

Page 354

Installing the Treasure Data Plugin


To install the latest production release of the Treasure Data Plug-in:
1. Log into your Metric Insights server with an account that has root access
2. Run the command sudo /usr/local/bin/mi-deploy comp TreasurePlugin-Trunk tip

Connecting To Data Sources

Page 355

Establish connectivity to Treasure Data


This article describes how you can setup connectivity to Treasure Data.
PREREQUISITES:
Before you begin, be sure that:
1. The latest version of the Treasure Data Plug-in is installed on your Metric Insights system.
See Installing the Treasure Data Plug-in for more details.
2. You have an API key and a database name for connecting to Treasure Data. Please refer
to Treasure On-line Data documentation for information on obtaining your API key.

1. Add Treasure Data plug-in to Data Sources

Find Data Sources item in Admin menu

Connecting To Data Sources

Page 356

2. Add New Data Source

3. Set type of created Data Source

Connecting To Data Sources

Page 357

4. Define Treasure data Connection Profile

1. Select 'Treasure Data' from the Plug-in drop-down list


2. Provide a meaningful name for this connection
3. Save

Connecting To Data Sources

Page 358

5. Set API key and database name

1. Click on the gear icon next to the API key column in the Parameters grid.

6. Enter your Treasure Data API Key

Connecting To Data Sources

Page 359

7. Click on Gear to Set the Database Name

Connecting To Data Sources

Page 360

8. Save the Profile

9. Check if profile can be used

Go to Metric Editor

Connecting To Data Sources

Page 361

1. Set Data Source - Treasure Data plugin


2. Input some command in the text box
3. Press Validate
If command is validated successfully, you will see message how many records are returned.

Connecting To Data Sources

Page 362

How to collect data from Treasure Data


1. Start by Adding an Element

2. Add an Element using the Metric Creation Wizard

Connecting To Data Sources

Page 363

3. Provide basic information for the first metric to be created using


Treasure Data as a source

1. Select the Measurement Interval that applies to your first metric (i.e. the aggregation
interval for your data)
2. Specify what this metric is measuring. If you do not see the measure that you want to use,
you can create one from this drop-down
3. Click on the Collect Data button

Connecting To Data Sources

Page 364

4. Enter the HiveQL in Plug-in Command text box

1. Find Treasure Data plug-in in Data Source list


2. Set Trigger
3. Input Plug-in Command
4. After entering the HiveQL statement, click on the Validate Plug-in Command button

Connecting To Data Sources

Page 365

5. Enter Last Measurement Time to specify what data points should


be fetched

6. Collect Data

1. If validation is finished successfully you will see how many records are fetched. Information
message is displayed in green font. If any error occurs during validation you will see error
message in red font.
2. Set Time point from what data should be collected.

Connecting To Data Sources

Page 366

3. Press Collect Data

7. Fetch is completed

You will get message how many values are recorded. Press Next button to generate Preview

8. Metric creation is finished

Check "Make Visible" flag is set to true and click Enable metric to add tile on your Home Page

Connecting To Data Sources

Page 367

Sourcing Data from Uptimerobot

Connecting To Data Sources

Page 368

Establish Connectivity to Uptimerobot


An Administrator can use the process described in this article to create a new Plug-in Data
Source. It is required to allow Elements to fetch data using Uptimerobot.

1. Go to Data Source editor

Connecting To Data Sources

Page 369

2. Add New Data Source

3. Select "Plug-in" as Data Source Type

Connecting To Data Sources

Page 370

4. Select 'Uptimerobot' from Plug-in picklist and input meaningful


Name

Connecting To Data Sources

Page 371

5. Plug-in Data Source Editor will open

1. Click on gear icon to set API Key parameter.


2. Provide API Key.
3. And Save.

Connecting To Data Sources

Page 372

6. Save Uptimerobot Plug-in

1. Save your changes.

Connecting To Data Sources

Page 373

How to Collect Data using Uptimerobot Plug-in


This article will show you how to create an Element using a Uptimerobot plug-in as a data source.
It assumes that you have already established connectivity to Uptimerobot.

1. Add new Element

Connecting To Data Sources

Page 374

2. Choose Uptimerobot plug-in as a Data Source for new element

1. Choose Uptimerobot as your Data Source.


2. Input your Plug-in Command.

Connecting To Data Sources

Page 375

3. Publish your new element

1.
2.
3.
4.

Validate your plug-in command.


You will see sample results under 'Validate' button.
Save the Element to continue configuring.
If configuring of the Element finished, Enable & Publish it.

Connecting To Data Sources

Page 376

Sourcing Data using a Web Service

Connecting To Data Sources

Page 377

Connecting to a Web Service


You can source metric and report data from any web service that returns information using a
JSON message format that conforms to Metric Insights Query Response format.

Architecture Overview

The diagram above illustrates graphically the Metric Insights (MI) Web Services architecture and
high-level process flow; the numbering on the chart corresponds to the list below:
1. The Web Service URL and parameter data its obtained from the MI database; if required
for the call, Web Service credential data (username and password) is also retrieved and
decrypted, if necessary
2. An HTTP POST request is made using the parameters described in the 'Web Service
POST Data' section below and uses the unencrypted credentials (that were retrieved in
Step 1) to authenticate to the Web Service, using HTTPS, if requested
3. The Web Service performs a data fetch process as defined by the developer of that service
4. The Web Service returns the results from the data fetch process to Metric Insights in JSON
format as described in the 'Data Returned from Web Service' section

Connecting To Data Sources

Page 378

5. MI parses JSON data and updates its database with the returned Metric data or Report
instance data
6. Data received from the Web Service is used by MI to generate new/updated tables and
charts according to the parameters defined for the Metric or Report associated with the
Web Service call

1. Adding a New Authentication


If you need to establish the the values for the 'Authentication' field on an Element's Editor, you use
to the right of the field's text display. There you add the URL for the source of your data. The
authentication process associates a Username and password with the URL as it constructs the
web service call.

2. Using Web Service POST Data


The system collects and passes the following information to the Web Service when performing the
data fetch command:

2.1 Web Service URL


The Web Service URL is provided by the Administrator via the Report or Metric Editor and is used
by the system to determine both the location of the Web Service and whether the fetch is to be
performed using HTTPS or HTTP. In addition to any information specified directly in the URL, the
table below contains items that are automatically appended to the list of HTTP POST parameters
as determined by the type of element:

Connecting To Data Sources

Page 379

2.2 Information Added to POST

The table above illustrates the information that is appended to the POST that is performed to the
Web Service URL

2.3 Dimension Values

As shown in the table in the preceding section, the dimension_name information is passed to the
Web Service URL for Metrics and Reports that are dimensioned.

Connecting To Data Sources

Page 380

2.4 Dimension Values

A separate Web Service call is performed for every 'dimension Value' defined for the 'dimension'.
Each individual 'Key Value' of that dimension is passed in the dimension_value parameter. This
value may either be an integer or text, depending upon the definition of the dimension.

2.5 Date Formats from Metrics and Reports

The POSTed parameters shown above are re-formatted prior to substitution, based on the date
format mask specified in the Web Service credentials associated with the Insight Element:

Connecting To Data Sources

Page 381

2.6 Date Formats if the 'MySQL to Web Service Format' is not specified

3. Authentication Credentials
If Username and/or password data is present for the credentials associated with the Insight
Element, these parameters are passed to the web service using HTTP Basic authentication.
NOTE: It is important that an Element's developer understand that authentication credential
information that the Web Service requires in order to perform the data fetch, including Usernames
and passwords to other external services, must be managed by the Web Service since this
information is not stored or processed by Metric Insights. Authentication and data collection
processing performed by the Web Service occurs outside of this system.

4. Web Service Processing


The Web Service performs whatever processing is required to collect data for the Metric or Report.
Data can be sourced by the Web Service from internal corporate information resources as well as
from external systems hosted outside the corporate firewall. The Web Service is triggered by
Metric Insights based on the execution of the Data Collection Trigger associated with the
Element as well as when the Administrator performs the Web Service call through the Editor.

Connecting To Data Sources

Page 382

5. Data Returned from Web Service for Metrics

5.1 Data Returned for a Metric

Metric Insights performs the following validation on the returned Metric data set before processing
the data to ensure that:
The Web Service always returns data sets that include one numeric (type of integer or
decimal ) and one date/time value (type of date) per row
If no date/time format is specified in the Web Service credentials for the Metric, all date/
time column values conform to the date format of
YYYY-MM-DD HH24:MI:SS
If a date/time format is provided as part of the Web Service credentials for the Metric, all
date/time column values conform to the specified format mask

5.2 Metric Example


1. If a Web Service uses the following SQL statement to generate a dataset:
Select measurement_time 'Calendar Date', sum(measurement_value) Total Sales
From...

The Result Set contains the following 2 rows:


[2011-01-01,1234.10] ,
[2011-01-02,5678.00]

The returned JSON would be:

Connecting To Data Sources

Page 383

{"header":[{"name":"Calendar Date","type":"DATE"}, {"name":"Total Sales","type":"DECIMAL"}],


data: [
[2011-01-01 00:00:00,1234.10]
[2011-01-02 00:00:00,5678.00]
]
}

The example above has been arranged for readability but a Web Service JSON result set does not
have to be formatted and would typically look like the following example:
{"header":[{"name":"Calendar Date","type":"DATE"},{"name":"Total Sales","type":"DECIMAL"}],
"data":[["2011-05-10 07:00:00",4730661.59],["2011-05-11 07:00:00",4602004.14],["2011-05-12
07:00:00",4635604.11],["2011-05-13 07:00:00",4873962.11],["2011-05-14 07:00:00",4614745.
529999999],["2011-05-15 07:00:00",4699752.8],["2011-05-16 07:00:00",4774199.8],["2011-05-17
07:00:00",4793545.17],["2011-05-18 07:00:00",4529600.81],["2011-05-19 07:00:00",4605180.
539999999]]}

5.3 Returned Metric Data Validation


Metric Insights performs the following validation on the returned Metric data set before processing
the data to ensure that:
The Web Service always returns data sets that include one numeric (type of integer or
decimal ) and one date/time value (type of date) per row
If no date/time format is specified in the Web Service credentials for the Metric, all date/
time column values conform to the date format of
YYYY-MM-DD HH24:MI:SS

If a date/time format is provided as part of the Web Service credentials for the Metric, all
date/time column values conform to the specified format mask

6. Data Returned from Web Service for Reports


Web Services that populate Reports must return the three JSON elements listed below:
1. Header Column Names
2. Column data types (DATE,DECIMAL,INTEGER,TEXT) given that the DATE data
type is used for date-time values as well as date-only values
3. Data Set values for each row

6.1 Expected JSON structure for a result set with four columns and three rows
{
"header": [
{
"type": "DATE",
"name": "Order Date"

Connecting To Data Sources

Page 384

},
{
"type": "DECIMAL",
"name": "US order Volume (US$)"
},
{
"type": "DECIMAL",
"name": "Intl order Volume (US$)"
},
{
"type": "DECIMAL",
"name": "Total order Volume (US$)"
}
],
"data": [
[
"2011-04-06 00:00:00",
415037.54999999999,
758473.73999999999,
1173511.29
],
[
"2011-04-05 00:00:00",
346160.52000000002,
738350.46999999997,
1084510.99
]
]
}

7. Error Reporting
Any error encountered in Web Service processing are returned to Metric Insights so that the
Administrator can be notified of any problems. The following information must be included in the
JSON message (for both metric and report data fetches) when an error is encountered:
{
error: | error string
}

7.1 Returned Report Data Validation


Any errors encountered in Web Service processing are returned to Metric Insights so that the
Administrator can be advised of any problems. The information that must be included in the JSON
message (for both metric and report data fetches) when an error is encountered is shown above.
Metric Insights logs (or displays in the Editor during validation) both error messages explicitly
returned in the JSON message as well as standard HTTP header error codes; e.g., 404,503,505.
For a data fetch to be considered successful all of the following conditions must be satisfied:

Connecting To Data Sources

Page 385

The web service call returns a OK HTTP status


No error section exists in the returned JSON
The returned JSON format adheres to the provided JSON specification
All validation rules set for the Metric's or Report's data are satisfied

8. Remotely Invoking Web Service Calls


Feature coming soon!

9. Sample Code
The following sub-steps contain examples of Web Services for consumption of Metric and Report
data written in php and python:

9.1 PHP Web Service example for collecting metrics data. Note the use of
last_measurement_time.
Above is a Php
<?php
/**
* @see Devx_Model
*/
require_once 'Devx/Model.php';
/**
* DemoModel object
*
* @version 1.0
* @package Insight
*/
class DemoModel extends Devx_Model
{
public function getDemoData1($subst) {
$cfg = Zend_Registry::get('config');
$password = Custom_Model_External::decryptPassword($cfg->database->password);
$params = array('host'
=> $cfg->database->host,
'username' => $cfg->database->username,
'password' => $password,
'dbname'
=> 'demo',
'port'
=> 3306 );
try {
$db = new Devx_Db_Adapter_Pdo_Mysql($params);
$db->query('SET NAMES "utf8"');
$sql = "select sum(amount), date(order_time)
from customer_order ord, customer_order_detail ord_line
where ord.order_id = ord_line.order_id
and ord.order_time > :last_measurement_time
and date(ord.order_time) < now()
group by 2";
if (!isset($subst['last_measurement_time'])) return array('error' => 'No substitution
provided for \':last_measurement_time\' pattern');

Connecting To Data Sources

Page 386

$pattern = ':last_measurement_time';
$sql = trim(str_ireplace($pattern, "'" . $subst['last_measurement_time'] . "'", $sql));
$rows = $db->query($sql)->fetchAll();
//print_r($rows);
if (is_array($rows)) {
return array(
'header' => array(
array('name' => 'sum(amount)', 'type' => 'decimal'),
array('name' => 'date(order_time)', 'type' => 'date')
),
'data' => $rows
);
} else return array('error' => 'Unexpected data structure is returned');
} catch (Exception $ex) {
return array('error' => $ex->getMessage());
}
}
}
?>

9.2 Python code example used to collect report data using a Web Service
#!/usr/bin/env python2.5
"""
Requirements: apache, mod_python, python2.5, MySQLdb
Sample of virtual host file:
<Directory /var/www/generator/>
Options MultiViews
Order allow,deny
allow from all
AddHandler mod_python .py
PythonHandler webservice
PythonAuthenHandler webservice
AuthType Basic
AuthName "Restricted Area"
require valid-user
AuthBasicAuthoritative Off
PythonDebug On
PythonOption mod_python.legacy.importer *
</Directory>
<VirtualHost *:80>
DocumentRoot /var/www/generator/
ServerName generator
ServerAlias www.generator
</VirtualHost>
Needed modules: apache, util (from mod_python)
Local modules: simplejson
"""
import os
import sys
import datetime

Connecting To Data Sources

Page 387

import MySQLdb
path = os.path.abspath(os.path.dirname(__file__))
sys.path.append(path)
import simplejson
import logging
import logging.handlers
import os, tempfile
from datetime import date
class MLogger:
def __init__(self, name):
self._logger = logging.getLogger(name)
self._logger.setLevel(logging.INFO)
log_name = 'log-%s-.txt' % date.today()
full_log_dir = '/var/www/generator/log/'#os.path.join(os.path.split(os.path.split(os.
path.split(os.path.abspath( __file__ ))[0])[0])[0], 'log')
full_log_name = os.path.join(full_log_dir, log_name)
try:
os.chmod(full_log_name, 0777)
except OSError:
pass
try:
self._ch = logging.FileHandler(full_log_name)
except IOError:
tmp = tempfile.mkstemp(prefix='log_', dir = full_log_dir)
self._ch = logging.FileHandler(tmp[1])
self._formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s %(message)s","%Y-%m-%d %H:%M:%S")
self._ch.setFormatter(self._formatter)
self._logger.addHandler(self._ch)
def get_logger(self):
return self._logger
"""
local testing data
"""
#main date format
datetime_format = '%Y-%m-%d %H:%M:%S'
date_format = '%Y-%m-%d'
def unformat_date(var):
"""
unformat string to datetime
"""
date = None
if var:
try:
date = datetime.datetime.strptime(var, datetime_format)
except:
try:
date = datetime.datetime.strptime(var, date_format)
except:
# cannot format it
pass
return date
def format_date(var):
"""

Connecting To Data Sources

Page 388

unformat datetime to string


"""
date = None
if var:
try:
date = datetime.datetime.strftime(var, datetime_format)
except:
# cannot format it
pass
return date
"""
Login and password to access this script
"""
web_service_credentials = {'username': 'user',
'password': ''
#'password': 'U2FsdGVkX19z/09S2MlKaiqCS3YmkwcCnOPqnFkX1Yc='
}
reports = {27: {'data_fetch_command_sql':
"""
SELECT calendar_date 'Order Date',
sum(if( country = 'United States', total_amount, 0)) 'US order
Volume (US$)',
sum(if( country = 'United States', 0, total_amount)) 'Intl order
Volume (US$)',
sum(total_amount) 'Total order Volume (US$)'
FROM daily_order_summary
WHERE calendar_date > date(%(measurement_time)s) - INTERVAL 60 DAY
AND channel = %(channel)s
GROUP BY 1
ORDER BY 1 DESC
""",
'measurement_time_fetch_command_sql':
"""
SELECT DISTINCT calendar_date
FROM demo.daily_order_summary
WHERE calendar_date < date(now())
AND calendar_date > date(%(last_measurement_time)s)
"""
}
}
field_type = {
0: 'DECIMAL',
1: 'TINY',
2: 'SHORT',
3: 'LONG',
4: 'FLOAT',
5: 'DOUBLE',
6: 'NULL',
7: 'TIMESTAMP',
8: 'LONGLONG',
9: 'INT24',
10: 'DATE',
11: 'TIME',
12: 'DATETIME',

Connecting To Data Sources

Page 389

13: 'YEAR',
14: 'NEWDATE',
15: 'VARCHAR',
16: 'BIT',
246: 'NEWDECIMAL',
247: 'INTERVAL',
248: 'SET',
249: 'TINY_BLOB',
250: 'MEDIUM_BLOB',
251: 'LONG_BLOB',
252: 'BLOB',
253: 'VAR_STRING',
254: 'STRING',
255: 'GEOMETRY' }
simple_field_type = {
0: 'DECIMAL',
1: 'INTEGER',
2: 'INTEGER',
3: 'INTEGER',
4: 'DECIMAL',
5: 'DECIMAL',
6: 'TEXT',
7: 'DATE',
8: 'INTEGER',
9: 'INTEGER',
10: 'DATE',
11: 'DATE',
12: 'DATE',
13: 'DATE',
14: 'DATE',
15: 'NVARCHAR',
16: 'INTEGER',
246: 'DECIMAL',
247: 'TEXT',
248: 'TEXT',
249: 'TEXT',
250: 'TEXT',
251: 'TEXT',
252: 'TEXT',
253: 'NVARCHAR',
254: 'NVARCHAR',
255: 'TEXT' }
_NAME = 'channel'
class MysqlConnect(object):
error = ''
connection = None
headers = []
#headers_types = []
result = None
rows = []
def __init__(self, *args, **kargs):
self.info = {
'host': 'localhost',
'user': 'generators',

Connecting To Data Sources

Page 390

'passwd': 'p0rtal',
'db': 'demo',
'port': 3306,
'use_unicode': True,
'charset': 'utf8'
}
if kargs.has_key('host'):
self.info['host'] = kargs['host']
if kargs.has_key('user'):
self.info['user'] = kargs['user']
if kargs.has_key('passwd'):
self.info['passwd'] = kargs['passwd']
if kargs.has_key('db'):
self.info['db'] = kargs['db']
if kargs.has_key('port'):
self.info['port'] = int(kargs['port'])
def connect(self):
try:
self.connection = MySQLdb.connect(*[], **self.info)
return True
except MySQLdb.Error, e:
self.error = "%d %s" % (e.args[0], e.args[1])
except Exception, e:
self.error = e
return False
def close(self):
if self.connection is not None:
try:
self.connection.close()
except Exception, e:
pass
def query(self, query, params):
try:
cursor = self.connection.cursor(MySQLdb.cursors.Cursor)
cursor.execute(query, params)
except MySQLdb.Error, e:
self.error = "%d %s" % (e.args[0], e.args[1])
return False
try:
self.result = {'header': [{'name': header[0], 'type':
simple_field_type[header[1]]} for header in cursor.description],
'data': []}
records = cursor.fetchall()
for record in records:
row = []
for i, item in enumerate(record):
if self.result['header'][i]['type'] == 'DATE':
item = item.strftime(datetime_format)
else:
item = unicode(item)
row.append(item)
self.result['data'].append(row)
#self.json_result = simplejson.dumps(result)
return True

Connecting To Data Sources

Page 391

except Exception, e:
self.error = e
return False
def is_int(s):
try:
int(s)
return True
except ValueError:
return False
def authenhandler(req):
pw = req.get_basic_auth_pw()
user = req.user
if user == web_service_credentials['username'] and (
(web_service_credentials['password'] and pw ==
web_service_credentials['password']) or not web_service_credentials['password']):
return apache.OK
else:
return apache.HTTP_UNAUTHORIZED
def handler(req):
"""
Binds handler routing
"""
req.log_error('handler')
req.content_type = 'application/json'
#req.content_type = 'text/html'
req.send_http_header()
form = util.FieldStorage(req, keep_blank_values=1)
process(form, req, ret_answer)
return apache.OK
def ret_answer(ret, req):
"""
Formats answer to json answer and returns to apache
"""
#req.write(simplejson.dumps(ret, indent=4))
req.write(simplejson.dumps(ret))
return apache.OK
def print_answer(ret, req):
"""
Print answer to stdout. For test purposes.
"""
print simplejson.dumps(ret,4)
pass
def process(form, req, ret_answer):
"""
Main routine
"""
# empty answer dict
#ret = {'error': ''}
ret = {}
# check for last_measurement_time field
if 'measurement_time' in form:
form['last_measurement_time'] = None
log = MLogger('webservice')

Connecting To Data Sources

Page 392

logger = log.get_logger()
logger.info('before elem id checks')
# check for element_id field
if 'element_id' not in form:
ret['error'] = 'ERROR. element_id is not set'
ret_answer(ret, req)
return
# check if element_id is correct
if not is_int(form['element_id']) or int(form['element_id']) not in reports:
ret['error'] = 'ERROR. element_id is incorrect %s ' % form['element_id']
ret_answer(ret, req)
return
element_id = int(form['element_id'])
# get mysql connection
outer_conn = MysqlConnect()
if not outer_conn.connect():
ret['error'] = "ERROR. Cannot connect to db: %s" % outer_conn.error
ret_answer(ret, req)
return
if 'command' in form and form['command'] == 'get_measurement_times':
if 'last_measurement_time' in form and form['last_measurement_time']:

#
#
#
#
#
#
#
#
#

last_meas_time = unformat_date(form['last_measurement_time'])
else:
last_meas_time = datetime.datetime(1900, 1, 1, 0, 0, 0)
if not last_meas_time:
last_meas_time = datetime.datetime(1900, 1, 1, 0, 0, 0)
query = reports[element_id]['measurement_time_fetch_command_sql']
params = {'last_measurement_time': last_meas_time}
else:
# check for value substitution
_value = ''
if _NAME in form:
_value = unicode(form[_NAME])
if not _value:
ret['error'] = "ERROR. _value is not specified"
ret_answer(ret, req)
return
# check for names substitution
_name = ''
if '_name' in form:
_name = unicode(form['_name'])
if not _name:
ret['error'] = "ERROR. _name is not specified"
ret_answer(ret, req)
return
# check for measurement time
#if 'measurement_time' in form and form['measurement_time']:
#
meas_time = unformat_date(form['measurement_time'])
if 'last_measurement_time' in form and form['last_measurement_time']:
meas_time = unformat_date(form['last_measurement_time'])
else:
meas_time = None

Connecting To Data Sources

Page 393

if not meas_time:
ret['error'] = "ERROR. Measurement time is required"
ret_answer(ret, req)
return
query = reports[element_id]['data_fetch_command_sql']
params = {'measurement_time': meas_time, _NAME: _value}
if not outer_conn.query(query, params):
ret['error'] = "ERROR. Cannot execute query: %s" % outer_conn.error
ret_answer(ret, req)
return
result = outer_conn.result
if not result:
ret['error'] = "ERROR. Source db returned empty result"
ret_answer(ret, req)
return
ret['header'] = result['header']
ret['data'] = result['data']
ret_answer(ret, req)
return
if __name__ == "__main__":
"""
for testing from bash
"""
form = {'element_id': 27, 'username': u'user', 'meas_time': '', '_value': u'corporate
sales', '_name': u'channel', 'last_measurement_time': None, 'password': ''}
if len(sys.argv) >= 3:
form['command'] = 'get_measurement_times'
form['last_measurement_time'] = sys.argv[2]
elif len(sys.argv) >= 2:
form['measurement_time'] = sys.argv[1]
process(form, sys.stdout, print_answer)
else:
from mod_python import apache, util
directory = os.path.dirname(__file__)

Connecting To Data Sources

Page 394

You might also like