Connection with Qlik manual

It is possible to visualise data from ClockWise within Qlik. This could be done via the Data Load Editor. In this manual, we explain the steps to obtain the data from ClockWise in a safe manner.

Creating an API client

First, you need to create an API client. This can be done in Configuration → Settings → OAuth/API clients. Click on 'Add OAuth client' and give it a name (like 'Qlik'), and press 'Save'. Note: We will need the client ID later on, so keep this page open somewhere or save this id.

Configuring the REST connectors

To create the connection, we need two(!) REST connectors in Qlik: one to retrieve the data, and one to manage the authentication in a safe manner.

The REST connector to retrieve the data

Within Qlik, create a REST connector with the following values:

  • URL: https://youraccountname.clockwise.info/api/v2/, where youraccountname is the name of your ClockWise account (often based on the company name)
  • Method: GET
  • Authentication Scheme: Anonymous (The script will manage the authentication)
  • Query headers: "Accept" "application/json"
  • Allow response headers: Turn on
  • Name: Create a useful name (Like REST_data)

The REST connector for authentication

Before we can make this connector, we need to do one additional step: We need to get an authorization code to retrieve the date the first time. This first time, we exchange it for an access token and a refresh token, first of which is used as login code, the second one could be used to retrieve new access tokens later. This prevents you from logging in every time Qlik refreshes the data.

Warning! The authentication code is valid for one hour, so between requesting the authorization code, and retrieving data for the first time, you have one hour. Otherwise you have to request another authorization code and edit the REST connector later.

Go in the browser to https://youraccountname.clockwise.info/api/v2/auth?response_type=code&redirect_uri=https%3A%2F%2Fexample.com&client_id=Client ID, where youraccountname is the name of your ClockWise account name (like before), and Client ID is the client id we got earlier. If all goes well, you are redirected to the ClockWise login page. Log in with you account (or the account of a special data user). Then you are redirected to "example.com/?code=authorization code". Save this authorization code.

Go back to Qlik. Make a new REST connector with the folling values:

  • URL: https:// youraccountname.clockwise.info/api/v2/token, where youraccountname is your ClockWise account name.
  • Method: POST
  • Request body: (fill in the different types of codes, indicated with the asterisks, remove those asterisks!)
  "code": "*authorization code*",
  "grant_type": "authorization_code",
  "client_id": "*Client ID*"
  • Query Headers: "Content-Type" "application/json", "Accept" "application/json"
  • Name: Create a useful name (Like REST_token)

Creating and configuring the script

Now open a Qlik application, and go to a sheet. Open the Data Load Editor. Create a new script if this application already existed.
Now copy and paste the code below in the editor (under the auto-generated localization variables):

// Macro for the parameters
SET GETFIELDS='PurgeChar($1Fields, ''"'')';

ClockWise account parameters. These are necessary to build the connection to ClockWise:

RESTName: The name of the REST connection to youraccount.clockwise.info/api/v2
RESTToken: The name of the REST connection to youraccount.clockwise.info/api/v2/token
accountName: The name of the ClockWise account (often something based on your company name)
client_id: The Client ID of the API client. An administrator could create those at Configuration->Settings->Oauth/API clients.
SET RESTName=*REST_data*;
SET RESTToken=*REST_token*;
SET accountName=*youraccountname*;
SET client_id=*Client ID*;

Parameters for retrieving the data
These are given as if they would be in an url (field=value&field2=value2&...)

includeRates: 1 or 0. Denotes if rate info is (1) or is not (0) retrieved alongside the other data.

employeeParameters, resourceParameters, projectParameters and hourParameters: The parameters for the four tables.
These variables are allready filled with the fieldnames and the sizes of the batches. DO NOT CHANGE THIS UNLESS YOU KNOW WHAT YOU ARE DOING!!
Extra fieldnames could be added by giving a comma separated list (starting with a comma) just before the &limit=10000 (and after the single quote).
(e.g. ...&$(GETFIELDS(employee))&',my_employee_field,my_second_employee_field,!employee_name&limit=10000')
Warning! The groupnames are not supported! Give the actual fieldnames you want to retrieve (Otherwise there will be a field named something like 'defaultfields' which is completely empty...)
An exclamation mark negates the field.
Furthermore, you could add extra parameters, for example the {pathfield}_split, or allprojects, etc.
    For an overview, you could go to https://youraccountname.clockwise.info/api/v2/doc#hours-in-flat-format, with 'youraccountname' being your ClockWise account name.

SET includeRates=0;
SET employeeParameters='fields='&$(GETFIELDS(employee))&'&limit=10000';
SET resourceParameters='fields='&$(GETFIELDS(resource))&'&limit=10000';
SET projectParameters='fields='&$(GETFIELDS(project))&'&limit=10000';
SET hourParameters='fields='&$(GETFIELDS(hour))&'&limit=10000';

// Actual begin of script

// If we still have a refreshToken and accessToken, we preload them
if not isNull(refreshToken) and not isNull(accessToken) then
  LOAD token, refresh inline [
    token refresh
    $(accessToken) $(refreshToken)
  ] (delimiter is spaces);
end if

// This subroutine creates a table with the given tablename, based on the fields given in query,
// loading data from the endpoint given by url, optionally renaming the fieldnames using a mapping table,
// and optionally renaming the id (to a duplicate name in another table) using renameID.
SUB getRecords(tablename, query, url, renameTable, renameID)

  // apiname -> actual clockwise url
  let url='https://$(accountName).clockwise.info/api/v2$(url)';


  SET ErrorMode=0;

  // A retry-on-error policy is used to load the data. There are 3 attempts. If it fails, a new attempt is made using a new access token, and
  // the attepmt is made again. After 3 attempts, we stop to avoid endless loops
  FOR i=1 to 3

      if not isNull(accessToken) then
        // If we have an access token, we try to retrieve the given fields from the given url
        REPLACE SELECT "Link",
          (SELECT $(query),
          FROM (wrap on) "root" FK "__FK_root")
        FROM JSON "_response_header" PK "__KEY_response_header"
        WITH CONNECTION(URL "$(url)", HTTPHEADER "Authorization" "Bearer $(accessToken)");

        if ScriptError = 12 then // Probably unauthorized error. Invalidate access token.
          let accessToken=null();
          drop table result;
          // No error, we look if there are more pages to load (pagination)
          LET Link = peek('Link', 0, 'result');

          // We copy the data from the temporary table "result" to the actual table $(tablename), to get rid of the extra keys and the header.
          // If renameID is given, we rename the id over here to the given name
          if not isNull(renameID) then
            LOAD id as $(renameID), *;
            LOAD $(query) RESIDENT result;
            DROP FIELD id FROM $(tablename);
            LOAD $(query) RESIDENT result;
          end if

          // We parse the Link header to see if there is a "next" page
          LOAD TextBetween(link, '<', '>') as new_url WHERE rel LIKE '*next*';
          LOAD SubField(link1, ';', 1) AS link, SubField(link1, ';', 2) AS rel;
          LOAD SubField('$(Link)', ',') AS link1 AUTOGENERATE 1;

          let url = peek('new_url', 0, 'Link 2 $(tablename)');

          drop table Link;

        end if
      end if
      // If we got another url from the Link header, we loop again.
      // If there was no accessToken, we stop the loop execution to retrieve a new access token
    LOOP UNTIL isNull(url) or isNull(accessToken);

    if not isNull(accessToken) then
      if not isNull(renameTable) then
        // We rename the fields using the mapping table. In order to avoid concatenation problems, we do this outside the pagination loop
        RENAME FIELDS USING "$(renameTable)";
      end if
      drop table result;
    end if

    if isNull(accessToken) then // accessToken did not exist yet or was invalid

      LIB CONNECT TO $(RESTToken);

      SET ScriptError=2; // If retrieving a new token is not successful, stop execution. A serious error occured.

      if not isNull(refreshToken) then // Retrieve with the refreshToken

        set refreshString =[{""grant_type"": ""refresh_token"", ""client_id"": ""$(client_id)"", ""refresh_token"": ""$(refreshToken)""}];

        REPLACE LOAD "access_token" as "token", '$(refreshToken)' as "refresh";
        SELECT "access_token"
        FROM JSON(wrap on) "root"
        WITH CONNECTION(BODY "$(refreshString)");

        LET accessToken = peek('token', 0, 'Tokens');
      else // Use the authorization code which was filled in in the REST connector

        REPLACE SELECT "access_token" as "token", "refresh_token" as "refresh"
        FROM JSON(wrap on) "root";

        LET refreshToken = peek('refresh', 0, 'Tokens');
        LET accessToken  = peek('token', 0, 'Tokens');
      end if;

      // Connect with the REST connector again
      // and reset the scripterror value again (so that the script won't halt when the new accessToken is somehow invalid)
      SET ScriptError=0;
      // We are done loading the data. Stop the retry loop and exit
      exit for;
    end if;

    // If we are here, we retrieved a new access token. We should retry getting the data, therefore loop again.
  NEXT i;

  // Reset the error variable. Ignoring the error was just for the SELECT step to catch the unauthorized errors
  SET ErrorMode=1;


// This subroutine parses the given GET parameters, to modify which fields are loaded.
// As a part of the parameters are generated using the fields, we need to compare the new fields with the fields we already have
SUB getFieldsFromParameters(parameters, fields)
  // Set the parameters in a table, so that they can be processed below
  LOAD SubField(parameter, '=', 1) AS parameterName, SubField(parameter, '=', 2) AS parameterValue;
  LOAD SubField('$(parameters)', '&') AS parameter AUTOGENERATE 1;

  // Process 1: Get all the extra fields given in the parameters which were not already retrieved via the GETFIELD macro.
  // When an exclamation mark is given, we need to exclude the field.
  // FieldNames contains the fields from the 'fields' variable
  LOAD TextBetween(fieldNames, '"', '"') as fieldNames;
  LOAD SubField('$(fields)', ',') AS fieldNames AUTOGENERATE 1;

  // ExtraFields contains the fields from the fields query pameter in the 'parameters' variable which do not exist in the fields variable
  LOAD extraFieldNames WHERE NOT Exists(fieldNames, extraFieldNames); // Check which fields do not already exist in the FieldNames table
  NOCONCATENATE LOAD SubField(parameterValue, ',') as extraFieldNames RESIDENT ParameterTable WHERE parameterName = 'fields';

  // NewFields concatenates the extra fields to the already exsting fields, unless they begin with '!'. In that case, they are excluded instead.
  NOCONCATENATE LOAD fieldNames RESIDENT FieldNames WHERE NOT EXISTS(extraFieldNames, '!'&fieldNames);
  CONCATENATE (NewFields) LOAD extraFieldNames as fieldNames RESIDENT ExtraFields WHERE NOT Left(extraFieldNames, 1) = '!';

  // We now need te convert the "list" of fieldnames into a string again
  Load '"' & Concat(fieldNames, '","') & '"' as string RESIDENT NewFields;

  fields = peek('string');

  // Process 2: Process the split parameters to replace the '{pathlist}_path' field into '{pathlist}_\d+' fields
  // We do this in a triple load statement, where we find the number of columns we need to generate, we process the parameter to get the {pathlist}_path out of
  // {pathlist}_split, and we create a mapping table out of the _path fields and new generated _\d+ fields
  MAPPING LOAD '"' & parameterBase & 'path"' AS oldFieldNames, '"'&Concat(newFieldName, '","')&'"' AS fieldNames GROUP BY parameterBase;
  LOAD parameterBase, parameterBase & IterNo() AS newFieldName WHILE IterNo() <= parameterCount;
  LOAD parameterName, Left(parameterName, Index(parameterName, '_split', -1)) AS parameterBase,
    IF(parameterValue='*', 9, SubStringCount(parameterValue, ',')+2) AS parameterCount
    RESIDENT ParameterTable WHERE parameterName LIKE '*_split';

  // We do a search/replace on the fields using the just generated table
  fields = MapSubString('Parameters', fields);

  // clean up
  DROP TABLES ParameterTable, FieldNames, ExtraFields, NewFields, FieldString;

// The mapping tables, to rename fields which are duplicate in the different tables, but are not meant as keys between the tables
MAPPING LOAD old_employee, new_employee INLINE [
  old_employee new_employee
  hourly_sales_rate employee_hourly_sales_rate
  hourly_sales_rate_currency employee_hourly_sales_rate_currency
  hourly_sales_rate_id employee_hourly_sales_rate_id
  hourly_sales_rate_name employee_hourly_sales_rate_name
  hourly_sales_rate_start employee_hourly_sales_rate_start
  km_sales_rate employee_km_sales_rate
  km_sales_rate_currency employee_km_sales_rate_currency
  km_sales_rate_id employee_km_sales_rate_id
  km_sales_rate_name employee_km_sales_rate_name
  km_sales_rate_start employee_km_sales_rate_start
  unit_sales_rate employee_unit_sales_rate
  unit_sales_rate_currency employee_unit_sales_rate_currency
  unit_sales_rate_id employee_unit_sales_rate_id
  unit_sales_rate_name employee_unit_sales_rate_name
  unit_sales_rate_start employee_unit_sales_rate_start
  hourly_purchase_rate employee_hourly_purchase_rate
  hourly_purchase_rate_currency employee_hourly_purchase_rate_currency
  hourly_purchase_rate_id employee_hourly_purchase_rate_id
  hourly_purchase_rate_name employee_hourly_purchase_rate_name
  hourly_purchase_rate_start employee_hourly_purchase_rate_start
  hourly_wage employee_hourly_wage
  hourly_wage_currency employee_hourly_wage_currency
  hourly_wage_id employee_hourly_wage_id
  hourly_wage_name employee_hourly_wage_name
  hourly_wage_start employee_hourly_wage_start
  km_purchase_rate employee_km_purchase_rate
  km_purchase_rate_currency employee_km_purchase_rate_currency
  km_purchase_rate_id employee_km_purchase_rate_id
  km_purchase_rate_name employee_km_purchase_rate_name
  km_purchase_rate_start employee_km_purchase_rate_start
  unit_purchase_rate employee_unit_purchase_rate
  unit_purchase_rate_currency employee_unit_purchase_rate_currency
  unit_purchase_rate_id employee_unit_purchase_rate_id
  unit_purchase_rate_name employee_unit_purchase_rate_name
  unit_purchase_rate_start employee_unit_purchase_rate_start
] (delimiter is spaces);

MAPPING LOAD old_resources, new_resources INLINE [
  old_resources new_resources
  resource_parent_id project_id
  hourly_sales_rate resource_hourly_sales_rate
  hourly_sales_rate_currency resource_hourly_sales_rate_currency
  hourly_sales_rate_id resource_hourly_sales_rate_id
  hourly_sales_rate_name resource_hourly_sales_rate_name
  hourly_sales_rate_start resource_hourly_sales_rate_start
  km_sales_rate resource_km_sales_rate
  km_sales_rate_currency resource_km_sales_rate_currency
  km_sales_rate_id resource_km_sales_rate_id
  km_sales_rate_name resource_km_sales_rate_name
  km_sales_rate_start resource_km_sales_rate_start
  unit_sales_rate resource_unit_sales_rate
  unit_sales_rate_currency resource_unit_sales_rate_currency
  unit_sales_rate_id resource_unit_sales_rate_id
  unit_sales_rate_name resource_unit_sales_rate_name
  unit_sales_rate_start resource_unit_sales_rate_start
  hourly_purchase_rate resource_hourly_purchase_rate
  hourly_purchase_rate_currency resource_hourly_purchase_rate_currency
  hourly_purchase_rate_id resource_hourly_purchase_rate_id
  hourly_purchase_rate_name resource_hourly_purchase_rate_name
  hourly_purchase_rate_start resource_hourly_purchase_rate_start
  hourly_wage resource_hourly_wage
  hourly_wage_currency resource_hourly_wage_currency
  hourly_wage_id resource_hourly_wage_id
  hourly_wage_name resource_hourly_wage_name
  hourly_wage_start resource_hourly_wage_start
  km_purchase_rate resource_km_purchase_rate
  km_purchase_rate_currency resource_km_purchase_rate_currency
  km_purchase_rate_id resource_km_purchase_rate_id
  km_purchase_rate_name resource_km_purchase_rate_name
  km_purchase_rate_start resource_km_purchase_rate_start
  unit_purchase_rate resource_unit_purchase_rate
  unit_purchase_rate_currency resource_unit_purchase_rate_currency
  unit_purchase_rate_id resource_unit_purchase_rate_id
  unit_purchase_rate_name resource_unit_purchase_rate_name
  unit_purchase_rate_start resource_unit_purchase_rate_start
  absent resource_absent
  cost_center_id resource_cost_center_id
  cost_center_name resource_cost_center_name
  cost_center_number resource_cost_center_number
  cost_unit_id resource_cost_unit_id
  cost_unit_name resource_cost_unit_name
  cost_unit_number resource_cost_unit_number
  end_date resource_end_date
  hours_budgetted resource_hours_budgetted
  ledger_number resource_ledger_number
  ledger_id resource_ledger_id
  ledger_name resource_legdger_name
  start_date resource_start_date
] (delimiter is spaces);
MAPPING LOAD old_projects, new_projects INLINE [
  old_projects new_projects
  hourly_sales_rate project_hourly_sales_rate
  hourly_sales_rate_currency project_hourly_sales_rate_currency
  hourly_sales_rate_id project_hourly_sales_rate_id
  hourly_sales_rate_name project_hourly_sales_rate_name
  hourly_sales_rate_start project_hourly_sales_rate_start
  km_sales_rate project_km_sales_rate
  km_sales_rate_currency project_km_sales_rate_currency
  km_sales_rate_id project_km_sales_rate_id
  km_sales_rate_name project_km_sales_rate_name
  km_sales_rate_start project_km_sales_rate_start
  unit_sales_rate project_unit_sales_rate
  unit_sales_rate_currency project_unit_sales_rate_currency
  unit_sales_rate_id project_unit_sales_rate_id
  unit_sales_rate_name project_unit_sales_rate_name
  unit_sales_rate_start project_unit_sales_rate_start
  hourly_purchase_rate project_hourly_purchase_rate
  hourly_purchase_rate_currency project_hourly_purchase_rate_currency
  hourly_purchase_rate_id project_hourly_purchase_rate_id
  hourly_purchase_rate_name project_hourly_purchase_rate_name
  hourly_purchase_rate_start project_hourly_purchase_rate_start
  hourly_wage project_hourly_wage
  hourly_wage_currency project_hourly_wage_currency
  hourly_wage_id project_hourly_wage_id
  hourly_wage_name project_hourly_wage_name
  hourly_wage_start project_hourly_wage_start
  km_purchase_rate project_km_purchase_rate
  km_purchase_rate_currency project_km_purchase_rate_currency
  km_purchase_rate_id project_km_purchase_rate_id
  km_purchase_rate_name project_km_purchase_rate_name
  km_purchase_rate_start project_km_purchase_rate_start
  unit_purchase_rate project_unit_purchase_rate
  unit_purchase_rate_currency project_unit_purchase_rate_currency
  unit_purchase_rate_id project_unit_purchase_rate_id
  unit_purchase_rate_name project_unit_purchase_rate_name
  unit_purchase_rate_start project_unit_purchase_rate_start
] (delimiter is spaces);

// Unfortunately, you cannot load all in one time, so we make a table containing the fields we can get from the API
// The columns with '1' or '0' denote in which endpoint we should load this field. The "rates" column is controlled by the includeRates variable
fieldname                     Employees Projects Hours Resources rates
id                            1         1        1     1         0
customer_code_path            0         1        0     0         0
customer_id_path              0         1        0     0         0
customer_name_path            0         1        0     0         0
date                          0         0        1     0         0
day                           0         0        1     0         0
department_code_path          1         0        0     0         0
department_id_path            1         0        0     0         0
department_name_path          1         0        0     0         0
employee_id                   0         0        0     1         0
employee_name                 1         0        0     0         0
hours                         0         0        1     0         0
project_code_path             0         1        0     0         0
project_id_path               0         1        0     0         0
project_name_path             0         1        0     0         0
remark                        0         0        1     0         0
resource_id                   0         0        1     0         0
resource_parent_id            0         0        0     1         0
resource_status               0         0        0     1         0
weeknumber                    0         0        1     0         0
absent                        0         1        0     1         0
cost_center_id                0         1        0     1         0
cost_center_name              0         1        0     1         0
cost_center_number            0         1        0     1         0
cost_unit_id                  0         1        0     1         0
cost_unit_name                0         1        0     1         0
cost_unit_number              0         1        0     1         0
end_date                      0         1        0     1         0
hours_budgetted               0         1        0     1         0
internal                      0         1        0     0         0
ledger_id                     0         1        0     1         0
ledger_name                   0         1        0     1         0
ledger_number                 0         1        0     1         0
projectmanagers               0         1        0     0         0
project_type_inherited        0         1        0     0         0
reference_number              0         1        0     0         0
start_date                    0         1        0     1         0
unbillable                    0         1        0     0         0
authorization                 1         0        0     0         0
contract_end                  0         0        0     0         0
contract_hours_max            1         0        0     0         0
contract_hours_min            1         0        0     0         0
contract_hours_scope          0         0        0     0         0
contract_start                0         0        0     0         0
contract_type                 0         0        0     0         0
employee_email                1         0        0     0         0
employee_function             1         0        0     0         0
salary_number                 1         0        0     0         0
usergroup_id                  1         0        0     0         0
usergroup_name                1         0        0     0         0
hourly_sales_rate             1         1        1     1         1
hourly_sales_rate_currency    1         1        1     1         1
hourly_sales_rate_id          1         1        1     1         1
hourly_sales_rate_name        1         1        1     1         1
hourly_sales_rate_start       1         1        1     1         1
invoice_id                    0         0        1     0         1
invoice_number                0         0        1     0         1
km_sales_rate                 1         1        1     1         1
km_sales_rate_currency        1         1        1     1         1
km_sales_rate_id              1         1        1     1         1
km_sales_rate_name            1         1        1     1         1
km_sales_rate_start           1         1        1     1         1
unit_sales_rate               1         1        1     1         1
unit_sales_rate_currency      1         1        1     1         1
unit_sales_rate_id            1         1        1     1         1
unit_sales_rate_name          1         1        1     1         1
unit_sales_rate_start         1         1        1     1         1
hourly_purchase_rate          1         1        1     1         1
hourly_purchase_rate_currency 1         1        1     1         1
hourly_purchase_rate_id       1         1        1     1         1
hourly_purchase_rate_name     1         1        1     1         1
hourly_purchase_rate_start    1         1        1     1         1
hourly_wage                   1         1        1     1         1
hourly_wage_currency          1         1        1     1         1
hourly_wage_id                1         1        1     1         1
hourly_wage_name              1         1        1     1         1
hourly_wage_start             1         1        1     1         1
km_purchase_rate              1         1        1     1         1
km_purchase_rate_currency     1         1        1     1         1
km_purchase_rate_id           1         1        1     1         1
km_purchase_rate_name         1         1        1     1         1
km_purchase_rate_start        1         1        1     1         1
reverse_invoice_id            0         0        1     0         1
reverse_invoice_number        0         0        1     0         1
unit_purchase_rate            1         1        1     1         1
unit_purchase_rate_currency   1         1        1     1         1
unit_purchase_rate_id         1         1        1     1         1
unit_purchase_rate_name       1         1        1     1         1
unit_purchase_rate_start      1         1        1     1         1
hourstatus                    0         0        1     0         0
hourstatus_modified_at        0         0        1     0         0
hourstatus_modified_by        0         0        1     0         0
comute_km                     0         0        1     0         0
expenses                      0         0        1     0         0
export_ch1_id                 0         0        1     0         0
export_ch2_id                 0         0        1     0         0
pause                         0         0        0     0         0
start                         0         0        0     0         0
travel_costs                  0         0        1     0         0
travel_km                     0         0        1     0         0
units                         0         0        1     0         0
] (delimiter is spaces);

// For each of the four tables, we do the following:
// - We load the fieldnames from the Fields table based on the '1's and '0's of the corresponding field (and the includeRates variable)
// - We expand the macro of the parameters (this macro is based on the *Fields variable loaded just before) to load the GET-parameters.
// - We adjust the fields based on the GET parameters (the ones which were not generated by the macro)
// - We retrieve the records based on the fields we just calculated.
NOCONCATENATE LOAD '"'&ConCat(fieldname, '","')&'"' as fieldnames RESIDENT Fields WHERE Employees=1 AND ($(includeRates)=1 OR rates=0) GROUP BY Employees;
let employeeFields = peek('fieldnames', 0, 'EmployeeFields');
let parameters = $(employeeParameters);
call getFieldsFromParameters(parameters, employeeFields);
call getRecords('Employees', employeeFields, '/report/flat/employees?'& parameters, 'EmployeesMap', 'employee_id');

NOCONCATENATE LOAD '"'&ConCat(fieldname, '","')&'"' as fieldnames RESIDENT Fields WHERE Resources=1 AND ($(includeRates)=1 OR rates=0) GROUP BY Resources;
let resourceFields = peek('fieldnames', 0, 'ResourceFields');
let parameters = $(resourceParameters);
call getFieldsFromParameters(parameters, resourceFields);
call getRecords('Resources', resourceFields, '/report/flat/resources?'& parameters, 'ResourcesMap', 'resource_id');

NOCONCATENATE LOAD '"'&ConCat(fieldname, '","')&'"' as fieldnames RESIDENT Fields WHERE Projects=1 AND ($(includeRates)=1 OR rates=0) GROUP BY Projects;
let projectFields = peek('fieldnames', 0, 'ProjectFields');
let parameters = $(projectParameters);
call getFieldsFromParameters(parameters, projectFields);
call getRecords('Projects', projectFields, '/report/flat/projects?'& parameters, 'ProjectsMap', 'project_id');

NOCONCATENATE LOAD '"'&Concat(fieldname, '","')&'"' as fieldnames RESIDENT Fields WHERE Hours=1 AND ($(includeRates)=1 OR rates=0) GROUP BY Hours;
let hourFields = peek('fieldnames', 0, 'HourFields');
let parameters = $(hourParameters);
call getFieldsFromParameters(parameters, hourFields);
call getRecords('Hours', hourFields, '/report/flat/hours?'& parameters);

DROP TABLES Fields, EmployeeFields, ProjectFields, ResourceFields, HourFields, Tokens;

// Auto-generate sub-fields based on the date fields, to allow analysis on years, months etc.
  Dual(Year($1), YearStart($1)) AS [Year] Tagged ('$axis', '$year'),
  Dual('Q'&Num(Ceil(Num(Month($1))/3)),Num(Ceil(NUM(Month($1))/3),00)) AS [Quarter] Tagged ('$quarter', '$cyclic'),
  Dual(Year($1)&'-Q'&Num(Ceil(Num(Month($1))/3)),QuarterStart($1)) AS [YearQuarter] Tagged ('$yearquarter', '$qualified'),
  Dual('Q'&Num(Ceil(Num(Month($1))/3)),QuarterStart($1)) AS [_YearQuarter] Tagged ('$yearquarter', '$hidden', '$simplified'),
  Month($1) AS [Month] Tagged ('$month', '$cyclic'),
  Dual(Year($1)&'-'&Month($1), monthstart($1)) AS [YearMonth] Tagged ('$axis', '$yearmonth', '$qualified'),
  Dual(Month($1), monthstart($1)) AS [_YearMonth] Tagged ('$axis', '$yearmonth', '$simplified', '$hidden'),
  Dual('W'&Num(Week($1),00), Num(Week($1),00)) AS [Week] Tagged ('$weeknumber', '$cyclic'),
  Date(Floor($1)) AS [Date] Tagged ('$axis', '$date', '$qualified'),
  Date(Floor($1), 'D') AS [_Date] Tagged ('$axis', '$date', '$hidden', '$simplified'),
  If (DayNumberOfYear($1) <= DayNumberOfYear(Today()), 1, 0) AS [InYTD] ,
  Year(Today())-Year($1) AS [YearsAgo] ,
  If (DayNumberOfQuarter($1) <= DayNumberOfQuarter(Today()),1,0) AS [InQTD] ,
  4*Year(Today())+Ceil(Month(Today())/3)-4*Year($1)-Ceil(Month($1)/3) AS [QuartersAgo] ,
  Ceil(Month(Today())/3)-Ceil(Month($1)/3) AS [QuarterRelNo] ,
  If(Day($1)<=Day(Today()),1,0) AS [InMTD] ,
  12*Year(Today())+Month(Today())-12*Year($1)-Month($1) AS [MonthsAgo] ,
  Month(Today())-Month($1) AS [MonthRelNo] ,
  If(WeekDay($1)<=WeekDay(Today()),1,0) AS [InWTD] ,
  (WeekStart(Today())-WeekStart($1))/7 AS [WeeksAgo] ,
  Week(Today())-Week($1) AS [WeekRelNo] ;

DERIVE FIELDS FROM FIELDS [resource_start_date], [resource_end_date], [date], [hourstatus_modified_at], [start_date], [end_date] USING [autoCalendar] ;

In the beginning of the script, there are a few variables which need to be changed in order for the connection to work properly. Those are explained in the code by comments, but we will go through them here quickly.

  • RESTNAME (filled in: REST_data) Fill in the name of the REST connector which will handle the data.
  • RESTToken (filled in: REST_token) Fill in the name of the REST connector which will handle the authentication
  • instance (filled in: youraccountname) Fill in the name of your ClockWise account name
  • client_id (filled in: Client ID) Fill in the name of the client id from the API client you created earlier
  • includeRates This variable determines if rates are retrieved from ClockWise for the hours, resources, projects and employees when this variable is set to 1.
  • employeeParameters This variables contains the GET paramaters for retrieving the employees (the part after the '?' in an URL). The standard fields are already given (via the '&$(GETRATES(employee))&' construction), and the size of the pages that are retrieved from the API. Do not change this limit parameter! You could provide extra fields to retrieve from the API, or exclude them, by giving a comma-separated list, starting with a comma behind the GETRATES construction. By prepending a field name with an exclamation mark, it will be excluded from the data. Example: field='&$(GETRATES(employee))&',!employee_name,hourly_sales_rate_id&limit=10000. Moreover, extra parameters could get appended (for example department_name_split).
  • resourceParametes Works the same as the employeeParameters
  • projectParameters Works the same
  • hourParameters

By clicking "Load data", the data from ClockWise will be loaded.

Note: It is possible that an error will be printed with (Connector error: HTTP protocol error 401 (Unauthorized):. This denotes that Qlik will refresh it's access token. If this is the only error that is printed, then you could ignore it.

The error message "HTTP protocol error 400" could mean that one of the ...Parameters variables is ill-formed, or contains a field which is not allowed in that report. Please look at the error message for more details.

This error (HTTP protocol error 400) could also mean that too many fields are enabled. You could recognise it by the error message Illegal field identifier:
If it is impossible to configure legers, cost centers and cost units in ClockWise, the fields starting with cost_center_..., cost_unit_... and leger_... are unnecessary, and the cause of the error. Change the '1's in '0's in those lines in the script.

Moreover, if includeRates is turned on, and units cannot be billed, the fields startign with unit_ (unit_sales_... and unit-purchase_...) are superfluous. Remove them by changing '1's in '0's.
Thirdly, kilometer purchase rates may not be available in ClockWise. Disable the km_purchase_... fields if that is the case.