Category Archives: Qlik Sense

A Journey to My First Published Extension

Like Karl Pover, I’m curious to learn more about writing Qlik Sense extensions and other opportunities to use the Sense APIs.   I’ve created some throwaway examples in class, usually working directly in the Sense/Extensions directory.

I found I reached the point where I wanted to get a bit more serious about the mechanics of writing and maintaining Sense code. This post discusses some of my journey in discovering and implementing a structure for writing and publishing Qlik Sense extensions. (For tutorials on writing the actual extension code see Qlik Help or websy.io)

First, my extension project.  I like the script export/import function in QlikView script editor and have missed this function in Sense.  So I created a extension that provides buttons to Export and Import script to text files.

 

 

 

 

Yes, it’s true — a “Visualization Extension” that visualizes nothing.

You can click “Export Script” to send the current script to a text file, and “Import Script”  to replace the current script from a text file of your choosing. You can also drop a text file on the Import button.  You can find the extension with download links here.

On to the focus of this post, which is “how-to”.

Naming Conventions

As the extension landscape expands, how many extensions named “Super Bar Chart” will be created and published on GitHub? We need a naming convention that allows everyone to co-exist and avoid collisions.  I adopted the common open-source naming convention of prefixing assets with my reversed domain name “com.qlikcookbook”.   So the formal name of my extension is:

com-qlikcookbook-ExportScript

All my files; js, css, etc are prefixed with this namespace prefix. The “name:” property of the required .qext file provides a friendly name that will display in the Assets panel.  I chose the shorter “Export/Import Script” for the name.

An extension lives inside a larger application and must play well and share with others.  It’s not a good idea to name your html Div “output” as you may collide with others who use the same name.  I used the same prefix for any elements that may have a scope outside my extension, specifically html ids and css class names. The Qlik Help has some recommendations on this topic.

Directory Structure

I looked a a number of different directory structures on github that other folks had used for existing extensions.  I settled on the layout recommended by Stefan Walther in the documentation for his sense-go tool.  The design made sense and I was also interested in using the sense-go tool for building.

Loading External Libraries

I used a couple of external libraries in the project to handle the file download and the drag&drop function. Instead of referencing those libraries with html links, I learned how to use require.js, an integral tool in the coding patterns of Qlik Sense.  I also  used require.js to load my css and html files. It’s a great tool.

Building and Deploying

The standard way of writing code is to write your source code in one location and then prep and package the files into an installation or runtime bundle.  There are many advantages to following this pattern.  We also want a way to automatically redeploy the updated code to the Sense desktop or Server for testing, and upload a release package when ready.   As a starter build & deploy process, I chose Stefan Walther’s sense-go tool.  In addition to automating the process, I found  the task chain to be a good knowledge transfer from an experienced Sense developer.

Other Tools

Everyone has a favorite editor, I tried a few on this project.  I found I liked vscode best.

For managing the git repository,  I used GitHub Desktop.  You may like another tool or be a command line fan.

If you are starting out writing extensions, I hope these notes help give you some direction.

Note; This extension has been deprecated in favor of the DevTool extension which provides the same functionality and more. 

Share

Cookbook Tools Updates

Just a quick note about some recent updates to the Tools available on QlikViewCookbook.com

  • QV Document Analyzer V3.5 
    • Added new computed field, “Expression Table Count” that identifies how many tables are involved in a given expression.  Expressions that use data from more than one table typically run slower then those with all data in a single table.
    • Added “Like Objects Count” attribute for Objects, identifying candidates for linked objects.
    • Bug fixes.
  • Copy Groups Utility V2 allows for copying groups within the same QVW.
  • Script Log Analyzer V1.6 can analyze reload logs from both QlikView and Qlik Sense, Desktop and Server versions.  Interface is available in four languages.

-Rob

Share

Vizlib: Innovation and Agility

Summary:  As a provider of Qlik Sense visualization extensions Vizlib promises to blend the innovation and agility of open source with the reliability of commercial software.. 

The pace of delivery for visualization in Qlik Sense has been disappointing to some.  Thanks to rich open APIs in QS, much of the slack has been taken up by the community in the form of open source visualizations as seen on branch.qlik.com.   There you can see some remarkable innovation and responsiveness to community requests.

Should you use open source visualizations?  An open source extension may provide just what you need in your project.  But you should consider the unsupported nature of open source and evaluate the risk and consequences of the visualization possibly failing at some point.

When I worked as  a Java developer  my team used the open source Eclipse IDE as our main tool.  We also used a dozen or more open source plugins.  As our plugin library grew, we found that testing and updating plugins between releases was taking an inordinate amount of time, sometimes making us afraid to upgrade the base Eclipse product.   We  turned to a vendor and purchased a bundled version of Eclipse with dozens of plugins tested, supported and verified. Problem solved.

Vizlib is a new company is promising to blend the innovation and agility of open source with the commercial support that many customers demand.  Vizlib is partnering with the best extension authors to produce a library of fully supported high function visualization extensions. Check out some of what they have published so far:

  • A Table object that delivers the functionality of  a QlikView table and much more.
    • Rich Formatting Options just like in Excel/QlikView
    • Conditional show & hide of columns
    • Dynamic Labels
    • Minicharts & Progress Bars

  • An Advanced Text Object that provides the full functionality of the classic QlikView text object  (including actions!) plus additional functionalities like HTML code and icons. 

To date Vizlib has released five visualizations with more on the way. Check out what’s available on the Vizlib download page.  A free trial license is offered that allows you to try any of the extensions in your own installation.  Take a look!

-Rob

Share

Storing a Data Model in a Single QVD

Have you ever thought it might be interesting to store a  Qlik data model into a single QVD?  This can be useful in a number of cases such as:

  • Archiving (and retrieving) data models.
  • Overcoming the “single binary load” restriction.

QlikView Components (QVC) Version 11 introduced two new routines to do just that:

Qvc.ExportModel — Exports all tables of the current model into a single QVD.

Qvc.ImportModel — Import a data model created by Qvc.ExportModel.

Even if you don’t have QVC V11 installed, you can try Qvc.ExportModel right now using  http include.  Add these lines to any QlikView script (instructions for Qlik Sense further on down in this post).

$(Must_Include=https://github.com/RobWunderlich/Qlikview-Components/releases/download/v11.1/Qvc.qvs);
 CALL Qvc.ExportModel
 CALL Qvc.Cleanup

Mind the wrap. The Must_Include should be on one line. Using QVC requires the Qvc.qvs library be included (usually at the beginning of script), CALLing Qvc routines, and CALLing a Cleanup routine at the end of your script.

Assuming this script is included in “Sales Dash.qvw”,  the default exported model QVD will be named “Sales Dash.qvd” in the same directory.

 

Now, to import this QVD model into another qvw, replace the CALL to ExportModel in the above sample with:

CALL Qvc.ImportModel('Sales Dash.qvd')

The original model will be reconstructed as individual tables.

Qvc.ExportModel has three optional parameters:

CALL Qvc.ExportModel(['qvddir'],['qvdname'],['addTimestamp']);
Parameter Number Parameter Description
1 String. Optional. Relative or absolute directory where the model QVD will be stored. If relative, it follows the same rules as the STORE script statement for relative directory.
2 String, Optional. Name for the model QVD. If omitted, the name of the QVW will be used. For example, if QVW is “Sales.qvw”, then QVD will be “Sales.qvd”.
3 String, Optional. 1/0 True/False. If True, a timestamp of the form _YYYYMMDDhhmmss will be appended to the QVD name. Default if omitted is False.

 

Qlik Sense has no default path  so parameter #1, a lib:// for the QVD should be specified.  Alternatively, if a lib has been established with a DIRECTORY statement, parameter 1 can be omitted.

Qlik Sense will require a web file Connection for the http Must_Include.

webfile-connection

After defining the web connection and having an appropriate folder connection to store the QVD in,  Qlik Sense script would look like this:

$(Must_Include=lib://QvcWeb);
 Call Qvc.ExportModel('lib://QVData')
 CALL Qvc.Cleanup;

 

That’s all there is to it!  If you are already using QVC, I hope you’ll find these routines a welcome addition to the library.  If you are new to QC, explore more at QlikviewComponents.org.

-Rob

Thanks to Jörgen Peerik for raising the single-QVD export idea during a QVC class. 

 

Share

Touchless Formatting

Summary: I show a scripting technique to assign display formats to loaded data without touching existing load statements. 

I coded in  SAS for many years and always appreciated the FORMAT statement which allows  assigning a display format to a field, independent of loading the field.

FORMAT OrderDate MM/DD/YYYY;

In QlikView and Qlik Sense script, there is an  equivalent that is useful to be aware of.   It’s not a statement, but a little known trick (so little known I’ve never seen anyone but me do it, although I’m sure others have thought if it).

// Load some dummy fields just to assign formats
TempFormatTable:
LOAD
 Date(0, 'MM/DD/YYYY') as OrderDate,
 Date(0, 'MM/DD/YYYY') as ShipDate,
 Num(0, '00000') as PostalCode,
 Num(0, '#,##0.00') as OrderTotal
AutoGenerate 1;

Facts: // Load the QVD
LOAD * FROM data1.qvd (qvd);

DROP TABLE TempFormatTable;  // Drop temp table

The formats assigned in the TempFormatTable will be inherited by any like-named fields in the QVD Load.   I sometimes find this easier than adding formatting function to the QVD Load statement because:

  • It maintains the optimized QVD load.
  • I can include a master list in the TempFormatTable. There is no error if a field doesn’t exist in the QVD.
  • Syntactically simpler.
  • I don’t touch the existing Load statement.

I don’t always format this way, but there are a number of scenarios where the technique is useful. A common application is to change formats from one locale to another. For example, loading a QVD created in Europe (with European formats) and assigning US Date and Number formats.

The technique works for any input source;  SQL, QVD, xls, etc. It works for both QlikView and Qlik Sense.

You may not ever need this tip, but if you do, I hope it saves you some time and makes your coding easier.

-Rob

Want more Tips & Tricks? Join me at an upcoming Masters Summit for Qlik event in Johannesburg (6-8 Sept) or Austin (10-12 Oct).  In addition to our two days of core sessions, Bill Lay’s “Tips & Tricks” on Day 3 always teaches me something new.  

Share

QV12 Timestamp Parsing

Have you noticed something new in QlikView12 and Qlik Sense timestamp parsing? UTC timestamps are automatically understood.

(Note: the output displayed below utilizes the US Date format set in the script as:  SET TimestampFormat=’M/D/YYYY h:mm:ss[.fff] TT’;)

For example, the expression:

=Timestamp('20160504T142523.487-0500')

returns:

5/4/2016 7:25:23 PM

That is, the UTC offset of “-0500” is detected and the returned value is the UTC time, not the local time of 2:25:23 PM.

I can’t find anything in the help beyond an example  for Timestamp# that demonstrates this but provides no detail.

This parsing functionality is particularly useful now that the QlikView Server logfiles use the UTC format for times.

I’m not sure yet if I like the automatic conversion to UTC time.  For example, apps like the QlikView Governance Dashboard now report Session Start or Event times in UTC time, not local time.

It’s nice that the “T” character is understood. If you want local time, it’s easy enough to drop the offset (“-0500”) as

=Timestamp(left('20160504T142523.487-0500', 19))

which returns

5/4/2016 2:25:23 PM

-Rob

 

Share

Scoping Selections with Aggr()

Summary: Selections can be made in Calculated Dimensions, although the result may not always be what is expected or desired.   The Aggr() function can be used to control what field(s) get selected.

The technique discussed in this post applies to both QlikView and Qlik Sense.  The screenshots shown are from QlikView.  Some of the visuals are a bit different in Qlik Sense, but the idea and expressions demonstrated are the same.

A downloadable example to accompany this post is available here.

Consider a listbox created with an <Expression> value of:

=Customer & ' -- ' & Country

A listbox constructed this way is useful for providing additional context or an  additional search.

Selections made in that listbox will make underlying selections in both Customer and Country.

The user is probably  expecting a selection in Customer only.  To limit the selection to Customer, add an Aggr() function to the expression:

=aggr(
 Customer & ' -- ' & Country
 ,Customer)

Only the Customer field is listed in the Aggr() dimension, so selections will be made only in Customer.

A side effect of adding Aggr() is that gray (unassociated) rows no longer display.  We can fix that with a bit of Set Analysis.

=aggr(
 only({1<Customer={"*"}>} Customer & ' -- ' & Country)
 ,Customer)

Now the listbox looks and behaves as expected.

 

Another place you may need Aggr() to control selection intent is chart Calculated Dimensions.

Hovering over a Salesrep value in the chart below gives a contextual popup that identifies Manager and Hire Date associated with the Rep.

The column was created as a Calculated Dimension:

=SalesPerson
& chr(10) & 'Reports to ' & [Sales Manager]
& chr(10) & 'Hire Date ' & date(HireDate,'YYYY-MMM-DD')

Clicking Michelle in the chart correctly selects her name as SalesPerson, but makes unexpected selections in HireDate and SalesManager.

I’m going to say that the dimension is “improperly scoped” and correct it by adding Aggr() to the Calculated Dimension.

=Aggr(
 SalesPerson
 & chr(10) & 'Reports to ' & [Sales Manager]
 & chr(10) & 'Hire Date ' & date(HireDate,'YYYY-MMM-DD')
 ,SalesPerson)

Selections will now be correctly limited to the “SalesPerson” field.

 

We’ve seen that Aggr() can narrow selections. We can widen selections as well.  This listbox will make selections in Customer, Country, SalesPerson and Year, even though only Customer is displayed in the listbox.

=aggr(
 only({1}
 Customer
 )
 ,Customer, Country, SalesPerson, Year)

 

We don’t have to include the display field in the selections.  In what I’ll call a  “backdoor associative search” , this expression will display Customer, but selects only the OrderID values associated with the Customer.

=aggr(
only({1}Customer )
,OrderID)

It’s usually a best practice to pre-create Calculated Dimensions in the script, when possible, for performance reasons. Returning to our first example, we might create a new field in the script as:

Customer & ' -- ' & Country AS CustomerAndCountry

We can use the new field as a display value, but we want selections to be made in Customer.

=aggr(
 only({1}  CustomerAndCountry)
 ,Customer)

 

As a last example,  we can create  “bookmark” like alternatives; either new fields linked in the data model or advanced search at run time.

Here I’ve linked a hidden field named “Bookmark” into specific OrderIDs in the script.  I want selections to be reflected in the OrderID field.

=aggr(only({1}Bookmark), Bookmark,OrderID)

Here is an advanced search that presents a listbox of Customers who have placed at least one order with a value >50K.

=aggr(
only({1<OrderID={"=sum({1}OrderAmount)>50000"}>}Customer )
,Customer)

Aggr() can be a “heavy resource consumer” and has the potential to slow down your application. Use only when required and avoid using or benchmark the impact in large applications.  Calculated Dimensions can also be a source of slow performance, precalculate fields in the script when possible.

Download the  example qvw for this post .

-Rob

 

 

Share