All posts by Rob Wunderlich

Help! My Qlik Cloud Reload Exceeds Quota!

If you are a Qlik Cloud (SaaS) customer you may have seen this dreaded message during script reload. Ack! Ack!

Your quota may be different than 25Mb, but the problem remains the same. How do I modify this application to fit within my subscribed quota?

QSDA Pro V3.1 to the the rescue! QSDA analyzes your app and provides recommendations and easy to use tooling to drop unused data and optimize the remaining data, potentially greatly reducing the size of your Qlik App.

QSDA needs a successfully loaded app for analysis, so our first step is to reload the app using the “Limited load” feature of the Qlik script debugger.

Click the Debug button in the Qlik script editor, check the Limited load option, accept the default of 100 rows and click “Run” to initiate the reload. The reload will come in under your quota. When the reload completes, key Ctrl-s to persist the data.

You don’t have a business-ready app, but this is enough for QSDA to scan all charts, expressions and master items to determine what data is actually required.

In QSDA Pro we now select our Cloud connection and select the app of interest, in this case “BikeShop Sales2”. Click the “Open” button to prepare the app for analysis. I will also uncheck the “Collect Calctime” option because chart calculation times are not meaningful with our limited data. Description is optional, in this case I’ve entered “Limited Load”.

Click the “New Analysis” button to launch the QSDA Pro analysis. The analysis may take a minute or two to complete depending on the capacity of your tenant and the current Qlik Cloud load. When the analysis is complete press the “View” button to see the results.

In the Summary view we see there are 89 unused fields. Unused fields are fields that are not used in any charts, master items or expressions. These are fields that can be dropped at the end of script without impacting the user experience in the app. The actual storage numbers (KiB) are not meaningful because we only loaded 100 rows. The bar chart is useful in that is indicates about 2/3 of our data is unused. Hope!

QSDA provides a Script Generator in the Tools menu to generate a “DROP Fields …” script statement for unused fields. Select the script generator from the Tools menu.

In the Script Generator > Drop Fields tab select the top checkbox to select all recommended fields. Press the “Copy” button to copy the generated Drop Fields statement to your clipboard.

Paste the Drop Fields statement at the end of your Qlik Script and reload.

Reload successful! Victory!

A subsequent QSDA analysis shows the app size has been reduced from the problematic 37Mb to 15MB!

QSDA Pro can quickly and efficiently get your Qlik Apps below your Qlik Cloud quota by pruning unnecessary data. Don’t pay for more than you need.

In a follow up post I’ll walk through some additional QSDA features to help you reduce Qlik app footprint even further and improve chart response time to boot.

Download QSDA Pro and try it for yourself. Or reach out to us to learn more about license options or arrange a demo or POC for your team.

-Rob

Are you going to Qlik Connect? I’ll be at the Motio booth on the show floor ready to demo QSDA Pro or answer any questions you may have. Or just to say Hej 🙂

Share

Formatting Intervals

In this post I’m going to show some tips for formatting intervals / durations in Qlik.

Datetimes in Qlik utilize a serial number that equals the number of days since December 30, 1899. The integer value represents the day and the fractional value represents the time of the day. The difference between two datetimes is an interval or duration.

Qlik provides the Interval function to format numeric interval values as dual strings for display. Interval() may be used both in charts and script. The Interval function accepts an optional format string that specifies the output format.

An example interval numeric value could be 3.3414351851898, three days and a few hours. The expression

Interval(3.3844097222245, 'hh:mm:ss')

will yield “81:13:33”. Note the 81 hours representing 3 days and 9 hours. We can exceed 24 hours because interval accumulates overflow in the largest unit. For example, we could show the duration as minutes using

Interval(3.3844097222245, 'm') 

yields “4873”

We can add the format character “D” (or “d”) to indicate days.

Interval(3.3844097222245, 'D hh:mm:ss')

will yield “3 09:13:33”.

Is it clear what this number string represents? The “:” may allow a user to intuit the hh:mm:ss. But what about the “3”? Can we make this clearer by adding some additional text to the output? (From now on I’ll use a field named “Duration” in my examples). Will this work?

Interval(Duration, 'D days hh:mm:ss')

No. We will get some funny looking output because the “d” and “s” in “days” will be interpreted as format characters.

The format pattern does not have a syntax to declare “this is a literal, not a format character”. We can work around this limitation by using a non-format character as a proxy and patching it up with a Replace function. I’ll use “a” to represent the word “days “

Replace(Interval(Duration, 'd a, hh:mm:ss'), 'a', 'days')

You may have noticed that the values are now left justified. This is because Replace() returns a string, whereas Interval() returns a Dual value. We probably want to retain the Dual nature to allow for proper sorting. I’ll fix it by adding a Dual() function.

Dual(Replace(Interval(Duration, ‘d z, hh:mm:ss’), ‘z’, ‘days’), Duration)

How about if I want to show only days and hours?

The format string “d h” will show days and hours. Minutes and seconds will not display. However…the internal numeric value will remain unchanged. Generally not a problem, but may cause some duplicate lines if the value is used as a Dimension or filter. Because of this, I like to round the value to match my display.

Interval(Round(Duration, MakeTime(1)), 'd hh')

MakeTime(1) is the value of one hour. If you want to round to 4 hour segments use MakeTime(4). Instead of Round() you can also use Floor() or Ceil(). For example, to make 15 minute buckets use Floor(Duration, MakeTime(0,15))

Let’s put Round and Replace together to display a clearly formatted days and hours. I’ll use “b” as proxy for “hours”.

Dual(
  Replace(
  Replace(
   Interval(Round(Duration,MakeTime(1)), 'd a, h b')
  ,'a', 'days')
  ,'b', 'hours')
, Duration)

Almost there. I don’t like the “0 days”. How about we don’t display days if zero? Let’s make “days” conditional on Duration being >= 1.

Dual(
  Replace(
  Replace(
    Interval(Round(Duration,MakeTime(1)),
      if(Duration >= 1, 'd a, ', '') & 'h b'.)
  ,  'a', 'days')
  ,  'b', 'hours')
, Duration)

I like it. It has become a bit wordy. I may want to use it elsewhere in this app (or others), perhaps for a different field. I’ll make it reusable by assigning the expression to a Variable with Parameter. Add this statement to the load script.

Set FormatDuration = Dual(
  Replace(
  Replace(
    Interval(Round($1,MakeTime(1)), if($1 >= 1, 'd a, ', '') & 'h b')
  , 'a', 'days')
  , 'b', 'hours')
, $1);

“$1” is a placeholder for the fieldname. The chart expression is now simply:

$(FormatDuration(Duration))

Looking for more tips & tricks? Have some of your own to share? Join us in September at the Masters Summit for Qlik in Vienna. In addition to our focused hands-on workshops on migrating, managing and developing in Qlik Cloud, we’ll be sharing lots of tips and techniques gleaned from years of Qlik experience.

-Rob

Share

QSDA Release 2.6 Brings Customized Flag Levels

QSDA Pro 2.6 is available now and a new feature is Flag Levels. Each flag type is assigned a level of Info, Warning or Error with a matching color. Color helps you focus on what’s important.

Flag Levels may be customized and refined for your requirements using the new Admin > Project Config page.

QSDA 2.6 also adds three new flags:

Calculated Dimension“.  The default level for this flag is Warning. 

No Application Thumbnail“. The default level for this flag is Warning. 

Missing Theme“. The default level for this flag is Error. A missing theme can occur when moving an App from one Qlik environment to another.

Want to learn more about QSDA Pro? Are you QSDA Pro curious or want a refresher on the capabilities and value of QSDA Pro to your Qlik development team? Join me on 24 January, 2024 for “Accelerate Your Qlik Development Through Automation” where I’ll walk you through the features of QSDA Pro and show you how QSDA can help you produce faster, leaner and cleaner Qlik Apps.

-Rob

Share

Charts from the 1900 Paris Exhibition

Summary: For the data viz nerds among us, I share my fascination with a groundbreaking data viz project at the 1900 Paris Exhibition.

In Tim Harford’s excellent book “The Data Detective: Ten Easy Rules to Make Sense of Statistics” the author mentions a set of infographics produced by W.E.B. Du Bois and assistants for the 1900 Paris Exhibition. The “Exhibition of American Negroes” is a powerful collection of photographs and data visualization that provide a strong counter to racist propaganda and ignorance of the time.

I encourage you to explore the entire contents of the exhibit, but in this post I’m going to highlight some of the data visualizations. You can browse the collection here, the charts are on the first two 100 entry grids.

Du Bois and his Atlanta University students made 63 hand-drawn diagrams. The data visualizations were created on presentation cardboard and conceived for a temporary installation at the fair. The original diagrams still exist and have been displayed in various museums.

The hand drawn nature of the charts makes them feel very personal and they tell their stories very well. I expect that the team was pushing the boundaries of data viz knowledge and technique of the time.

This multivariate diagram is one of my favorites

Link to larger size at Library of Congress

There are charts covering many aspects of African American life — professions, geography, income, property — and importantly showing change over time. This is a chart comparing marital status of African Americans to Germans. Why Germans? I’ll let you research that.

Link to larger image at Library of Congress

Measured against our current standards for data viz, we could make some criticisms about choices of shape and scale, but I don’t see any dishonesty. One of the design choices I found amusing was what to do when the bar size exceeded the available space. Wrap it!

Link to larger image at Library of Congress

I hope you find this bit of data viz history as fascinating as I do. I’ve provided some additional reading links below. Again, you can browse the collection at the Library of Congress. I’m curious to hear which visualizations you find most interesting and why.

Additional Reading

An excellent overview of the sociological context and impact of the exhibition: “W.E.B. Du Bois in Paris: The Exhibition That Shattered Myths About Black America” by Jacqueline Francis and Stephen G. Hall

WEbDuBois.org The 1900 Paris Exhibition index. List of sources and materials related to the exhibition.

Share

Masters Summit for Qlik Cloud

Masters Summit New Qlik Cloud Track!

The Masters Summit Team is developing all new courses for 2024. The new content is focused on:

  • Qlik Cloud (perfect for clients who are on or moving to cloud already, or evaluating a migration).
  • New capabilities in the platform, like AutoML, Automation, etc.

Before we get too far down the road, we want your input to gauge interest.

We are opening pre-registration for our 2024 conferences:

Vienna, Austria – October 02-04 Pre-register for Vienna now

Philadelphia, USA – November 04-06 Pre-register for Philadelphia now

If you or one of colleagues would be interested in attending one of these Summits, and believe you can get approval for the expense we would ask you to kindly pre-register now.

We will not be taking any payment at this time and you should not make any travel arrangements.

Why pre-register?

If you pre-register on or before January 15, 2024, you can attend the event for a specially-discounted price of $1800 (that’s a 25% discount off our usual early-bird price)

This is your opportunity to help shape the future of Masters Summit events and attend with a deeply-discounted price. If we determine enough of you are interested in returning, based on pre-registration, we will move forward with our 2024 plans

See the new Sessions below.

Managing your tenant

  • Setting up development and promotion flows
  • Approaches for managing data.
  • Monitoring your tenant

Migrating to Qlik cloud

  • Planning your migration
  • Common pitfalls
  • Available tools, and how to use them.

Performance Tuning and Capacity Management

  • Best practices for performance and data scalability with the Qlik capacity licensing in mind.
  • Review opportunities in data modelling and visualizations to improve performance.
  • Review available tools for monitoring performance.

Real-world AutoML Applications

  • Understand machine learning predictive models and what business problems they can tackle.
  • Profiling data for ML, basic statistics, feature engineering, avoiding common pitfalls.
  • Evaluating and explaining your models using AutoML results
  • Deploying and monitoring your model

Enterprise Reporting

  • Building reports from exiting sheets or objects
  • Output presentation quality reports in various formats including annotations.
  •  Automating the distribution of reports to a variety of delivery channels.
  • Review existing report connector blocks.
  • Discover the self-serve options for subscription and alert-based reporting.

Qlik Automations

  • Automating Analytics Operations and Automation
  • Dynamic Actions and Business Process Integration
  • Going beyond: Qlik-CLI and the Qlik APIs

Qlik Cloud Integration

  • Understand what APIs are available, the necessary skill sets required to develop with them and how to correctly structure your project architecture.
  • See the art of the possible with a mix of real-world solutions and innovative demos.
  • Discover useful tools to help speed up development and reduce costs.
  • Get tips and tricks for the dark art of authorization and authentication in both client and server implementations.

We hope to see you in 2024.

Sincerely, The Masters Summit Team.

Share

Using Attributes in QVD Files

Here’s a fun Friday post.

As I was updating my incremental load materials for the upcoming October/November Masters Summit for Qlik, I ran across an old note about creating “Attributes” in QVDs using Tags. Due to a bug in QVD generation I couldn’t implement the idea at the time. But the bug is long since fixed so here I present the general idea of “QVD Attributes” (my name, not an official Qlik thing).

We can assign Tags to Qlik Tables using the script TAG statement:

Tag TABLE TranData with 'MaxId=$(vMaxId)';
Tag TABLE TranData with 'Source=Salesforce';

If the table “TranData” is subsequently written to a QVD using the STORE statement, the tag will be included in the QVD file.

We can read tags directly from a QVD with script:

LOAD String%Table as Tag
FROM [lib://mylib/TranData.qvd]
(XmlSimple, table is [QvdTableHeader/TableTags/String]);

Note that we are reading only the XML header from the beginning of the file. This is very fast.

If we use a structured tag format like “key=value” we can create a generalized “Attribute” store in a QVD that can tell us something about the contents of that QVD. That could be useful when we want to use the QVD in a subsequent script.

How do we retrieve these tags in a useful and consistent way? I think loading them as variables would be a good approach. We can create a reusable subroutine to load all “key=value” tags from a QVD and generate variables named “Tag:key”. We then call the Sub, using the path to any QVD:

Call LoadTags('lib://QVDFiles/TranData.qvd');

We might utilize the new variables in something like an incremental Load.

SQL SELECT * From Trans Where Id > $(Tag:MaxId);

Or as a control to conditionally execute a script block.

If '$(Tag:Source)' = 'Salesforce' Then
// Use the Salesforce Load statement
EndIf

What is all this stuff? Subroutines? Preceding Load? Incremental Load? Tags? Join me at the Masters Summit for Qlik in Dublin or Orlando where we will teach you all about these topics and make you a scripting Master.

At the bottom of this post is a complete sample including the subroutine. You only need to change the “QVDFiles” to your lib to run this on your system.

-Rob

// Subroutine to Load Tagname=Value pairs from a QVD as "Tag:Tagname" Variables.
Sub LoadTags(_path)
    _TempTags:
    Load
        SubField(Tag, '=', 1) as TagName,
        SubField(Tag, '=', 2) as TagValue 
    ;    
    LOAD String%Table as Tag
    FROM [$(_path)] 
    (XmlSimple, table is [QvdTableHeader/TableTags/String])
    Where String%Table like '*=*';
    
    For _i = 0 to NoOfRows('_TempTags') - 1
        Let _vname = 'Tag:' & Peek('TagName', $(_i), '_TempTags');
        Let [$(_vname)] = Peek('TagValue', $(_i), '_TempTags');
    Next _i
    Set _vname=;
    Set _i=;
    Drop Table _TempTags;
End Sub
// END of Subroutine


// Generate some sample data
TranData:
LOAD  
	RecNo()*2 as Id,
	Rand() as Amount,
    'A' as Customer
AutoGenerate 6;

// Something we may want to record from the data is the max Id value, for later incremental load. 
TempId:
LOAD Max(Id) as MaxId Resident TranData;
Let vMaxId = Peek('MaxId');
Drop Table TempId;

// We can tag with $(variable) values or literal values. 
Tag TABLE TranData with 'MaxId=$(vMaxId)';
Tag TABLE TranData with 'Source=Salesforce';
// Some other tag that is not part of our name=value scheme.
Tag TABLE TranData with "ABC";
// STORE and DROP the QVD
Store TranData Into [lib://QVDFiles/TranData.qvd] (qvd);
Drop Table TranData;


// Call our subroutine to load tags as variables, from the disk QVD.
Call LoadTags('lib://QVDFiles/TranData.qvd');

// Demonstrate how we might use these tag values.
// In a SQL Where clause.
SET select = SQL SELECT * From Trans Where Id > $(Tag:MaxId);
Trace $(select);

// In a conditional to select a block of script.
If '$(Tag:Source)' = 'Salesforce' Then
	Trace Going to use the Salesforce load statement;
EndIf
Share

Guest Speaker for Dublin Masters Summit

I’m pleased to announce that Brian Booden will join us at the October Masters Summit for Qlik in Dublin, Ireland. Brian is a seven (!) time Qlik Luminary / Partner Ambassador, co-host of The Data Mix, and a well known Qlik and Analytics expert.

Brian will share some inside tips on using Qlik Application Automation and I expect a few interesting tricks. I’m hoping he shows off the capabilities of his WhatsApp for Qlik integration!

Masters Summit for Qlik is the premier Qlik Sense & QlikView advanced education event. In three packed days of lectures and hands on exercises, you’ll learn from experts and take your Qlik Dev skills to the next level in data modeling, scripting, expressions and visualization. We’ll also cover performance tuning and Qlik API Integration for beginners. You’ll come away with a wealth of new knowledge, code samples and documentation.

Find out what over 1500 of your colleagues have learned already.

Share

Creating Time Groups in Your Data Model

I frequently create Date groups in my Qlik apps to facilitate selections like “Yesterday” or “Last Week”. Some years ago I published a pattern for creating Date groups in the Recipes section of QlikCookbook.com and it’s been a very popular download. I’ve responded to a few requests on Qlik Community looking to do the same thing for Time groups, so I decided to publish a recipe for Time Grouping as well.

The recipe demonstrates selecting ranges such as “Day Shift”.

Selecting by Time Group

How do we create ranges that map to our data? In the script, create a Range table that defines range Names, Start and End times.

Ranges:
LOAD
Range as Range,
Time#(RangeStart, 'hh:mm:ss') as RangeStart,
Time#(RangeEnd, 'hh:mm:ss') as RangeEnd
INLINE [
Range, RangeStart, RangeEnd
Midnight Shift, 00:00:00, 05:00:00
Early Morning, 05:00:01, 09:00:00
Daylight, 06:00:00, 17:59:59
Day Shift, 09:00:01, 17:00:00
Early Evening, 17:00:01, 20:00:00
Evening, 20:00:01, 23:59:59
12am-6am, 00:00:00, 05:59:59
6am-12pm, 06:00:00, 11:59:59
12pm-6pm, 12:00:00, 17:59:59
6pm-12am, 18:00:00, 23:59:59
];

Then use IntervalMatch to link the time field in our data (EventTime in this example) with the Range field.

JOIN (Ranges) IntervalMatch (EventTime) LOAD RangeStart, RangeEnd RESIDENT Ranges;
// No longer need RangeStart, RangeEnd, but may keep them for documentation or debugging.
DROP FIELDS RangeStart, RangeEnd;

There are additional notes provided in the downloadable example, but that’s it for the basic pattern. You can download the example from here Time Grouping

You may have noticed I used Time#() to read the text time values. That makes it easy to write the range values as readable hh:mm:ss.

If you are generating the ranges using another technique like Interval#() or division, be mindful of rounding. A Qlik time value is represented numerically by the fraction of 1, a whole day. 0.5 is 12:00:00 PM, halfway through the day.

There are four methods I’m aware of to generate a Qlik time. For example to generate the numeric time value corresponding to 12:00:05 AM:

Time#('12:00:05', 'hh:mm:ss')
Interval#('12:00:05', 'hh:mm:ss')
MakeTime(0,0,5)
5/86400

All four will generate a floating point number to represent the time. Will they produce exactly the same result? Results that would pass an equality test?

Time#() = Interval#() = MakeTime() = x/y?

For some inputs they are equivalent. For a significant number of inputs the odd man out is Interval#(). Given the 86,400 second values in a day, Interval#() will not match the others 36% of the time. Try sharing that interesting fact with your spouse at the breakfast table.

Join me at the Masters Summit for Qlik in Dublin or Orlando this fall to learn advanced scripting techniques for Qlik as well as inside tips. Our team of expert consultants will cover data modeling, set analysis, visualization techniques and Qlik integration as well as answer your “how do I” questions. Register now or get more information at the Masters Summit for Qlik website.

Happy Scripting
-Rob

Share

QSDA Pro 2.5 Brings Extended Expression Validation

Summary: Set Modifier field names and Set Identifiers are not validated by Qlik syntax check. QSDA Pro 2.5 validates everything and raises an “Unrecognized Name” flag for invalid names.

QSDA Pro syntax checks all expressions in a Qlik App using the Qlik syntax checker. You’re familiar with the Qlik syntax checker. It’s the message at the bottom of the expression editor that tells you your expression is “OK”.

The syntax checker is also good at telling you when the expression has an error, although it’s not always specific about the problem.

QSDA Pro, using the Qlik API version of syntax checker, tells you what is wrong with the expression:

The Qlik syntax checker has a significant limitation. It does not validate field names used in sets. Whether used in the API or the editor, the syntax checker will not raise an error if the field “EmpPurchase” does not exist in the data model.

This lack of validation can be a serious problem if “EmpPurchase” is removed or renamed in the data model. The expression will still return a number but the exclusion of employee purchases will no longer be applied. That could be a very subtle but important fail.

Recognizing this limitation and it’s potential impact, QSDA Pro 2.5 validates all names used in sets and will raise a new Quality flag, “Unrecognized Name”, if the name is invalid.

Another place this validation is extremely useful is Chart Filters in Qlik SaaS. The chart will cheerily display that filter “Expression2 > 1” is being applied. Even though field “Expression2” no longer exists in the data model.

But QSDA knows.

Ever use a bookmark as a set identifier? And then the bookmark got deleted or renamed?

I’ve used simple examples to demonstrate, but where this validation really shines is in longer expressions where it may be easier to overlook that something has gone wrong.

Yes, you need QSDA Pro. We all need QSDA Pro.

Learn more about QSDA Pro and download your own copy at https://easyqlik.com/qsda/

-Rob

What does QSDA stand for? Qlik Sense Document Analyzer. The combination of data model and sheets we know as an “App” in Qlik Sense was called a “Document” in QlikView. When I first created this tool in 2009 to help maintain QlikView, I called it “Document Analyzer”. When it came time to create a similar tool for Qlik Sense, I stuck with the “DA” name as the function and usefulness of “Document Analyzer” was well established in the Qlik community.

Share

Motio Acquires QSDA Pro

I’m pleased to announce that I’ve joined forces with Motio, Inc, producers of the great DevOps tools Soterre and Gitoqlok. Motio has acquired my QSDA Pro product and I’m excited to combine our tools to provide a comprehensive platform for professional Qlik App development in Qlik Sense and Qlik SaaS. You can read the announcement here.

I will continue to lead the development of QSDA Pro, enhancing the product and bringing new superpowers to my customers in the integration with Soterre and Gitoqlok.

The QSDA Pro customer base is growing fast! To date hundreds of organizations have purchased QSDA Pro and are realizing the full benefits of the tool. More are joining every day and I’m excited to add Motio’s dedicated product support and admin teams to handle this growth.

You can continue to learn more and purchase QSDA Pro online .

Share