Omg, I can read Json in Qlik Sense
https://www.buzzfeed.com/benhenry/15-gifs-of-jason-momoa-that-will-make-you-weak-af

Omg, I can read Json in Qlik Sense

My latest article about inline tables created a round of wows about the fact that Qlik script natively reads JSON (JavaScript Object Notation) - not Jason :-). Recently, Qlik added script support for .parquet but the common .json files are still not officially supported, ... but ... you can read them, it is a bit tricky and slower for large files - and an import wizard is missing, which would create the script blocks.

Note: Qlik may still change the described behaviour of json import at any time.

The hassle starts when you click on Add data in the connection pane. The file dialog for the folder has no specific filter for .json files:

.json files are not shown under "All table files"

Choose "All files" and then select your .json file. Next you will see, that the wizard attempts to interpret it as csv. Pointless, but at least you get the FROM construct properly filled. Forget the rest, as we will replace it in a minute.

Json is not listed as a File format yet. Just hit "Insert Script"

Just hit "Insert script" and then put LOAD * in front of FROM, and instead of (txt ...) put (json):

data:
LOAD * 
FROM [lib://:Google_Drive/119UHVSR9_JjkT9o-mz2NAO1xyS1xU56f/01.json] 
(json);        

Not too complicated so far. If your .json file only has one root object (or one root array of objects) and you will already get the keys from the top level. You are already done.

Now lets learn more about the parsing!

Simple JSON Object

A singe json file containing only one object ...

{"key1": 123, "key2": "abc", "key3": true, "arr": [], "obj": {} }         

... will result in one line of data.

simple keys are loaded 1:1

  • Keys with simple values will be 1:1 loaded into a Qlik Sense field
  • arrays and sub-objects will get a 16 char hash-value (I will explain below)

Multi-elements JSON File

If the json file has multiple "rows" (objects in an outer array [ ... ]) you automatically will get multiple rows in Qlik.

[ 
  {"key1": 123, "key2": "abc", "key3": true}
, {"key1": 234, "key2": "def", "arr": [5,6,7]} 
]        

The arrays can have different keys. Common keys will concatenate in the same field, whereas keys missed out in a given 'row' will read "null" (unfortunately as text, not as Null()) in the result.

Note that "arr" gets a hash-value back because it is not a basic value

Now lets look, how to deal with sub-objects and arrays? For this we need understand the table is construct and the hash values.

  • The root element has the pseudo-table name "Root"
  • Every element in an array has an offset, starting with 0, even the outmost array. Hence, the first "row" above you would get with "Root/0"

for the example I am loading json within INLINE, but it can come from a file as well

  • An object or an array will come back as a hash-value
  • To get the object or array contents itself, alter the "table is" construct to address this key, e.g. in above example "Root/1/arr"
  • If it is an array, you need to use the Pseudo key NoName to get the value and the key itself to get the hash value to link it to its parent

The good thing is, the whole import logic can be simply done by a recursive sub procedure: Json in, data-model table(s) out. Await my next blog entry!

Importing JSON sub-arrays

That is fine for one row, but if you have a json file with many rows, you don't want to call every line with "Root/0/arr", "Root/1/arr", "Root/2/arr" ...

Assuming a .json file with this content

[ 
    {"name": "Alex"  , "age": 50, "kidsAge": [25,23,7,4] }
  , {"name": "Dhruv" , "age": 27, "kidsAge": [1] }
  , {"name": "Eyal"  , "age": 35, "kidsAge": [] }
  , {"name": "Chris" , "age": 49, "kidsAge": [17,19] }
]        

We will introduce a FOR loop now. Note, if an array is empty [] or is null, the LOAD command inside the Loop would fail on that "row", so we set ErrorMode to 0 (=continue script upon error) and reset it to the default 1 (=fail on error) after the NEXT command

SET vFile = [lib://.../filename.json];

root: 
LOAD 
    name, age, kidsAge AS %kidsAge
FROM [$(vFile)] (json, table is 'Root');

SET ErrorMode = 0;
FOR v = 0 TO NoOfRows('root') - 1
    kidsAge:
    LOAD DISTINCT 
        kidsAge AS %kidsAge, 
        NoName AS kidsAge
    FROM [$(vFile)] (json, table is 'Root/$(v)/kidsAge');
NEXT v
SET ErrorMode = 1;        

The result is a nice 1:N relation resolved.


The fieldname "kidsAge" also exists on the inner LOAD (next to "NoName") and it returns - yes - the same hashvalue as in the outer, first LOAD.

Importing Json sub-objects

Last example is for a sub-object (instead of the sub-array): Similarily, you get hashes on the parent level and the content itself when addressing it right with the "table is" position. A FOR loop will ease things again, plus some more logic.

In this .json file we have one row where "kids" is null and the subobject doesn't always contain the key "happy"

[ 
    {"name": "Alex"  , "age": 50, "kids": {"count": 4, "happy": true } }
  , {"name": "Dhruv" , "age": 27, "kids": {"count": 1, "happy": true } }
  , {"name": "Eyal"  , "age": 35, "kids": null}
  , {"name": "Chris" , "age": 49, "kids": {"count": 2 } }
]        

We can't count on auto-concatenate any more for the sub-objects (row 4 would end up in a different table than rows 1 and 2). Also, without error-tolerance, the 3rd row import would fail. So we go for this script, where we create an empty table for the "kids" and then explicitly concatenate into it. We have to use the LOAD * functionality, because the structure of the subobjects is not identical throughout the main array. (Assuming you set the vFile yourself)

SET vFile = [lib://..../filename.json];

t_Root: 
LOAD   // outer load
    *
FROM [$(vFile)] (json, table is 'Root');

SET ErrorMode = 0;
t_Kids: LOAD * INLINE [kids]; // start empty table

FOR v = 0 TO NoOfRows('t_Root') - 1
    CONCATENATE(t_Kids)
    LOAD DISTINCT   // inner load
        *
    FROM [$(vFile)] (json, table is 'Root/$(v)/kids');
NEXT v
SET ErrorMode = 1;        

We will get now two tables, that are 1:1 related (or 1:1 or 0, to be precise)


Note: the inner LOAD * will return the "kids" as a field, too, containing the hash that links to the first table

The result can easily be used now in the Client

Uff, it can get tricky, but next week I will show you that all the manual steps (loop, field aliasing, concatenating) could be solved by a recursive SUB.

I hope though, that Qlik will also finish a wizard, that does that parsing on import like we have it for JSON objects received via the REST Connection or, likewise, as XML files are parsed.

Conclusion

Pros

  • A surprise and already a good starting point for importing .json natively
  • Everything can be imported using "table is" directive

Cons

  • Quite some script logic needed to parse nested JSON objects
  • Can be slow
  • Still undocumented and may change in future

Bolla Vinod

DATA WAREHOUSE & BUSINESS INTELLIGENCE (Qlik|Power BI|SSIS|GCP|BFSI Domain)

9 个月

Nice to know

回复
Alwyn Kotze

Son of Hennie and Dorette Kotze

10 个月

This only returns only the first line of the .JSON file and not all of it unfortunately.

回复
Andrew Yakibchuk

Driving Cloud & Infrastructure Excellence | Digital Transformation Consultant | COO at Crunch

1 年

Konrad Mattheis: That's great news! How does it work?

回复
Palak Mazumdar

Director - Big Data & Data Science & Department Head at IBM

1 年

?? Embark on your journey to Qlik Certification success! www.analyticsexam.com/qlik-certification is your guide to professional growth. ???? #QlikSuccess #CertifyNow ??

回复
Youri Torchalski

Business Intelligence | QlikSense | QlikView | Qlik Nprinting | Transforming functional demand into technical solutions

1 年

Nice to know. Still a lot of scripting required with potentially incorrect output if parsing is not done right. An alternative when not using the REST connector can also be to download the JSON files, either through an applicable web connector or through another kind of datafeed and use the General Web Connector to convert the JSON to XML. After that you can read the XML. It also requires a few steps but I think it increases the chance towards more reliable output.

回复

要查看或添加评论,请登录

Christof Schwarz的更多文章

  • Am I old-fashioned to write about Qlik Sense Windows?

    Am I old-fashioned to write about Qlik Sense Windows?

    Of course, Qlik has put enormous focus on their cloud offering and they would not be where they are, if they hadn't…

    3 条评论
  • Going to QlikCloud with Section Access in Qlik Sense Apps?

    Going to QlikCloud with Section Access in Qlik Sense Apps?

    Check this out! Your Section Access tables, that you load with into your Qlik Sense apps to achieve row-level security,…

  • Is AncoreScript a new (programming) language?

    Is AncoreScript a new (programming) language?

    Ha. Good question.

  • QuickTableViewer 2024 for Qlik Sense

    QuickTableViewer 2024 for Qlik Sense

    The QuickTableViewer extension is used by most Qlik Sense developers around the globe. On every conference, people pump…

    11 条评论
  • Preview data in Qlik Sense Cloud script editor

    Preview data in Qlik Sense Cloud script editor

    This May really got me excited in terms of new stuff arriving in Qlik Sense! This morning in Qlik Cloud the Sense…

    3 条评论
  • Cyclic Groups - what a Qlik milestone

    Cyclic Groups - what a Qlik milestone

    For the people outside the #qlikcommunity, cyclic groups has no, or a different meaning, like people who meet for a…

    4 条评论
  • Do you see Qlik Sense here?

    Do you see Qlik Sense here?

    No - you can't. This a screenshot from a Native Mobile App (Android or iOS) made with our databridge Q^Plus framework.

    9 条评论
  • Move Sense Apps between Qlik Sense on-prem environments

    Move Sense Apps between Qlik Sense on-prem environments

    For quite some time all my focus was on moving (migrating) Qlik Sense apps from Qlik Sense on-prem (client-managed Qlik…

    4 条评论
  • Uniqueness checks in Qlik Script

    Uniqueness checks in Qlik Script

    Or: Working with imperfect data Is think this is a situation you can easily get into when you load data from "reports",…

    2 条评论
  • Qlik Script: Join or not to join

    Qlik Script: Join or not to join

    I am continuing my Qlik Sense script blob, where I compare the scripting syntax with SQL, this time lets look at JOIN…

    13 条评论

社区洞察

其他会员也浏览了