SailPoint IdentityNow - Custom data analysis and visualisation
Fernando de los Ríos Sánchez
Advisory Solutions Consultant @ SailPoint | Technical Sales Cycle Delivery
IdentityNow integration series
The intention of these series is not only show you how to implement certain use cases that go beyond out-of-the-box functionality but help you think out of the box. In this piece, I'll show you how you can leverage IdentityNow API as a data source so you can analyse and visualise your data beyond the capabilities of our search engine. We generally think these external solutions could increase the overall cost of a project significatively, but this is not necessarily true. I'll show you how I leveraged an online platform, free to use for small projects like this. You can even fork the project and adapt it to your needs. This doesn't require so much effort to get up and running and much less to maintain. In the corporate world, we could've leveraged any available business intelligence tool we have, and that's fine. As a consultant, this platform I used, ObservableHQ, could be a proof of concept tool with no cost.
Before I continue, I'd like to thank my colleagues Richard Guiot, Hind Chourak and Cristian Grau. We teamed up for the latest Hackathon and we gave birth to this project. The intention of this was to discover certain issues with certifications, access requests and permission management that could benefit from the introduction of our AI services. This is because its recommendations feature, that greatly helps with certifications, or the role modeling feature for the obvious. In addition to all this, AI services include Access Insights, which is a historical analysis and data visualisation much more powerful than the integration described here (but it comes at a cost :)). If you're interested into our AI offering check out this page.
Custom data analysis and visualisation
Let's cut to the chase. You can start using my public ObservableHQ notebook here. This is like an online Jupyter notebook but if this doesn't make sense to you, think of it as a fancy online spreadsheet.
Like so many other integrations I described, you'll need to get ready your Personal Access Token ID and secret, plus your API url. This needs to be an administrator's PAT for everything to work. Fear not since everything the notebook does is reading data. Data is not persisted, your credentials are only cached in your browser (should you decide to save them) and you can check all the source code on the notebook itself. Notice last few sections at the bottom where most of the logic sits.
Once you fill in your details you can click Connect to IdentityNow to start fetching data:
Notice you can store multiple intances to connect for your convenience. This is cached in your browser.
Feel free to explore the results, filter your campaigns and see if you discovered something you didn't know about your data. If you do, this means we didn't do that bad for a first attempt :). Now let's see how we did what we did, so you can adapt it to your needs.
Before we do, note I cannot go into detail about all ObservableHQ features and you'll need to check the brilliant documentation they have. You'll need some practice before everything starts to sink in.
Collecting input data
We're using a view, which is a visual representation of an object that, in turn, uses input widgets to collect that data.
This will hold our connection details but it's volatile. We don't want to type all this every time we want to connect. In order to avoid this, we're using a fantastic module called localstorage. It's as simple as importing the module from an external notebook and link it to the previous view (idnConfig), so it drives the contents of it.
Connecting to IdentityNow
We're using another great module here: fetchButton. This fetches the authorisation code we'll use to collect all data we need. It's like our key to IdentityNow from now on.
We will later use this key in order to create different services that extends this class:
Like this example where CampaignService extends IDNService and we instantiate it using the context information described so far:
Using different services to collect data
We've built multiple services that encapsulate connection details, raw data collection and data parsing/filtering. This makes the notebook look neater and helps you take them as a reference for other services you may need:
领英推荐
Another thing to note is we've created object classes in order to properly represent the data we need that can be conveniently created from the raw input coming from IdentityNow:
You can then collect them by calling the appropriate service method:
Manipulating data
We've seen the process of collecting raw data from IdentityNow, process it and return objects with the desired format. However, sometimes we want to filter these objects, aggregate them or apply any transformation we may need. For this purpose there are multiple tools we can use:
There's a plethora of this D3 methods for arrays you can check here. You can always use your own code to manipulate data. If you explore our code, you'll see we had to do so in order to process some data we couldn't do otherwise.
Visualising data
We've used basically three different tools here:
Fine tuning charts is not simple, I must admit. It takes some time to get the hang of it but it's the toll we must pay to access this technology. There are dozens of examples all over Internet, so don't worry too much if you don't get it right away.
Conclusion
In this article I tried to explain the process and tools we used to create this project. It wasn't about the project itself but about the process, so you can adapt it to your needs and start with an initial structure that helps you achieve what you want.
Please, let me know if you used it, if it was helpful and what you used it for.