Unlocking the Power of Analytics:  A Journey Through Spotfire ??, Power BI ? and Tableau ??
Spotfire, Power BI, Tableau

Unlocking the Power of Analytics: A Journey Through Spotfire ??, Power BI ? and Tableau ??

The world of analytics is rapidly evolving.

You must stay tuned with the most ultimate tools and follow the recent releases. At least if you want to accomplish anything and bring a minimum of innovation around you. With so many things happening at the same time, you cannot follow everything. You must be selective and focus on the right technologies. The first step is to accept that one tool can’t bring all the answers. The second step is to somehow acknowledge that using the right tool from the start, understanding its core and minor capabilities, can elevate the experience of an entire organization. Many times, it can change the way we address a problem.

Initially I started with one tool, Spotfire, and progressed a lot with my understanding of data and analytics. To a point that I started to coach and advise colleagues about it. Whilst I was enjoying a lot progressing in this domain, I started to share some of my creations. This is when I started to face a certain resistance to change around me. The fundamental reason for this was related to the fact that the tool I was using was not the single tool on the market.? At the same time, I needed to explain the differences with other tools, which I didn’t know much. I was happy to highlight some differences, but I was not convincing enough to effectively influence people around me. Being a one-tool expert, I was not speaking the same language of other people, also experts in another single tool for analytics.

To address this gap, I decided to learn other tools and be more inclined to using different software for analytics. Within a few years, I became a specialist of both PowerBI and Tableau, two mastodons in the world of BI. I must admit that it was not a simple journey. Like I said, I once was this person being a “one-tool only”. Like the other “one-tool BI experts”. Learning a new BI software is a big investment that takes a lot of time. It is easier to think that it could be such a waste of time to learn another tool…and keep thinking that the single tool I use today is the best in the world.

That is how I discovered why people didn’t want to change, why they didn’t want to switch or even try out. Without exploring other tools, you stay in your comfort zone. It is easier that way. You are convinced that the tool you - and the company - ?are using is the best. Besides, this is a good reason for not learning – learning being perceived here as spending time on something that makes you think twice about what you were doing until now. Sure of yourself, there is nothing more to learn. And since the world of BI and Analytics is vast, you don’t have to make any effort at all. Stay a “one-tool expert” and enjoy it. If you are challenged, you can always find your way out and mention that you are relying on “robust solutions” and “top-notch quality analytics”, without really knowing what other tools can provide, if they are more robust or bring better quality. In other words, you will remain biased.

To move away from this, you will have to learn what other tools can provide and what their key features are. There are hundreds of them. In some cases, you will find out that the main BI tools are competing with each other and sometimes that they complete each other. Amazing discoveries ahead. You will also have to learn about the differentiators. What makes a BI tool so different from another one? Again, there are dozens of them. You can rely on Gartner or Forrester reports, heavily shared and commented, giving an overview of the market. However, are these IT reports practical enough for your daily challenges? Will you learn something new that you will be able to apply? Maybe not.

That is why in this review I will explain the main differentiators and limitations of the following tools:

?

What follows is not another comparison that tries to prove a point for either BI tool.

The purpose of this review is rather like this:

  • To help others with the transition from one tool to the other by knowing the fundamental differences.
  • To explore new tools that you can add to your repertoire as a Data Analyst, Data Scientist, BI Specialist, BI Manager, Data/Analytics Lead
  • To create further awareness around main capabilities of the different BI tools studied here.

Most of the comments that follow are therefore practical and are mainly addressed to the professional working in analytics. Nevertheless, their applicability extends beyond this specific audience, offering insights to anyone interested in data and analytics.

Additionally, these insights can also be beneficial for BI companies, providing them content for a better understanding on how those tools are perceived and used today. Without further ado, let’s get started.

?? Spotfire, the All-in-One Powerhouse for Advanced Analytics and Data Magic

Let’s start this review with Spotfire. The first thing you will notice when using Spotfire is the speed. From the very beginning, you can quickly analyze any dataset and extract pretty much all insights without having to spend too much time in the backend of the software. It all depends on the datasets and data quality of course, but after practicing it on several occasions, I have to say that the tool is extremely versatile. It adapts to pretty much any kind of datasets whilst being extremely stable over time. Having the data canvas (analytics backend) next to the visuals (analytics frontend) is a game-changer from the start.

The native filtering capability, including markings, is a power feature here. It’s by default very dynamic, very handy, and extremely customizable. Knowing the amount of data that we handle everyday; a strong filtering option is important and giving an edge for interactivity. A huge plus: one does not have to create filters in most of the cases because these are already created for you, a considerable time saver when you open a dataset. On top of this, your filters are by default carried forward to the different pages and dashboards, an enormous plus for user interactivity, right from the start.

What people are usually surprised to learn about Spotfire is not filtering but rather deals with the concept of data wrangling. Whether you have the ugliest datasets on Earth or not, you will be able to process them within the same interface. No need to use a data prep extension or another software: Spotfire does most of the job for you. You save time because you don’t have to jump and flick through several windows to handle a dataset from A to Z. Spotfire is a time-saver.

The all-in-one interface makes it easy because you can build visuals on the way and fine-tune them over time very easily. It is a hard thing to notice at first, but it becomes more blatant when using other software. The drag and drop of fields, combined with the sizing of the visuals (available for front-end users as well) makes it very handy. Anyone can adapt the dashboard online, based on their screen setting. It is a huge thing as user-friendliness is key in analytics. It helps driving adoption to your dashboards. On top of that you can use properties and interact with your data further by passing through variables, text of any kind. Very flexible, it opens the door to scenario handling, dynamic scripting… Endless possibilities to interact with your data and push them to the next level of insights. Still saving time all along.

One word on geo-analytics: fantastic! Whether it is about geocoding data with GPS coordinates, coloring countries, region or districts on a map, use several layers or create polygons with your datapoints Spotfire will stand out. In addition, one can incorporate scripts (see below) to unleash the AI beast and visualize on a map what you ever dreamed of. The fact that one can combine and manage easily multiple layers of data and make them interact with each other is definitely a plus. When we think about analytics today, we cannot exclude geo-analytics and here we have a very strong case to build powerful analytics solutions from A to Z.

I cannot write about Spotfire without mentioning the scripting possibilities: one can compute simple or very complex calculations without experiencing the computer slowness that every analyst fears. This is extremely useful and can also be a time-saver. However, because it is less known, and a bit more complex, people tend to forget it. I think it is a pity because there is so much in automation that can be brought through scripting. Let us point out that Spotfire was mostly built with C#, that IronPython can access through the C# API. Basically, it means that you can interact with almost any Spotfire element thanks to IronPython. Very convenient and very precious. Cherry on the cake: Spotfire is being enhanced regularly to integrate data functions out-of-the-box through flyout options that makes it even more approachable for newcomers or non-coders. R and Python aficionados will love the TERR console and the Native Python Data functions to experiment and improve their scripts.

According to me, Spotfire could progress on several topics that are inter-connected: overall brand awareness, learning materials and user interactions. Regarding brand awareness, Spotfire is still playing the role of Tom Thumb in the world of BI. Pushing a little bit further the awareness and daring to wisely invest more on the overall communication could be a good idea. A little bit of SEO/SEM, sponsoring and affiliation wouldn’t hurt much at this stage.

Regarding the learning, the content is less organized compared to Power BI and Tableau. There is an Enablement Hub which is amazing, but it could be more organized. Whoever new landing on this learning pages could feel overwhelmed, for good reasons. Newcomers are indeed lacking a little bit of support. There is not really a learning path or learning journey properly organized and I think this is a big miss. Sharing tips about Python scripts in fine, but it’s typically not what newcomers in BI need first. Newcomers need content fast and need to solve their questions fast. If they don’t find what they need when starting to use a tool, they soon consider another one for which a solution to their problem is available... Also, a strong learning platform is amazing to push brand awareness. At the time I’m writing this it seems that there is a positive development happening as a New Spotfire website is seeing the light. Definitely something to look at closer in the coming months.

About user interaction, or better customer-interaction, I believe there is also a margin for improvement. If I have an issue with something, I might raise it through the online community and this is usually fine. Most of the time I get an answer from experienced people and this is truly magical. However, there is one little thing that bugs me a little bit and this is dealing with the concept of sharing ideas to suggest improvement. Originally this concept of online suggestion box is not bad at all: anyone can express him/herself and share ideas for improving. However, over the time, thousands of ideas have popped up and many of them seem to be discarded by default, several thousands appearing into the “Future Consideration” category. Probably less than 5% will see the light one day. Of course, it’s impossible to answer every customers’ needs and make them happy by considering all of their suggestions, some of them being contradictory. I would prefer to hear it directly than hoping that somebody someday picks up some ideas out of thousands.

?Power BI, The Business Intelligence Platform For Your Team

Power BI is impossible to ignore because it is part of Microsoft Power Platform. Though very rudimentary and somewhat limited in terms of visualizations when it was launched, it has expanded a few capabilities over the years that make it more and more promising.

When you open Power BI for the first time, you can be scared by the number of existing menus and panes. Don’t panic! This is something that Microsoft Office users are already familiar with. Power BI is a bit like an extension of PowerPivot and Access, with the possibility to add visuals (which are far better than the Access charts for those who remember) and share them in the cloud via the Power BI service platform.

You will notice it from the start: Power BI is well integrated into the suite of Microsoft Apps: you can create a Onedrive Workspace, publish your analytics in Teams and import (more or less seamlessly) data from Microsoft Lists or a SharePoint folder. You can also consider DataVerse via a shared? PowerApps in Teams. The more you use Microsoft products, the more you will see how well Power BI is integrated and how well it responds to the needs of business and the needs of your teams.

Once data are loaded for your analysis, you will have the option to use PowerQuery and clean -a little bit – your data. PowerQuery is indeed quite efficient to clean and scratch the untidy surface of your datasets, sometimes you will even be able to reorganize your data in a better way thanks to pivoting but keep in mind that it won’t perform extremely well if you have massive and complex data cleaning to do. Again, it will depend on how your data is structured from the start.

Before working on your visuals, you may have a look at the data model. Here we are in the Access-like part of Power BI. As its remote ancestor, you will be able to connect tables with one-to-many and one-to-one relationships. It comes has well with new features, like establish inactive relationships or apply security filters. From there you can also define how your data source is set up (DirectQuery, Dual or Import) and you will be able to create hierarchies, like you could do with Spotfire and Tableau. It’s a convenient menu to organize your data tables and how they interact with each other. In certain occasion it can provide more flexibility than Spotfire and Tableau, and at times you will be stuck. The good thing is that it pushes you to get structured and get familiar with data modeling concept like the star schema or the snowflake schema.

Once your data is loaded, prepped-up, correctly linked, you can go to the report view and start arranging a few visuals. Like Spotfire, you can work directly on the same worksheet with several visuals. Not being a big fan on how Microsoft sets up the overall layout, I really needed some time to find my way. Some of the most interesting menus are really hard to find whilst some totally useless (at least for me) are present on top. Alright, let’s assume you want to build a bar chart, you go to the visual menu and surprise! It’s not one bar chart option that you have from scratch but six! Eight if you include composite charts (bar & line chart altogether). Absolutely crazy and very effective: you just have to click and tadaa, your visual appears! You won’t have to scratch your mind about reversing axis, horizontal or vertical, PowerBI already made it for you. Of course this leads to fewer flexibility, but it’s great I think that Microsoft frontally addressed this topic. Bar charts are indeed the most popular visuals. The native visual gallery is also containing small gems like a dynamic funnel that I find pretty useful in some cases.

Power BI also enhances the traditional visuals by adding on top a few AI-driven visuals to spot the invisible from your data. One has therefore the luxury to use a Key Influencer lollipop chart, a decomposition tree and a Q&A visualization. I wouldn’t say that one would make huge discoveries thanks to these visuals but for sure these can help to better understand what is behind, what is underlying. And maybe it will help to understand a problem and where it is coming from. To that extent, I find the decomposition tree particularly adapted.

Another big strength of PowerBI regarding the choice of visualization is the third-party visual bank that can really help out in some situations. Since I’m focussing on native features I won’t delve too much into these but it is surely something to consider one day if you are running out of visuals. Watch out, not all of these third-party visuals are free.

After placing a few visuals on your dashboard-to-be you will realize that you will need to make extra calculations or that some new columns will have to be created. Here again, you will have to remember that for data wrangling you will need to pass by Power Query most of the times, but not all the times. Sometimes you will just need an additional measure, and for that it can be done very easily from the report view. Sometimes you will need a custom column to be added onto your data model, and for that you can use Power Query Editor. It will depend on how you want to data to appear and how you want to interact with them. Something is sure is that you won’t be able to do much without learning the basics of the DAX language. Thanks to the DAX language, you will be able to perform advanced calculations and take advantage of the filter context. Extremely powerful, specifically when used inside the matrix visualization, DAX will provide you with the additional flexibility that is sometimes missing.

There are of course limitations, like all BI tools. If you are fond of maps and into spatial analytics, Power BI will not be responding to all your needs. Drawing a map with a few countries will be fine, but going further in granularity might be either time-consuming or not suitable design-wise. It’s not a strength and should be used with parsimony. It’s actually pretty archaic and will denature your work so be very cautious with it. Another thing that I mentioned earlier is the overall menu and layout. I would dream of a right-click on my visual of interest and get all my options nicely sorted out, but no. At the moment, I have to go to this maze on the right of my screen and go click after click to this third menu of the fifth section of the bottom for the twentieth time of the day to perform what I want. A bit tedious over the time. I wish as well that I could customize a bit more some visuals with more useful options, that I could have a better grip on my labels when put on a chart…

PowerBI is designed to do BI, and in the world of BI I would rather say that it is leaning more towards reporting than analytics. One can define and share KPIs effectively in different manners. Power BI is marvelously handling small data that needs to be quickly shared to a team in Teams for instance. It is made for that. One can of course do a bit of data discovery and it will definitely make a good job most of the time, but it is not made for it. I believe this is important to mention because as much as a project goes, you will sometimes need more and more advanced analytics with true capabilities to support in the long run. This is why it is so important to assess the complexity of the datasets and the degree of customization that is needed from the start. And of course, make your end-users happy and enjoying the use of your solutions.

?? Tableau, The Ready-Made Solution for Data Analysis

Tableau visuals usually stand out first about the design: neat and classy.? Almost nihilist. The minimalist interface makes it nice and shiny. Very appealing and very tempting. No need to spend, at first, too much time on the design. Whatever you choose as visual will look like you have the most fashionable thing on your screen. Even if you don't have much to show, it will look nice once displayed in Tableau. It is a plus in today’s world where storytelling is key. Tableau Desktop will for sure accompany you well on this journey... providing that you don’t have too much data.

It is not just the design that makes it attractive at first. As its name rightfully indicates in French, despite being an American company, Tableau means “table”. And it happens that Tableau is extremely good at handling tables! ?Anyone confident with Tableau will show you a neat table with appealing icons. It is a strength, and the possibilities of Tableau on that topic are very strong. You can perform table calculations across, down, across then down, down then across, manage calculations at pane or cell level… Great when you don't know coding and don't want to scratch your head too long. Lots of possibilities to investigate. If you mostly have tables to display in one or two views, Tableau is your tool.

Another thing: pretty much anyone can feel like an analytics expert within a few seconds. Tableau Desktop is good to bring confidence because most of the elements are available out-of-the-box. The ready-made spirit. Draw a bar chart, change it to a line chart, add reference lines. All this is within a few clicks. You can even post your work on Tableau Public and share it to the mass. Every dashboard looks amazing there. Consistent color palettes, amazing charts, happy people. Until you realize that most of these dashboards are basically all hosted in one page with limited data and limited interactions. It of course depends on the definition we give to dashboards. In Tableau, a dashboard is a superposition of worksheets, meaning visuals. It does not necessarily have to be interactive.

Groups and Sets are the example of how practical analytics can look like with Tableau. Convenient and relatively easy to implement, you can use and abuse of these features to enhance your interaction with your data and enrich your dashboards with straight-to-the-point visuals, that can update dynamically based on what is selected. With groups you can simply regroup different elements of a column together whilst with sets you will be able to define a condition to pertain or not to a subset of data. Very handy and extremely useful to answer complex questions. Knowing that one can define fixed or dynamic sets, and that one can combine sets together, it offers lots of possibilities to answer very complex questions in a few clicks only. Groups and Sets really are instrumental to correctly master Tableau.

Where Tableau is at the same time both strong and weak lies in the making of visuals. Unlike Spotfire or Power BI, one cannot build a dashboard composed of several visuals right away when opening a Tableau workbook. One must first create a page for each single visual and then only gather them in one dashboard. On one side it brings a lot of focus on the visual that you are building: one full page to pimp it up with tons of options. Almost overwhelming. Tableau visuals are probably more customizable than the ones from Spotfire or Power BI. The cost of it is, of course, the time to build these pages, but also the time dedicated to the handling of your dashboards. Every time you want to change something in your dashboard, you will first have to go on the page of the visual. This best exemplifies the micro-management of visuals. If you have already eight or ten dashboards (combining at least 2 visuals or more), it means that you have at least sixteen or twenty pages, one for each visual. Going from one page to another may become a pain over time, clicking through over and over will also become more painful. Think about it when you have large analysis involving several pages. In exchange, your visuals will look pretty. Some will see it as a powerful thing, some others will do everything to avoid it. It depends on your style.

The first difficulties will appear with large and complex datasets. When loading big datasets, I close my other Tableau dashboards and breathe deeply. It tends to crash easily, even though my laptop is not showing any sign of fatigue. I noticed that several times already: keeping several Tableau workbooks open in different windows considerably increases the risks of crash. Once overcome, the main challenge will be the handling and preparation of large datasets. If you have just a few basic unions to make between tables it will go fine, however more complex transformation will necessitate you to use Tableau Prep… Tableau Prep offers good possibilities to handle datasets, but it brings along the weakness of segmenting your activity. You will not be able to handle everything on Tableau Prep and you will not handle everything in Tableau Desktop. AS a result, if you have several Tableau dashboards open, and need to make amends in the data transformation, you will have to open several Tableau Prep files, makes your changes and launch another workflow again. Any time you have to make a change that is structuring the data, you will have to pass by Tableau Prep. The positive thing however is that you will be able to perform a little bit more advanced transformation compared to Tableau Desktop and bring some automation, producing some new file (in a specific Tableau tbwx format) or scheduling some refresh/publishing which can be quite nice I have to say.

I rapidly evoked storytelling at the beginning. Tableau is great to combine text and visuals. To tell a story and to guide your peers, it is a great feature that is probably a bit underrated.

Though I was easily dragged into the tool because of the promising visuals and the amazing marketing around the tool, I soon realized that the trendiness look of the visuals comes with a trap: the micro-management of visuals. As if as the advanced features on the design would somehow compensate the lack in the data handling. You will at some point like to have a different way of displaying your visuals and brand them a little bit: to achieve that, you will need to spend hours only on design. Hours spent exclusively on tiny design things, hundreds of clicks, making your tables and charts look the way you would like, because the designing options are like a maze, not necessarily straightforward.

In my opinion, Tableau gives too many “locked-in” options to change visuals. The bottom-line is mostly on the size of the data and the story I have to tell with the data that I have. Using large datasets dynamically with advanced features and performing heavy calculations is currently a bottleneck, because this currently means a lower experience for me and my end-users.

Tableau is definitely a good tool and should be chosen if you don’t have more solid options. Having said that, my best advice is to assess how your datasets are and if you need to perform lots of transformations or not. If the main expectation is to summarize small or mid-sized datasets in a few bar charts and a few tables, if your stakeholders are happy with the design and if you are not planning to connect too many data tables together inside Tableau, you can just go ahead. If we are talking about bigger datasets, bigger customization needed and more data transformations, scripting, I would probably consider another tool at the time being.

Final Word

At times, I have heard that certain business intelligence (BI) tools come with a steeper learning curve compared to others. I tend to disagree with this. On multiple occasions, I have observed individuals mastering any of these tools in just a few weeks, effortlessly using complex features without any issue. It depends on the engagement one is able to put in, right from the start on day one. Let’s be clear here: you can’t force somebody to use a tool.

The most important thing to remember also is that none of these tools are fundamentally bad. Each of them, used correctly, can convert your data into gems. The only thing you have to keep in mind is that depending on your purpose, your audience (and its habits), your datasets and your skills, you might not be able to handle everything in one tool.

In my case I need the power horse to support big data handling, use features that don’t exist anywhere else, need also to perform advanced analytics with scripting. For that it’s a no brainer, I use Spotfire. With Power BI I can publish online and share easily the reporting I make on Teams. If my audience is not into data & analytics, is quite involved on Teams and is only interested in consuming basic KPIs, Power BI comes top of my mind. If I need to perform particular calculations on a table and display it nicely, using groups and sets I would rather consider Tableau. Each tool is different and each one is serving a different purpose.

The best advice I can give here is to try out, there is absolutely nothing to lose!

?#DataScience #Data #Tableau #PowerBI #Spotfire #Microsoft #Analytics #BI #DataAnalytics #BigData

#BusinessIntelligence #DataDriven #AnalyticsInAction #TechInnovation #DigitalTransformation

Hoang Nguyen Huy

Reporting Specialist | Data Analyst | Python

2 个月

I can use well both Spotfire and PowerBI, Honestly it was some trade off for functions between them two While has much larger community and strong start point, Power BI has it own big pool of visualizations (in which Spotfire catch up with mods, but still far behind) But the main point is power query M language for data transformation is quite too old for large scale* data set. It comes extremely hard when you start your own calculations with DAX on the row-context level. On the otherhand, Spotfire is extremely easy for BI experienced user to catch up, also steady even on the larger scale* data set, but the limitness of visualization and also customization bring up some reget for dashboard building. (I used a lot ironpython and data function for automation in spotfire, too) *tested on ~30M rows & transformations without any pandas support on both platform

回复

How does the release of Microsoft Fabric change this review?

回复
Matthew Callahan

Sr. Principal Clinical Data Scientist - Solution Owner, Clinical Analytics Services at MaxisIT

1 年

Excellent overview of the foremost BI tools in data analytics.? Spotfire's strengths have never been so succinctly and accurately articulated.? Thank you so much for this.? ??

Catherine Sirven

LifeHub animation. Innov4Ag program at Bayer

1 年

Wahou!!! This article is just excellent! Thank you so much Vincent Thuilot for sharing your thought. What I find just amazing is the association of both technical details and mindset consideration. The "Why" is set at the most important place and the "how" is just there to serve the main objective... It is just a perfect exemplification on how digital should enlighten and not dazzle the users... I can't advice more that to read the complete article and will share it broadly to my learner community! THANK YOU!!!

Andrew Berridge

Senior Principal Data Scientist, Spotfire

1 年

Hey Vincent - nice article! One small correction... You say that Spotfire is mostly built with IronPython - this isn't correct. It's mostly built with C#, and you can access the C# API using IronPython.

要查看或添加评论,请登录

Vincent T.的更多文章

  • What it takes to learn analytics (in less than 2500 words)

    What it takes to learn analytics (in less than 2500 words)

    Learning analytics is everything but a resolution that you should take at the beginning of the year. Rather the…

    1 条评论
  • It's time to make your data speak!

    It's time to make your data speak!

    It is 9 am and you have received an important email. The board is meeting at 10 and you are expected to present results.

  • 2018 World Cup: A Lesson of Management

    2018 World Cup: A Lesson of Management

    Now that the world cup is over and that Gary Lineker’s famous quote about Germany always winning is no longer true, we…

    2 条评论

社区洞察

其他会员也浏览了