MeasureCamp Stockholm 2024 Takeaways
Alban Gér?me
Founder, SaaS Pimp and Automation Expert, Intercontinental Speaker. Not a Data Analyst, not a Web Analyst, not a Web Developer, not a Front-end Developer, not a Back-end Developer.
Some of you are eagerly waiting for my post-MeasureCamp notes. Thank you very much for your support! Considering the eight hours it takes me on average to empty my brain, I realise the daunting task facing me, especially after returning from a Georgian restaurant on the outskirts of Stockholm for lunch. With its multicultural life, Stockholm offers many options for trying new cuisines. It might come second only behind London. MeasureCamp Stockholm took place yesterday for the fourth time in the city, and I have been fortunate to attend every year so far. I hope it continues, but I need more students for my Javascript Senpai online course! Without further ado, let's start with my takeaways.
Yours truly and Victoria Smith : Running a Session, Moi?
This was a repeat of the same session I ran at MeasureCamp London. With Vicky co-presenting, I could run it more like I planned in London, i.e. going around the table rather than me and a few leading the discussion while most remained silent. When running a session of that nature, you realise that a slight bias makes the attendees value sessions where they will gain more external knowledge than learning about themselves. This is by no means a reproach. I knew the attendance would be small for such a topic, which would help Vicky because not only had she never given a session before, I didn't tell her she was going to until the last minute.
Her apprehension didn't last long when she realised I was booking the smallest room and five people attended, not 50. Besides, the session format, i.e. going around the table, affords the advantage of not needing a deck or preparation. An essential aspect of such a session is to help people realise that they are not the only ones terrified about public speaking. It also helps them realise that what's blocking them is far from unique; they are often surprised that they share a specific blocker with others. Sometimes, one will describe more eloquently what they go through. Here are the blockers we identified:
Facing an unknown audience comes with its own set of challenges. If public speaking terrifies you, you may be introverted. Introverted does not mean shy. Introverted people find networking events draining them of their energy, whereas extroverted find such events energising. Imagine being an introverted on a stage with all eyes on you. Introverted people recharge their batteries in solitude. That's exactly how I was at my first MeasureCamps. At one point, I had to take a break to recharge mentally. However, the more MeasureCamps I attended, the more people I talked to, my energy depleted more slowly, reaching neutral and giving me more energy as if I were a bit more extroverted. It takes about 20% of people I know to feel that way. I assume that this number varies for everyone.
Calibrating the content to the audience is about mitigating the risk that your audience does not understand you. The typical case is you being very technical, but your audience is anything but. Conversely, you might cover the basics of a topic while presenting to an audience of experts. How do you avoid putting all that effort into preparing and presenting for nothing? How do you find the balance? Let's consider the what, the why, and the how of your talk, i.e. what you tried to achieve and the results, why it matters, and how you achieved it. With an audience of laypeople, I would focus on the what and the why and drop the how. All three are helpful to an audience of experts, but they would appreciate and expect a focus on the how.
The lack of professional experience was another blocker cited by a few attendees. I interpret this as the infamous imposter syndrome. The imposter syndrome makes people feel they do not deserve to be where they are in their careers. They are convinced that there are far more experienced people to do their jobs, and their getting the job instead is some mistake. In the context of public speaking, they would feel like someone else would be the better speaker. There would be people with more public speaking experience who had to start from nothing, just as you are. From a substance point of view, i.e. the content of the presentation, getting and keeping the attention of experts can feel like a challenge. Remember that these experts were someone like you once. Self-talk, i.e. how we talk to ourselves, is often harsher than what we imagine others think of us. All these experts were once a little rough around the edges, and for many, you will remind them of themselves when they had accrued similar experience.
Finding an interesting topic is a blocker I found particularly relevant to me. The topics find me rather than the other way around. In the first few weeks of working for Barclays Bank in London, our developers turned the login pages to the online banking section into a single-page application (SPA). The developers should have communicated with us about that change, but suddenly, our page views dropped except for the first web page of the login process. I learned the Javascript framework the developers used and created a few mock-up web pages showing how to make them work with our digital analytics tool. A few months later, I registered a web domain, got a web server with a hosting firm and created new mock-up pages for myself. That became my first MeasureCamp talk.
Dealing with Q&A sessions was also an interesting blocker mentioned. Imagine the blood draining from your face when someone asks you a question you cannot answer! Eddie, who mentioned that blocker, shared how he learned to deal with it: he throws the question back to the audience! It is much better than admitting that you don't know the answer. But by asking the audience, some will have sensible answers for you and the rest.
A few value preparation as a mitigation strategy. I have seen some keynote speakers who learned their presentations by heart. It is all well when your memory is infallible, but we are only human. Preparation feels like a fragile crutch when you realise you skipped a key point in your presentation. From the London session, Ton Wesseling recommended this: nobody else knows you forgot that key point; panicking is guaranteed to make them realise your terrifying secret. Finish your current point and then introduce the point you forgot. It's more important to come across as authentic, genuine and relatable than in control.
We did not hear from the attendees because of the fear of missing out (FOMO). While taking a speaking slot for your talk, another talk might happen, which you wished you had attended but no longer could. I recommend reaching out to the speaker and asking them to share their deck and arrange a one-to-one conference call where they give you the lowdown on their presentation.
Although understanding one's blockers is critical, mitigating them may not be sufficient. From personal experience, I felt like I was leeching off the people who volunteered to present; in other words, I felt guilty, which motivated me to start presenting at my third MeasureCamp. What if inside us all had a switch overriding all our blockers? Some famous singers have, or had, this larger-than-life persona when on stage while being very shy in private. Freddie Mercury, Prince, and Jim Morrison come to mind. They found their switch, turning them into someone else for a few hours, only to return to their true self when the curtain closed.
I highly recommend reading my notes about the same session at MeasureCamp London six weeks ago, as they provide further insight. Thanks again, Vicky, for going along with my scary plan and giving me the session idea last year.
Steen Rasmussen : Decision vs Data
As digital analysts, we can only add business value if we get buy-in, i.e., get the stakeholders to make changes based on our recommendations. Most commonly, our recommendations fail to have an impact. Is it because our stakeholders are stupid or evil? Hanlon's Razor suggests that the former is far more likely. Our world may seem evil sometimes, but it's more likely the consequence of cumulated stupidity.
According to Steen, our field faces a crisis of faith. Only 1 in 3 CEOs trust data, and even we practitioners do not trust much of the data we collect, yet we trust our recommendations. We can argue over the best terminology: are companies data-driven, data-informed, or data-inspired? What about data-deceived? The deception may result from data cherrypicking and confirmation bias rather than outright data falsification or fabrication.
We should not start by asking the stakeholders what they want to track because their answer will be "everything", which breaches GDPR Article 5 against data hoarding. When you track everything, you need to maintain everything, said Steen. You also need the headcount to analyse all that data because data left unanalysed is the definition of hoarded data. The more data you collect, the more you enable data cherrypicking, i.e. selecting the data that supports your point of view while blaming poor data quality for the data that does not. Data cherrypicking is not the only root cause of data deception. The more data you collect, the greater the risk for spurious correlations, i.e. false causality. Your company would start taking actions that fail to produce results or backfire.
According to Steen, we should see a nice balance of tech, process and people with digital delivering business value at the intersection of all three. However, with data tool vendors promising tools that require little staff and little training, we see companies focusing on spending their budget on tech, leaving process and people as an afterthought and no intersection between the three. Outside digital analytics, however, there is a widespread but largely ignored need to focus on people. People responsible for lacking buy-in prevent closing the last-mile problem of taking action based on actionable insight. The focus on spending on tech is also something they lament.
What if we seek to support business decisions instead of asking what the stakeholders want to track? Companies invested in digital analytics to improve their business. What are they trying to improve now that we can support them with? We need rules on how we collect data, share the raw data and manage with whom we share the data. We need data governance.
Businesses engage in what Steen calls one-door projects, i.e. Big Bang changes that you can't unwind. Putting the toothpaste in the tube after jumping on it is hard. Digital analytics provides an alternative: a two-way door in the shape of the plan-do-check-act (PDCA) loop that should be familiar with all A/B testing maestros out there. We push changes onto a small control group. We measure the uplift of the control group over the rest. If the uplift is positive, significant, and greater than the error margin, we roll out the changes to everybody. If it's inconclusive or fails, we try a new testing hypothesis instead and accumulate small gains, which translate into tangible business value over one, two or more years.
Steen also laments a diffusion of responsibility over data collection. If it fails, identifying these failures is everybody's job, not just the data analytics team's. You can't spend all day, every day, watching metrics, seeking and reporting anomalies in real-time. It is everybody's responsibility, which means nobody feels responsible because everybody thinks that someone else probably already has reported it.
Steen concluded his talk with a bone-chilling number: According to Deloitte, CMOs intend to slash Digital Analytics by 25% because we have yet to deliver business value.
Matt Gershoff thinks it's neither malice nor stupidity but somewhat perverse incentives that underlie the business's resistance to data. According to Matt, understanding the incentive structure, i.e., what will help them get their bonus, is key. Steen agrees.
Steen's talk resonates strongly with the AAAA framework for digital analytics excellence, which I presented at WAW Copenhagen about eight years ago. AAAA stands for (Auditable, Accessible - i.e. understandable and shared selectively, Actionable, and not adjective I know, but Acceptance, which is what we seek from the stakeholders ultimately). Auditable includes reproducibility audits, which I also covered in one of my newsletter articles, and aims to address the trust deficit that Steen witnesses. Ultimately, the AAAA framework seeks to reforge the Gartner Hype Cycle into a completely different shape, mostly by recognising that it's the composite of at least two or probably three curves, the first of which deserves a haircut.
Piotr Gruszecki : Tech Stack For Data Wrangling & Analysis - At Scale
Piotr works in Conversion Rate Optimisation (CRO). Data is needed beyond A/B testing. The field is similar to no-win, no-fee lawyers: it's no uplift, no pay. His work has three major components, from the most important to the least: workflow, pipes, and tech stack.
Workflow:
There's the myth, i.e. the sales pitch, and the reality
There's variety in data formats:
Data comes with layers - not to be confused with data layers. Each layer pertains to the degree of rawness:
Analysts should not have access to layer 1 data.
Pipelines:
Piotr recommends Airflow for managing data pipes. There are soft and hard pipe crashes. Soft crashes mean a flow pushes contaminated data, but Piotr also uses additional basic checks to handle pipe crashes.
Stack:
Piotr was a fan of Quarto, which sounds like Jupyter Notebooks. Quarto is good at blending code and descriptions. It also has a low learning curve and gives you a link to share with your stakeholders, which is easier to share than huge PowerPoint decks. Reproducibility is key in Piotr's work.
Snowflake is an essential tool in Piotr's stack. Snowflake provides a Copilot LLM function to generate SQL code from natural language.
Piotr prefers Shiny over Power BI but has no choice but to use what his clients use.
Piotr believes that managers are to blame for silos, amounting to little more to ammunition for meeting showdowns. As a result, what else do you expect but data silos.
Third-party data is dead; deal with it. Focus on zero (asking the customer directly) or first-party data (inferring behaviour from your website data).
However, when things change so fast, aiming to master one or more tools is a fool's errand. Tools change, tools come and go, so it's wiser to focus on mastering business logic.
As a final remark, Piotr added that 90% of the data he deals with is unstructured.
John Hansfeldt : Building a Chrome Extension: From Concept to Launch
That one was a bit of a walk down memory lane as I did a session about two Chrome extensions I built over five years ago, i.e. "Duckface" and the "3rd Viscount of Duckface". One of them brute-forced the detection of what's tracked on the page for Adobe Analytics by firing click and change events on every DOM element on the page and capturing the network request, linking the two. As it scanned the page, it made sounds of ducks having fun on a pond and ended with quack-quack-quack-quack when done. But I never dared get it approved on the Google Store like John did.
John's journey started with using a collection of Google Chrome extensions to QA various tags, each vendor providing their tool. Since that was too tedious, John began to do his QA off the raw network requests in DevTools, which could probably drive you blind and mad, and maybe not in that order. So, John decided to write his own Chrome extension to bring all the functionality he started with under one single extension. That took him eight months despite not having a developer background. AI helped a lot.
There are a few files that without which you can't create an extension:
If your extension will show a popup window, you will also need the following:
Testing can happen locally on your machine and costs nothing. But if you want fame and prestige, you must submit your extension to Google so they can offer it to the general public. That costs $5 for up to twenty extensions. Each review can take up to 30 days. Google will expect promotional images and copy, which John generated efficiently with ChatGPT.
John found Google's reviews rather picky, especially regarding privacy rules, which require linking to your privacy policy on your page. The manifest.json file requires specifying the permissions your extension needs. Any seemingly superfluous permission will get you a rejected review.
I was intrigued by how John captures network requests in his extensions since the onBeforeRequest function is no longer part of Manifest 3 due to privacy concerns. However, John explained that it has been replaced with something else that does the same job. I gave up extension development because they are blocked by default at work, and capturing network requests was no longer possible under Manifest 3. I focused my efforts on using monkey patching off various Javascript methods generating HTTP network requests using Tampermonkey instead, which I am porting to a better-supported Node-TypeScript-Playwright stack.
Jonas Velander - A session about best uses for AI vs alternatives
The hype around large language models (LLMs) is enormous now, and many business stakeholders are under the false impression that this is a magic bullet for everything when machine learning (ML) and even venerable regression analysis deliver better results. In this session, we discussed what LLM is good for, bad for, and what's better. I will try to rank them below:
My personal experience with AI is rather diverse:
Carl Franzon : Are we heading towards another data hoarding epidemic in the AI gold rush?
Companies face the temptation to track everything just in case, and lay on top of that mass of data a chatbot or a large language model (LLM).
I felt we should address the elephant in the room: GPDR Article 5 says data hoarding is a breach. Some insightfully remarked that the general public is still rather in the dark about the kind of tools at our disposal. Right now, trust may be high after GDPR. What if the knowledge about session replay tools was more widespread? We could face a backlash.
AI is not a magic bullet. Feeding AI with all the data we can collect can lead us to miss the forest for the trees. Perhaps life is too transient for AI, as it relies on a deterministic flow of events, as someone added.
Another participant shared a remarkable story of how they collected weather data and leveraged it to optimise the bidding on paid search terms. The savings were significant for their company.
Emi Olausson Fourounjieva - leveraging AI to drive personal development
Emi shared the belief in the CXO world that companies compete on the data volume they capture. Letting the competition capture more data and identifying trends nobody else saw can lead to significant market share gains at your company's expense.
Closer to the ground where we are, we know it is fallacious reasoning. The more data you capture, the more you facilitate data cherrypicking, confirmation bias, spurious correlations, GDPR Article 5 breaches, and degradation of your ESG scores, to name only a few. We learned last year that all forms of AI combined require as much electricity as Japan. Where are we now? Japan, plus South Korea? The more data you collect, the more cooling you need - water. I wrote some time ago about what is technically collectable, legally permitted and morally defensible. With each level, the scope of data you should collect gets ever smaller.
Echoing what Steen said in the morning, instead of tracking everything, we need to ask more questions:
Emi remarked that the bigger the company, the deeper you must dig.
I brought up the topic of getting buy-in. The writers report this enduring lack of buy-in and cultural resistance in some business periodicals. The latest annual report by NewVantage Partners shows how companies are seeing less business value in data, with figures we have not seen so low for several years. The impact of AI as the latest hype is clear, and decision-makers are hitching their wagons on that AI train rather than the data train of a few years ago.
A lot of that cultural resistance comes from data contradicting the official narrative. The stakeholders only value data supporting their point of view. If AI supports them and data contradicts them, they will embrace the former and despise the latter. The truth is probably around the idea that contradicting data provides nuance to the official narrative rather than contradicting it altogether, as if describing the other side of the same coin. As analysts, we offer only impartial data without agenda or prejudice.
We often hear that getting buy-in requires excellent communication skills. However, trust plays an important role. As Steen mentioned in the morning, our field faces a trust deficit. You can be the best communicator, but without trust, it's all for nought. A less talented but trusted communicator could have more impact. A participant shared the story of a brilliant colleague who failed to sway the business. Emi summarised this by saying that mindset is crucial.
Another participant offered?adaptability?as a key soft skill, i.e. keeping an open mind. Some may ignore AI, but like the man who stopped reading the Readers' Digest when I read there that smoking kills rather than stopping smoking, AI will also impact those who ignore it. Someone added nuance and said an open mind without critical thinking is dangerous. One should not chase the latest shiny fad blindly. By chasing AI, there's a tangible risk of losing knowledge and the ability to maintain what we automate.
Emi told us something an American psychologist for eight and nine-figure CEOs told her: Over 95% of all decisions are emotionally driven emotions. Empathy and emotional intelligence matter, even when we produce ever more data to support decisions. Jeff Bezos has an interesting take on decision-making: focus on the immutable things we know and use that anchor for your decisions.
If personal confidence comes with a halo effect that spills over and leads to professional confidence, how does AI impact this? The fear of losing your job to AI can reduce that personal confidence. Can AI develop feelings, too? What are our responsibilities when such an AI exists, and what will happen within the next five years, according to Emi?
Last but not least, stort tack till: Chris Beardsley Philip Bromley Lisa Lindh Risberg Celine Derkert Ashit Kumar Josefin Kjellbris Mikael Malmgren Anna Forssén
Thanks to Curamando Conversionista! Eidra Simmer Conductrics INC Piwik PRO Amplitude and the all the sponsors I forgot! Thanks to Simon Dahla Madeleine Hellsén
And I am only getting started. Thanks to: Jessica Loveng Valentina Mikhaylevskaya Hanna Larsson ?ke Rosvall Jonathan Altg?rd Mari Ahava Simo Ahava Petri Mertanen Karoliina "Liina" K. Lotta Holm Mikko Piippo Peter Meyer Gunnar Griese Caroline Vidal Johan Strand Ezequiel Boehler Nathaniel Weiss Noemi Boldizsar Erika Bohlin Marcin Pluskota Lasse Hoffmann Jon Su and everybody I haven't mentioned. You're awesome! It's almost 1 a.m. I think I can schedule that article for publication and go to bed.
#MeasureCamp #MeasureCampSTHLM #DigitalAnalytics #WAWCPH #CBUSWAW
CEO @ Altamedia | Data, Insights and CRO Expert
2 周Alban Gér?me, thank you for the super detailed and insightful summary of the sessions. The level of detail, and above all, your ability to capture the most important points, make you a true master (as if anyone ever doubted that?).
CEO @ Optimal Ways | Digital Analytics Consulting for Ecommerce & Retail | Certified B Corp
2 周thank you for sharing Alban Gér?me!
Measurement Lead - Digital Analytics
2 周Great summary, thanks Alban. Delighted you could make it again ??
Digital Analyst ? Intern
2 周Nicely done Alban Gér?me. Nice meeting you. Whishing you all the best with your course moving forward ??
Helping Leaders & Teams to Adapt & Thrive ?? Enabling Business Growth ?? Digital Strategy, AI, Innovation, Leadership & Soft Skills ? Digital Transformation, Marketing & Analytics Leader, Public Speaker, Board Member
2 周It was great seeing you again, Alban! Thank you for attending my session and for sharing your valuable insights. Interesting summary ?