Evolution of Digital Marketing Data Analytics to Google’s Machine Learning
Image Copyright Pixabay.com

Evolution of Digital Marketing Data Analytics to Google’s Machine Learning

While many companies are deep into creating and managing their digital marketing campaigns using many Google advertising platforms such as Google’s AdWords, Display Ads, YouTube Ads, Video ads, Yahoo ads, Bing ads, AWS ads, and even many Social Media platforms, silently the Machine Learning career path is developing. I’ve identified four aspects of this migration into Machine Learning analytics in my digital marketing and data analytics journey. Why is this happening? My thought is that internet privacy is a concern for consumers. Therefore, the digital marketing industry will need to adjust to changes in consumer interest. Many countries and states in the USA are moving to adopt a data governance e-policy to no longer track consumers throughout their online journey. Therefore, many companies are moving to notify online customers of their cookie consent and are tagging all events within their website to capture user ID information while the consumer is on their website. Also, to utilize old historical data of consumers who have visited their website, many companies are moving away from expensive server farms and uploading their enterprise datasets onto a Cloud Software as a Service (SAAS) company such as Google's BigQuery, Snowflake, Microsoft Azure, and many more options out there. The advantage of this is to only be charged when running SQL queries for specific datasets about their online consumers and utilize those leads list to retarget them in email campaigns and other more focused campaigns.

Here they are:

Step #1 (Migration to GA4): Data Analyst will take a company’s?website or an abundance of websites from the legacy Google Universal Analytics 360 e-commerce reporting system and migrate it to the new Google Analytics 4 system. Many of the same features companies large and small loved about Google Universal Analytics 360 and Google’s Data Studio dashboard are still there in Google Analytics 4. If companies that used Google Universal Analytics 360 want to continue to have their Google Analytics reports without any lost data collection from the usage on their website, then they will need to at least transition to Google Analytics 4. Since many companies are working with a wide variety of website platforms from Shopify, WordPress, Magento, legacy code, and many more website platforms, the transition to GA4 is not going to be that simple. Therefore, the extension for migrating to GA4 from UA 360 has continued to be extended from July 2023 to July 2024.

No alt text provided for this image
Migration from UA to GA4 Extended Until July 2024

Once the Google Tag Manager script is added to properly in the correct areas of the theme.liquid code below the <head> and below the <body>, the data analytics for the website will continue to collect e-commerce data without any interruption after Google Universal Analytics 360 is phased out for its replacement with Google Analytics 4. The new UI for Google Analytics 4 is streamlined with more consolidated sections by breaking out the Life Cycle and User ID as compared to the previous Google Universal Analytics 360. Below is the new user interface. You can see the difference. If you wanted, a Data Analyst can connect their Google Analytics 4 data source to Tableau or Microsoft Power BI to create reporting from it or they can use any of the templates that Google already provides in the rebranded Google Data Studio, which is now Looker Studio. Tableau, Microsoft Power BI, and Google's Looker Studio utilize similar customization features in their visualization reports that paint the picture of how large enterprise datasets are performing as KPIs. These e-commerce websites over time have gathered so much data that Big Data is evolving the digital marketing industry into cloud computing, and machine learning and endorsing Data Analysts to become more multi-dimensional in terms of adapting job functions to not only communicate technicalities to non-technical people but also embrace light computer programming and Cloud database extraction skills.

No alt text provided for this image
UI/UX of GA4 vs. GUA 360

Also, Google Analytics 4 comes with pre-built event tracking for the most common events, not including e-commerce tracking. If you want the Monetization section to populate, then you would need to tag e-commerce Data Layers in Google Tag Manager. Tealium IQ is another enterprise tagging system for enterprise websites that does the job. Along with Adobe Experience Platform's Adobe Launch (analytics), which allows Data Analysts to utilize its system to generate custom JavaScript code and tag the Data Layers and various events in a website. One caveat about Adobe Analytics (AEP) is that it too is evolving into a more robust system and transforming. Adobe is discontinuing its Reports & Analytics features within AEP on December 31, 2023, AND MOVING IT ALL TO ANALYSIS WORKSPACE. Please make note of AEP's announcement below on how your company wants to track events on your websites using AEP and make the transition to getting familiar with Analysis Workspace.

No alt text provided for this image

As you can see, there is a combination of pre-build and custom URLs and UTM tags (events) in Google Analytics 4.

No alt text provided for this image
Preset GA4 Tags & Custom GTM Tags show in GA4 Reporting

Step #2 (Tagging in GTM): However, instead of tracking e-commerce events activities such as Category, Action, and Label in Google Universal Analytics 360, ALL clicks and scroll activity within a website is now a tagged event. All e-commerce activities such as add_to_cart, view_cart, begin_checkout, add_shipping_info, add_payment_info, items, value, currency, tax, coupons, purchase, and many other e-commerce datasets are tagged in Google Tag Manager in the data layer element of tags. Within Google Tag Manager, the Data Analyst can still tag URLs, UTM, and other landing page metrics as before, and assign potential currency values to the leads submitted within those landing pages if they desire to. When the Data Analyst begins to tag the Data Layers for e-commerce, this is where they quickly find out that not all website platforms are created equal for e-commerce tagging so that the Monetization section of Google Analytics 4 reporting populates with real-time revenue metrics. For this article, I am sticking to Google products to make it easier to understand. Therefore, I use the free Google Tag Manager to tag all events and e-commerce data layers on a website. Other tools used are Adobe Experience Platform and Tealium IQ.


Regardless of the tagging system you use, this is really important to note. MAKE SURE YOU DOCUMENT IN A SPREADSHEET THE NAMES OF THE TAGS (tags, triggers, variables, parameters, custom Javascript code, etc.) YOU USE, IT SO THAT IT IS EASIER TO RUN SQL QUERIES BY COPYING/PASTING THE FOLDER NAMES INTO THE CODE. Otherwise, you will likely spend days on end tracking how particular e-commerce data layers and events were tagged and named so that you can run the file names in BigQuery. If you don't document the tags in a Google Sheets or Excel file, you will spend endless hours tracking file names in the source code, inspecting, debugging, and running sample SQL queries. Why is this important? All cloud SAAS services like BigQuery charge based on bandwidth to run through all your enterprise cloud server files to locate the files and metrics you want more information on and for data modeling in Machine Learning. A Data Analyst can make the mistake of running: SELECT *, which is not specific enough to extract exact file names and specific triggers and variables. Therefore, the SQL query will run through everything which creates a large bill to run that query, say a couple of thousand dollars depending on the enterprise dataset. Multiply that query by dozens per day and you are in the negative. I ALSO RECOMMEND HAVING GOOGLE SHEETS/EXCEL SPREADSHEETS ON SQL CODE AND HAVING TEAM POW-WOW MEETINGS ON HOW TO IMPROVE THE CODE FOR SPECIFIC RESULTS BEFORE RUNNING THE SQL CODE. That way, you are wise in how you spend the marketing budget on SQL queries. Just a suggestion.


Also, many companies are using a variety of website content management systems to run their e-commerce functions. Depending on the digital marketing team and its stakeholders, they can be running their online commerce on Shopify, WordPress, Magento, HTML5/CSS, and a wide variety of website platforms. For some of these website platforms, pre-built Data layer plug-ins with the JavaScript script have been created to use that will sync up with Google Tag Manager and Google Analytics 4. For others, the Data Analyst will have to insert a custom Data Layer script into the code to populate the e-commerce tagging in Google Tag Manager. Google Tag Manager is the same in that you still have to set up the Tags, Triggers, and Variables. See below for what that looks like on the website I am working on.

No alt text provided for this image
Custom Tags set in GTM show in GA4 Reporting
No alt text provided for this image
Custom Triggers set in GTM show in GA4 Reporting
No alt text provided for this image
Custom Variables set in GTM show in GA4 Reporting

In the preview mode of Google Tag Manager, you can see real-time e-commerce data come in when a consumer starts adding items to their shopping cart and begins the purchase because those Data Layer e-commerce metrics will populate in Google Tag Manager. This is valuable data analytics for any digital marketing team so they can see the funnel report in Google Analytics 4. They can see where consumers dropped out and figure out incentives that can pop up or show up during check-out to entice consumers to complete the purchase. The rest of the data can be reported to analyze what the various digital marketing teams can do to prospect these low-hanging leads via e-mail marketing and re-targeted advertising campaigns. At this stage, there is plenty of value in data analytics for reporting deep dives.

No alt text provided for this image
E-Commerce Data Layer Collects User ID Activity in Real-Time and feeds GA4 Reporting

Step #3 (Optional: BigQuery): This is perfect for enterprise companies with big data to wrangle. BIG NOTE: Data Analyst teams can break the department budget by running random queries all day without pre-planning the SQL code they want to execute 1st. Be wise, not careless. Strategy over SQL query quantity. I ALSO RECOMMEND HAVING GOOGLE SHEETS/EXCEL SPREADSHEETS ON SQL CODE AND HAVING TEAM POW-WOW MEETINGS ON HOW TO IMPROVE THE CODE FOR SPECIFIC RESULTS BEFORE RUNNING THE SQL CODE. REMEMBER, THE COMPANY IS CHARGED PER QUERY BASED ON THE BANDWIDTH IT TAKES TO RUN THROUGH YOUR WHOLE SERVER TO EXTRACT THE SPECIFIC INFORMATION YOU WANT OUT OF YOUR ENTERPRISE DATASET. The more specific and well-written SQL code that includes the correct files in the code that you got from the Google Tag Manager Tagging spreadsheet that you documented, the better the result and the more cost-effective the query. This is the part that I WISH BIGQUERY AND OTHER CLOUD PLATFORMS WOULD READ THE CODE AND PUT THE COST OF RUNNING THAT CODE ON THE SCREEN SO THE ANALYST CAN MAKE ENHANCEMENTS TO THE CODE BEFORE RUNNING IT. All products have its price listed before you buy much like a pay-to-play model. That way, you are wise in how you spend the marketing budget on SQL queries.


What Google offers for such companies is the Google Cloud Platform (GCP). This is attractive to medium to enterprise companies because they can migrate budgets from managing large server farms and hardware to storing their e-commerce data and migrate into having Google Cloud Platform store big data for them. Then when they want data mining of particular datasets such as user IDs for online customers who failed to purchase or any other useful e-commerce list, then all they need their Data Analyst to do is run SQL queries through the BigQuery user interface. Data Analysts should be mindful of the SQL query scripts they run in BigQuery because BigQuery is a SAAS (software as a service) product. If they run general scripts, Bigquery will charge based on the bandwidth it takes to extract that specific dataset throughout all of the company’s cloud database. Refining the SQL script to more specifics takes less bandwidth and is more cost-effective. I recommend using the best practices outlined by Google’s Coursera online courses to see example SQL scripts for best practices for running cost-effective SQL queries. As you can see in this example below the different e-commerce Data Layer metrics that your Data Analyst pulls from the cloud database. The below example is what it looks like in an enterprise dataset from Google’s Coursera Qwik Labs.

No alt text provided for this image
Run Data Mining Sales Leads lists using SQL queries on the GTM Tags in BigQuery (GCP) Saves on Server Farms to store Big Data. Provides Insights on Optimizing Digital Marketing campaigns.

Now, your large query results can populate into Google’s Looker Studio reporting for your digital marketing teams to optimize their qualitative and quantitative analysis from how they run higher ROI advertising in the many Google ads platforms and other digital ads such as email marketing, social media marketing, etc. Below is a Google BigQuery report on user activity in an education account.

No alt text provided for this image
The Same Template/Custom Reports in Google Looker Studio (Data Studio 2.0)

Many of the BigQuery Looker reports are templated to give you an idea, but your Data Analyst can customize it like before in Looker Studio (Data Studio). The below image is a dataset I am working on in BigQuery and Looker Studio.

No alt text provided for this image
Customizing Visualization Funnel Reports for BigQuery Datasets in Google's Looker Studio.

Step #4 (Optional: Google Machine Learning): Creating Predictive analytics models with Deep Learning setup in Google’s Machine Learning is a little further out as many companies are in the migration to Google Analytics 4 and e-commerce tagging in Google Tag Manager, but some enterprise businesses are fast accelerating into this phase of digital marketing analytics, sales, and lead generation to finally making predictive e-commerce models and reports using Google’s Machine Learning platform. Before Machine Learning via Google Cloud's BigQuery platform, the digital marketing profession would evaluate bounce rates, dropped check-out carts, refunds, and other e-commerce factors and scratch their heads wondering, what if we could get a higher percentage rate predicting the rate of return buyers who'll buy, what percentage are the drop-off rates on the different campaign, etc, then we can optimize our digital marketing campaigns to be more precise and cost-effective. Well, Google has done it with BigQuery and Cloud database storage. Now, medium to large companies with HUGE datasets of information their websites gather over time can be used to create predictive models with a high percentage of accuracy. Writting effective SQL queries in Bigquery provides data analytics insights. Going further, the Data Analyst can run SQL queries to 1. ingest/preprocess data, 2. run a SQL query to build a training data set in BigQuery machine learning, 3. run a SQL query to create a predictive model in BigQuery machine learning, 4. run a SQL query to evaluate the predictive model, and 5. make e-commerce predictions with higher accuracy rates. Below is an example of SQL queries making a prediction with a 92% accuracy rate on what percentage of online visitors will buy in a future return visit.

No alt text provided for this image
Coursera example of CREATE MODEL, EVALUATE MODEL, & PREDICT MODEL SQL queries that I ran in BigQuery that your Data Analyst will run to see the percentage of return visitors who'll buy a product from the site.
No alt text provided for this image
Coursera example of a SQL query that I ran in BigQuery to predict the rate of return customers on a website who'll buy something.
No alt text provided for this image
Coursera dataset I ran in BQML. Once a predictive model is run in BigQuery, you can export it to Google Sheets, Looker Studio, or other reporting tools to measure the potential ROI of the predictive modeling on the percentage of customers who will buy on a return visit on the website. With this predictive modeling using the company's enterprise database in GCP, a Data Analyst can predict potential revenue with high accuracy for future quarters and fiscal years.

As of April 2023, most?companies are at?Step #1 or Step #2, which is e-commerce tagging all their websites on different platforms.?Google is teaching courses on this to train as many Data Analysts, Data Engineers, and Data Scientists worldwide to catch up to the Machine Learning tools they are building out for digital marketing, e-commerce, and problem-solving in real-world applications using predictive modeling of Machine Learning. Many Data Analysts, will be creating Predictive Modeling using Machine Learning and BigQuery. Below is a Google Coursera example of what that looks like. Beyond the space of the Data Analyst in the of Big Data are Data Engineers and Data Scientists who use all the above Google tools. Along with Data Prep and TensorFlow for advanced predictive data modeling analytics.

No alt text provided for this image
Google Coursera lab work example for predictive modeling of Taxi rates.

Ok, now you are in the know. As traditional marketing merged into digital marketing. Digital marketing is merging with Data Analytics. Now, Data Analytics is taking digital marketing and e-commerce management into Machine Learning with BigQuery. This space is evolving and I do hope that Machine Learning creates a UI/UX (user interface/user experience) platform that is more user-friendly for many people to adopt. When computers first entered the scene, we typed commands on a blue screen, but adding images and containers into websites helped make engagement in a website more user-friendly. The earlier iterations of various Google digital marketing management systems have also become more user-friendly. I predict that the source code scripts that run in the BigQuery and Machine Learning platform will also become more user-friendly to encourage faster industry adoption. Stan Lee, founder of Marvel Comics wrote this in Spider-Man, "With great power comes great responsibility.” The Data Analysts, Data Engineers, and Data Scientists who enter and mature in the Machine Learning career path should also strive to be individuals made of business ethics and integrity, compassionate character, and moral responsibility to humanity. AI machine learning is a tool that the creator, the human creates with. Choose wisely...

No alt text provided for this image
Copyright Pixabay.com

要查看或添加评论,请登录

Von Galt的更多文章

社区洞察

其他会员也浏览了