Here’s how merchants can use experimentation: ?? (1) — Run smoke tests to gauge real buying intent before selling new products. ?? (2) — Test and personalize product recommendations, up-sells, and cross-sells, including placement, display, algorithm, and filtering rules. ?? (3) — Experiment with promotions, pricing, and discounts. See how those promos and discounts impact sales of particular categories or products. Experimentation goes way beyond optimizing copy and website pages.
Speero
互联网出版
Austin,Texas 6,602 位关注者
We help product and marketing teams make better decisions with world class CRO, CXO, and Experimentation programs.
关于我们
Speero is an experimentation agency. Speero focuses on helping product and marketing teams make better, faster decisions using A/B testing and more broadly experimentation programs. We are for mid-sized and enterprise organizations looking to build and scale CRO and experimentation. Speero, formerly CXL agency, was founded in 2011 by Peep Laja the #1 most influential conversion rate optimization expert in the world. Generally we serve lead-gen and ecommerce, with clients such as ADP.com, Mongodb, Codecademy, Serta-Simmons, Native Deodroant, Miro.com, and others. Speero has offices in London UK, Tallinn Estonia, and Austin, Texas USA. (but we're more and more fully remote!)
- 网站
-
https://speero.com/
Speero的外部链接
- 所属行业
- 互联网出版
- 规模
- 11-50 人
- 总部
- Austin,Texas
- 类型
- 合营企业
- 创立
- 2011
- 领域
- Conversion Optimization、Customer Experience 、CX、CRO、Experimentation 、User research、Optimization、A/B Testing和Analytics
地点
Speero员工
动态
-
The year ends soon. How do you quickly analyze and plan on improving your experimentation program for 2025 and increase testing velocity and impact in the process? It all starts with analyzing the main zones/pillars/aspects of your program: ?? — People and skills (Do you have the right technical staff, project managers/product owners, data analysts, developers/engineers, UX/UI teams, and how you train them). ?? — Strategy and culture (Do you have buy-in, evangelize experimentation, collaborate/CoE, personalize/target, connect business goals and experimentation strategies, and test complexity) ?? — Data and tools (How you invest in testing tools, run tests, track data and KPIs, research, and create a knowledge base). ?? — Process and governance (How you generate hypotheses, prioritize and roadmap experiments, manage the program, QA, pre-test, analysis, and test communication)* *This is just Speero’s way of measuring the program’s maturity, you can use your own. When you understand your starting points and where you are, you can improve your program for more test velocity and impact. Imagine your ideal experimentation situation, then set a scope that can deliver practical results and move the organization toward that ideal situation. Last tip. Sanity check the whole scope of improving your program. You need to ensure that the original scope is realistically achievable in your organization. Reformulate if you need to. PS: We have a free experimentation program audit that lets you analyze all the gaps in your program across all of its aspects. Link in the comments.
-
How many times did you have trouble adding user research to your experimentation program? On an ongoing basis? But categorize all research initiatives into one of these 3 categories, and you can plan your research better. Speero’s Research Objectives Blueprint? speaks about different research types you can (ideally should) incorporate into your experimentation programs on an ongoing basis. Also, helps you plan those kinds of research more effectively. At its core, this blueprint maps out three main research types. 1) Exploratory research—evaluating and exploring the overall user experience. Its goal is to identify barriers to conversion and key areas of opportunity and generate new ideas. 2) Focused research—digging deeper into specific issues, themes, topics, or opportunities. Sometimes you need to understand the problem better before you can fix it. This is where focused research helps. 3) Validation research—gathering data to validate concepts or solutions. As you can see from the blueprint, A/B testing can serve as a validation method, but you can also use other validation methods. In this blueprint, we’ve mapped out typical research methods for each research type or objective. But you need to keep in mind some research methods like usability studies are quite flexible, and you can use them effectively for different types of research objectives. We’ve also mapped out the typical cadence of research. Now the real cadence is going to depend on how quickly your company can action outputs from recommendations to research. This might be outside of your hands, with factors like design and dev perspective, traffic for A/B tests, etc. But this is typically what Speero would expect of 1 or 2 large exploratory research projects every year. You can download the Research Objectives Blueprint from our website (link in comments). Send a DM on LinkedIn to Emma Travis or Martin P. for any feedback, suggestions, or questions about the blueprint. Big thanks to blueprint creator, Emma Travis!
-
By Peep Laja. "After hundreds of demos over the last few years, I've concluded that most B2B SaaS marketing leaders are uneducated about how qualitative research works and what specifically it’s useful for. They're quietly skeptical about the sample sizes. They've been trained to turn to analytics or attribution tools instead. Here are 10 things I usually tell them about qualitative: 1. It’s about the ‘why,’ not the ‘what.’ Numbers tell you what’s happening. Qualitative research tells you why it’s happening. Without the why, you’re guessing. It's the most insightful type of information there is. Once you understand why your ICPs do this or that, it has massive implications on how you go to market. 2. Find what you didn’t even know to ask. Open-ended questions are magic. People will volunteer information you didn't even know to ask. You will learn things you didn't even now are a prevalent thing. Sometimes seemingly superficial things make your target customers choose your competition instead. 3. Nail your messaging. You can’t do messaging that resonates if you a) don’t know your ICPs top pain points b) don't know the the exact words they use to describe their problems. (It's rarely the typical jargon you see on websites). 4. Small sample, big insights. With qualitative, the concept to know is insight saturation. In most cases already 15 people is enough to reach saturation, getting the max insights possible. A tight, vetted survey panel beats a bloated data set of randoms. You don’t need to spend hundreds of thousands of dollars for 1000 responses to learn what matters. 5. Test fast, pivot faster. You don’t need a huge dataset to figure out if your messaging fails, or if you are lacking a compelling reason to buy. Replace one big annual study with lots of small ones to be agile. 6. It’s how you innovate. People hate new ideas until they see them. Qualitative lets you refine the crazy stuff instead of killing it. 7. Focus on what actually matters. Most marketing data is noise. Qualitative gets you the signal—the real problems your ICP cares about. 8. Revenue comes from relevance. If you don’t understand your ICP, you’ll waste time and money building stuff they don’t need or want. Talk to them, fix it. 9. Say no to fraud. Qualitative surveys with verified B2B participants keep the fraud out. You get real answers from real people. 99% of what you get from those large-scale panel companies is garbage for B2B. 10. Fast feedback = fast results. You can get meaningful insights in a day or two, and adjust course. Speed is one of the only true competitive advantages." Original post by Peep Laja. In this week's Speero's best take of the week — where we celebrate and elevate the opinions and takes of experimentation experts.
-
The client wanted to expand into the U.S. So we copy-tested (via Wynter) the client's U.S ICP to explore their emotions and perceptions of the client’s website claims. Objective: Find out which elements build confidence and resonate with the U.S. audience. Plan: 1 test run via Wynter. 30 panelists, 3 feedback areas. Results: - Overall, the page received a positive response for explaining the value proposition and what the client offers. However, the page did not clearly explain the client's uniqueness. - The header did a good job of presenting the offering, the reviewers said it was clear and concise with the bullet points. There were mixed reactions to the form in the header: some reviewers did not like the premature form and wanted to learn more before filling up the form, while others liked the idea of quickly filling it up in the header as the header was clear enough for them. - Reviewers mentioned that the product description section lacked detailed information about features, integration, pricing, and implementation timelines. The copy contained within the thumbnails were too unorganized to fully read. -? Reviewers also mentioned they didn't know most of the awards and this made them doubt the legitimacy of it. They demanded more customer testimonials and reviewers from bigger brands. Recommendation:? -? Test into competitor-focused value props -? Test condensing/simplifying the product description section -? Test showing awards/testimonials
-
Marketing often focuses its experimentation on increasing conversion rates and the initial purchasing experience. But this doesn’t account for the whole customer journey. Here’s where marketing can use experimentation beyond simple conversion optimization: ?? — Test which products, creative, copy, and promotions work best for different customer segments. ?? — Test ad creatives or messaging before spending more on media buying. ?? — Test retention initiatives and loyalty programs to improve long-term retention KPIs. ?? — Test different email lengths, types, copies, and designs. Focus on the whole journey, not just the initial purchase.
-
Here’s how you can use Speero’s experimentation audit to analyze your program’s strengths, gaps, and capabilities so you can increase testing velocity and impact.* *You can also narrow it to analyze the same aspects of a department or a team inside your org. After you complete it (takes 10 minutes) you get a holistic view of your organization’s capabilities to understand, act on, and measure each step of the online lifecycle of your visitor/customer. You will also be able to develop a clearly prioritized roadmap of your follow-up activities. - Are you missing tools to measure user behavior? - What are the skill gaps of your team? - Is experimentation siloed? When you understand where you are in the organization and what are your starting points, you can improve your program for more test velocity and impact. Imagine your ideal experimentation situation, then set a scope that can deliver practical results and move the organization toward that ideal situation. Last tip. Sanity check the whole scope of improving your program. You need to ensure that the original scope is realistically achievable in your organization. Reformulate if you need to. The point isn’t to stick to an SOW, but provide as much value as possible within that SOW. Link for the audit in the comments.
-
To run experimentation programs you need? Yet you need to keep the budget lean. Here’s how to have both. In XOS Tools Blueprint we share a set of software categories you need when planning your resources, activities, and budget. Also, how they fit into Speero’s Experimentation Maturity Curve. A BIG reminder: These are NOT the only tools and solutions available in the market, but the most common you might come across when doing your own research. These are: ??Assessment and Integration (Speero’s Maturity Pillar)?? — Qualitative Data:Wynter, Hotjar, Fullstory, Userlytics, Usertesting, Maze. — Quantitative Data: GA4, Adobe Analytics, Amplitude. ??Planning and Process (Speero’s Maturity Pillar)?? — Knowledge Management: Airtable, Notion, Effective Experiments, Confluence. — Test Planning: Speero’s AB Test Calculator, Figma, Adobe XD, VS Code, Sublime, Atom, Tampermonkey, Github, Browserstack. ??Test and Learn (Speero’s Maturity Pillar)?? — Experiments: Adobe Target, Optimizely, Convert, VWO, Kameleeon, A/B Tasty. — Reporting: Looker Studio, Tableau, BigQuery, GSheets/Excel. ??Decision and Action (Speero’s Maturity Pillar)?? — Knowledge Share: Slides, Docs, Canva. — CMS: WP, Shopify, Webflow, ??Tools that Connect the Pillars?? — Alignment: Slack, Loom, Zoom, Google Chat. —Automation: Zapier. —Task Management: ClickUp, Asana, Jira, Trello, Monday(com) With XOS Tools Blueprint you get bird eye view of all tool categories you need, but also enough details to start your own search. Links for the blueprint and our calculator are in the comments. Thanks to the blueprint owner, Carlos Trujillo
-
The client’s copy concentrated a lot on their team and their ability to help. They wanted to?continue focusing on this brand image. But there were?many FUDs around the brand itself.?The client had limited brand awareness and the website itself gave a limited impression that the client is a distributor, rather than a manufacturer. This limits brand recognition and fails to highlight the specialized (manufactured) nature of its products. Less clarity means their leads may be overlooking the offering, lowering sales. We wanted to?improve trust and reliability?by showcasing the products to leads. So we ran a branding workshop to discuss with our client how we can showcase them to prospects and identify what a hesitant prospect may need to know. They originally wanted to create sub-brands, but our direction was that the main brand needed strengthening before creating new ones. The workshop’s outcome was that we created a series of statements to?strengthen the company and product offering.?We ran a test to confirm this. Our hypo was:?If we redesign the Homepage hero blade by showcasing the products they sold along with benefits statements that addressed customer pain points , we will improve the brand perception for new customers and this will increase product inquiries. The outcome? Win! The specialized product showcase, combined with a brand-focused presentation, significantly enhanced brand perception and increased all key metrics: - MQLs increased by 58.62% at 97% CTBC. - Email captures increased by 83.31% at 98% CTBC. The main difference with the CTA engagements is that both hero CTAs are getting clicks. Implementing this test would generate an?estimated revenue of $150K in 6 months?time period. Next step: Implement the variant.
-
Original post by Emily Anderson - "My workshops got 10x better when I started doing this: Understanding that people's brains work differently We spend so much time understanding user's needs, but we often forget about the people we work with. → What do they need? → How do they work best? → How can we be more inclusive? → What could they struggle with? The truth is, workshops can create pressure and anxiety. ...thinking of ideas against a timer, in tool you've never used, then presenting to the group when you can barely draw a stickman? No thanks. People can spend the whole time panicking about whether their idea looks "good" rather than actually having space to ideate and let the ideas flow (I've definitely felt embarrassed to present my ideas that I didn't think were "good enough") Our brains are all beautifully unique, so our workshops should support that. That means, ideation shouldn't be a one-size-fits-all approach Instead, we can create encourage people to ideate in whatever format suits them: → Draw ideas on paper and take a photo → Write ideas down on post-it notes → Create scrappy wireframes → Use screenshots of apps / websites as references Anything that helps people communicate their ideas! The best workshops are the ones where everyone feels confident to share their ideas and can be heard ??" Original post and picture by Emily Anderson. In this week's Speero's best take of the week — where we celebrate and elevate the opinions and takes from experimentation, growth, and data experts.