
Effective Digital Marketing today requires cross-functional support. Gone are the days where different departments all stay in their lanes, only focusing on the piece of the puzzle that they’re directly responsible for. Modern organisations take a more horizontal approach to providing value to their customers, creating internal collaboration across traditionally siloed functions.
Eleanna Smpokou has spent much of her career focused on split, or A/B testing. We talked to her about her experience and her approach to building collaborative teams to maximize responsiveness to the results of this testing.
Eleanna Smpokou | Interview
A chemical engineering graduate who started out doing project management work in Greece and Oman, Eleanna switched gears in 2013 and began her career in Digital Marketing in London. She spent four years in managerial positions at various digital marketing agencies before entering the world of product as Product Performance Manager and later Experimentation Product Owner at HelloFresh. There she developed the company’s experimentation process and worked with Product Managers to drive higher customer retention. Soon after, she joined IDAGIO, a Berlin-based classical music streaming service, as a Senior Product Manager for Growth. Most recently, she embarked on a new adventure with Cobalt, a platform offering Pentest as a service.
Hi Eleanna, thanks for joining us. Can you start by telling us a little about your career path?
Of course! So I was in digital marketing for quite a few years, but before that, I actually studied chemical engineering followed by an MRes in Plant Chemical Biology. I have quite an affinity for numbers which I think always lent itself nicely to digital marketing.
I started my career in project management before transitioning over to digital marketing. I eventually moved to HelloFresh as a Product Performance Manager. I initially focused on their referral programs but soon got the opportunity to work on A/B testing. Back then we were mostly running simple A/B tests to drive incremental improvements on the conversion funnel. We also started doing retention-related A/B testing—for instance, trying to convince a person who starts with a trial box to stay for more and more boxes. Of course, the physical part of this (the box itself) is key, but I was focused on the digital experience: how can we improve the experience of existing subscribers with our website and apps and what are the metrics we should use to measure success. What makes things more complicated when it comes to retention for a subscription service like HelloFresh, is that you need to run experiments for a much longer period of time to be able to understand whether a change in a user flow can drive a statistically significant improvement in metrics such as the average number of boxes per subscriber.

This was basically a deep-dive into retention-focused A/B testing involving a lot of data analysis and statistics; I must admit it somehow took me back to my days of chemical engineering and research! These were great times to be part of the Product Analytics team at HelloFresh, as we were growing very fast and the whole company was embracing more and more this experimentation culture. We went from basically myself trying to convince engineers to make some “tiny, I swear it’s going to be simple!” changes which resulted in 2-3 experiments a month to a team of 17 Product Analysts coordinating over 200 experiments in less than a year! I also had the pleasure to work on an initial idea for an in-house experimentation platform which exposed me to the technical aspects of experiment setup, a whole new world for me!
How do you even go from 2-3 experiments a month to 200 in less than a year?
I was responsible for the experimentation process across the company— we had to align the different teams to work together and ensure we were all using the same language across product analytics, product management, design and engineering. There were a few components to what we did:
- Education: Across the company, teams need to understand why you’re doing A/B testing in the first place, why it’s valuable and should be the cornerstone of product development. Very often, you launch new features or make changes but you have nothing to compare them to—so you don’t even know if the change was for the better. A/B testing is crucial to validate your assumptions and drive the desired business outcomes.
- Communication: Product Designers and Engineers want to know how their work is contributing value and driving change and if it’s not, this is also a useful learning: our assumption has not been confirmed, let’s discuss thoughts about what we’ll do next. It’s important to keep stakeholders and your team informed on how the experiment is progressing.
- Setup your experimentation process: Map our your plan from concept to design, to monitoring and analysis, to results and next steps. When we talk about design, we’re not only referring to the actual UX design of the new flow but the experiment setup itself. For instance the number of users that need to be assigned to each test group, the period of time we should keep tracking performance, user groups that need to be excluded, which platforms the test should run on and so on. You need to involve people from data analytics and data science in the process or you risk misinterpretations of the results or a bad experiment design in the first place.
- Pipeline: Building out a testing pipeline is crucial if you’re running multiple tests at the same time. If there’s not enough traffic or tests are clashing with each other, prioritise the initiatives to be tested. Product Managers are typically responsible for this prioritisation as they are the ones who put together the business case and know which tests have a stronger potential to move the needle or make more sense to start with from a RICE (Reach – Impact – Confidence – Effort) perspective.
- Roll-out Plan: Have a plan in place for successful tests that can be rolled out to wider audiences. For example, you might have only tested on iOS in the DACH market. How do you tackle other platforms and geographies? Is it safe to assume this will work there as well or should you test separately?
Can you tell us more about IDAGIO and the audience it serves?
Idagio is the leading classic music streaming service, with a catalogue of over 2 million tracks from just under 2,000 labels and rights holders. We have surpassed 2 million app downloads and we’re available in over 190 countries, including Japan which launched earlier this year.

The focus on classical music comes from our founders, Till and Christoph. Till has 20 years’ experience in the industry as an artist manager, producer and concert promoter. Christoph is the tech-savvy one, a digital and startup guru. He also founded the music-streaming service Simfy back in 2006, at the age of just 22.
They both recognised that despite there being a lot of music streaming apps, there’s never been one specifically tailored to the needs of classical music fans. These fans are different in the way they search for and listen to music. For example, it’s not just the piece and the composer that are important, but also the conductor, the ensemble, the soloists…They want to be able to search by instrument, classical music genres and periods. And they also want a very high sound quality, which could be of less importance in other music genres.
So, Idagio was born, to offer these features along with expert curation from a team of specialists as well as external partners.

In terms of our audience, we basically categorise our users into 2 big groups: aficionados and enthusiasts. Aficionados know exactly what they are after and most of the time have vast classical music collections at home. Enthusiasts, on the other hand, are here to build their knowledge —they love recommendations and to be taken on a journey.
What did your role specifically entail?
I joined as a product manager focusing specifically on growth and worked on initiatives such as the onboarding flow for the native apps, the re-design of our landing page, referral programmes and gift voucher options. In addition, perhaps the most challenging and rewarding project, was Idagio Live which launched in April: a new online space to give classical artists a platform where they can stay connected to their audiences during the difficult times of COVID. It started with a few artists talking about their work and engaging with the community. In May, we started facilitating online concerts. Tickets were sold on the platform and people could attend virtually. We included a little Q&A session at the end which went down really well—the audience would submit their questions and the artist would respond there and then. Today, we’re proud to call this the Global Concert Hall, where classical music lovers can attend live online concerts by their favourite artists wherever they are around the world.

Given that IDAGIO is still pretty young, what’s the organziational setup like? Is it a case of “all hands on deck”?
We’ve basically moved away from having dedicated units, teams set in stone. Instead, we have “swimlanes”; each one represents a different product focus, for example new user activation, premium subscriber experience, concert experience. Product Managers are assigned on a swimlane level however Designers and Engineers can move around as required depending on the project and their expertise. This requires regular alignment meetings to plan and allocate resources for the upcoming “cycles”. Our product process is inspired by Basecamp’s Shape Up. This allowed us to be faster and more efficient as well as more adaptable to changes of the overall landscape (a big example being COVID and how it affected music events). Finally, Product Management, Product Design and Engineering leads facilitate collaboration, foster alignment and consistency across the product team, ensuring we’re all working together towards the set product vision.