Playing with Patterns: Data Science for Digitising Events
The future of events lies in harnessing data science, predictive analytics, and AI to optimise engagement. As the volume of data generated by digitised events grows, understanding and leveraging it can transform how we plan, execute, and enhance participant experiences.
The future of events will unfold in the digital arena when we link data with predictive analytics and artificial intelligence to use science to design for and deliver high engagement but also turn event data into value in its own right.
We are witnessing an unprecedented explosion of data. According to a study by IDC, the total volume of data created, captured, copied, and consumed worldwide is forecasted to grow from 64.2 zettabytes in 2020 to an astonishing 180 zettabytes by 2025. This data deluge is especially evident in digitising events – in-person events, webinars, virtual conferences, or online workshops. The volume of data that can be linked to one registration alone – a sign-in, a click, a download, online behaviour – contributes to these waves upon waves of data.
On one hand, it presents an opportunity, an untapped vein of insights waiting to be discovered. On the other, it poses a significant challenge for data specialists, tech teams, and management personnel: How can we make sense of this structured and varied data to draw meaningful conclusions, deliver actionable insights and make Data-driven decisions? The answer lies not in the traditional confines of structured data analysis but in pattern recognition's playful and creative realm, a skill long associated with human intelligence and creativity.
In Digitising Events this means tapping into participant behaviour and interactions to plug those gaps in our operations and empowering ourselves to respond to emerging trends. We can now use predictive modelling to predict attendance down to an individual level, even down to a particular seminar, without the need for participants to register for everything individually. With such information, we address questions like: Will this webinar have an interactive audience? well, before it is due to go live. What is the likelihood of a sponsor having a positive experience? Which participants will have a positive return on time invested?
Rethinking our approach to data with a data science mindset will solve real-world problems like being able to model participant behaviour in advance of events so we can address issues before they happen.
Data for many events companies is disproportionately focused on structured data and keeping it clean. It's not about unlocking value. We believe that access to advanced predictive modelling, behavioural analytics, and machine learning models will make data a core value proposition for events and be at the centre of decision-making and product development.
Data as New Language for Digitising Events
We've long understood data in the conventional sense, a nice structured box to slot names and addresses into. It is impossible to organise some of the most valuable data points which come at you in an unstructured avalanche. Traditionally, events businesses have largely ignored them. For many, Google Analytics is not part of the solution but something to dip into to generate reports. We must start to mine it for insight and make it a core of smart decision-making leveraging unique event analytics as proposed by the IAEK Framework.
How do we find the golden insights hidden in Digitised Events in a world drowning in data?
Unstructured data lacks a pre-defined data model or is not organised in a predefined manner, so can seem scary. This type of data is typically text-heavy and may contain dates, numbers, and facts. Examples from when we have a well-digitised event might include social media posts, video and audio recordings of sessions, and discussion threads. This data is often captured and stored using data lakes or object storage, then should be processed using tools like natural language processing (NLP) to extract valuable insights.
Finally, semi-structured data contains aspects of both. It's not as rigid as structured data, but it contains tags or other markers to enforce hierarchical groups of records and fields within the data. Examples include XML, JSON files, or email data. In digital events, chat logs, email communications, questions at sessions or user-generated tags could fall into this category.
Collectively, these different types of data provide a rich trove of information about participant behaviour and interest at a Digitised Event. They offer a wealth of insights, from preferences and interaction patterns to feedback and discussion themes, all of which hold significant value in understanding and enhancing the participant experience and will take us away from traditional event analytics. This diversity and richness underlines why it's crucial to shift our perspective and view data in all its forms as valuable building blocks rather than baggage.
Instead of focusing on taming this data tsunami, we focus on learning its language, understanding its nuances, and ultimately unveiling the intricate patterns hidden within. This shift in perspective doesn't negate the complexities of data; instead, it equips us with a more effective way to approach and derive value from it.
Data Science: The Key To Leveraging Digitised Events
When we talk about digitising events, we mean a highly interconnected digital wrapper that facilitates each participant's engagement with and experience at an event. We don't do this just to make the customer journey better or to have integrated marketing campaigns. We do this because we can get deeper insights if we have access to more data, which is aligned with the goal of increasing IAEK for each participant.
Treat data not as a relentless storm but as a treasure trove of signals. The future of events hinges on this shift in perspective.
Data science is our guide, providing the principles and methodologies to solve this complex problem. For our seasoned players, this realm is not unfamiliar. Still, we aim to illuminate its significance in the specific context of Digitising Events – how honing our pattern recognition skills gives our business predictability, stability and actionable insight before it's too late.
Breaking Down The Analytics Process
Analytics can broadly be thought of as four interrelated stages, each serving a distinct purpose: descriptive, diagnostic, predictive, and prescriptive.
Descriptive analytics answers the 'what happened' question. It involves examining historical data to identify patterns and trends. It might mean analysing the number of attendees, the average session duration, and the most popular features or assessing the overall customer experience through feedback forms.
A good example of this would be the post-event report compiled by most event teams.
Diagnostic analytics moves us a step further; diagnostic analytics investigates 'why did it happen'. It identifies correlations and patterns to determine causality. For instance, did a specific keynote speaker result in higher attendance? Did certain content consistently underperform? Or was a technical glitch responsible for participant drop-offs on webinars, for example?
While both descriptive and diagnostic analytics have value, they usually tell us things when it's too late, especially for live events.
Predictive analytics, as the name suggests, is all about forecasting 'what might happen'. Using machine learning and statistical algorithms allows us to predict future behaviour based on past data. In Digitising Events, it can predict trends like participant engagement levels, popular topics, or potential technical issues.
This predictive analytics is used in The DiG to predict attendance, for example, using a logistic regression machine learning model.
This is where the real power of pattern recognition comes into play, allowing us to construct predictive models that guide audience acquisition and investment that are relevant, actionable and deliver a competitive edge.
Prescriptive analytics builds on these predictive models and offers advice on 'what should we do'. By simulating different scenarios, it provides recommendations on the optimal course of action. For example, it could suggest the best time to schedule an event to maximise attendance, the best format for a given IAEK outcome, and the optimum time frame to activate a key channel in your digital marketing and help optimise audience acquisition.
Recognizing these different types of analytics allows us to tailor our approach depending on our needs, whether we're looking to understand past performance, diagnose service design issues, predict future trends, or determine strategic actions. The ability to turn raw data into actionable insights, informed decisions, and leveraged opportunities drives the overall success of Digitised Events, embodying the essence of business sanity. So, how do we put this data science into action for data-driven smart decision-making.
Data Science and Digitising Events in Practice
Stepping away from individual tools and techniques, which we already elaborated on in our previous article, let's examine the unique interplay of these capabilities in the context of Digitising Events.
Seeing strategies unfold in the real world adds a layer of practicality to our game of pattern recognition. Let's consider two case studies showcasing how these data science principles, tools, and techniques have been creatively applied to digital events.
Beyond the numbers and algorithms, data tells a story—a narrative of patterns, behaviours, and possibilities critical for the advancement of Digitised Events.
Case Study: Maximising Engagement at a Conference With Data Science
One of the challenges of delivering virtual events is that most are still orchestrated using the old attention-by-default assumptions. We know that the old way of best guess sessions and solution demos is creating diminishing returns on attendance and engagement. Driving more and more on-demand attendance as opposed to live attendance.
By leveraging behavioural data, participant questions from webinars, and customer reviews, as well as customer sentiment analysis and structured data about customers appended to prioritisation scorecards, we can rapidly develop a live content matrix as a scaffold for the event based on advanced analytics to tune into the real moment in time needs for your target participants.
Subject matter experts can then use this information to write solutions and knowledge paragraphs aligned to this live matrix. Large language models can then be used to generate session abstracts for the event.
Using artificial intelligence and machine learning, we go one step further using artificial intelligence and machine learning techniques as the session abstracts were published; they were continually augmented based on behaviour interactions to improve and fine-tune them.
Utilising real-time analytics, we are able to monitor participant interactions throughout the event, quickly identifying popular sessions and high-traffic periods. Simultaneously, machine learning algorithms analysed this real-time data, identifying patterns and feeding these back to a presenter dashboard to be incorporated into upcoming sessions.
Using NLP tools on chat logs in the app and discussion threads, we further identify hot topics and common queries which can be added on the fly. It allows the organisers to dynamically adjust session content and FAQs, providing real-time responses to participants' interests and concerns. Achieving record engagement levels and highly positive participant feedback.
Case Study: Optimising Information Exchange in “The 150 in 2”
"The 150 in 2" presents a unique challenge: to create a significant shift in domain understanding, information exchange, and actionable outcomes in a condensed, online event for 150 people instead of a 2-day in-person event. With such a concentrated timeframe and audience size, every interaction counts, making data science crucial for optimisation.
Using prescriptive analytics we can determine how we optimise and tune events to maximise information exchange and knowledge gain. By analysing user profiles, past participation data, and survey responses, the system identifies key areas of interest and their knowledge gaps, as well as potential connections between participants. This information allows us to tailor the content on the fly to address these knowledge gaps.
During the event, real-time analytics would come into play, monitoring participant interaction, engagement levels, and feedback in real-time. This data would feed into predictive models that forecast participant behaviour, enabling organisers to adjust the event flow dynamically. For instance, if a particular discussion generates high engagement, the schedule could be tweaked to extend that session, maximising the value for participants.
To facilitate impactful information exchange, data science is used to generate 'dynamic clusters' of participants inspired by Open Space Technology, the format/method developed by Harrison Owen in 1985.
By analysing real-time interaction data with tuning machine learning, the system identifies participants with synergistic interests or complementary expertise to create breakout groups or discussion threads.
Post-event, prescriptive analytics give an instantaneous assessment of the event's effectiveness. Going beyond just structured participant feedback and looking at session engagement data, behavioural signals around defined CVIs.
This strategic application of data science, tailored to a format like the "The 150 in 2", demonstrates how analytics can optimise the event experience, drive understanding, facilitate information exchange, and produce actionable outcomes in a condensed time frame.
What the case studies show is that data science is only useful if you tune it to a clear outcome, as defined by the IAEK Framework™, to drive and increase participation at events.
The Power of Playing with Patterns for Digitised Events
Future events will hinge on how their individual data oceans are tuned to flow into artificial intelligence and machine learning models that are continuously refined for deep learning to enhance the decision-making process around event formats, flows and architecture.
As we draw this exploration to a close, let's revisit the challenge we started with the monumental task of making sense of the vast, varied, and rapid-fire data generated when we Digitise Events. We've journeyed from the traditional view of data as an intimidating deluge to a new perspective where we see data as an intricate puzzle waiting to be solved and its value unlocked beyond traditional outcomes like refining your marketing channel to where the true value is, which is maximising participation and connecting people.
In this game of "playing with patterns", we've learned to view data science not merely as a technical discipline but as a creative act. We've examined how different types of data generated in Digitised Events serve as fundamental building blocks and how the tools and techniques of data science like regression analysis, logistic regression and other statistical methods act as tools in our new playbook. The case studies illuminated how these principles can be applied to real-world scenarios, transforming event experiences.
Embracing this perspective of data science can lead to invaluable insights, more effective strategies, and enhanced participant experiences in Digitised Events. As we continue to explore, remember that data is not just numbers and facts; it's a story waiting to be told, a pattern waiting to be discovered. The keys to unlocking these patterns lie in the strategic application of data science in Digitising Events.