How AI Power Is Going TO Impact Events: Joe Colangelo & Adam Malik UNSCRIPTED Podcast
By
Adam Malik
in
In this podcast, I hosted Joe Colangelo, CEO of Bear Analytics. We had a great time exploring the transformative role of AI, particularly large language models (LLMs), in the events industry. Joe and I have both been implementing and experimenting with use cases for LLMs in events, which means this podcast shares lessons grounded in real-world applications.
But before I go any further, one of the questions I asked Joe was about the hype of GPTs, and I would 100% echo his answer.
“So this stuff is, it's nothing short of incredible and it's only getting better.” Joe Colangelo
One of the big wins is the efficiency and accuracy of LLMs in processing vast amounts of data, which is crucial for creating personalised and engaging experiences for all participants.
While this is true, we also touched on the fact that if we are to enable all this potential, it is incredibly important to have a robust data strategy. Without it, leveraging AI to its full potential becomes challenging. Joe shared some innovative use cases, such as personalised participant journeys, which really highlight the practical applications of AI.
We also examined the balance between data collection and user experience. Both of us agreed that it is important to justify the data we collect by providing improved experiences in return. A point that also came up in another podcast with Tamar Beck.
In this article, I wanted to explore some of the key points from our podcast discussion and ‘delve’ a bit deeper—just kidding.
AI: The New Event Management Powerhouse
Joe and I reflected on how large language models (LLMs) like ChatGPT can be transformational. Joe rightly pointed out that while LLMs alone might not have specific domain knowledge, they become invaluable when trained on industry-specific content.
One example he suggested is feeding the exact transcripts and even PowerPoint slides from previous events into an LLM to create a proprietary knowledge store. This could allow organisers to query past content and even enable prospective audiences to understand what speakers like Tim Ferriss or Scott Galloway or, indeed, any of your speakers have said on particular subjects in the past.
Organisers have long dreamed of leveraging their digital archive, but this has almost always failed because it is a lot of work to turn PowerPoint and transcripts into information and, as a consequence, knowledge.
For that reason, Joe and I were in total agreement that with this technology so cheaply and readily available, one of the most exciting opportunities lies in leveraging the vast multimedia archives that event organisers have accumulated over the years.
In the past, these archives ended up as underutilised download portals - that no one wanted to pay for. As the LLMs become much more multi-modal you can now decode entire videos, giving organisers the ability to create segmented, searchable, and monetisable content. This not only enhances the value for participants but also opens new revenue for B2B event producers, especially if you go further and integrate your webinar content into this industry knowledge pool which has the potential to create real business value from this data.
We also explored another use case, and Joe shared an example of using AI to streamline CRM data. By employing a simple prompt, he managed to categorise tens of thousands of job titles in just 90 minutes—a task that would have been laborious and time-consuming a few years ago.
This creates value as job titles come in many shapes and sizes, and your participants are keen to have them as they would like them. With LLMs, job titles like C.E.O, CEO, Entrepreneur in Residence, and so on become easier to categorise and derive value from.
AI has the potential to be quickly set up to handle repetitive tasks like the one described above and complete a job that took weeks in less than an hour. We can now align teams to focus on more strategic aspects of event management.
Combining all this gives AI and ability to identify patterns and provide personalised content recommendations based on unstructured data is a game-changer, making events more relevant and engaging for attendees.
“Cleaning your data is not a data strategy” - Adam Malik
While these examples are achievable today I did highlight the importance of a robust data strategy to fully activate AI's potential. While LLMs can, for example, clean and structure data, the foundation needs to be solid. I believe it also needs to integrate tools like GA4 to track online and onsite behaviours can create a comprehensive matrix of attendee interests and actions.
This data can then be utilised to create value for all participants.
As joe puts it
“You're actually Need to do the math for the LLM and then add the context of what was done and then it'll go through and explain it and put it into word form. Maybe build a chart, build a table the way that you want it to. And you will feel that magic come back.” - Joe Colangelo
Unlocking Efficiency: The Real-World Applications of LLMs
Going back to Joe’s point of of categorising job titles using an LLM. In the past, this task was laborious and required multiple people and extensive industry knowledge. With the LLM, Joe was able to categorise tens of thousands of job titles in about 90 minutes using a simple prompt.
When sharing how he created this result, Joe shared the concept of "few-shot prompting," a technique that enhances the model's performance by providing a few examples within the prompt. For instance, when categorising job titles like CTO, CSO, and CIO under the "C Suite," adding a few examples like "CEO equals Chief Executive Officer" can drastically improve accuracy.
What is Few Shot Prompting?
This method, supported by platforms like OpenAI, will allow you to get much better results from a little extra input and makes a significant difference in the output quality.
The ability to quickly and accurately categorise data, extract meaningful context, and personalise experiences can revolutionise how businesses operate. As Joe pointed out, the ease and speed of these models mean we're only scratching the surface of their potential.
While we did not explicitly discuss it on the podcast, in my view, the deep personalisation of participant journeys around events with trained models beyond just logistical knowledge is where the real magic will come.
Leveraging AI with a Solid Data Strategy
First of Data is just a foundation, and as Joe puts it
"You need to think about where the strength currently lies... you really have to do a great job of contextualizing the numbers that you're feeding into the LLM" - Joe Collangelo.
Stripping it down, I believe the two key areas that will give event owners a competitive advantage are creating a proprietary knowledge store and utilising multimedia—these two enable nearly all the use cases Joe and I discussed.
Inspired by this episode of UNSCRIPTED, here is my mini checklist of what your AI-ready, data strategy should include most importantly, you should have a very clear Customer/Commercial Value Outcome (CVO) to guide your efforts especially if you are non-technical.
You AI Ready Data Strategy Checklist
If the aspiration is to activate AI in a way that creates business value, then your teams will need an understanding of these areas to a greater or lesser degree. How you execute them is up to you. I am sure the team at Bear Analytics would be more than happy to help.
Training and Fine-Tuning LLMs for Maximum Impact
In the podcast, we discussed the importance of training and fine-tuning large language models (LLMs), Joe and I explored the necessity of feeding these models specific data to enhance their performance.
As mentioned previously, simple things like the few-shot prompting examples Joe shared can have an immense impact; however, training goes beyond that. What is more exciting is that in the space of 12 months, the ability to train and fine-tune has become much less technical, and this trend is likely to continue.
On this topic, Joe mentioned the use of vector databases to create 'memories' for these models, enabling them to provide more contextually accurate responses. This ongoing training, both from end-users and developers, helps refine the model's output, making it more reliable, tailored to specific needs, and integrating industry knowledge.
Personalisation in Events: The AI Revolution
For the longest time, personalisation at events has been driven by structured data, which in my view, has hindered personalisation. A significant part of our discussion was devoted to discussing the potential for AI to personalise the event experience, Joe and I explored how analysing attendance data and profiling attendees can significantly tailor content and messaging.
Joe highlighted the struggle to balance timely content with the audience acquisition marketing cycle. With AI, organisers can map who attends, their demographic data, and the metadata that defines them. This enables a rapid quantification of content types and tags, which can then be used to personalise individual profiles effectively based on unstructured and behavioural data.
Joe was particularly excited about AI's predictive capabilities. He described a scenario where data from various interactions—whether attending a session, downloading a white paper, or participating in a webinar—can be used to predict and recommend future content.
For example, if someone has shown interest in analytics and AI, they could receive personalised recommendations for sessions at upcoming events that align with these interests. This dual use of data, both informative and predictive, allows for a much more tailored messaging strategy.
We also discussed the historical challenges of personalisation in events. Previously, the lack of integrated data flow made it difficult to create a truly personalised experience. I pointed out how structured data often misses the mark, as people's job titles don't always reflect their current interests or needs. By integrating on-site behaviour with online and webinar interactions and using tools like GA4 to pull raw data into BigQuery, we can create a comprehensive matrix that significantly enhances personalisation.
This matrix could allow organisers to tag and structure data effectively, enabling a more personalised approach to live events.
The Great Registration Form Debate: Less is More or More is Better?
We invariably found ourselves in a spirited debate about the ideal approach to registration forms - it seems like any discussion around data and events is not complete without this examination.
Joe firmly believes that collecting detailed data during registration is a golden opportunity that should not be missed. He argues that live events are perfect for gathering accurate first-party data, as attendees are less likely to provide incorrect information, such as their job titles. This accuracy is invaluable for tailoring the event experience and ensuring that the collected data serves a meaningful purpose.
Joe's stance is rooted in the belief that attendees are willing to provide detailed information if they perceive a direct benefit. He points out that people are generally happy to share their data if it enhances their experience. This means that if organisers can demonstrate how the collected data will be used to improve the event, attendees are more likely to comply. For instance, if a registration form explained that the data would be used to personalise content recommendations or networking opportunities, attendees might appreciate the extra steps involved.
While I agreed with this point, I highlighted that if organisers (as is often my experience) ask for detailed information but fail to utilise it to enhance the attendee experience, it can lead to frustration.
My feeling is that the value exchange is important. We must ensure that each piece of data collected clearly benefits the attendee and is not just there as a tick box for the event sponsors. If they demand a data point, they, too, have a duty of care to use it to maximise the participant's ROTI (Return on Time Invested).
We did agree that while simplifying registration forms can reduce barriers to entry, it might also mean missing out on valuable data that could significantly improve the event experience. The key is to strike a balance between ease of registration and the richness of the data collected to drive actionable value.
“And here's why, because us as humans have proven time and time again, that we will gladly hand over our information if you dazzle us on the experience. the experience side, to your point has been waning has not been delivered upon.” Joe Collangelo
Envisioning AI's Future in Event Experiences
Towards the end of this podcast, I asked Joe if he had a blank cheque where he would invest his time and effort and, in his own words..
“..can we create unique attendee journeys that are akin to when you go to the museum and you put like your headset on and explains to you about The different exhibits and things like that the delivery would probably be through your phone and you would be able to not just way find But truly have, like, today I want to look for new business partners.”
We agreed that the core of enabling this transformation lies in the data already being collected. The challenge isn't gathering new data but leveraging the existing data more intelligently in fact, Joe mentioned that no new innovation is required in data collection; instead, the focus should be on using AI to interpret and act on this data.
Joe and I touched on the potential for AI to continually improve event experiences through feedback and learning. By incorporating participant feedback and interaction data, AI systems can refine their recommendations and itineraries for future events. This continuous improvement loop ensures that each event becomes progressively, you retain learning in your organisation as your talent moves on and institutionally gets better at meeting the needs and preferences of all participants.
Joe eloquently described the perfect storm surrounding generative AI, which reminds us why the pace of change is so rapid.
But here you have not just the software and the hardware experts, you have the capital allocators moving as quickly as possible. And you also have, you can argue customers willing to adopt maybe faster than any tool set technology, however you want to codify it in history. Joe Colangelo
I want to thank Joe for giving up his time, joining me on this episode of UNSCRIPTED, and sharing his experiences and insights. I encourage you to check out Bear Analytics and the work they are doing to help you digitise your event.