Planners are often perfectionists, and they certainly get to exercise that tendency when it comes to delegates’ experiences at the convention. The experience is so multifaceted that it always seems to leave room for improvement. Macro-level aspects include the quality of the education, exhibit floor, networking opportunities, destination and lodging. Micro-level aspects include everything from meeting room temperature to the supply of water and coffee to the amount of “movement time” allotted between sessions. Staying attuned to attendees’ needs and desires year after year allows their experience to be continually refined, which in turn supports attendance numbers.
“A couple of the groups that I work with are heavy Twitter users, so we do a lot of following and engaging through that.”
— Vanessa Mobley, CMP
A case in point is the Electronic Retailing Association, which has done “quite a bit of tweaking” to its D2C Convention over the years “to make it easier for people to connect” with one another, remarks Scott D. Oser, marketing director for the ERA. Through its post-convention surveys and onsite feedback from attendees, the ERA gradually learned that “it’s very much a connecting crowd as opposed to a learning crowd,” he explains. “Education is a big part of it, but interestingly for the attendees that’s not really one of the high-ranking things that they’re looking for; they’re looking for deals and networking. We kept hearing over and over again that they wanted spaces where they could sit down and do deals. And there really would be no way for us to know that without watching what’s going on, talking to them and listening to what they’re telling us.” The ERA has even instituted a “Deal Makers Club” at its smaller Great Ideas conference as a result of that feedback. “Instead of your traditional show floor we have tabletops where people can have appointments with the different companies,” Oser says.
Gathering feedback, whether during or post event, is thus no mere record-keeping activity or formality; it is a means of identifying areas for improvement that the association intends to address, leading to more ROI for attendees or simply a better hospitality experience. That includes F&B service: “Because we’re a medical association, our attendees are very attuned to being healthy, so we get a lot of complaints about the lack of healthy choices in the convention center,” notes Pamela Ballinger, CMP, senior director, meetings and exhibits, American Association for Cancer Research. “We take those comments to the next city, sit down with the convention centers and caterers, and say, ‘This is a very health-conscious group, you’ve got to provide vegetarian options, don’t offer fried foods, etc.’ ”
It also includes logistics: “In New Orleans, we had a lot of rooms that were filled to capacity and that was a big complaint,” says Ballinger. “So going into DC (for the AACR Annual Meeting 2017) we’re looking into overflow options.”
With the goal of improvement in mind, it is best to reframe the complaints that appear on surveys as opportunities for productive change. Psychologically that may be helpful for a planner, as it is typically the case that most of the comments are negative. “Many times the people that submit (the surveys) and take the time to do them are those that really have something to complain about, honestly,” observes Ballinger. “I think that’s human nature. So, one of the things meeting planners have to develop is thick skin.”
Heather M. Seasholtz, CMP, director, meetings and events with Talley Management Group, effectively confirms this phenomenon based on survey responses to the conventions she manages overall: “I’d say it’s 40/60, 40 percent positive. But I think people are starting to make more of an effort to put in positive comments as well. I’m seeing a lot of simple ‘good job’ and ‘really liked xyz’ type of information now.”
Unfortunately, the response rate on post-event surveys tends to be low. “If you get a 10–15 percent return, that’s pretty good,” says Ballinger. “Some associations that I’ve worked with tie the evaluation to attendees’ getting their CME credit, and you get a better response if you can do that.” Clearly, that tactic will not be available to many associations, so the survey will need to be made as convenient for attendees as possible in order to maximize response rate. Part of that is brevity: fewer questions that focus more directly on what the association identifies as the key areas of the convention experience.
“Some of our clients are removing some of the logistical type questions and really focusing on the education, because we know that’s what’s driving them to come to the meetings,” says Seasholtz. “You can just add more questions about the content on top of the logistical, but then you risk making it too long and losing the attention of the attendee.” What counts as brief? For Seasholtz, “anything that’s within that five- to seven-minute range is something people will respond to. When you start hitting 10+ minutes it just takes away from their work, and workload isn’t going away for anybody. I know I’ve cut myself off from surveys that just start taking too long. It’s nice to have that bar at the top that tells them how far along in the process they are, the percent of questions done.” An indication at the outset of approximately how long the survey will take also is helpful.
In 2011, SurveyMonkey conducted a telling study on the relationship between survey length and the time spent completing it. Based on a random sample of roughly 100,000 surveys that were 1–30 questions in length, the study found that as the number of questions increases, the average amount of time respondents spend answering each question decreases. In addition, survey abandon rates increased for surveys that took more than seven to eight minutes to complete, with completion rates dropping anywhere from 5 percent to 20 percent. Thus, assuming that the ideal is to have attendees give well-considered replies as opposed to rushed replies, the number of questions should be streamlined as much as possible.
It’s recommended that the post-event survey be sent out the last day of the meeting, or within the ensuing week, so that the memory of the experience is still fresh. “We give them a time limit of one to two weeks to (complete it) because after that point, the memory is lost. We also send out one or two reminders,” says Seasholtz.
As far as the format, Likert-type scale questions (which represent people’s attitudes) deliver better data than yes/no questions, and are more convenient for respondents than open-ended questions. However, there is value in allowing participants to write in comments and express themselves in more detail. “Have a place to put comments but don’t make them required,” says Ballinger. “If somebody rates something a 1 on a 1-5 scale with 5 being the best, I would certainly ask why: For example, ‘If you’ve rated this less than 3 provide a comment.’ If somebody rates something at a 1, I suspect they have a really compelling reason and they will put the comment in.” Participants even can be given the option to express their satisfaction/dissatisfaction on a more personal level. “We do ask, ‘Would you like somebody from the association to contact you regarding your survey?’ ” Seasholtz says.
While the traditional post-event survey is generally regarded as an indispensable metric, there is less agreement on the post-session survey. The viability of these surveys will depend on the association, its goals and its audience. If the assessment of the educational component of the convention is especially critical, post-session surveys can be desirable because they deliver feedback based on attendees’ freshest memory of the session, including the quality of the speaker, usefulness of the information, fulfillment of objectives and other aspects.
“Generally we get about a 40 percent response rate” on the evaluations at the session, says Barbara Licht, director, educational meetings and conferences at the American College of Physicians. “We hand the surveys out and there’s somebody in the back of the room collecting them. They are brief, just asking several key questions, using bubble replies.” That is backed up by a post-conference evaluation that is “totally targeted to assessing outcomes achieved and the change in learner’s competence, performance and patient outcomes as a result of the content,” Licht explains. “The surveys are sent to a sample of the participants and they’re asked to indicate the practice gaps that they came into the meeting with and the sessions they attended to address those gaps. And then two months later they’re sent a follow-up survey asking what changes did they make based on the information they received.”
But some planners feel that post-session surveys — whether administered on paper, via email or through a meeting app — would be taxing to their busy attendees. “With the number of sessions our attendees go to, they’d feel bombarded,” says Ballinger. Oser has a similar sentiment. “More people are trying to get instant feedback,” he observes. “But we have to see how feasible the instant survey would be because our attendees are booked all day if you look at their calendars. My fear is that that email would get ignored, and we’d get even less response on that. So we’d spend that time to get less worthwhile feedback.” One option to encourage instant feedback without bombarding attendees is the web survey accessible on kiosks stationed in the meeting or exhibit areas, although it can be expected that the participation rate on these will not be especially high.
Apart from surveys, social media has come to prominence as another metric for evaluating the attendee experience. “We’ve definitely seen an uptick in use of social media,” says Seasholtz. “We use it more as an immediate feedback versus post-event tool.” The feedback ranges from comments about a speaker or session to alerts about logistical issues such as an AV problem, bathroom failure or a wet floor at the convention center. Monitoring social media outlets can be done by volunteers or, in the case of the American Association for Cancer Research, by the PR/communications department. “They put together a general report that’s for everybody and you can pull it up from our general data storage system,” says Ballinger. “If there’s any issue regarding logistics that’s coming out, somebody’s tweeting about a problem in a room, they would let us know immediately.”
Some groups just tend to be heavy social media users, and that makes it a very effective feedback tool. “A couple of the groups that I work with are heavy Twitter users, so we do a lot of following and engaging through that,” says Vanessa Mobley, CMP, senior meetings manager, Association Management Center. “We’re able to see the trends and fix the things that we need. We just had a meeting in Chicago with one of those groups and unfortunately the Wi-Fi wasn’t good, but we heard about it, and I was able to reach out to the IT team at the hotel and bump up the bandwidth, and then we were better for the rest of the meeting. But (in contrast) I have a group of rehab nurses that is just not that active on social media, so we depend more on the post-conference survey for them.”
The Electronic Retailing Association’s delegates are not very active on social media as it concerns the convention, but the ERA is aiming to change that, in part because it is another way to gauge their experience. “I had always been told that our people are not on social media, and I don’t agree with that,” says Oser. “We don’t live in ‘Field of Dreams’ where if we build it they will come. You can’t just throw up a LinkedIn or Facebook page or start tweeting and all of a sudden thousands of people are going to flock to it. So we need to do more to promote it, and up until very recently we had not promoted our social media, but we just hired a consultant to help us with that. And we’ve noticed that just by trying to promote it our increases have been huge, going from 100 to 1,000 followers, and the engagement has been that much higher literally within three weeks.”
That interaction is going to be “another source of data for us,” he continues. “I don’t want to kill us with data, but maybe there are some people who are not filling out the evaluation, and they’re not really verbal when it comes to talking to our people. Maybe they’re more comfortable with social media, and we don’t know that since we never tapped into it. (The feedback from those attendees) might reinforce that we’re doing the right thing, but if it’s contradictory to what we’re hearing and it’s not just a few people who are (very actively) tweeting, then we’ll have to figure out what to do with the information.”
Many of the ERA’s attendees are in fact “verbal” when it comes to giving feedback onsite, and that is yet another valuable channel. “We get a lot of word-of-mouth feedback. Our people are very good at telling us what they like and what they do not like,” says Oser. “We have a couple of salespeople on staff who handle exhibit sponsorships and they’re always talking to the exhibitors, sponsors and people on the show floor about what their thoughts are and what’s going on.” Particularly given that the post-event survey response rate is a little over 10 percent, “we’re really making an effort onsite and before and after to talk to people about what their expectations are, what they’re thinking while they’re there and what they felt when it finished,” he adds.
The attendee experience is indeed multifaceted, and its measurement can be as well. Apart from surveying, social media monitoring and collecting onsite verbal feedback, Seasholtz points to numerous other helpful metrics. “We need to not just look at those (three) options as success measures, but also website hits. Are we getting increased traffic to our website? I take that as positive feedback. Registration data: Are you a previous year attendee? That goes back to them being satisfied. Financial information: Is our sponsorship up? Is our registration up? We also have to talk to our exhibitors: How is the traffic? Yet another source of information is our vendors: An AV team or decorator is walking the floor just like we are. What are they overhearing? Those are areas where we need to look outside of the box, outside of the survey, to determine what the attendees’ satisfaction is.” AC&F