Wouldn’t life be easier if the team working on your university’s virtual student recruitment events only had to learn one set of safeguarding tools, and if those tools were designed with the safety of under-18s In mind?

In 2020 we all added the phrase “Zoom bombing” to our mental dictionaries. For university marketers organising virtual events, this was merely the tip of a much larger safeguarding iceberg.

As lockdown hit, student recruiters rapidly built large-scale events around existing tools, such as Zoom, Microsoft Teams, Facebook Live and Gotowebinar, to name a few examples in a much wider field.

Each of these came with their own set of safeguarding strengths and weaknesses, defined by the target markets and marketing goals of their parent companies.

During our conversations with university marketing and events teams we have identified three significant challenges shared by the organisers of student recruitment marketing and events around the world:

  • Resourcing: universities are hard pushed to be sure of meeting the resource-hungry moderation demands of large-scale events like open days.
  • Control: in the context of running events and marketing aimed at under 18s, universities need certainty of sufficient privacy and security to ensure the safety of those attending.
  • Complexity: University staff and participants in open events and marketing have been expected to learn a multitude of new tools and platforms in the past years since lockdown hit. Staying on top of so many all at the same time is a tough ask.

The difficulties university teams have experienced have two fundamental causes:

Firstly, many of the platforms HE teams have been forced to fall back on only provide one slice of the functionality needed for the full suite of interactivity needed for student recruitment and marketing activity. The obvious result of this is that universities end up having to learn and resource multiple different tools for multiple different purposes rather than a single tool with all the functionality in one place.

Secondly, many of these platforms are business-focussed, solving user issues for customers primarily working in a (home-)office environment. While safeguarding is still a high priority for these B2B solutions, it doesn’t carry the same level of priority and control for them as many universities need for u18 outreach and access purposes.

While all the larger events and engagement platforms have launched significant user safety and security upgrades during the pandemic, not all of them have solved the problems specific to university outreach and student recruitment.

The mission of Union Spaces is to simplify and improve the experience of student recruitment marketing and events for applicants and university staff alike. Because our senior staff come from a higher-education background, we believe we can achieve this in ways which understand and serve the needs of higher education users.

In the area of safeguarding and moderation this means ensuring the highest levels of safety and security for your events and marketing audiences, while reducing the number of tools your staff are expected to learn and use.

For more detail on our five layers of safeguarding technology, how we save you time and maximise security, please read on. Or if you’d like to see how it all works on a live site, please book a demo with us – we’d love to show you!

Book your Union Spaces demo

Book your Union Spaces demo

Enter your email address below and we'll get straight back to you to arrange your demo

Our approach in depth

Our approach has three basic tenets: bake safety and security in by design, automate wherever possible, and provide a “white glove” service to support your staff where necessary. In practice, this leads to five “layers” of safeguarding, privacy and moderation technology on the Union Spaces platform, ensuring a positive, cheerful, inclusive and secure experience for applicants, and the minimum cognitive load for your staff:

 

1. Privacy by design

Edit personal informationThe first step towards ensuring privacy for under-18yo users of Union Spaces is to restrict the usage of real names on the platform. By default we never collect real names from applicant or pre-applicant users of the platform unless a university client specifically requests this. At registration we collect an email address (never displayed), private username and display nickname only.

For all other areas of data collection, the user has complete control over the privacy / visibility of their data to all other site users via settings in their profile as shown here.

 

2. Safety by design

Building a relationship of trust between members of a community is a long-term process which takes time. For that reason we enable university clients to regulate the types of social interaction members can access, based on their profile type. Example social interactions are starting new discussions, replying to discussions, private messaging, posting to group activity feeds and uploading media to the platform.

Profile types change and are updated through the course of the applicant journey, either via CRM integration or a manual admin process. An example might see a user start out as a ‘Pre-Applicant’ and change through the cycle to ‘Applicant’ and then ‘Offer Holder’. At each of these stages, as the relationship with the university has developed further, we can enable more freedom for community members, as specified per client. For example:

 

3. Reporting and blocking tools for applicants

We have adopted a best-practice approach to reporting and blocking functionality development, using the principles that have become commonplace on online social platforms. Importantly, these are the reporting and blocking tools that prospective students will have come to expect and feel comfortable with, thanks to their widespread adoption on major platforms such as Instagram, Facebook, Snapchat, Twitter and TikTok.

All users can block any other user via their profile, meaning that user cannot message them and any posts from them will be invisible to the blocking user.

Once a user has been blocked (eg) twice, their account will be automatically suspended and they will no longer be able to log into the platform: Similarly, all public user posts carry a ‘Report’ button for users to flag offensive content to admins and moderators. Similar rules account-suspension apply where a user’s posts have been reported multiple times. All offensive-content reports are automatically communicated to client site moderators and/or admins, to enable further action in line with our acceptable use policies.

 

4. Automatic profanity filter

Union Spaces can filter and replace unwanted words and make your community more inclusive for all. It allows you to easily censor words in activity update posts, activity comments, private messages subjects/content and more.

We have a starting list of blocked words which clients can review and enlarge if desired.

 

5. Moderation controls

In the context of the privacy and safeguarding-by-design outlined above, our goal has been to enable community connection while minimising the need for high-effort community moderation activities on the part of client universities. We are happy to discuss incorporating such a service at a competitive rate over and above the core subscription price.

Union Spaces enables post-moderation (as opposed to pre-moderation) of user comments and activity, where moderators can flag, delete or report activity and content direct from the activity feed itself.

Our recommended service levels are:
During live events: Moderators monitor a complete feed of all site activity and flag, delete or remove inappropriate content immediately after it has been made public.
During periods between live events: Moderators review onsite activity for inappropriate content once daily at a specified time, including at weekends.

Chris Smith - Union Spaces

About Chris

Chris Smith is CTO and Co-Founder of Union Spaces, a community-led technology that helps universities design virtual and hybrid student recruitment strategies.

Related posts

Leave a Comment