COVID-19 Coverage : See how the pandemic is impacting the world of higher education.
Access the Business Officer Magazine menu by clicking or touching here.
Business Officer Magazine logo, click or touch this logo to return to the homepageClick or touch the Business Officer Magazine logo to return to the homepage.
Get back to the Business Officer Magazine homepage by clicking the logo.

OMG! Is That Real?

January 2017

By Margo Vanover Porter

Learn More About Offline Reading

Social media alert systems can help institutions remain vigilant and minimize incidents and threats on campus, and to students, faculty, and staff.

I’m not worthy. I want to kill myself. There is no reason for me to live.”

Those sentences, if written by a student on Twitter, would trigger preventive action at higher education institutions with social media alert systems, says Sonny Provetto, a licensed clinical social worker in Burlington, Vt., and a consultant for law enforcement.

“These are phrases that we’ve given the alert system to pick up on,” he says. “For those universities with such a program, the computer would then follow those phrases to see the person’s activity in terms of negative behavior and self-harm, and alert the university.”

Provetto, who is a subject matter expert for Social Sentinel Inc., a company that develops social media threat intelligence alert solutions for college campuses, K–12 schools, and public safety industries, explains that institutions with alert systems can use sophisticated algorithms to zero in on specific phrases and key words that might be used in social media. Filters, which rely on geo-fencing to restrict coverage to a contained geographic area, typically target actions that threaten violence, involve controlled substances, or indicate self-harm.

“The alert system works on phrases, words, and concepts based in psychology,” he says. “Students who are depressed sometimes lose touch with their self-esteem and with reality. They have found themselves in despair or have [such] a level of helplessness going on in their lives [that] they might put such feelings on social media.”

Adopt New Tools

David L. Perry, assistant vice president and chief of police, Florida State University (FSU), Tallahassee, believes that institutions should consider a variety of tools to protect their campuses.

“Thirty years ago, we were still using our banana-shaped phones with a long cord, and students had to ask permission to use the phone after 9 p.m.,” Perry says. “Social media wasn’t a concern. However, you can’t throw your hands up in the air and act as if there’s nothing you can do to stay current or be aware of trends. Today, we have to build safety procedures to address inappropriate uses of social media.”

FSU adopted an alert system two years ago following a 2014 incident with an active shooter on campus in which three students were injured, one paralyzed from the waist down. “Officers responded within two minutes and stopped him from hurting anyone else.” The gunman, who drew his weapon and got into a shootout with campus officers, was killed.

Perry feels confident that a social media alert system could have avoided the panic that needlessly spread in the shooting’s aftermath.

“We saw that with the absence of a tool, we were not able to capture misinformation and social media posts that were adding to the fear and confusion, causing parents who were thousands of miles away to be overly concerned with safety on our campus,” he says. “With [such a] tool, we would have been able to detect the barrage of inaccurate information and quickly put out updates to address inaccurate concerns.”

In a nod to potential student privacy concerns, FSU officials ran the idea by the student council before purchasing the alert system. “We respect and appreciate student privacy,” he says. “Before the university even considered using a threat alert tool, we vetted it with our student government.” He emphasizes that student government members enthusiastically supported the tool’s implementation and even offered to help pay for it.

At FSU, the algorithms focus on words and phrases that involve personal safety, including weapons, drugs, violence, hate speech, and crimes against law enforcement officers—any actions that could jeopardize the safety of the campus community.

“In our society, people often lash out on social media talking about the things that they might do, including posting pictures of weapons,” Perry says. “They use it to threaten or intimidate—or sometimes to tip off their actions.” He thinks that the system has been particularly successful prior to protests and special events.

Occasionally, a social media post may not require immediate intervention but calls for long-term monitoring. Perry cites this hypothetical example: A student posts a message saying, “I just purchased three guns, I’ve done a walk-through of the building, and I know the firing rate it would take to empty my magazine in the confined area.”

Believe it or not, this post, by itself, is not illegal, he says, “but definitely would lead a reasonable person to believe that a threat could be emanating.”

None of the systems are perfect and despite best efforts will occasionally alert officials to harmless situations, such as a student who writes: “The party was a bomb. It was a blast. It went off with a bang.”

“We see those from time to time, particularly after weekends and home games,” Perry says. “The system catches one or two of the key words and part of the algorithm, but we are able to quickly eliminate it. We dismiss those.”

Watch for Farewells

Provetto encourages business officers and administrators to investigate the advantages of social media alert systems. “You really need to pay attention to this stuff,” says the former police officer. “You can’t afford not to have a safety net like this for your students. We’ve had great success with some of the words and phrases given to the software program and have been able to intervene in several situations.”

He cites the example of a student who posted the sentence, “I’m not worthy.”

“The computer picked up on the phrase and began following the posts, picking up messages around bulimia, starvation, and anorexia,” he says. “An intervention got this person the help needed. If a college can intervene even one time in a whole year to save a young person, the system has proven its value. To the person and family [of the individual] whose life is saved by intervention, the system becomes priceless.”

The alert system can also watch for farewells or students who start giving away prized possessions. “Although they might not say it to a close friend or confidant, students may put a ‘goodbye’ to their 468 connections on Facebook. They may write a farewell letter. The system would alert, if, for example, they say, ‘I’m giving away my collection of baseball cards.’ Who would do that?”

The alert systems can also pick up on terms used in bullying, such as “You’re stupid and ugly. Why don’t you go home? Nobody likes you. Nobody wants you here.”

Unfortunately, Provetto says, students say mean, hateful things to each other all the time. “The computer can look at the number of times these hurtful messages are sent from one IP address, which would indicate to us that someone is a tormentor. When hiding behind the anonymity of a keyboard or phone, people say things that they would never say face to face.”

Limit Access to Information

Both security experts caution that the systems alert key words, phrases, and concepts that may be used on social media, as opposed to people actually monitoring activity. “We try to give these concepts, which are based on human behavior, to the alert system so that it can intervene and prevent young people from going down a path where they will harm themselves or other people,” Provetto says.

Due to the sensitivity of messagesand concern for student privacy, Perryencourages institutions to keep posts discovered during an alert on a need-to-know basis. He suggests limiting access to investigators, senior law enforcement administrators, and those who might be asked to take the most appropriate action when a post needs follow-up.

Provetto also advises institutions to look for an alert system that “meets their needs and can become a force multiplier—a tool that will tip the scales in their favor when trying to be on the front end of dangerous or threatening situations, or to identify students who may be in the process or position to commit self-harm.”

MARGO VANOVER PORTER, Locust Grove, Va., covers higher education business issues for Business Officer.


Related Topics

We saw that, with the absence of a tool, we were not able to capture misinformation and social media posts that were adding to the fear and confusion, causing parents who were thousands of miles away to be overly concerned with safety on our campus.

—David L. Perry, Florida State University

Sonny Provetto, a clinical social worker, also advises institutions to look for an alert system that meets their needs and can become a force multiplier—a tool that will tip the scales in their favor when trying to be on the front end of dangerous or threatening situations, or to identify students who may be in the process or position to commit self-harm.