Technical solutions to technical problems
Parents, extended relatives, Internet service providers and technology providers can all be incorporated in thinking about how children use technology, Holt said.
Apps that control how much time children spend online, and other easy-to-use parental control devices, may help, Holt said. There could also be apps to enable parents to better protect their children from certain content and help them report bullying.
Scientists at Massachusetts Institute of Technology are working on an even more automated solution. They want to set up a system that would give bullying victims coping strategies, encourage potential bullies to stop and think before posting something offensive, and allow onlookers to defend victims, said Henry Lieberman.
Lieberman's students Birago Jones and Karthik Dinakar are working on an algorithm that would automatically detect bullying language. The research group has broken down the sorts of offensive statements that commonly get made, grouping them into categories such as racial/ethnic slurs, intelligence insults, sexuality accusations and social acceptance/rejection.
While it's not all of the potential bullying statements that could be made online, MIT Media Lab scientists have a knowledge base of about 1 million statements. They've thought about how some sentences, such as "you look great in lipstick and a dress," can become offensive if delivered to males specifically.
The idea is that if someone tries to post an offensive statement, the potential bully would receive a message such as "Are you sure you want to send this?" and some educational material about bullying may pop up. Lieberman does not want to automatically ban people, however.
"If they reflect on their behavior, and they read about the experience of others, many kids will talk themselves out of it," he said.
Lieberman and colleagues are using their machine learning techniques on the MTV-partnered website "A Thin Line," where anyone can write in their stories of cyberbullying, read about different forms of online disrespect, and find resources for getting help. The researchers' algorithm tries to detect the theme or topic of each story, and match it to other similar stories. They're finding that the top theme is sexting, Lieberman said.
"We're trying to find social network sites that want to partner with us, so we can get more of this stuff out into the real world," Lieberman said.
Turley and Rigal, who is now a freshman at Columbia University, are currently promoting the idea of having a "bully button" on Facebook so that people can formally report cyberbullying to the social network and have bullies suspended for a given period of time. They haven't gotten a response yet, but they're hopeful that it will take off.
In the meantime, Turley is feeling a lot safer in school than he used to.
"Times have changed definitely, where people are becoming slowly more aware," he said. "At my school, at least, I'm seeing a lot less bullying and a more acceptance overall. People just stick to their own."