Blackburn’s Quest: Assistant computer professor investigates the darkest corners of the internet


Life in the digital age has added one more certain thing to the old saying about death and taxes: People are going to be fools on the Internet.

Whether it is an anonymous troll questioning your parentage or a propaganda campaign by a foreign power, the signal-to-noise ratio on social networks has deteriorated considerably in recent years. It doesn’t even mention the hate gravediggers, conspiracy theorists and outright liars who want their distorted views to become yours.

Assistant Professor Jeremy Blackburn, a faculty member in the Computer Science Department at Watson College, has been researching “bad actors” online for over 10 years. This journey has taken him to dark places where strangers are afraid to walk, but he hopes that by shining a light there, we can begin to understand how to fix them.

“I don’t think the problems are new. These are basic human issues, ”says Blackburn. “What is different is that it has become a socio-technical problem rather than a simple social problem. The internet doesn’t make people bad, it just makes them worse and allows them to find other people who are also bad.

FROM GAMING TO SOCIAL MEDIA

Blackburn first became interested in computers while growing up in Florida, connecting with other users around the world through massively multiplayer online role-playing games (MMORPGs) such as “Ultima Online”. Players have adopted sword and sorcery character avatars for quests aimed at conquering kingdoms and battling monsters.

Because Blackburn and his friends were smart with the programming code, they sometimes found ways to cause chaos. Once, his “clan” built a virtual house in front of a key entry point and shot arrows from within at other approaching players. Another trick, which got them into the game’s “prison”, was to kill a character and steal blueprints for a new type of building that is in beta testing.

Yeah, they weren’t exactly angels.

“If you did this stuff in person during a Dungeons & Dragons game, you might get a punch in the mouth,” Blackburn says with a laugh. “But the fact that it’s virtual allowed for a whole new level of mischief.”

Like many teens who love to code, Blackburn headed to college – in his case, the University of South Florida (USF) in Tampa – with the intention of designing computer games. His interests then shifted to the underlying technologies that make shared games possible, such as distributed systems that distribute various components across multiple computers.

For his doctoral dissertation – also at USF – he revisited the idea of ​​bad behavior online by studying cheating in internet games, and that pointed a direct path to the type of research he is doing. today.

While earning his degrees, Blackburn worked for over a decade in the private sector, including as a senior developer at test prep company Boson Software and as a software architect in his own company, Pallasoft. He also spent three years as an associate researcher at Telefonica Research in Barcelona, ​​Spain.

His time in academia – first at the University of Alabama at Birmingham and now at Binghamton University – has coincided with the proliferation and influence of mainstream platforms such as Facebook and Twitter as well as niche apps like Telegram, Parler, 4chan, and Gab.

“Things have moved away from blogs and similar sites over the past 10 years,” he says. “People want interactive social media – they want to be able to engage with each other rather than just shouting from a podium. “

In our polarized society, however, these back-and-forth interactions can become downright unpleasant.

TROLL TRACKING

Blackburn is the co-founder of the International Data-Driven Research Laboratory for Advanced Modeling and Analysis (iDRAMA), which includes more than two dozen professors, doctoral students and industry researchers from around the world.

In various configurations, iDRAMA members have studied almost every social media platform, from dominant platforms like Twitter to white supremacist havens like Gab and 4chan. The only one they ignore is Facebook, as data collection from there has become increasingly unreliable.

The iDRAMA lab has published research on QAnon, the rise of anti-Asian and anti-Semitic sentiments, the use of manipulated news images (also known as “faketography”), cyberbullying, misogyny, disinformation campaigns sponsored by the state, etc.

It’s a roundup of the worst humanity has to offer, and sometimes enemies strike back. A recent 4chan article, for example, claimed that Blackburn is “a Hamas recruiter,” and he has received some disturbing threats over the years. (Fortunately, nothing came of them.)

Blackburn fosters an atmosphere of camaraderie among his students and peers, welcoming open conversations so that no one feels overwhelmed by hatred of the Internet.

“If you don’t look at the content, you can’t really do research on it,” he says, “but if you look at the content too much or too deeply – if you look into the abyss a little too long – you might fall in. It’s hard to follow that line, and I’ve certainly had some failures along the way. “

Gianluca Stringhini, assistant professor at Boston University and co-founder of the iDRAMA lab, praises Blackburn’s willingness to break the boundaries of traditional computing methods.

“When Jeremy and I started working together, we realized that studying these emerging socio-technical issues required techniques that didn’t really fall within established research methods in our fields,” Stringhini explains.

“Five years later, we combine computer networking, security, graphical analysis, psychology and other disciplines to paint a comprehensive picture of militarized information online. Few researchers would be comfortable doing this, but Jeremy has a unique outlook and is not afraid to break with research standards.

RETURNING THE ROCKS

Earlier this year, Blackburn received a National Science Foundation CAREER Award of $ 517,484 over five years for his project “Towards a Data-Driven Online Understanding of Sentiment”. The CAREER Prize supports professors who have the potential to serve as future academic models.

At the heart of the project is designing a better way to train machine learning – which does most of the moderation of content on social media platforms – on how to judge the offensiveness of images used in memes. .

Currently, artificial intelligence software is trying to determine whether a particular image is bad or not, but Blackburn wants to take a trick from the online game by presenting it with two images and asking which one is worse. The process is similar to the “matchmaking” system which classifies players in similar skill groups, not people who are “1,000 times better or worse than you”.

“Instead of looking at the images in isolation and passing judgment on that individual content, it’s more like ordering them,” he says. “We don’t learn whether something is racist or not, we learn what is more racist. Who knows what we’ll find, but we’re confident it will lead to something interesting.

Blackburn admits that he and his colleagues at iDRAMA sometimes discuss whether their research is helping internet freaks dig deeper and evade future detection. Maybe if they didn’t turn the rocks over, the evil creatures below would just stay there and never come out.

As a computer scientist, however, Blackburn believes that learning more will be an important step in stemming what has become a political and social threat. He argues that it is also a public health crisis: online hatred affects our mental well-being, and misinformation about COVID-19 has resulted in more deaths and hospitalizations.

“We have this incredibly powerful, world-changing technology that’s been around for less than a generation,” he says. “I hope we will provide the knowledge and tools to become more resilient, resilient and less susceptible to this type of behavior, and to begin to find ways to actively address it. “

Comments are closed.