How 2 iSchool professors are undertaking the ‘most complicated experiment’ in gravitational physics
The National Science Foundation recently awarded two Syracuse University professors a $1 million grant to develop a citizen science system for a project being hailed as “the most complicated experiment ever undertaken in gravitational physics,” according to the grant abstract.
The term “citizen scientist” has been recently coined, said Kevin Crowston, a distinguished professor of information science in the School of Information Science who is co-investigating the project, INSPIRE: Teaming Citizen Science with Machine Learning to Deepen LIGO’s View of the Cosmos, with Carsten Østerlund, associate professor and director of Information Management and Telecommunications and Network Management at the iSchool.
The project will support the Advanced Laser Interferometer Gravitational wave Observatory, according to the grant abstract.
The aLIGO uses laser beams to monitor the distance between precisely arranged mirrors, according to the MIT news website.
According to Einstein’s theory of relativity, when a gravitational wave passes by, the distance between the mirrors should change very slightly. The aLIGO detects a change in distance, even more precisely than one-thousandth the diameter of a proton, according to the MIT news website.
Crowston and Østerlund are trying to figure out a way to determine which noise events — or changes in the distance between mirrors caused by a gravitational wave — are happening on Earth as opposed to happening somewhere in the galaxy, Crowston said. When the images are observed, the signal that comes off is basically a sound wave.
The reason this is an innovative citizen science project is that the project will use machine learning as well as people, Crowston said. While there are computers that are trained to identify the different images, they cannot detect all of them. This is where the citizen scientists come in. The images the computers do not recognize are then given to people.
“The human brain is very, very good at pattern recognition, and if you can pull out images of the shapes of these glitches … you can get humans to classify these glitches,” said Duncan Brown, an associate professor of physics at SU.
Humans can classify as well as do pattern matching for the environmental behaviors surrounding the glitches, Brown said. They may also be able to figure out what kinds of glitches occur most often or correlate them with things going on in auxiliary channels to see why a glitch is occurring, he said.
The citizen scientists are being recruited through the Zooniverse, which, on its website, is described as a platform for people around the world to contribute to real discoveries in science. The Zooniverse is completely web-based and currently boasts more than 1 million members.
Over time, the feedback from the citizen scientists will help to make the computer programs more accurate, Crowston said. The images that the computer is fairly certain about can then be given to new citizen scientists and be used as a way to train them.
The Zooniverse has templates that Crowston and Østerlund are tweaking and adding extensions to, Østerlund said.
The main concern is that most projects on the Zooniverse do not have a training component to them, Crowston said.
“We’re thinking that people will need to get some kind of training of what glitches look like and what the process looks like, and so we’re trying to decide some kind of a training regime,” Crowston said.
There is a current version of the system available now, but the user would have to already know what glitches are and how they appear, Crowston said. He added that he and Østerlund are hoping to have a version of the system done by the summer that would be ready for regular users.
Published on January 24, 2016 at 9:42 pm
Contact Stacy: sfern100@syr.edu | @StacyFernandezB