That is why a team of Harvard Medical School researchers have decided to attack this issue from an entirely new angle. Rather than build a mountain range of proteomic data one grain of dirt at a time, they have developed a computer program that can take on the responsibility of assembling such a gargantuan model.
"Through incorporating principles of engineering, we've developed a language that can describe biology in the same way a biologist would," says Jeremy Gunawardena, director of the Virtual Cell Program in Harvard Medical School's department of systems biology. "The potential here is enormous. This opens the door to actually performing discovery science, to look at things like drug interactions, right on the computer."
These findings will be published in the July 23 issue of Journal of the Royal Society Interface.
Aneil Mallavarapu, Matthew Thomson, Benjamin Ullian, and Jeremy Gunawardena
Department of Systems Biology, Harvard Medical School, Boston MA
Mallavarapu used the programming language LISP, a language widely used in artificial intelligence research. LISP is famous among computer scientists due to its ability to write code that, in turn, can write code, enabling a programmer to derive new mini-languages.
"LISP isn't like typical programs, it's more like a conversation," says Gunawardena. "When we input data into Little b, Little b responds to it and reasons over the data."
For example, Gunawardena's lab works on kinases, a kind of protein that transfers phosphate chemicals to other proteins in order to regulate their activity. While this property is common to all kinases, there is a great deal of variety in how particular kinases carry this out. Little b, however, understands this basic property of kinases, this abstraction.
Here, the researchers demonstrated how they were able to interact with Little b to build complex models of kinase activity, using Little b as a kind of scientific collaborator, and not simply a passive tool.
"This language is stepping into an unknown universe, when your computer starts building things for you," says Gunawardena. "Your whole relationship with the computer becomes a different one. You've ceded some control to the machine. The machine is drawing inferences on your behalf and constructing things for you."
"The next step is to create an interface that's easy to use," says Gunarwardena. "Think of web page development. Lots of people are creating web pages with little or no knowledge of HTML. They use simple interfaces like Dreamweaver. Once we've developed the equivalent, scientists will be able to use our system without having to learn Little b."
Be a part of the XTractor community. XTractor is the first of its kind - Literature alert service, that provides manually curated and annotated sentences for the Keywords of user preference. XTractor maps the extracted entities (genes, processes, drugs, diseases etc) to multiple ontologies and enables customized report generation. With XTractor the sentences are categorized into biological significant relationships and it also provides the user with the ability to create his own database for a set of Key terms. Also the user could change the Keywords of preference from time to time, with changing research needs. The categorized sentences could then be tagged and shared across multiple users. Thus XTractor proves to be a platform for getting real-time highly accurate data along with the ability to Share and collaborate.
Sign up it's free, and takes less than a minute. Just click here:www.xtractor.in.