close
Systems biology is a new field in the molecular life sciences. It is new as molecular biology was in the fifties and as cell biology was in the seventies. In my definition, systems biology is a science that aims to elucidate the general principles that govern the emergence of biological function from the interactions of components of living systems. Biological function is defined as all it takes for an organism to survive momentarily and under various stresses that reflect what its ancestors have been subject to in evolution. The components are the biological macromolecules, largely (although not completely) encoded by the genome, or higher order aggregates of such components. Indeed, biology appears to be organized in a modular fashion. This is clear in the sense of structure, with examples such as the structure of catalytic units (enzymes), confinement units (membranous vesicles), and inheritable information (chromatin). It is less clear perhaps in the sense of units of function such as metabolic pathways, endocytosis and division. Yet, if only to make understanding by the human mind possible, systems biology also aims to understand cell function in terms of those well- and ill-defined higher order modules.

An academically quite interesting spin-off of the genome sequencing effort has been the verification of what had been suspected: life requires a minimum magnitude. Living organisms do not come much smaller than 1 mm3. Here, a living organism is defined as an organizational unit that does not require other living organisms for its continued existence. This excludes viruses and artificial life. It is an interesting exercise to estimate the minimum size of life, as the process of such estimation entails the realization of several aspects that are essential to it. Because any physical object undergoes damage, be it from cosmic radiation, from predation or from simple diffusion of components, it needs to engage in maintenance processes. According to the second law of thermodynamics, processes dissipate Gibbs energy; hence, life needs to find a way to obtain Gibbs energy from its environment. It must do this at a comparatively high rate to compete with other processes (often of other living organisms that attempt to do the same). Because ‘dead’ (i.e. merely physical-chemical) processes that harvest Gibbs energy in one form or another are leaky, have a low stoichiometry, and are slow at ambient temperature, life needs catalysts that couple (photo-) chemical reactions to Gibbs energy-fixing reactions (such as the synthesis of ATP or the generation of an electrochemical potential difference for protons). The capturing of photon free-energy is simplest if it induces the movement of an electron in space. Part of the corresponding electric energy can then be captured if that movement is (partly) across a membrane closed to ion permeation, and if the electron can recombine with a proton and reduce a transmembrane carrier molecule such as ubiquinone. In molecular terms, the simplest way to make such a stable membrane is a closed bilayer of bipolar molecules, which in a world dominated by carbon, oxygen, phosphate and hydrogen is a phospholipid bilayer. The membrane also serves to keep the catalysts together, but requires more catalysts, this time for the transport of food across that membrane. Clearly, versatile catalysts are needed, which should be encodable. Hence proteins are necessary, with encoding being done by a relatively inert, readily replicable molecule. Lipids, DNA and proteins would have to be synthesized from any form of carbon, hydrogen, oxygen and nitrogen in the environment, which requires metabolic pathways. The latter syntheses need to be catalyzed by enzymes. The simplest way to carry out metabolism is in a modular fashion (i.e. one reaction at a time) employing a series of standard reactions (i.e. dehydrogenases, isomerases, transferases, lyases and, if free energy is needed, ligases). Accordingly, a pathway leading fromglucose to glycerol (a building block of phospholipids) involves some ten such steps (i.e. ten enzymes). Doing the sums, one then readily estimates the need for some 150 reactions (i.e. of some 150 proteins and 150 genes) to sustain life. The actual minimumgenome size is 300 genes therefore our sums bring us to the right order of magnitude.

More important, however, is the realization that life has a minimum threshold. Life consisting of three processes is impossible. Consequently, one cannot expect to carry out a reduction of an organism of 300 genes to 100 subsystems of three genes each, understand each of those subsystems as being alive for one hundredth, add the 100 times understanding of one hundredth of life and then understand life: life is a systems property. And, as the minimum number of required genes is more than 100, the understanding of life somehow requires the simultaneous understanding of more than 100 processes.

Elsewhere (http://www.systembiology.net/philosophy/) we deal with the more philosophical aspects of the molecular understanding of life in terms of systems biology, and with the crucial aspect of interdisciplinary teaching (http://www.febssysbio.net/). Here, we should like to emphasize that in practice systems biology requires several new approaches, which have been late to enter molecular cell biology. One of these is the ability to test hypotheses through well-defined quantitative experimentation. In physics and chemistry, the paradigm is that of the formulation of a hypothesis, which is then made to predict the effect of a well-defined experiment. That experiment should consist of the effect of a nonnatural perturbation of the system under study. Probably owing to a lack of possibilities to perturb living cells in well-defined ways, cell biologists have focused more on observing and reporting correlations and on interpreting these in terms of cartoons representing qualitative mechanisms. Hard, quantitative attempts at Popperian falsification have been rare in cell biology.

Thanks to the pioneering work of Jensen and others, there is now a viable possibility to perturb living systems at quite specific points, quantitatively. In fact, the method that these authors and others have been developing makes full use of the molecular organization of life, where almost every process is encoded by a well-defined piece of DNA. Hence, by putting the expression of that piece of DNA under the control of a tuneable promoter, each process of the living cell can be perturbed, making many systems biology hypotheses testable, in principle. In a paper in this issue, Jensen and colleagues review the present state of this methodology.

Because systems biology will ultimately depend on the simultaneous understanding of hundreds of processes, and because the human mind can only deal with a few such processes, we need help. Help is provided by computation. For a long time, theoretical biology has served the purpose of making mathematical models that show that (or how) certain principles might work. Rarely, however, do these models show that they do work: as the models were descriptive or formulated in terms of immaterial, higher-order concepts or with non-realistic kinetics, they cannot serve to test hypotheses. Precise molecular models are therefore needed, describing the functioning of biological systems in terms of experimentally determined properties of the components of those systems. Only if the predictions made on the basis of the properties of the components correspond with the behaviour of the system, importantly without enforcing such correspondence by adjusting (fitting) the experimentally determined parameter values, can the functioning of the system be understood and that aspect of systems biology considered to have succeeded. Snoep and colleagues are among those who have championed this approach of making precise computer replica of intracellular networks engaged in metabolism, gene expression and signal transduction. Not only have they made some of these themselves, they have also made a repository of such replica, called silicon cells (see http://www.siliconcell.net). This repository is ‘live’ in that the models are accessible through the world wide web and can be used for experimentation in silico. Scientific journals use the facility in their reviewing process. Snoep describes all of this in a paper in this issue.

Because intracellular processes are nonlinear, their behaviour depends on their operating point. It is therefore of crucial importance that the component processes in living cells are characterized precisely and under in vivo conditions. Tomita and colleagues have set up a vast and highly significant research programme along these lines: they have determined many processes in the living cell and coupled them to modelling. In this issue, they give an extensive account of this programme, which is highly characteristic of what is important for systems biology.

When testing molecular hypotheses concerning the molecular basis of the functioning of a living system quantitatively, it is of course important that one is able to measure the functioning of that system equally quantitatively. One important testable set of functions are the steady-state rates of all the processes (i.e. the fluxes). By measuring fluxes that flow into and out of the system, and by using maps of the network, so-called ‘flux analysis’ can deduce many of those steady-state fluxes. Not all the fluxes can be determined, as biological networks can often accomplish certain functions in more than one way; for instance, at the cost of different amounts of free energy. Using additional assumptions such as the one of maximum thermodynamic efficiency, some approaches go even further. Borodina and Nielsen contribute a paper to this issue that deals with this type of approach (i.e. with fluxes through biochemical reaction networks).

Interacting molecules are important in systems biology, and many such interacting molecules define a network. Hence, networks of molecules are important. However, because everything may influence everything else in such networks, and because those influences can involve genetic, biochemical, biophysical, mathematical and other processes, their understanding benefits from intensive integration of knowledge and expertise from a wide variety of disciplines. Accordingly, systems biology benefits greatly from the networking of scientists. This is why worldwide alliances of systems biologists are forming, such as the International E. coli Alliance, the Yeast Systems Biology Network (YSBN) and the Tyrosine Kinase Consortium (see http://www.systembiology.net). In this issue, Hohmann provides an account of the recent activities of the YSBN.

And then, what more is needed? Well, success stories! There have been quite a few already, but one that has been close to paying off in true dollars has been the systems biology mediated optimization of lysine production by Stephanopoulos and colleagues. Koffas and Stephanopoulos have written a nice account of this for the issue. I am quite sure that there are more stories to be told of systems biology paying off in industry, but not all such stories are being told. . . What is up for development, is stories about personalized medicine made possible by systems biology, dynamic drug dosing with drug combinations, and differential network-based drug design. Well, an issue of this journal is an issue — and gets full. Let us just engage in this new discipline of systems biology. Let us not be slowed down any longer by what systems biology is, or is supposed to be. It has been properly defined by systems biologists [1]. After all the defining, and after trying to get funding streams up and running, we now have some systems biology in action. Let us continue to go for it, and for its results, as shown in this issue. No need for systems biologists to be last and switch off the light. . .
arrow
arrow
    全站熱搜

    tear2001 發表在 痞客邦 留言(0) 人氣()