The purpose of this page is to give an accurate, yet concise overview of Robert Rosen’s definition of complexity, its criteria, and a hint at its ramifications. The latter constitutes the bulk of Rosen’s investigations into complexity, and it cannot be easily surveyed here.
Rosen’s definition of “simple” and “complex” systems
“A system is simple if all its models are simulable. A system that is not simple, and that accordingly must have a nonsimulable model, is complex.”
Conceptual steps toward Rosennean complexity :
The following provides a very brief sketch of one way of approaching Rosennean complexity. I chose this sequence since I believe the steps I have laid out also somewhat approximates the historical development of Rosen’s concept of complexity, as I perceive it through the sequence of his books. Thus, this layout serves two purposes at once. For the sake of brevity, I have relegated quotes and elaborations to the footnotes.
Modeling is the act of establishing congruence between the elements and entailment structures of two systems (namely, between an object system and a model of the system). 
The possibility of “complexity”, in an intuitive sense, arises when a system acts in unexpected ways; that is, in ways that do not match the predictions of our models. 
This notion of complexity is therefore rooted in a comparison of a system with its model(s).
When a single dynamical description is capable of successfully modeling a system, then the behaviors of that system will, by definition, always be correctly predicted.  Hence, such a system will not have any “complexity” in the sense above, in that there will exist no unexpected or unanticipated behavior.
Conversely, systems that require multiple partial dynamical descriptions - no one of which, or combination of which, suffices to successfully describe the system - are complex, in the sense above. 
Another way of connoting this distinction is to say that simple systems will always have a single “largest model”. 
Therefore, the precise relation that circumscribes the category of simple (non-complex) systems is this: all the models of the system are what Rosen calls simulable  (i.e., Turing-computable [16,17]).
We have arrived at Rosen’s definition:
“A system is simple if all its models are simulable. A system that is not simple, and that accordingly must have a nonsimulable model, is complex.”
So, we can see that Rosen’s definition provides a precise characterization of our intuitive notion of complexity. The property of simulability of models is one that not only delineates the realm of simple systems, it also carries with it a multitude of corollary characteristics - some of which are mentioned further below. These characteristics have profound consequences for the study of systems in general, and for science, specifically.
A subtle, yet important, item to note is that complex systems are defined essentially in terms of what they not (i.e., simple systems). This is done by specifying as minimal a criteria as possible (having at least one nonsimulable model) in order to provide a precise, yet unrestrictive, distinction between simple and complex. The realm of simple systems are well-understood to the extent that all their models restricted to Turing-computability, and thus the characteristics of simple systems are likewise restricted. [20a] In contrast, complex systems are, in a sense, open-ended: there is no apparent upper-limit to how complex a system can be constructed, and therefore, no apparent limit to what characteristics such systems might have. Accordingly, complexity itself is defined by Rosen in an open-ended fashion. [20b]
Alternate phrasings of Rosennean Complexity
Taken out of its context, Rosen’s definition of simplicity/complexity can be misunderstood or misinterpreted. Here I have attempted to create other “standalone” definitions of Rosennean simplicity/complexity that I consider more self-explanatory, while still remaining true to Rosen’s definitions.
A system is categorized as simple if all its models are Turing-computable. A system that is not simple, and that accordingly must have a Turing-incomputable model, is categorized as complex.
Rosennean simplicity is a relational property of a system if all the models of the system are Turing-computable.
Rosennean complexity is a relational property of a system if at least one of the models of the system is Turing-incomputable.
Degrees of Rosennean complexity
Just as a system is deemed simple or complex based on its set of models, so too can degrees of complexity be discussed. Within the category of complex formal systems, we can, at least in principle, compare two complex formal systems in the following way: if one system is inadequate to successfully model all the entailments in the second system, then we can say that the second system is more complex than the first system.  Note that this is a relative criterion, and not an absolute one.
A sidenote on Turing Machines
There sometimes exists a degree of confusion concerning what a Turing machine is or is not capable of computing. This often arises through the use of phrases such as “more powerful” when referring to certain kinds of computing devices, which seem to imply a larger scope of computability, but in actuality refer only to speed or efficiency of computation. As such, everything from Cray computers to cellular automata to neural nets to quantum computers are unable to compute anything that is outside the (theoretical) scope of a Turing machine. Also, adding “stochastic” devices, multiple read/write heads, etc. to a Turing machine does not allow the enhanced Turing machine to compute anything beyond what was already considered computable. [22,23,24]
Turing-computability encompasses the realm of purely rote or algorithmic processes. Such processes are completely syntactic, as they constitute merely symbol manipulation. In the realm of mathematical systems, such systems that are bereft of semantic elements and entirely describable by rote processes are called formalizable.
Characteristics of simple vs. complex systems
Complex systems have remarkably different characteristics than simple systems. It is the investigations into these characteristics that Rosen pursued in order to better understand complex systems, particularly as they related to biological organisms. Here is a very brief summary of some of the differences between the two categories:
|Simple System||Complex System|
|Fully predicative||Contains impredicativities|
|Fully fractionable||Contains non-fractionable aspects|
|Has a single largest syntactic model||Has no single largest syntactic model|
|Has no complex models||Has complex and simple models|
|Has computable models||Has noncomputable and computable models|
|Has no closed loops of entailment||Has closed loops of entailment|
|Fully syntactic||Has semantic aspects|
|Synthesis is the inverse of analysis||Synthesis generally distinct from analysis|
|Epistemology coincides with ontology||Epistemology generally distinct from ontology|
Simple Systems as constrained complex systems
It is not unusual for people to imagine complex system as being, in some sense, more rare (or nongeneric), than simple systems. Often this notion is tied up with the erroneous idea that it somehow takes lots of simple pieces put together to make a complex system.  However, if we view simple vs. complex system in other terms, we will see that, in fact, it is simple systems that are nongeneric. We will do this by considering the way in which simple systems are complex systems that have been burdened with constraints. It is enlightening if we examine the comparative list of characteristics above, and reword them as follows:
|Complex System||Simple System|
|Contains impredicative and predicative structures||Only predicative structures allowed|
|Contains at least some non-fractionable aspects||Only fractionable structures allowed|
|Exceeds having a single largest syntactic model||Confined to a single largest syntactic model|
|Has complex and simple models||Only simple models allowed|
|Has noncomputable and computable models||Only computable models allowed|
|Has closed loops, and can have linear chains, of entailment||Only linear chains of entailment allowed|
|Has semantic and syntactic aspects||Only syntactic elements allowed|
|Synthesis can vary from analysis||Synthesis must be the inverse of analysis|
|Epistemology can vary from ontology||Epistemology must coincide with ontology|
Seen in this way, it becomes readily apparent that simple systems are a very restricted subset of systems, and that they are exceedingly constrained in comparison to complex systems. As such, simple systems require an inordinately rare (and rarified) set of circumstances to exist. It is thus quite evident that it is specious to presume that we can envision the world as composed of simple systems, or that we can understand the world via simple models alone.
Ramifications of Rosennean Complexity for Physics
Turing-computability encompasses the class of recursive functions. As it turns out, the formalism of state-based Newtonian physics is just such a recursive formalism.  What this entails is that state-based Newtonian physics (and by extension, relativistic physics and quantum mechanics as well ) are all within the realm of Turing-computability. The result is that these versions of physics are essentially adequate for modeling only simple systems; and conversely, are inadequate for modeling complex systems. 
The notion that Newtonian-style physics is capable of describing physical reality is thus a tacit claim that the universe is simple, in Rosen’s sense. A more explicit version of this claim derives from Church’s Thesis. This (unprovable) thesis, based on Church’s lambda calculus, stated essentially that if a function is effectively calculable in an intuitive sense, then the function is Turing-computable. This thesis has been widely exaggerated over the years to include not just “calculable” processes, but any type of effective processes - including anything in the physical world. Such “strong” or “material” versions of Church’s Thesis are explicit restatements of what is built into the formalism of Newtonian mechanics. A corollary of this thesis is that reductionism is a valid universal program in science. 
However, the attempt to equate effectiveness with rote algorithm fails to include even most of mathematics, to the extent that most of mathematics is not formalizable. This is another way of understanding the consequences of Gödel’s Incompleteness theorems in Number Theory.  So, already Church’s Thesis fails if it is extended only slightly beyond a very limited notion of “effective process”. In physics, the failure of material versions of Church’s Thesis shows up, for example, in the inability to solve N-body problems in closed form or by reductionistic methods.  At root, the idea that the physical universe adheres to some variant of Church’s Thesis is no more than an errant belief. Along with casting off that belief, so too is abandoned any hope for the program of reductionism as a universal strategy in physics, since reductionism relies upon a hypothesis of universal fractionability, a feature which is restricted to simple systems.A challenge in creating a physics for complex systems is in creating a formalism that is outside of the realm of simulable recursive functions, and therefore, is not state-based, and is not fully computable, in the Turing sense. The methodology of relational biology [34,35,36] and its relational models created by Rosen’s mentor, Nicolas Rashevsky, provides an important avenue. In Essays on Life Itself, Rosen outlines some further approaches to creating a “new physics”.  The upshot is that we need not discard our Newtonian physics completely just because of its inability to fully capture complexity; but rather we need to enlarge physics. As Rosen says, “It does not say that we learn nothing about complex systems from simple models; it merely says we should widen our concept of what models are.” 
References & Footnotes
EL: Rosen, R. 1998. Essays on Life Itself. Columbia University Press
LI: Rosen, R. 1991. Life Itself. Columbia University Press
AS: Rosen, R. 1985. Anticipatory Systems. Pergamon Press
FM: Rosen, R. 1978. Fundamentals of Measurement. Elsevier Science
Davis, M. 1982. Computability and Unsolvability. Dover.
Rashevsky, N. 1960. Mathematical Biophysics: Vol. II. 3rd edition. Dover.
 EL p. 292
 Judith Rosen, daughter of Robert Rosen and heir to his works, suggested the term “Rosennean”. My use of the phrase “Rosennean complexity” is not one Rosen ever used in his published writings, to my knowledge; however, I feel it is useful as a way to explicitly denote Rosen’s specific notion of complexity.
 EL p. 324: “I have been, and remain, entirely committed to the idea that modeling is the essence of science and the habitat of all epistemology.”
 EL p. 158-160; LI Ch. 3
 AS p. 421-422: “If a system surprises us, or does something we have not predicted, or responds in a way we have not anticipated; if it makes errors; if it exhibits emergence of unexpected novelties of behavior, we also say that the system is complex. In short, complex systems are those that behave counter-intuitively.”
 A relational property of X is a property based on a specified relation between X and something that is external to X. For example, “is next to” or “larger than” are relational properties. As far as I know, Rosen himself did not explicitly use the term “relational property” in describing complexity; however, I believe the term properly describes the situation. (see footnote 8, and AS p. 83)
 FM p.112
 AS p. 322: “This approach to complexity is novel in several ways. For one thing, it requires that complexity is not an intrinsic property of a system nor of a system description. Rather, it arises from the number of ways in which we are able to interact with the system.”
 FM p. 112: “Therefore, a system is simple to the extent that a single description suffices to account for our interactions with the system; it is complex to the extent that this fails to be true.”
 AS p. 424: “Hence, another characteristic feature of complex systems; they appear to possess a multitude of partial dynamical descriptions, which cannot be combined into one single description.”
 EL p. 336: “This property of possessing a largest model is a hallmark of simple systems or mechanisms.”
 LI p. 162-163
 LI p. 203-212
 EL p. 336: ” It [possessing a largest model] is basically what allows a whole system to be expressible as software (i.e., as program and data) to an extraneous symbol-processor or mathematical machine, and hence to make the system simulable.”
 EL p. 292, 303
 LI p. 192: “if f is simulable, then there is a Turing machine T such that, for any word w in the domain of f, suitably inscribed on an input tape to T, and for a suitably chosen initial state of T, the machine will halt after a finite number of steps, with f(w) on its output tape.”
 EL p. 305
 EL p. 292, 306
 Note that we do not in general say that “a complex system is Turing-incomputable”. ‘Computability’ as a criterion is applied to the models, not to the system.
 EL p. 292
[20a] Intuitively, there is no way to make a simulable model into a nonsimulable model by altering or enlarging the set of algorithmic steps finitely. The resulting program is still algorithmic, and thus, still simulable.
[20b] A mundane analogy: consider the categorization of certain highway vehicles as “oversized”. This means that they exceed some defined legal dimensional (size) limit. (Note that “oversized” is thus a relational property.) Vehicles that are not categorized as oversized are constrained to specific well-defined limits of certain size characteristics. Conversely, a vehicle that has been categorized as “oversized” may exceed the legal limit by a little or alot, and may do so in one or more size characteristics. “Oversized” only tells the minimum criteria, and is therefore essentially an open-ended categorization.
 LI p. 9-10: “Let us suppose we can treat it [Number Theory] as a model…and ask what it can model. The question immediately arises: are there other mathematical formalisms too rich in entailment to be captured by Number Theory, and hence more complex than it? The answer, of course, is yes; in fact, we can iterate this process, obtaining more and more complex formalisms, indefinitely.”
 Davis, M. 1982.
 EL p. 35-37
 LI sec. 4I
 LI p. 104-105
 LI p. 102: “This is a fateful situation. Once we have partitioned the ambience into a system and its environment, and (following Newton) once we have encoded system into a formalism whose only entailment is a recursion rule governing state succession, we have said something profound about causality, and indeed about Natural Law itself. In brief, we have automatically placed beyond the province of causality anything that does not encode directly into a state transition sequence. Such things have become acausal, out of the reach of entailment in the formalism, and hence in principle undecodable from the formalism.”
 EL p. 304, 325
 EL p. 280: “If, and only if, a system is simple, which means that all of its models are computable or simulable, then this set of all models becomes a reductionist paradise - otherwise, not.”
 EL p. 78
 EL p. 52
 EL p. 324
 Rashevsky, N. 1960. Ch. 28-35
 LI Ch. 5, 10, 11
 EL p. 259-260
 EL p. 26-28
 EL p. 325