Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
533 views4 pages

The Psychogenesis of Knowledge and Its Epistemological Significance

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 4

Piaget J. (1980) The psychogenesis of knowledge and its epistemologica... https://cepa.

info/3062

PIAGET J.
THE PSYCHOGENESIS OF e-print 3062

KNOWLEDGE AND ITS EPISTEMOLOGICAL


SIGNIFICANCE
Cite as: Piaget J. (1980) The psychogenesis of knowledge and its epistemological signi@cance. In: Piattelli-Palmarini M. (ed.) Language and

learning: The debate between Jean Piaget and Noam Chomsky. Harvard University Press, Cambridge MA: 23–34. Available at

https://cepa.info/3062

Log in to see this properly formatted

Curator’s note: Page numbering follows the 2006 reprint in: Beakley B. & Ludlow P. (eds.) The philosophy
of mind: Classical problems contemporary issues. Second edition. The MIT Press, Cambridge MA: 741–749.
Fifty years of experience have taught us that knowledge does not result from a mere recording of
observations without a structuring activity on the part of the subject. Nor do any a priori or innate
cognitive structures exist in man; the functioning of intelligence alone is hereditary and creates
structures only through an organization of successive actions performed on objects. Consequently, an
epistemology conforming to the data of psychogenesis could be neither empiricist nor preformationist, but
could consist only of a constructivism, with a continual elaboration of new operations and structures.
The central problem, then, is to understand how such operations come about, and why, even though they
result from nonpredetermined constructions, they eventually become logically necessary.
Empiricism
The critique of empiricism is not tantamount to negating the role of experimentation, but the “empirical”
study of the genesis of knowledge shows from the onset the insufficiency of an “empiricist”
interpretation of experience. In fact, no knowledge is based on perceptions alone, for these are always
directed and accompanied by schemes of action. Knowledge, therefore, proceeds from action, and all action
that is repeated or generalized through application to new objects engenders by this very fact a “scheme,
“ that is, a kind of practical concept. The fundamental relationship that constitutes all knowledge is
not, therefore, a mere “association” between objects, for this notion neglects the active role of the
subject, but rather the “assimilation” of objects to the schemes of that subject. This process, moreover,
prolongs the various forms of biological “assimilations, “ of which cognitive association is a particular
case as a functional process of integration. Conversely, when objects are assimilated to schemes of
action, there is a necessary “adaptation” to the particularities of these objects (compare the phenotypic
“adaptations” in biology), and this adaptation results from external data, hence from experience. It is
thus this exogenous mechanism that converges with what is valid in the empiricist thesis, but (and this
reservation is essential) adaptation does not exist in a “pure” or isolated state, since it is always the
adaptation of an assimilatory scheme; therefore this assimilation remains the driving force of cognitive
action.
These mechanisms, which are visible from birth, are completely general and are found in the various
levels of scientific thought. The role of assimilation is recognized in the fact that an “observable” or
a “fact” is always interpreted from the moment of its observation, for this observation always and from
the beginning requires the utilization of logico-mathematical frameworks such as the setting up of a
relationship or a correspondence, proximities or separations, positive or negative quantifications
leading to the concept of measure – in short, a whole conceptualization on the part of the subject that
excludes the existence of pure “facts” as completely external to the activities of this subject, all the
more as the subject must make the phenomena vary in order to assimilate them.
As for the learning processes invoked by the behaviorist empiricists on behalf of their theses, Inhelder,
Sinclair, and Bovet have shown that these processes do not explain cognitive development but are subject
to its laws, for a stimulus acts as such only at a certain level of “competence” (another biological
notion akin to assimilation). Briefly, the action of a stimulus presupposes the presence of a scheme,
which is the true source of the response (which reverses the SR schema or makes it symmetrical [S ⇄ R]).
Besides, Pribram has demonstrated a selection of inputs existing even at the neurological level.
Preformation
Is it necessary, then, to turn in the direction of the preformation of knowledge? I will return later to
the problem of innateness and will limit myself for the moment to the discussion of the hypothesis of
determination. If one considers the facts of psychogenesis, one notes first the existence of stages that
seem to bear witness to a continual construction. In the first place, in the sensorimotor period
preceding language one sees the establishment of a logic of actions (relations of order, interlocking of
schemes, intersections, establishment of relationships, and so on), rich in discoveries and even in
inventions (recognition of permanent objects, organization of space, of causality). From the ages of 2 to
7, there is a conceptualization of actions, and therefore representations, with discovery of functions
between covariations of phenomena, identities, and so forth, but without yet any concept of reversible
operations or of conservation. These last two concepts are formed at the level of concrete operations
(ages 7 to 10), with the advent of logically structured “groupings, “ but they are still bound to the
manipulation of objects. Finally, around the age of 11 to 12, a hypothetico-deductive propositional logic
is formed, with a combinatorial lattice, “sums of parts, “ algebraic four-groups, and so on.
However, these beautiful successive and sequential constructions (where each one is necessary to the
following one) could be interpreted as the progressive actualization (related to factors such as
neurological maturity) of a set of preformations, similar to the way in which genetic programming
regulates organic “epigenesis” even though the latter continues to interact with the environment and its
objects. The problem is therefore to choose between two hypotheses: authentic constructions with stepwise
disclosures to new possibilities, or successive actualization of a set of possibilities existing from the
beginning. First, let us note that the problem is similar in the history of science: are the clearly
distinct periods in the history of mathematics the result of the successive creations of mathematicians,
or are they only the achievement through progressive thematizations of the set of all possibilities
corresponding to a universe of Platonic ideas? Now, the set of all possibilities is an antinomic notion

1 de 4 30/03/2023, 20:14
Piaget J. (1980) The psychogenesis of knowledge and its epistemologica... https://cepa.info/3062

like the set of all sets, because the set is itself only a possibility. In addition, today’s research
shows that, beyond the transfinite number “kappa zero” (which is the limit of predicativity), some
openings into new possibilities are still taking place, but are in fact unpredictable since they cannot
be founded on a combinatorial lattice. Thus, either mathematics is a part of nature, and then it stems
from human constructions, creative of new concepts; or mathematics originates in a Platonic and
suprasensible universe, and in this case, one would have to show through what psychological means we
acquire knowledge of it, something about which there has never been any indication.
This brings us back to the child, since within the space of a few years he spontaneously reconstructs
operations and basic structures of a logico-mathematical nature, without which he would understand
nothing of what he will be taught in school. Thus, after a lengthy preoperative period during which he
still lacks these cognitive instruments, he reinvents for himself, around his seventh year, the concepts
of reversibility, transitivity, recursion, reciprocity of relations, class inclusion, conservation of
numerical sets, measurements, organization of spatial references (coordinates), morphisms, some
connectives, and so on – in other words, all the foundations of logic and mathematics. If mathematics
were preformed, this would mean that a baby at birth would already possess virtually everything that
Galois, Cantor, Hilbert, Bourbaki, or MacLane have since been able to realize. And since the child is
himself a consequence, one would have to go back as far as protozoa and viruses to locate the seat of
“the set of all possibilities.”
In a word, the theories of preformation of knowledge appear, for me, as devoid of concrete truth as
empiricist interpretations, for the origin of logico-mathematical structures in their infinity cannot be
localized either in objects or in the subject. Therefore, only constructivism is acceptable, but its
weighty task is to explain both the mechanisms of the formation of new concepts and the characteristics
these concepts acquire in the process of becoming logically necessary.
Reflective Abstraction
If logico-mathematical structures are not preformed, one must, in contrast, go far back to discover their
roots, that is, the elementary functioning permitting their elaboration; and as early as the sensorimotor
stages, that is to say, much before language, one finds such points of departure (though without any
absolute beginning, since one must then go back as far as the organism itself; see the section on the
biological roots of knowledge). What are the mechanisms, then, that provide the constructions from one
stage to the other? The first such mechanism I will call “reflective abstraction.”
It is, in fact, possible to distinguish three different kinds of abstraction. (1) Let us call “empirical
abstraction” the kind that bears on physical objects external to the subject.
Logico-mathematical abstraction, in contrast, will be called “reflective” because it proceeds from the
subject’s actions and operations. This is even true in a double sense; thus we have two interdependent
but distinct processes: that of a projection onto a higher plane of what is taken from the lower level,
hence a “reflecting, “ and that of a “reflection” as a reorganization on the new plane – this
reorganization first utilizing, only instrumentally, the operations taken from the preceding level but
aiming eventually (even if this remains partially unconscious) at coordinating them into a new totality.
(3) We will speak finally of “reflected abstraction” or “reflected thought” as the thematization of that
which remained operational or instrumental in (2); phase thus constitutes the natural outcome of (2) but
presupposes in addition a set of explicit comparisons at a level above the “reflections” at work in the
instrumental utilizations and the constructions in process of (2). It is essential, therefore, to
distinguish the phases of reflective abstractions, which occur in any construction at the time of the
solution of new problems, from reflected abstraction, which adds a system of explicit correspondences
among the operations thus thematized.
Reflective and reflected abstractions, then, are sources of structural novelties for the following
reasons: In the first place, the “reflecting” on a higher plane of an element taken from a lower level
(for example, the interiorization of an action into a conceptualized representation) constitutes an
establishment of correspondences, which is itself already a new concept, and this then opens the way to
other possible correspondences, which represents a new “opening.” The element transferred onto the new
level is then constituted from those that were already there or those that are going to be added, which
is now the work of the “reflection” and no longer of the “reflecting” (although initially elicited by the
latter). New combinations thus result which can lead to the construction of new operations operating “on”
the preceding ones, which is the usual course of mathematical progress (an example in the child: a set of
additions creating multiplication).[Note 1]Note 1. NOTETEXT-1 As a rule, all reflecting on a new plane
leads to and necessitates a reorganization, and it is this reconstruction, productive of new concepts,
that we call “reflection”; yet well before its general thematization, reflection comes into action
through a set of still instrumental assimilations and coordinations without any conceptual awareness of
structures as such (this is to be found all through the history of mathematics). Finally reflected
abstraction or retrospective thematization become possible, and although they are found only on
preconstructed elements, they naturally constitute a new construction in that their transversal
correspondences render simultaneous that which was until now elaborated by successive longitudinal
linkings (compare, in scientific thought, the thematization of “structures” by Bourbaki).
Constructive Generalization
Abstraction and generalization are obviously interdependent, each founded on the other. It results from
this that only inductive generalization, proceeding from “some” to “all” by simple extension, will
correspond to empirical abstraction, whereas constructive and “completive” generalizations in particular
will correspond to reflective and reflected abstractions.
The first problem to be solved, then, is that of the construction of successive steps that have been
established in the preceding paragraphs. Now, each one of them results from a new assimilation or
operation aimed at correcting an insufficiency in the previous level and actualizing a possibility that
is opened by the new assimilation. A good example is the passage of action to representation due to the
formation of the semiotic function. Sensorimotor assimilation consists only of assimilating objects to
schemes of action, whereas representative assimilation assimilates objects to each other, hence the
construction of conceptual schemes. Now, this new form of assimilation already was virtual in
sensorimotor form since it bore on multiple but successive objects; it was then sufficient to complete
these successive assimilations by a simultaneous act of setting into transversal correspondence before
passing to the next level. But such an action implies the evocation of objects not presently perceived,
and this evocation requires the formation of a specific instrument, which is the semiotic function
(deferred imitations, symbolic play, mental image which is an interiorized imitation, sign language, and
so on, in addition to vocal and learned language). Now, sensorimotor signifiers already exist in the form
of cues or signals, but they constitute only one aspect or a part of the signified objects; on the
contrary, the semiotic function commences when signifiers are differentiated from what is thereby
signified and when signifiers can correspond to a multiplicity of things signified. It is clear, then,
that between the conceptual assimilation of objects between themselves and semiotization, there is a
mutual dependence and that both proceed from a completive generalization of sensorimotor assimilation.
This generalization embeds a reflective abstraction bearing on elements directly borrowed from
sensorimotor assimilation.
Likewise, it would be easy to show that the new concepts inherent in the levels of initially concrete,
then hypothetico-deductive operations proceed from completive generalizations as well. It is thus that
concrete operations owe their new abilities to the acquisition of reversibility, which has already been
prepared by preoperative reversibility; but the reversibility, in addition, requires a systematic
adjustment of affirmations and negations, that is to say, an autoregulation which, by the way, is always
working within the constructive generalizations (I will return to the subject of autoregulation in the
section on necessity and equilibration). As for the hypothetico-deductive operations, these are made
possible by the transition from the structures of “groupings” devoid of a combinatorial lattice (the

2 de 4 30/03/2023, 20:14
Piaget J. (1980) The psychogenesis of knowledge and its epistemologica... https://cepa.info/3062

elements of which are disjoint), to the structures of the “set of components” embedding a combinatorial
lattice and full generalization of partitions.[Note 2]Note 2. NOTETEXT-2
These last advances are due to a particularly important form of constructive generalizations, which
consist of raising an operation to its own square or a higher power: thus, combinations are
classifications of classifications, permutations are seriations of seriations, the sets of components are
partitions of partitions, and so on.
Finally, let us call attention to a simpler but equally important form which consists of generalizations
by synthesis of analogous structures, such as the coordination of two systems of references, internal and
external to a spatial or cinematic process (the 11to 12-year-old level).
The Biological Roots of Knowledge
What we have seen so far speaks in favor of a systematic constructivism. It is nonetheless true that its
sources are to be sought at the level of the organism, since a succession of constructions could not
admit of an absolute beginning. But before offering a solution, we should first ask ourselves what a
preformationist solution would mean biologically; in other words, what a priorism would look like after
having been rephrased in terms of innateness.
A famous author has demonstrated this quite clearly: it is Konrad Lorenz, who considers himself a Kantian
who maintains a belief in a hereditary origin of the great structures of reason as a precondition to any
acquisition drawn from experience. But as a biologist, Lorenz is well aware that, except for “general”
heredity common to all living beings or major groups, specific heredity varies from one species to
another: that of man, for instance, remains special to our own particular species. As a consequence,
Lorenz, while believing as a precondition that our major categories of thought are basically inborn,
cannot, for that very reason, assert their generality: hence his very enlightening formula according to
which the a prioris of reason consist simply of “innate working hypotheses.” In other words, Lorenz,
while retaining the point of departure of the a priori (which precedes the constructions of the subject),
sets aside necessity which is more important, whereas we are doing exactly the opposite, that is,
insisting on necessity (see the next section), but placing it at the end of constructions, without any
prerequisite hereditary programming.
Lorenz’s position is therefore revealing: if reason is innate, either it is general and one must have it
go back as far as the protozoa, or it is specific (species-specific or genus-specific, for instance) and
one must explain (even if it is deprived of its essential character of necessity) through which mutations
and under the influence of which natural selections it developed. Now, as research stands at present,
current explanations would be reduced for this particular problem to a pure and simple verbalism; in
fact, they would consist of making reason the product of a random mutation, hence of mere chance.
But what innatists surprisingly seem to forget is that there exists a mechanism which is as general as
heredity and which even, in a sense, controls it: this mechanism is autoregulation, which plays a role at
every level, as early as the genome, and a more and more important role as one gets closer to higher
levels and to behavior. Autoregulation, whose roots are obviously organic, is thus common to biological
and mental processes, and its actions have, in addition, the great advantage of being directly
controllable. It is therefore in this direction, and not in mere heredity, that one has to seek the
biological explanation of cognitive constructions, notwithstanding the fact that by the interplay of
regulations of regulations, autoregulation is eminently constructivist (and dialectic) by its very
nature.[Note 3]Note 3. NOTETEXT-3
It is understandable, therefore, that while fully sympathizing with the transformational aspects of
Chomsky’s doctrine, I cannot accept the hypothesis of his “innate fixed nucleus.” There are two reasons
for this. The first one is that this mutation particular to the human species would be biologically
inexplicable; it is already very difficult to see why the randomness of mutations renders a human being
able to “learn” an articulate language, and if in addition one had to attribute to it the innateness of a
rational linguistic structure, then this structure would itself be subject to a random origin and would
make of reason a collection of mere “working hypotheses, “ in the sense of Lorenz. My second reason is
that the “innate fixed nucleus” would retain all its properties of a “fixed nucleus” if it were not
innate but constituted the “necessary” result of the constructions of sensorimotor intelligence, which is
prior to language and results from those joint organic and behavioral autoregulations that determine this
epigenesis. It is indeed this explanation of a noninnate fixed nucleus, produced by sensorimotor
intelligence, that has been finally admitted by authors such as Brown, Lenneberg, and McNeill. This is
enough to indicate that the hypothesis of innateness is not mandatory in order to secure the coherence of
Chomsky’s beautiful system.
Necessity and Equilibration
We still have to look for the reason why the constructions required by the formation of reason become
progressively necessary when each one begins by various trials that are partly episodic and that contain,
until rather late, an important component of irrational thought (non-conservations, errors of
reversibility, insufficient control over negations, and so on). The hypothesis naturally will be that
this increasing necessity arises from autoregulation and has a counterpart with the increasing, parallel
equilibration of cognitive structures. Necessity then proceeds from their “interlocking.”
Three forms of equilibration can be distinguished in this respect. The most simple, and therefore the
most precocious, is that of assimilation and accommodation. Already at the sensorimotor level, it is
obvious that in order to apply a scheme of actions to new objects, this scheme must be differentiated
according to the properties of these objects; therefore one obtains an equilibrium aimed at both
preserving the scheme and taking into account the properties of the object. If however, these properties
turn out to be unexpected and interesting, the formation of a subscheme or even of a new scheme has to
prove feasible. Such new schemes will then necessitate an equilibration of their own. But these
functional mechanisms are found at all levels. Even in science, the assimilation between linear and
angular speeds involves two joint operations: common space-time relationships are assimilated while one
accommodates for these nonetheless distinct solutions; similarly, the incorporation of open systems to
general thermodynamic systems requires differentiating accommodation as well as assimilations.
A second form of equilibrium imposes itself between the subsystems, whether it is a question of
subschemes in a scheme of action, subclasses in a general class, or subsystems of the totality of
operations that a subject has at his disposal, as for example, the equilibration between spatial numbers
and measurement during calculations in which both can intervene. Now, since subsystems normally evolve at
different speeds, there can be conflicts between them. Their equilibration presupposes in this case a
distinction between their common parts and their different properties, and consequently a compensatory
adjustment between partial affirmations and negations as well as between direct or inverted operations,
or even the utilization of reciprocities. One can see, then, how equilibration leads to logical
necessity: the progressive coherence, sought and finally attained by the subject, first comes from a mere
causal regulation of actions of which the results are revealed, after the fact, to be compatible or
contradictory; this progressive coherence then achieves a comprehension of linkings or implications that
have become deductible and thereby necessary.
The third form of equilibration relies upon the previous one but distinguishes itself by the construction
of a new global system: it is the form of equilibration required by the very process of differentiation
of new systems, which requires then a compensatory step of integration into a new totality. Apparently,
there is here a simple balance of opposing forces, the differentiation threatening the unity of the whole
and the integration jeopardizing the necessary distinctions. In fact, the originality of the cognitive
equilibrium (and, by the way, further down in the hierarchy, also of organic systems) is to ensure,
against expectations, the enrichment of the whole as a function of the importance of these
differentiations and to ensure their multiplication (and not only their consistency) as a function of
intrinsic (or having become such) variations of the totality of its own characteristics. Here again one
clearly sees the relationship between equilibration and progressive logical necessity, that is, the

3 de 4 30/03/2023, 20:14
Piaget J. (1980) The psychogenesis of knowledge and its epistemologica... https://cepa.info/3062

necessity of the terminus ad quem resulting from the final integration or “interlocking” of the systems.
In summary, cognitive equilibration is consequently “accretive” (majorante); that is to say, the
disequilibria do not lead back to the previous form of equilibrium, but to a better form, characterized
by the increase of mutual dependencies or necessary implications.
As for experimental knowledge, its equilibration admits, in addition to the previous laws, of a
progressive transfer ( passage) from the exogenous to the endogenous, in the sense that perturbations
(falsifications of expectations) are first nullified or neutralized, then progressively integrated (with
displacement of equilibrium), and finally incorporated into the system as deducible intrinsic variations
reconstructing the exogenous by way of the endogenous. The biological equivalent of this process (compare
“from noise to order” in von Foerster)[Note 4]Note 4. NOTETEXT-4 is to be sought in the “phenocopy,” as I
have endeavored to interpret and to generalize this notion in a recent paper.[Note 5]Note 5. NOTETEXT-5
Psychogenesis and History of Science
As Holton said, one can recognize certain convergences between psychogenesis and the historical
development of cognitive structures;[Note 6]Note 6. NOTETEXT-6 this is what I will attempt to define in
an upcoming work with the physicist Rolando Garcia.
In some cases, before seventeenth-century science, one can even observe a stage-by-stage parallelism. For
instance, in regard to the relationship between force and movement, one can distinguish four periods: (1)
the Aristotelian theory of the two motors with, as a consequence, the model of antiperistasis; (2) an
overall explanation in which force, movement, and impetus remain undifferentiated; (3) the theory of
impetus (or élan), conceived by Buridan as a necessary intermediary between force and movement; and (4) a
final and pre-Newtonian period in which impetus tends to conflate with acceleration. Now, one notes a
succession of four very similar stages in the child. The first one is that one in which the two motors
remain rather systematic as residues of animism, but with a large number of spontaneous examples of
antiperistasis (and this often occurs in very unexpected situations, and not only for the movement of
projectiles). During a second stage, an overall notion comparable to “action” intervenes and can be
symbolized by mve, in which m represents the weight, v the speed, and e _the distance covered. During a
third period (ages 7 to 10), the “impetus” in the sense of Buridan’s middle term spontaneously appears,
but with, in addition, the power of “passing through” motionless intermediaries by passing through their
“interior” when a movement is transmitted through their mediation. Finally, in a fourth phase (around the
age of 11 to 12), the first inklings of the notion of acceleration appear.
For larger periods of history, obviously one does not find any stage-by-stage parallelism, but one can
search for common mechanisms. For instance, the history of Western geometry bears witness to a process of
structuration whose steps are those of a centration on an emphasis by Euclid on simply intrafigural
relationships, then a construction of interfigural relationships with Cartesian coordinate systems, and
then finally a pro-gressive algebrization by Klein. Now one finds, on a small scale, a similar process in
children, who naturally begin with the “intrafigural, “ but who discover around their seventh year that
in order to determinate a point on a plane, one measurement is not sufficient, but two are necessary, and
they must be orthogonally arranged. After this “interfigural” stage (which is necessary also for the
construction of horizontal lines) follows that which we can call the “transfigural” stage, in which the
properties to be discovered cannot be read on a single diagram, but necessitate a deduction or a
calculation (for example, mechanical curves, relative motions, and so on).
Now, these analogies with the history of science assuredly speak in favor of my constructivism.
Antiperistasis was not transmitted hereditarily from Aristotle to the little Genevans, but Aristotle
began by being a child; for childhood precedes adulthood in all men, including cavemen. As for what the
scientist keeps from his younger years, it is not a collection of innate ideas, since there are tentative
procedures in both cases, but a constructive ability; and one of us went so far as to say that a
physicist of genius is a man who has retained the creativity inherent to childhood instead of losing it
in school.
Notes
1. Considering the number of these additions and not only their result.
2. Let us recall that completive generalization is a constructive process essential in mathematics: for
example, the transition from passages of groupoids to semigroups, then from there to monoids, then to
groups, to rings, and to bodies.
3. It is true that autoregulation is in part innate, but more in terms of functioning than in terms of
structures.
4. H. von Foerster, “On Self-organizing Systems and Their Environments, “ in Self-organizing Systems, ed.
M. Yovitz and S. E. Cameron (Elmsford, N.Y.: Pergamon Press, 1960). https://cepa.info/1593
5. J. Piaget, Adaptation vitale et psychologie de l’intelligence: Sélection organique et
phénocopie(Paris: Hermann, 1974).
6. G. Holton, Thematic Origins of Scientific Thought (Cambridge, Mass.: Harvard University Press, 1973),
p. 102.

Found a mistake? Contact corrections/at/cepa.info


Downloaded from https://cepa.info/3062 on 2023-03-30

Publication curated by Alexander Riegler

4 de 4 30/03/2023, 20:14

You might also like