Jun 9, 1999 - are thus able to account for the whole range of sound-meaning ..... units do not need coincide with propositions: a syntactic object SOX is ...
Interfaces, Impaired Merge, and the Theory of the Frustrated Mind Peter Kosta, University of Potsdam, Germany Diego Krivochen, University of Reading, UK
Abstract: In this paper we will reanalyze the basic structure generation operation in Minimalist Syntax, namely, Merge. We will present a new approach to generation in a wide sense, focusing on the Faculty of Language (FLN) and its interfaces: C-I and S-M, building on work by Krivochen (2011a et. seq.), Kosta & Krivochen (2014), and Krivochen & Kosta (2013). Our main claim is that there is nothing more to “syntax” than Merge and, going further, that the so-called “syntactic component” of the Faculty of Language is not the only computational system in the mind-brain: as a consequence, the term “syntax” has wider scope than generally recognized. By doing so, we attempt to sustain the claim that Merge is a third-factor, principled operation (Chomsky, 2005). We will analyze and decompose some current proposals (Chomsky’s 2005, 2007; Adger’s 2011; Hornstein & Pietroski’s 2009 and Boeckx’s, 2010) and other theoretically possible options to rule out any stipulation and, optimally, get a Radically Minimalist definition of Merge: a system-neutral non-stipulative definition that can capture the properties of recursion all throughout the mind-brain (understood as a complex physical system), while retaining descriptive and explanatory adequacy and going beyond, towards justification in both biological and computational terms. Furthermore, we will try to show that: (1) FLN only comprises External Merge (both monotonic and non-monotonic, see Uriagereka, 2002), and only humans can make use of finite means provided by sound-meaning interfaces to weakly generate potentially infinite pairs Exp = (π, λ) with information interpretable for the performance systems. (2) FLB displays the major human language-specific property, namely, the linking of sound to meaning (and vice versa) in an intensional way, via unbounded computational manipulation of symbolic representations. While the former might be shared with non-human species (e.g., superior primates), the latter is hypothesized to be human-specific, related to the structure of the genome. (3) Only humans’ computational capacities, dynamically ranging from finite-state grammars (Markovian processes) to Turing-computability, allow full recursion (i.e., not only head-tail recursion or so-called true recursion, but also inter-sentential recursion via referential dependencies and cross-clausal connectives: conjuncts and disjuncts, Quirk et. al., 1985), and
are thus able to account for the whole range of sound-meaning relations and the internal structure of each interface. In the second part of the paper, some problems of molecular genetics research on language impairments and speech disorders will be discussed, namely molecular genetics methods with special reference to Specific Language Impairment (SLI) and verbal dyspraxia caused by mutations, deletions, insertions or duplications of FOXP1 or FOXP2 genes (Watson, 2003). The crucial question we try to address is whether the computational system responsible for the so-called language faculty in the narrow sense (FLN, Hauser, Chomsky & Fitch, 2002) is language-specific or shares any of its properties with other, independent cognitive modules; both in terms of computational properties and / or neurological bases. Some new findings in neurolinguistics seem to point into the direction that at least some types of language impairments are not grammar specific but rather caused by abnormal development of brain structures that constitute the procedural memory system, where computations are performed in real time. Thus, we would expect SLI to limit the computational power of FLN to simpler grammars (e.g., Markovian, Context-free) and impoverish interface-mapping operations, thus “getting in the way” of sound-meaning relations, and forcing alternative ways to materialize structured meaning. Keywords: Merge, Generation, Interfaces, Language Design, Molecular genetics, FOXP2 1. On Merge (and its interpretative consequences): In this section we will characterize the operation Merge as we assume it, and describe the consequences it has for the design of the faculty of language. Moreover, we will briefly compare and contrast the properties of Merge in humans and non-humans. In this paper we will adopt a “mirror view” of the usual justification for Merge: we will not take the syntacticocentric stance that “merge is a function, thus, it takes arguments” (e.g., Collins & Stabler, forthcoming), instead, we will argue that objects merge in a free, a priori unbounded manner, in order to be interpretable for the interface systems. The advantages of such a view include the elimination of the necessity to have a trigger for Merge within the syntactic component (e.g., a feature to be checked, as in Wurmbrand, forthcoming), which implies the introduction of yet another element in the theory: all triggers are replaced by the notion of interface interpretability, and Agree-constrained Merge, by a freely-applying structure-building operation. Let us consider an example, departing from the following lexical array, in turn a proper subset of the mental lexicon: 1) Array = {√love, T}
2
Of course, this is a sub-array, and we could not generate a full derivation from it. However, it will serve our present purposes. Roots, being semantically underspecified (see, e.g., Panagiotidis, 2013), are not interpretable at LF: they denote neither a sortal entity nor an entity anchored in time. Thus, merge is a last resort to save the derivational step from collapsing at the interfaces (notice how our notion of last resort differs from that of Wurmbrand, forthcoming): the application of a structurebuilding algorithm is justified by interface necessity: 2) Merge (T, √love) = {T, √love} The operation is not trivial: Merge has created a structural dependency between both elements, which is to be read at the semantic component as the specification of the time coordinates on which the extension of the root are to be interpreted. A semantic substance extending in time is read off as an event, and this little piece of derivation is thus partially legible at LF. Unlike mainstream Minimalism, we do not need to appeal for the existence of an Edge Feature (Chomsky, 2008) or similar intratheoretical tools to make an element “mergeable”: all symbolic elements are mergeable a priori, the interfaces deciding whether they are in fact merged for the purposes of a particular derivation. Moreover, there is a strong theorematic claim with respect to the nature of those elements: 3) An element enters a linguistic derivation if and only if it can be exhaustively decomposed in semantic and phonological features. Such a claim has two consequences:
There are no formal features (EPP, Wh-, Case), insofar as either they are not necessary (EPP, for instance: see Krivochen & Kosta, 2013 for extensive discussion) or they can be subsumed to interface configurations (e.g., Case, as in the system developed in Krivochen, 2011, 2012)
There are derivations that are not linguistic. Linguistic derivations are characterized by the convergence of a structure-building algorithm and two external components, responsible for sound and meaning (a claim traceable back to Aristotle in ancient times, and Saussure in modern times). However, the absence of any of those components, or the absence of an interface between any two of those components does not preclude the existence of derivations as sequences of applications of structure building / mapping algorithms.
The second consequence, which is derived from the specification “a linguistic derivation” in (3) in turn leads us to locate the specificity of human language not on UG, whichever its content might be (Krivochen & Kosta, 2013: 59, ff.), but on the convergence of the three systems, call them Syn, Sem, and Phon. Given this scenario, let us exemplify the possible intersections between systems: 4) Sem ∩ Phon = interjections, animal calls (e.g., vervet monkey, see Demers, 1988) 3
Sem ∩ Syn = conceptual structures (see Fodor, 1970; Jackendoff, 2002; Mateu Fontanals, 2005; Uriagereka, 2012, 2014) Syn ∩ Phon = musical capacity, structured bird sing (Uriagereka, 2012: Chapter 6) Syn ∩ Sem ∩ Phon = natural language Interestingly, only the intersection between Syn, Sem, and Phon gives rise to natural language, which is exclusively human. However, the rest of the partial intersections are shared with non-human species, including conceptual structures: this claim is based on recent experiments in which nonprimates, like certain species of crows, have displayed the capacity of planning actions with a goal, and establish a hierarchy between needed actions, see Taylor et. al. (2010); Taylor & Cayton (2012). If we adopt a wide definition of “syntax” that does not focus on language and the rest of mind faculties as parasitic (despite evolutionary traits), as Chomsky (2009: 26) explicitly claims -taking “visual arrays as lexical items”, a position for which there is no biological or neurological evidence-, then syntax equals a structure building function, call it concatenation, that applies throughout the mind, in more than one faculty. Planning ahead is a syntactic mental operation, in which events are subjacent: en cannot take place unless the subject somehow creates the conditions for it to occur, in an en-1. Subjacent dependencies between events in the time continuum are clearly syntactic, even if they do not display full Turing-computability, the mental grammar in charge of simple event planning is at least Markovian, with en being entirely locally-depending on en-1 and determining in turn the characteristics of the system at en+1. With respect to isolated sound-meaning relations, no syntax intervening, we can cite as an example the calls of vervet monkeys, as summarized by Demers (1988: 320), who also includes other species we will not deal with here for space reasons:
Table 1: sound-meaning correspondences in vervet monkeys
4
The signals in Table 1 display sound-meaning relations, but no syntax, since they appear in isolation (i.e., without meaningful context) and are triggered by elements of the context (i.e., there is no planning, thus, no conceptual structure). Phon ∩ Sem, signs without combination or hierarchy, is thus attested in non-human communication. This said, the derivational dynamics in real-time we assume are the following: Merge (α, β…n) = {α, β, …n} AnalyzeIL {α, β, …n} [is {α, β, …n} fully interpretable by the Interface Level IL?] (Transfer {α, β, …n} to IL if AnalyzeIL results in convergence at IL) Contrarily to Hauser, Chomsky & Fitch (2002), and much related work, we hold the claim that language specificity does not rely on the characteristics of its structure building algorithm, which is in our view and that of many other researchers shared with other faculties (e.g., mathematics, music, event organization, among others; see Jackendoff & Lerdahl, 2004; Levy, 2010; Dehaene, 1999 for references and extensive discussion of each domain). On the contrary, specificity is given by the interaction between three computationally independent systems, as an emergent property of language as a complex system (see Boccara, 2002, for discussion of the properties of complex systems at an introductory level). That interaction, which gives rise to a dynamic system, requires bidirectional information flow (contra traditional Minimalism’s unidirectional information flow, from the syntax to the interfaces, inspired in Fodorian modularism; see Fox, 1999 for an early minimalist proposal of semantics-to-syntax information flow), such that the S-M and C-I components have access to the syntactic derivation and evaluate each output of structure-building operations, in tune with the merge trigger we have developed above. 2. Computational Complexity and Biolinguistic considerations The architecture we have begun to sketch above lead us to reconsider the role of Sem and Phon in linguistic derivations. Syn consists of a single generative operation but no substantive elements, that is why we will focus on the role of the substantive components. Unless empirical evidence forces us otherwise, we will assume both mental components attempt to maximize conservation of lexical and structural information throughout a derivation (Cf. Lasnik, Uriagereka & Boeckx, 2005; Emonds, 1976), such that both Sem and Phon aim to minimize information loss. In a claim related to recent developments in Survive Minimalism (Stroik & Putnam, 2013), and thoroughly developed within Radical Minimalism (Krivochen, 2013a, b), we will describe this situation as an anti-entropic approach to derivations: the successive application of structure-building operations should increase the informational load for the interfaces and/or minimize entropy through redundancy (see Shannon, 1948 for discussion), overtly manifested in morphological agreement: 5
5) Haec bonae puellae (Latin) TheseFemPlNom niceFemPlNom girlsFemPlNom However, there are less obvious ways of introducing redundancy in a linguistic derivation so that the operation Transfer does not involve information deletion. For instance, overt Top fronting is redundant in a configuration in which a certain constituent is given phonological prominence, as in (6): 6) a. I love MARY (Top in situ) b. Mary, I love (frontalized Top) a’ Amo A MARIA (Spanish) b’ A María, la amo a’’ Ein Buch hat er gelesen (German) b’’ Er hat ein Buch gelesen
Each language, needless to say, favors different strategies for displacement: the LF-PF tension is not displayed equal in all natural languages. In this sense, we believe Chomskyan phase theory is incorrect: according to Chomsky, once a phase head is merged, its complement undergoes transfer to both interfaces, whereas the derivational dynamics we have proposed allow for different interpretative systems to access the working area separately, request (or not) that certain operations be applied, and each system decides when to take what it can minimally read. PF minimally legible units do not need to coincide with LF units, which is essential in our real-time computational proposal. Entropy (i.e., information lost as the time variable increases its value) is minimized, thus, through inflectional morphology, but also through syntactic configuration, as (5) and (6), respectively, show. The relevant question at this point is “what kind of information is to be conserved?” Following the proposal made in Krivochen (2013a, b), in turn based on Uriagereka (2012), Taylor et. al. (2011), Jackendoff (2002), among others, we claim that semantic information is conserved throughout the derivation. Neurocognitive evidence has been provided in favor of the existence of conceptual structures, which help organizing sensorial information as well as making it fit with previously available information, by establishing relations between them (figure-ground among the most important, but by no means the only one). Those relations, insofar as entities are related and located in (perceptual, conceptual) hierarchies, are syntactic in a strict sense. That is, there is a structure-building algorithm, but those structures are neither linguistic, nor they need to be: language enters the scene only when:
7) a. There is no one-to-one relation between entities of the phenomenological world and verbal expressions b. There are sound-meaning relations, such that phonological exponents materialize structure
6
c. Computational complexity can go beyond strict linearity in recursively enumerable types of dependencies (7 a) is crucial, insofar as natural languages are never triggered by a stimulus-response mechanism: a human can see a snake and shout “there’s a snake!”, or anything else he wants, or just do nothing. Contrarily, the vervet monkey has no choice but to respond to the visual stimulus “snake” with a particular call. One-to-one relation implies, in this case, that there is a subjacency relation between events in the phenomenological world and linguistic expressions, such that: 8) e → Exp Where e is an event in the phenomenological workd, → is an antisymmetric relation that we can identify as “triggers”, and Exp is an unstructured signal conveying some fixed semantic content (e.g., “climb up a tree”), which cannot be tampered with. The event is in these cases both a necessary and sufficient condition for the expression to be uttered. On the contrary, as we claimed in the introduction, the human computational system interfaces with both a sound system and a meaning system, which is in itself something biologically unique. Moreover, the computational power of the structure-building operation in humans goes well beyond strict linear, subjacent representations of events in the hic et nunc to reach context-sensitive dependencies, two levels higher in the Chomsky Hierarchy (see Chomsky, 1956):
Figure 1: The Chomsky Hierarchy (CH)
Chomsky formulated the hierarchy in the form of a theorem, such that (1956: 143): 9) “THEOREM 1. For both grammars and languages, type 0 ⊇ type 1 ⊇ type 2 ⊇ type 3”
7
Where Type 0 corresponds to Turing machines; Type 1, to Linear Bound Automata; Type 2, to Push Down Automata; and Type 3, to Markov models. Despite recent arguments (Watumull, 2012; Watumull et. al. 2014) in favor of a uniform Turingmachine mind, considerations of computation complexity of natural languages (Joshi, 1985) as well as memory limitations in biological systems lead us to claim that the upper bound for natural languages, at least within the CH1, is to be found at (mild) context-sensitivity, a position we have argued for in Krivochen (2014). In any case, the relevant matter is that human language displays discontinuous dependencies, which cannot be captured by a purely linear system, a Markov model, or a Lindenmayer grammar; as we can see in (10) (condition 7c): 10) a. Whati did you think that Mary claimed to have lost ti? (Wh-interrogative) b. Either [S1] or [S2] (discontinuous dependency between [either] and [or], cf. ‘*Either S1 nor S2’) c. (…) weil er [den Patienten]i [ohne PRO vorher ei zu untersuchen] ti operierte (scrambling + parasitic gap) As Chomsky (1957) succinctly argued, dependencies of the kind of (10) cannot be accounted for via finite state models (corresponding to Markov processes) insofar as they have no memory at all, and cannot “see” or influence anything beyond the immediately next derivational step (thus, long-distance dependencies are not formulable as Markov processes). Locality conditions of the kind developed in transformational grammar as well as non-transformational models (HPSG, LFG, among others) also argue against a pure finite-state approach to natural languages. However, finite-state grammars of the kind Σ, F (where Σ is a set of initial strings, and F a set of terminal strings), with no transformational component (and equivalent to a pure Markov model) are not absent in biological systems, as Prusinkiewicz & Lindenmayer (1990) prove with the development pattern of the bacteria Anabaena catenula. Consider the following derivation, where a and b represent cytological states of the cell, and l and r indicate cell polarity: 11) ω : ar p1 : ar → albr p2 : al → blar p3 : br → ar p4 : bl → al Notice that the form of the grammar is simply Σ → F, that is, “rewrite Σ as F”, very much in line with the first phrase structure models (Chomsky, 1957, 1965). The same kind of grammar, always with the 1
This clarification is highly relevant insofar as it is possible that conceptual structures are beyond Turingcomputation, as suggested by Uriagereka (2012: 7) and developed in Krivochen (2013a, b), without resorting to the notion of hypercomputation, but rather quantum models of cognition.
8
added assumption that all rules apply simultaneously (Prusinkiewicz & Lindenmayer , 1990: 3), is also useful to provide a descriptive generative (i.e., explicit) procedure for aminoacids. We will take only Phenylalanine (Phe) and Leucine (Leu) (Smith, 2008): 12) Grammar: Σ, F, where F = 3 (that is, Σ is always rewritten as 3 terminals) Phe → U3; U2 C (i.e., UUU, UUC)2 Leu → U2A; U2G (i.e., UUA, UUG) While this does not mean RNA’s structure is itself finite-state, it does show that the descriptive limits of Markov models go well beyond what was initially considered in traditional transformational grammar. In other words, the fact that an object X is modelable by means of grammar G does not necessarily mean that X has the computational complexity of G: X can be more complex that X (as in the case of DNA) or it can be simpler (as in the case of morpho-phonology, whose rules are contextsensitive –e.g., /a/ → [ã] _[+nasal]-, but whose structure is finite-state, as argued by Idsardi & Raimy, in press; Uriagereka, 2012; Krivochen, 2013a, b). Moreover, if a link with linguistic structure is found, this provides further basis for a new approach to the biolinguistic enterprise, from the point of view of the formal grammars that can be used to model different aspects of the biological – physical world. 2.1 What exactly is a dynamical frustration? Binder (2008) claims that the main characteristic of dynamical systems (which sometimes appear to be very different in a surface level, take heartbeat rate and climatic conditions as examples) is that they display a fundamental tension between global and local tendencies, which is called a dynamic frustration. These frustrations arise, for example, in a triangle lattice under the condition that all three be antialigned (that is, aligned in an opposite direction) with respect to one another: three neighboring spins cannot be pairwise antialigned, and thus a frustration arises (Moesler & Ramirez, 2006: 25-26; Binder, 2008: 322). Binder goes even further, positing that a complex system where no frustration arises will “either settle to equilibrium or grow without bounds” (2008: 322). Equilibrium, in this particular case, is not balance, which could be, in principle, desirable: we are not referring to an optimal organization of matter or distribution of forces, but to complete lack of activity (for instance, a pendulum, which, after a while, stops moving altogether). Crucially, a system can display opposing tendencies at different scales: local and global tendencies may result in a frustration as well. A relevant example is the well-known Lorenz attractor (13a, b) (which describes the behavior of a chaotic system), where there are opposing clockwise and counter-clockwise tendencies in the 3-D phase space (13 b is taken from Binder, 2008: 322):
2
U: Uracil; A: Adenine; G: Guanine
9
13)
a)
b)
In the 2-D figure (13 b) we can see opposing tendencies between “stretching out and folding”, in a resemblance of centripetal and centrifugal forces pulling the attractor in opposite directions. As the attractor “grows” in each dimension, centrifugal forces can be informally said to prevail. This kind of frustration, exemplified by chaotic systems, is called a geometrical frustration. There is another kind of frustration, which arises when global and local tendencies are opposing in nature: so-called scale frustration. Binder’s example is a clockwise global tendency with local counter-clockwise cycles, as in (14): 14)
As we have argued in Krivochen (2013a, b, 2014), this kind of frustration might be essential when considering the cognitive processes that take place in language processing, both production and interpretation, as a core property of derivations, particularly phrase structure and the syntax-semantics interface in relation to “counter-cyclic” phonological finite-state computations. What is more, we have argued in that work, and will argue here (going deeper in the line of Uriagereka, 2012: Chapter 6; in press), the concept of frustration is also of key importance when trying to situate language in relation to other cognitive capacities. Summarizing much discussion in the works mentioned above, the mind displays, at a global level, an architectural frustration, a tension, between finite-state processes and higher levels of computation (say, mildly context-sensitive dependencies). While some systems work with only one of those kinds of grammars (e.g., interjections are purely Markovian, and so are sound sequences without tone center, as in whole tone scales), others, like human language, resort to both 10
types of computations, sometimes co-existing in the same phrase marker. This claim leads is to the second kind of frustration, that one displayed between semantic-conceptual structures and materialization possibilities, a local derivational frustration. If Spell-Out is understood as dynamical Markovization of structure, as argued for in Idsardi & Raimy (in press), Uriagereka (2012, 2014) and Krivochen (2013a, b, 2014), that is, the conversion of computationally complex, structured symbolic representations, in linear strings of sounds necessarily subsequent to one another (the linear character of the linguistic signifié noticed by Saussure). It is crucial to point out that we are not dealing with two qualitatively different kinds of frustrations, they just vary in scale: one is found all throughout mental computations, the other, more specific, is at the heart of linguistic derivations. Having the ‘syntactic component’ reduced and simplified to a single structure-building operation applying cross-modularly (see Krivochen & Kosta 2013, in press; Krivochen, to appear, for extensive discussion and empirical support to the idea that structure mapping operations, such as Move-α or Copy can be eliminated in favor of Displacement-as-Remerge-fromNUM approaches; see also Stroik & Putnam, 2013 for a feature-driven alternative to remerge operations); the role of the interfaces becomes crucial, in fact, as Marantz (1995: 380) points out, the burden of explanation had already begun to shift from a highly specified computational component to the interpretative interfaces and their legibility conditions in the mid-90’s3. Thus, architectural Markov processes are manifested in linguistic derivation at each PF-cycle of the derivation, whereas architectural high-level computations are to be found in semantic cycles, which might or might not coincide with phonological cycles. Such a dynamic proposal, which avoids theoretical and empirical problems posited by fixed Chomskyan ‘phase theory’ (to the point of requiring stipulative operations of phase extension / phase sliding, cross-linguistically determined, to account for conflictive extraction data, see Gallego, 2010; den Dikken, 2006), constitutes, we think, a more realistic model of the computational processes involved in natural language. A problem we will address now is ‘what happens when language is impaired?’. We will attempt to provide a programmatic answer to this crucial question from the perspective of a mind displaying dynamical frustrations as a core property of linguistic derivations. 3. What is ‘specifically impaired’ in SLI? Specific Language Impairment (SLI) is a condition characterized by late emerging and protracted language acquisition relative to age expectations, without intellectual disability, autism diagnosis, hearing loss, or other obvious contributing conditions. The prevalence is estimated as 7% of 6-year3
…before being displaced again in the first years of the XXI century, as feature checking / valuation operations took the field by assault and the interfaces fade into the background once again. Even today, it is hard to find an explanatory article in generative linguistics which is focused on the role of the interfaces and not on intratheoretical, stipulative syntactic mechanisms. A very welcomed initiative is the book by Kosta et. al. (eds.) (in press), which includes some of the most interesting interface-driven studies so far.
11
old children (Tomblin 1997). The impairments involve both receptive and expressive language and include late talking and deficits in grammar, vocabulary and discourse (Leonard 1998). Twin-based heritability of between 0.50 and 0.97 has been reported for measures of SLI (Bishop 2002; De Thorne L. 2006), particularly in populations which sought therapy (Bishop et al. 2008a). Family aggregation studies document increased risk for SLI among siblings and parents of affected children. Twenty-two percent of nuclear family members of SLI probands are reported with a positive history compared to 7% of control families (Rice et al. 1998), with a similar range of affectedness across studies (Tomblin et a. 1989 Tallal et al. 2001). Recent linkage studies from the SLI Consortium of Great Britain (SLI Consortium 2002, 2004) report genome-wide linkage screens of quantitative measures of language that implicate chromosomes 16q (SLI1) and 19q (SLI2). A follow-up study (Falcaro et al. 2008) confirmed linkage to chromosomes 16 and 19 in a subset of the SLI Consortium full sample. For example, the SLI Consortium studies used the Clinical Evaluation of Language Fundamentals (CELF) test (Semel 2003). Two other more narrowly defined phenotypes are of interest; one is an index of morphosyntax in the domain of Tense-marking (TNS) and the second is performance on non-word repetition tasks (NWR). Tense-marking and non-word repetition performance have been identified as strong candidates for clinical markers (Tager-Flussberg et al. 1999). Significant heritability in twins is reported for NWR and TNS (Bishop et al. 2006) with a tense marking task originally developed in the lab of Rice as an experimental precursor to a standardized test (Rice/Wexler Test of Early Grammatical Impairment (2001)). The reader will notice that we have characterized the structure-building algorithm, call it Merge, as a simple n-ary combinatory device (see also Marantz, 1995: 380) constrained only by the legibility conditions of the systems it interfaces with. This means that it is unlikely that the generative procedure itself is impaired, insofar as SLI patients often do not exhibit impairment in syntactically based capacities, like mathematics or music; or even non-linguistic inferences (for discussion of the latter, see Schaefer, 2003). SLI literature often equates “grammatical processing” to “syntactic processing”, which is a methodological mistake, from our point of view: grammar includes not only the combinatorial mechanism, but also the interfaces and the substantive elements involved in neurocognitive computations; whereas syntax refers only to the former (thus, arithmetic and geometrical knowledge in the brain is syntactic and, most importantly, independent of language, as Grinstead et. al. 2004 argue). Other studies refer to “syntactic SLI” when dealing with issues that correspond to Spell-Out patterns or semantic issues (e.g., binding theory, establishment of dependencies between constituents, among others; see Levy & Friedmann, 2009 for reference to ‘syntactic SLI’). However, there seems to be some consent with respect to the fact that whenever dependencies are affected, impairment does not affect all equally, as Novogrodsky & Friedmann (2010) show. Moreover, what is usually referred to as ‘syntactic’ is, in all cases we have reviewed, 12
expressible exclusively in interface terms, not in terms of an impaired structure-generating engine. Consider, for example, the following extract from Shalom (2003: 418): “A widely accepted syntactic marker for SLI in many languages is the extended use of infinitives in syntactic contexts where finiteness is obligatory.” While the process can be described as the vocabulary insertion of a distributionally underspecified infinitival form in the place of conjugated forms, under a separationist framework (e.g., Distributed Morphology, see Halle & Marantz, 1993 and much related work), it is not clear at all that the process is by nature ‘syntactic’, even under the wide sense of ‘syntax’ we have been handling here: the structure building algorithm is not affected, materialization of terminal nodes is. Similar considerations about relativizing the ‘syntactic SLI’ terminology (or straightforwardly replacing it) can be made, under our view of what ‘syntax’ comprises, when facing quotes like the following: “When English-speaking children with SLI are matched on mean length of utterance with typically developing younger children (around 3 years old), the children with SLI make significantly more errors in tense production, even when performance on other inflections does not differ significantly between the two groups. Similarly, omitted or incorrect grammatical morphemes are a clinical hallmark of agrammatism” (Shalom, 2003: 418) Also, as pointed out by Shalom (2003) and van der Lely (2003), children with SLI typically interpret passive sentences such as “The man is eaten by the fish” as if active, that is, “The man is eating the fish”: theta roles are not assigned in a local relation to certain functional heads (as in the highly componential theory of Hale & Keyser, 2002; also adopted by Chomsky, 2004; or the simpler variant elaborated on Krivochen, 2012 and extensively applied in Trejo, 2013), but read off transparently (i.e., following a 1-to-1 relation between positions and roles) from word order. It is not clear why should it be ‘syntactic’, insofar as it is not structure building what is impaired (as it is free, blind, and unbounded), but at most the interface procedures (in computational terms, protocols) that assign an interpretation to the structure. This is an essential point, since we will try to develop the following claim: 15) Language Impairment is a specific form of Mapping Impairment What does this mean? It states that structure generating algorithms cannot be impaired4, because there are no substantive elements to them; but interface procedures (which contain instructions as to how to manipulate transferred elements; and, in our particular ‘invasive interfaces’ model, also determine 4
Representational accounts of SLI also claim that the computational system is itself not affected, but semantic features are. Our proposal is also compatible with opposite models, which find the source of SLI at the computational component, since those accounts (e.g., van Der Lely, 2005) claim that “Children with SLI use the most economic structure, which is according to this model the least complex one.” (Marinis, 2011). The ‘SLI-asmapping’ hypothesis is compatible with these accounts insofar as we assume a computationally minimal generative algorithm, unlike mainstream generative grammar.
13
which minimal syntactic objects are fully interpretable by the relevant system) can. Moreover, in the specific case of natural language, ‘mapping’ is defined as the resolution of the derivational frustration we mentioned above, such that each materialized linguistic stimulus is the best solution to the tension between what the speaker wants to say, that is, the information he wants to convey (information that is syntactically structured); and the means he has to externalize that meaning, which implies a structural flattening from hierarchy to linear dependencies. A further thesis we will argue in favor of is that the simplest solution to mapping impairment is uniform mapping (where no structural rearrangements at the interfaces are necessary, e.g., Q raising at LF), and that is what we find in many instances of SLI. This claim deserves more development: notice that, if PF and LF tendencies are contrary (as would follow from a frustrated approach to language architecture), Transfer to LF and Transfer to PF apply separately. In other words, for instance, tone units do not need coincide with propositions: a syntactic object SOX is transferred to an interface level IL if and only if SOX does not contain unusable information for that system and there is no smaller SOY (properly contained in SOX) such that it is transferrable, ceteris paribus. In a word, transfer the smallest fully interpretable unit to the relevant interface as soon as you can. Of course, the size of this minimally interpretable unit varies in impaired patients, and according to the type of impairment. Data taken from the CHILDES corpus shows that children aged 4;4 diagnosed with some form of SLI tend to concatenated-word responses (as opposed to fully-fledged syntactic constituents) when presented with elicitation tests: 16) Mother: Is she your brother or your sister? Child: She sister Notably, this kind of concatenated, finite-state organized expression was also found in Genie, a famous case of a girl who had been isolated from linguistic stimuli from 20 months to 13;6 years (thus lacking linguistic stimulus during the so-called critical period). As Curtiss (1988) argues, once provided with adequate stimulus, Genie managed to gain a command of lexis and semantics (including complex semantic structures) comparable to what would be expected in a girl her age. However, morphology was definitively impaired, lacking inflection in both nominal and verbal domains (see below for examples of SLI patients with the same symptoms). The level of reduction of morphological complexity, as well as the absence of subordination and functional elements point to the lexicon and the PF-branch of the derivation, but not to the narrow syntax. This is not alien to our theory, nor is the fact that SLI children can acquire some morphosyntactic complex forms with time, since language is a complex system (in the technical sense, see Boccara, 2002 for a general introduction, and Krivochen, 2012 for discussion) we expect it to change, adapting mapping routines dynamically and thus “overcoming” previous shortcomings: SLI children can in fact improve their syntax-semantics14
phonology mapping. There is a significant effect on a child’s ability to communicate. Although some of the problems appear to resolve with age, other difficulties persist. Recent evidence has shown that children who are late talkers are at higher risk for continued language problems. If the condition is not resolved by school age, language impairments are likely to remain into adolescence and adulthood (cf. Johnson et al. 1999; Tomblin et al. 2006; Rice et al. in press; Kosta, Krivochen, Peters, RadevaBork, in prep.). Observations about SLI are frequently based on the capacity of respecting word order, displaying inflectional morphology, and proper use of functional categories (e.g., Curtiss, 1988), which we argue does not prove there is an impairment on syntax per se. WSR maintain that in finite contexts in adult English, verbs obligatorily carry person, number and tense features, phi-features agreeing with their subject (arguably, a morphophonological condition rather than a syntactic one, under our assumptions). So, in sentences like those in (19) below, the italicised verb forms will carry the parenthesized features: (19) (a) Daddy plays with me [plays = present-tense, third-person, singular-number] (b) Mummy and Daddy play with me [play = present-tense, third-person, plural-number] (c) Daddy played with me [played = past-tense, third-person, singular-number] (d) Mummy and Daddy played with me [played = past-tense, third-person, plural-number] (20) (a)
-s = [present-tense, third-person, singular-number]
(b) -d = [past-tense] (c)
-ø = [any other set of features]
(21) Daddy plays with me [plays = present-tense, thirdsingular-number]
person,
(22) Daddy play with me [play = third-person, singular-number] (23) Daddy play with me [play = present-tense] As noted earlier, both ND and SLI children frequently omit the auxiliaries BE/HAVE/DO in finite contexts, hence producing auxilariless sentences such as: (24) (a) Daddy snoring/Daddy naughty (omission of is) (b) Daddy gone out (omission of has) (c) Daddy not like cabbage (omission of does) WSR posit that the function of such auxiliaries is to encode tense and agreement properties. They further assume that (just like main verbs) auxiliaries can optionally be underspecified for tense and/or 15
agreement in finite contexts in child grammars, and that when HAVE/BE/DO are underspecified for one or more of their tense/agreement features, they are given a null phonetic spell-out (i.e. so are ‘silent’). So, in an adult sentence such as Daddy is snoring, the auxiliary BE will be fully specified for tense and agreement features, as in (25) below; (25)
Daddy BE snoring [BE = present-tense, third-person, singular-number]
Accordingly, the auxiliary BE will be spelled out in the corresponding 3Sg present tense form is. But suppose that a child marks the auxiliary BE for agreement but not for tense, as in (26) below: (26)
Daddy BE snoring [BE = third-person, singular-number]
In such a case, the (tense-) underspecified auxiliary BE will be given a null spell-out, so that the resulting sentence is Daddy ø snoring, where ø is a null exponent of BE. Likewise, if the child specifies the auxiliary for tense but not agreement as in (27) below: (27)
Daddy BE snoring [BE = present-tense]
Although (overt forms of) the non-modal auxiliaries BE/HAVE/DO encode both tense and agreement features, WSR posit that modal auxiliaries always encode tense, since they inflect for present/past tense (cf. pairs such as will/would, shall/should, can/could, may/might), and never carry the third person singular agreement inflection -s. However, studies in modal verbs that indicate that, although historically those phonological forms were indeed inflectionally related, today they are different forms indicating not tense, but different degrees of possibility. So, for the lexical entry for CAN, the corresponding (context-sensitive) L-grammar is (28): (28)
√CAN → can [+present] / [+realis]; could [-present] / [-realis]
All these cases are perfectly compatible with our view of SLI as a mapping impairment, in which there is no possibility to spell-out some features in the T node, and thus simpler (for some, defective) alternatives are used, the measure of simplicity being different in each type of impairment. At this respect, De Villers (2003) proposes an interesting problematization: The final question concerns what is meant by impairment in specific language impairment: That is, is there a part of the grammar that has gone away or is missing? Or, is the grammar intact, with all the principles, parameters, categories, and properties of universal grammar (UG), but delayed in maturation, with that delay having clearly demarcated consequences? This is an essential debate, and where the prospect of locating a gene becomes especially tantalizing. It is also where crosslinguistic work becomes essential. (2003: 427) As shown in the data above, it might be more accurate to talk about “delay” than about “impairment” when it comes to materializing combination, even though there are cases in which linguistically 16
meaningful combinations never occur, complexity in conceptual structures underlying such linguistic instantiations (as in the case of Genie), and cases of children who –albeit to different extents- master basic stem-desinence configurations (as in (19)) forces us to relativize the sense in which ‘impairment’ is meant and defined. Genetic factors are not minor, as De Villers suggests, but they have to be approached to with caution: as Lasnik & Uriagereka (2011: 100) say, Though we have no reason to question the accuracy of the genetic, brain-imaging, and developmental studies as such, when it comes to mapping a very specific point mutation of the FOXP2 gene onto the differential activation of brain areas (with respect to normal subjects), we do question the interpretations and conclusions that we find in the technical literature. Watson (2003) classifies FOXP2 as a regulating gene, that is, a gene which is in charge of controlling genes determining the synthesis of proteins that, in turn, form areas of the brain related to language processing (production and understanding). Skepticism with respect to the ‘biolinguistic’ interpretations molecular genetic findings (but, of course, not molecular genetics itself) are subjected to in linguistic literature is, we think, justified. Jenkins (2011) points out that asymmetry is a fact in biological systems (brain lateralization, for example), and a plausible biolinguistic line of inquiry is to find out whether it has arisen from symmetry or some other independent source. This is of special interest since the physical world seems to be ruled by very deep principles of symmetry, from string theory to relativity. According to Jenkins, asymmetry would arise as a result of self-organization of biological units, which is a very interesting thesis specially because of its apparent incompatibility with deeper physical claims, but, once again, just as string theory came to fill the gap between the symmetric Universe that relativity needed and quantum leaps at the Planck scale, there could be a deeper solution for this. In our framework, the dynamic and complex character of biological systems, at all levels (i.e., from molecular genetics to neurocognitive structures) can be approached from a ‘frustrated’ perspective, with the appropriate caveats with respect to the scale of the relevant frustration (recall that Binder’s 2008 original formulation refers to lattices involving electrons whose spins cannot be aligned): there is no principled objection to the application of the concept of dynamical frustration at the level of molecular biology. 4. Molecular Genetics and SLI Using a database of predominantly monozygotic twins with speech and language disorders and other relatives in the same family, we will first discuss which molecular genetics methods can be applied for the identification and study of specific genes including FOXP1, FOXP2, CNTNAP2, ATP2C2, CMIP and lysosomal enzymes, and how these methods, concerning transport of molecular genetic-based research, can provide insight into the aetiology of speech, sound and language disorders. 17
Subjects and methods5
4.1
Subjects: A total of 250 participants, including 100 probands, 100 siblings, and 50 parents and other relatives to be drawn and implemented from an ongoing longitudinal study of Specific Language Impairment. The study will have to be approved by the institutional review boards at the University of Potsdam and HUB/Charité and external reviewers/experts. Appropriate informed consent will be obtained from the subjects. At present, we have 26 probands (12 male, 14 female), mean ages 3;5 to 8;10 across variables, ascertained from preschool speech pathology caseloads followed by assessment to meet the requirements of the study. The presently preferred method of molecular genetic analysis is the method of next generation sequencing, which is established at the Institute for Molecular Genetics at the Charité, Berlin. This makes it possible to analyze even small families with only two affected children who suffer from speech problems. With this method a) the whole exome of a person or a family to point mutations, deletions, insertions or duplications can be examined simultaneously. Puls: with this method it is possible to examine b) the whole genome (all genetic information that is: exome + all non-coding regions 95%, about 3 x 10 6 base pairs) of a person. The application of this method represents a tremendous advance over the linkage analysis. In addition, we will conduct the classical methods such as coupling (or linkage) analysis and association studies - a more accurate description of the standardized test to check in the speech and language disorders in the syntactic and phonetic range is coming soon in the written version of this paper.
4.2 Analyzing some data from SLI data bank (English and German) English vs. German SLI children (Clahsen 1991; Clahsen et al. = CBG 1997) Clahsen, Bartke and Göllner (henceforth CBG) use data from English and German children with SLI to argue that SLI involves a selective grammatical deficit. They claim that both the English and German SLI children in their study have far more problems in the marking of agreement than of Tense (both in main and auxiliary verbs). Against the background of the MP in Chomsky (1995), tense features of auxiliary verbs and main verbs are interpretable (i.e., they contribute to determining the meaning of sentences), while agreement features of verbs are not interpretable. CBG come to the conclusion that SLI children have particular problems with the acquisition of uninterpretable features, that is, features that cannot be read by either the semantic or the morpho-phonological interface, thus having a purely formal nature.
5
The following speech therapy centers in Berlin and Potsdam supplied us with screening data and material: Dipl. Patholinguistin Veronika Riegel. Krankengymnastik und Logopädie, Provinzstr. 45-46, 13409 BerlinReinickendorf. – Dipl. Patholinguistin Franziska Starke, Praxis für Logopädie, Rudolf-Breitscheid-Str. 162, 14482 Potsdam, and Dr. Barbara Lindemann (Fachärztin für Hals-Nasen-Ohrenheilkunde, Fachärztin für Sprach- Stimm-und kindliche Hörstörungen, Potsdam). We received data from children of 3;3-8;5 with problems of agreement, case and word order. All data used here are with the kind permission of the parents.
18
Methods used: The English Data for CBG’s study come from elicitation tasks performed on a group of 9 SLI children in the age range 10;00-13;01. On one task, the children were prompted to produce 3Sg present tense forms (using the prompt Every morning, my mum...), and on another to produce past tense forms (on a story- telling task using the prompt Once upon a time...). Both elicitation tasks were carried out twice for each child, at intervals of a year. On the basis of the responses the children produced, they were scored for their percent correct marking of case on subjects and tense/agreement on verbs. CBG report that the English children in their study achieved relatively high scores on the past tense elicitation task (overall88 correctly inflecting 76% of main verbs and 89% of auxiliaries for past tense in obligatory contexts), but much lower scores on 3Sg present tense forms (overall correctly inflecting 49%89 of main verbs and 35% of auxiliaries in obligatory contexts). They also report that the children achieved a 100% correct score on nominative case-marking in obligatory contexts (all 217 of their subjects being assigned nominative case, according to CBG). Conclusions drawn from CBG’s English SLI study The fact that the English SLI children achieved relatively high scores on past tense forms but much lower scores on 3Sg present tense forms leads CBG to conclude that the reason for their poorer performance on Sg present tense forms is the fact that the latter encode not only tense but also agreement in (person and number) phi-features with the subject. More specifically, they posit that SLI children have far greater problems in marking agreement (viz. between subject and verb) than in marking tense: since past tense forms generally encode only tense, the SLI children achieve high scores on this (76% for main verbs and 89% for auxiliaries); since 3Sg present forms encode not only present tense but also agreement, the SLI children achieve much lower scores on this (49% on main verbs and 35% on auxiliaries). Why should SLI children have greater problems in marking agreement on verbs than in marking tense on verbs? CBG argue that this is a consequence of the tense features of verbs being interpretable and their (person/number) agreement features being uninterpretable, and of the particular problems which uninterpretable features pose for SLI children. The main problem with this account is that interpretability is not a primitive notion (it is rather connected to valuation in the Chomskyan account) and, moreover, a feature is not interpretable or uninterpretable per se, but depending stipulatively on the category it forms: thus, [tense] is interpretable in T, but not in D; conversely, person-number are interpretable in D but not in T. To this day, there is no principled account of the notions of valuation or even ‘formal feature’ (let alone a sound biological correlate to them, see Kosta & Krivochen, 2011): the empirical results must be re-evaluated, in our opinion.
19
In a case study, Clahsen (1985; 1991 et seq.) used a parametrized grammatical theory of language acquisition of verb placement and inflection by ND and SLI children and adults. The age mean of the 17 SLI children ranged between 3;2 (Julia 1) and 9;6 (Anja 1), 10 male and 7 female probands. The analyses aim to define the impairments that are present in dysphasia: person and number inflection on the finite verb, and case markings are instances of grammatical agreement. Their common feature is that the morphological form of the lexical item is determined by other elements in the clause. Case markers Accusative contexts total
Dative contexts Acc.
Neutr.
total
Dat.
Neutr.
Gen.
Case
Other
marked pronouns
Anja (9;6)
1
5
0
5
1
0
1
0
0
0
7
1
6
8
5
1
0
3
2
2
1
0
0
0
0
0
1
0
2
0
0
3
0
0
0
5
3
Klaus 1
7
2
5
2
0
1
0
0
4
(4;6)
6
5
1
2
0
1
0
3
3
Julia 1
1
0
1
1
0
0
4
0
1
(3;2)
1
0
1
6
0
5
1
1
0
18
15
3
1
0
1
0
10
3
5
3
2
3
0
3
0
3
2
Anja 2 (10;8) Andreas 1 (7.0) Andreas 2 (7;1)
Klaus 2 (4;7)
Julia 2 (3;3) Patrick (4;5) Sven (7;4)
20
Stefan
6
0
2
11
6
3
0
12
4
Jonas 1
7
0
7
0
0
0
0
0
1
(6;6)
8
0
7
5
3
2
0
2
0
Petra 1
4
0
4
1
1
0
0
1
1
(3;8)
2
0
2
9
2
7
0
2
5
Petra 2
11
1
9
7
7
0
0
8
0
10
0
10
2
0
1
0
1
0
(4;8)
Jonas 2 (7;7)
(3;11) Petra 3 (4;7) Wolfgang (4;5)
Table 2 contains a quantitative summary of use of case markings by the children under investigations. The analysis of the acc. and dat. Markers is dependent on the syntactic context. The frequencies show how many cases require acc. and dat. Forms, in how many cases acc. and dat. are used and in how many cases the child uses case-neutral forms (Nom. counting also as one), e.g. ich baue ein mast (instead of ich baue einen Mast “I build a mast”. Case markers which could not be categorized are in the last column. Syncretic forms are also considered. The table shows that all children use case markings that differ from the case neutral form. In some children’s data we find a preference for the dative form (Stefan, Petra 1/2), in others only the accusative (e.g. Klaus 1 ,2, Patrick). Only the data of Anja 2 and Petra 3 show acc. and dat. forms. From the profiles we can also see that genitive markers (except in five instances for Julia) are not used. We can see that in stages II and III of grammar acquisition of Normally developed children (ND), NPs are still unmarked, instead children use case-neutral markings (cf. Clahsen 1991, 153-161). 4.3 On the design of the empirical investigations (Kosta, Krivochen, Peters, in prep.).
Selecting the children in our case study In order to ensure that all features under investigation were present, we used screening based on the mentioned production and reception tests. Thus, we received a priori a case history file in which demographic information, as well as the available diagnostic results were entered. 21
These files were completed for each child with the information from the logo-pedagogue or speech therapist. Thus, the research study includes children who have difficulties with the normal acquisition of syntax and morphology without having hearing impairments, mental handicaps (e.g. MR, Down Syndrom or a lesion of the FOXP1 chromosom, cf. Horn et al 2010).
Most studies on Stage II clearly show that children already have access tothe most important word classes both of the open class (RI) and of the closed class (LL). But they also show that the grammatical patterns (such as Agreement) do not yet correspond to the target language. In some recent studies, this can be explained by so-called feature blindness, that is to say: even if there is access to the class of functional words (e.g. determiners as heads of DPs in German) their labels (or features) are not yet valuable because their “interpretation” remains obscure or invisible for the children at that Stage II of cognitive development of grammar. We will analyze some examples from German data based on the study of Clahsen (1991:38ff.). But before we can do it, here are our statements and hypothesis (following RM): As nominal elements are the most frequent roots, we find more sortal entities than eventive entities. We are avoiding the terms “noun” or “verb” because categories play no role in the syntactic component, as it is only generative (not interpretative) and blind to all characteristics but format, either Structural or Ontological (Krivochen, 2011a: 16-17): Ontological format refers to the nature of the entities involved. For example, Merge can apply […] to conceptual addresses (i.e., roots) because they are all linguistic instantiations of generic concepts. With ontological format we want to acknowledge the fact that a root and a generic concept cannot merge, for example. It is especially useful if we want to explain in simple terms why Merge cannot apply cross-modularly: ontological format is part of the legibility conditions of individual modules. Structural format, on the other hand, refers to the way in which elements are organized. […] Only binary-branched hierarchical structures are allowed in human mind. The arguments are conceptual rather than empirical […]: Merge optimally operates with the smallest non-trivial number of objects. Needless to say, given the fact that ontological format is a necessary condition for Merge to apply (principled because of interface conditions, whatever module we want to consider), the resultant structures will always consist on formally identical objects. Categories, as such, are epiphenomenal results of the local relation between a root, defined as precategorial linguistic instantiations of a-categorial generic concepts from C-I (see Taylor et. al. for neurocognitive evidence) and a procedural node, specified enough distributionally and semantically to generate a “categorial interpretation” at C-I. This said, we can analyze our data: (29) a. diese tuhl (dieser Stuhl) (M.) (= this chair) b. diese eis… (dieses Eis) (M.) (= this ice cream) 22
c. meine auto hoch (mein Auto) (M.) (= my car is driving up.) d. eine rad (ein Rad)
(D.)
(= a wheel) e. ein Buch (= ein Buch)
(D.)
(= a book) (We follow Clahsen (1982), repeated in Clahsen (1991:38) for notation and abbreviations of the names. M = Mathias, D = Daniel, J = Julia. ) The fact that children in this age fail to recognize the doubling pattern inside of a DP between the Determiner head (Det) and the nominal complement material with respect to ϕ-Features seems to be independently confirmed by data of anomic aphasia where the Agreement pattern could be recognized only in case where Agreement was triggered inside of a DP via a D head whereas there was no Gender recognized in Hebrew test persons where Gender is not marked via agreement (bare NP). 5. Evidence from Normal Language Development (NLD) Previous attempts to explain primary language acquisition as part of a behavioral model of simple learning strategies (Skinner’s behaviourism), prove to be untenable from today’s perspective. The study of child language has now reached a level that puts the simplistic model of the simple imitation of the behaviorists by the certainly by far more explanative model of an innate endowment which matures with cognitive development in a speech community, making place for the cultural aspects of language acquisition In order to gain access to the nature of the computation processes related to certain grammatical properties, two phenomena are studied on the basis of L1 experimental acquisition data from Russian, German and English: early subjects and agreement. The relevant ‘features’ (i.e., interpretative cues) are not given by a stipulative UG, whose content is at best unclear, but by the interfaces themselves, providing a language with the possibility of materializing or not the external argument by default (notice that even English allows truncated subjects, as in the case of the advertisement ‘got milk?’). The simplest option is, thus, do not materialize X if there is any other cue to recover the semantic contribution of X, morphology being one of the major cues at this respect (either full paradigms, as in Spanish or Italian; or zero morphology, as in Chinese). It is legitimate to predict that the operations between the interfaces will pose problems for children (given the architectural frustration we mentioned), since they need to be able to compute knowledge from and at both interfaces, resulting in delay or at least optionality in the acquisition of the interface
23
properties. At the same time, the computation of single lexical subjects is not associated with interface properties and we can expect them to be unproblematic and emerge on the onset of acquisition. The objectives of this part of the study were (schematically): 1) to establish whether null subjects and lexical subjects are simultaneous or successive processes in acquisition, and 2) to examine if there are significant computational differences in the acquisition of a single interface process as compared to that of a multiple interface process. Predictions:
Since the syntax-discourse interface feature emerges late, the prediction that interface phenomena such as Internal Merge may be characterised by delay and optionality in child grammar is confirmed. Children’s non-adult linguistic performance is traceable to the load of computations at the levels of two interfaces-syntax and discourse. Thus syntactic properties are fully acquired whereas interface properties trigger residual optionality effects in the early stages. Furthermore, since null subjects appears long after the acquisition of Agreement morphology, it is legitimate to assume a modular nature of language with syntax present from around 1;8 and the interface between discourse and syntax not fully matured until after age 4;0.
This prediction is a natural product of the view that there is no Movement as independent operation, but only External Merge of tokens (Krivochen & Kosta, 2013; Krivochen, to appear): if external Merge of tokens of a single type is driven by semantic necessities (as argued in the citec works), then we do not need any independent argument to account for the impairment of displacement operations if the discourse-syntax interface is aquired late ontogenetically: if an operation is not interfacelegitimated, in our model, it just cannot be proposed (thus entailing a strong version of an interfacedriven framework, so-called restrictivist framework). Problems with Parameters: Earlier stages of language development in the so-called non-null subject languages (like German, English, French, Danish or Dutch) seem to exhibit between the second and third life year, just such cases of the zero-subjects (less the zero-objects), even though these zero entities are not permitted in the target language (grammar of adults): (30) a. Se blomster har. (Jens, 2, 2) ‘look flowers have / has’ look, (I / you / she / we) have / has flowers. b. Tickles me. (Adam, 3, 6) c. Mange du pain (Grégoire, 2, 1)
The early childhood multi-word utterances hardly deviate from the syntax of the target language (TL) in terms of position from the heads and complements, i.e. in head initial languages (English, French, Russian) it is expected that the complements of the lexical or functional head follow the head V of the 24
VP, while in head final languages (German, Japanese, Turkish) it is expected that the reverse word order would be the case. The verbs and nouns exhibit all properties and features of TL (inflection on nouns, verbs) and the show subject and object pro-drop (EC’s) in the same positions as the TL. The following dialogue is an example of an early language stage of a Russian girl (CHI; 2;7.29) that show clear preferences for head-initial structures (31): (31) *MOT: a obaz’jane, rasskazhi. of-monkey-tell (me) *CHI: [IPSpec obez'jany, obez'jany [VP zhivut [PP v zooparke]]]. monkeysNomSg, monkeys NomSg live3PlPresent in ZOOPrepSg *MOT: v zooparke, da? In Zoo, yes? *CHI: [e] prygali. (they) jumped3PlPast. *MOT: oni [VP [PP na maSHIny] FOK [VP prygali ]] ]. they on cars jumped *MOT: a chto eshcho delali? and what (they) done, too? *CHI: [IPSpec [e] [VP sideli i [VP [DP peCHEN’je] FOK [VP [e] eli ]]]]. (they) sat 3PlPast. and cakeAkkSg (they) ate3PlPast. *EVA: kto im dal pechen’je? who them gave cake *CHI: [IP mama [VP dala (e) pechen'je]]. motherNomSg gave3SgPast (them) cakeAkkSg *MOT: a drugije morkovku eli. and others carrot ate *CHI: [CP kto-to [IP I° dal [vP obez'jane [VP t banan ]]]]. someone gave monkey banana MOT: banan? banana? *CHI: a pechen'je Tanino. and cake (of) Tanya @Begin @Languages: ru @Participants: CHI Tanja Target_Child, EVA Eva Investigator, MOT Mother @ID: ru|tanya|CHI|2;7.29||||Target_Child|| @ID: ru|tanya|EVA|||||Investigator|| @ID: ru|tanya|MOT|||||Mother|| @Comment: Last edited 9-JUN-1999 by William Snyder.] (phrase structure in brackets =P.K.) The fact that the child selects from the unmarked SVO word order does not exclude that also other variants are possible in Russian word order, as long as the change in word order is a “drastic interface effect”, an operation triggered by the need to achieve optimal Relevance in any (or both) of the interfaces. These are either lexically driven and follow the attested stages of lexical learning from less to more complex syntactic structures and properties (unergatives > transitives > unaccusatives > causatives > passives) or/and in other cases the acquisition of word order is usually due to factors such as focus-background, topic-comment-or theme-rheme information structure, with other words: these changes are usually driven through the communicative perspective (information structure) 25
operating on the syntax. As is apparent from the following examples, the child at the age of just 2.7 years already begins to acquire the rules of information structure. These differences arise even without the need of fixation of different parameters (selection options), which children need to assign a value. Children search on the basis of their linguistic experience, the phonological input, the “rules” or pattern of a specific grammar (call it I-grammar) - although we must insist in the fact that RM makes a case against the mental reality of a “grammar” in the ST/GB sense, but strives to find out the computational and mathematical / physical basis of cognition, including language - to accommodate them with the SM-/CI-interfaces of this I-grammar. But these rules are acquired only step-by-step, in a process that results hypersensitive to initial conditions (following Krivochen 2011a, 2013a, b), in which a small change in the input can result in a major change in the output. From this perspective, natural languages are, in the technical sense, emergent properties of SM and CI requirements, and ultimately, emergent properties of neural networks considered as complex, non-linear systems. References: Baltin, Mark (2003) Is Grammar Markovian? Invited forum lecture to the Korean Association of English Language and Linguistics (Korea University, Seoul). http://linguistics.as.nyu.edu/docs/IO/2637/KASELL-paper.pdf [Consulted on 09/02/2014] Binder, Phillip (2008) “Frustration in Complexity”. Science 320. 322-323. Chomsky, Noam (1956) Three models for the description of language. IRE Transactions on Information Theory 2: 113–124. (1957) Syntactic Structures. The Hague: Mouton. (1959) On Certain Formal Properties of Grammars. Information and Control 2. 137167. (1963) Formal Properties of Grammars. In R. D. Luce, R. R. Bush, and E. Galanter (eds.), Handbook of Mathematical Psychology. New York: John Wiley & Sons. 323–418. (1965) Aspects of the Theory of Syntax. Cambridge, Mass.: MIT Press. (2008) On Phases. In Freidin, R., C. P. Otero & M. L. Zubizarrieta (eds.) Foundational Issues in Linguistic Theory: Essays in Honor of Jean-Roger Vergnaud. Cambridge, Mass.: MIT Press. Collins, Chris & Edward Stabler (forthcoming) A Formalization of Minimalist Syntax. http://ling.auf.net/lingbuzz/001691 26
Curtiss, Susi (1988) Abnormal language acquisition and grammar: Evidence for the modularity of language. In Hayman, L. & C. Li (eds.) Language, Speech, and Mind: Festschrift for Vicky Fromkin. New York: Routledge. Dehaene, Stanislas (1999) The Number Sense: How the Mind Creates Mathematics. Oxford: OUP. Den Dikken, Marcel (2006) Phase Extension: Contours of a theory of the role of head movement in phrasal extraction. Theoretical Linguistics. Volume 33 (1). 1–41 Emonds, Joseph (1976) A transformational approach to English syntax. New York: Academic Press. Everett, Dan (2013) The Shrinking Chomskyan Corner: A Final Reply to Nevins, Pesetsky, Rodrigues. Ms. http://ling.auf.net/lingbuzz/000994 Fox, Danny (1999) Economy and semantic interpretation. Cambridge, Mass.: MIT Press. Frege, Gottlob (1981) Function und Begriff. Jena 1981, II. Gallego, Angel (2010) Phase Theory. Amsterdam: John Benjamins. Grinstead, John, Jeff MacSwan, Susi Curtiss & Rochel Gelman (2004) The independence of language and number. Ms. http://linguistics.ucla.edu/people/curtiss/2004%20%20The%20independence%20of%20language%20and%20number.pdf Halle, Morris & Alec Marantz (1993) Distributed Morphology and the Pieces of Inflection. In Hale & Keyser (eds.). 111-176. Idsardi, William & Eric Raimy (in press) Three types of linearization and the temporal aspects of speech. In T. Biberauer and Ian Roberts (eds.) Principles of linearization. Berlin: Mouton de Gruyter. Jenkins, Lyle (2011) Biolinguistic Investigations: Genetics and Dynamics. In Di Sciullo, A-M. & C. Boeckx (eds.) The Biolinguistic Enterprise: New Perspectives on the Evolution and Nature of the Human Language Faculty. Oxford: OUP. 126-134. Kosta, Peter (1992) Leere Kategorien in den nordslavischen Sprachen. Zur Analyse leerer Subjekte und Objekte in der Rektions-Bindungs-Theorie. Frankfurt am Main 1992 (Habil-Schrift, 679 S.). (2013): Early Subjects ad Patterns of Agreement and Word Order in L1-Acquisition. A Radically Minimalist Perspective. In Deutsche Beiträge zum 15. Internationalen Slavistenkongress, Minsk 2013. Herausgegeben von S. Kempgen, M. Wingender, N. Franz und M. Jakiša (= Die Welt der Slaven. Sammelbände. Sborniki 50). München-Berlin-Washington/D.C.: Verlag Otto Sagner 2013, 161-174. 27
(in press) Biolingvistika (Biolinguistics). In: Karlík, Petr et al. (eds.) Nový encyklopedický slovník češtiny. Praha: Lidové noviny. Kosta, Peter & Diego Krivochen (2011) The Biolinguistic Enterprise: New Perspectives on the Evolution and Nature of the Human Language Faculty. Edited by Anna Maria di Sciullo and Cedric Boeckx Oxford: Oxford University Press 2011. In International Journal of Language Studies (IJLS). Vol. 6(4). 154-182. Kosta, Peter & Diego Krivochen (2014) Flavors of movement: Revisiting the A/A′ distinction. In: Kosta, Peter, Steven L. Franks, Teodora Radeva-Bork and Lilia Schürcks (eds.), Minimalism and Beyond: Radicalizing the Interfaces. Amsterdam, John Benjamins. 251–282. Kosta, Peter, Krivochen, Diego Gabriel,Teodora Radeva-Bork & Peter Robinson (in prep.), Biological and genetic foundations of language genome (faculty) based on language disorders and impairments: the case of Merge, Agreement, Case, Number, Gender, Word Order and Empty Categories. Ms. Krivochen, Diego (2013a) A frustrated mind. Ms. Under review. http://ling.auf.net/lingbuzz/001932 (2013b) On the Necessity of Mixed models: Dynamical Frustrations in the Mind. Ms. Under review. http://ling.auf.net/lingbuzz/001838 (2014) On Phrase Structure building: How your theory of labeling gives away your theory of mind. Ms. Under review. https://www.academia.edu/6062796/2014__On_Phrase_Structure_building_How_your_theory_of_labeling_gives_away_your_theory_of_mind (to appear) Tokens vs. Copies. Displacement revisited. To appear in Studia Linguistica. Krivochen, Diego & Peter Kosta (2013) Eliminating Empty Categories. Potsdam Linguistic Investigations 11. Frankfurt am Main. Peter Lang Verlag. Lasnik, Howard (2011) What Kind of Computing Device is the Human Language Faculty?. In Di Sciullo, A-M. & C. Boeckx (Eds.) The Biolinguistic Enterprise: New Perspectives on the Evolution and Nature of the Human Language Faculty. Oxford: OUP. 354-65. Lasnik, Howard & Juan Uriagereka (2011) A Geneticist’s Dream, a Linguist’s Nightmare: The Case of FOXP2. In Di Sciullo, A-M. & C. Boeckx (eds.) The Biolinguistic Enterprise: New Perspectives on the Evolution and Nature of the Human Language Faculty. Oxford: OUP. 100-125. Lasnik, Howard, Juan Uriagereka & Cedric Boeckx (2005) A course in Minimalist Syntax. Oxford: Blackwell.
28
van der Lely, Heather K. J. (2005). Domain-specific cognitive systems: Insight from Grammatical specific language impairment. Trends in Cognitive Sciences 9, 53-59. Levinson, Stephen (2013) Recursion in pragmatics. Language 89.1, 149-162. Levy, Simon (2010) Becoming recursive: Toward a computational neuroscience account of recursion in language and thought. In van der Hulst, Harry (ed.) Recursion and Human Language. Berlin: De Gruyter. 371-392. Levy, Hagar & Naama Friedmann (2009) Treatment of syntactic movement in syntactic SLI: A case study. First Language, 29 (1). 15-49 Marantz, Alec (1995) The Minimalist Program. In Webelhuth, Gert (ed.) Government and Binding Theory and the Minimalist Program. Oxford: Blackwell. 349-381. Marinis, Theodoros (2011) On the nature and cause of Specific Language Impairment: A view from sentence processing and infant research. Lingua 121 (3). 463-475. Nevins, Andrew; David Pesetsky & Cilene Rodriguez (2009) Piraha exceptionality: a reassessment. Language 85.2, 355-404. Novogrodsky, Rama & Naama Friedmann (2010) Not all dependencies are impaired in Syntactic-SLI: Binding in children with a deficit in Wh-movement. BUCLD 34 proceedings, 2010. Prusinkiewicz, Przemislaw; Aristid Lindenmayer (1990) The Algorithmic Beauty of Plants. SpringerVerlag. [Electronic edition: 2006] Quirk, Randolph; Sidney Greenbaum; Geoffrey Leech & Jan Svartvik (1985) A Comprehensive Grammar of the English Language. Longman. Shalom, Dorit (2003) Understanding SLI: A Neuropsychological Perspective. In Yonata Levy & Jeannette Schaefer (eds.) Language Competence Across Populations: Toward a Definition of Specific Language Impairment. London: Lawrence Erlbaum Publishers. 413-424. Stroik, Thomas & Michael T. Putnam (2013) The Structural Design of Language. Oxford: OUP. Smith, Ann (2008) Nucleic acids to amino acids: DNA specifies protein. Nature Education 1(1):126 Taylor, Alex, Douglas Elliffe, Gavin R. Hunt & Russell D. Gray (2010) Complex cognition and behavioural innovation in New Caledonian crows. Proceedings of the Royal Society B: Biological Sciences 277 (1694). 2637-2643
29
Taylor, Alex & Nicola Clayton (2012) Evidence from convergent evolution and causal reasoning suggests that conclusions on human uniqueness may be premature. Behavioral and Brain Sciences 35 (4). 241-242. Trejo, Malena (2013) Prolepticidad de los Nombres Propios: un estudio en ‘Historia Apollonii Regis Tyri’. Licentiate Thesis. Universidad Nacional de La Plata. Uriagereka, Juan (2002) Multiple Spell-Out. In Uriagereka, J. Derivations: Exploring the Dynamics of Syntax. London: Routledge. 45-65. (2012) Spell Out and the Minimalist Program. Oxford: OUP. (2014) Regarding the Third Factor: Arguments for a CLASH model. In Kosta, Peter, Steven L. Franks, Teodora Radeva-Bork and Lilia Schürcks (eds.), Minimalism and Beyond: Radicalizing the Interfaces. Amsterdam: John Benjamins. 387–417. de Villers, Jill (2003) Defining SLI: A Linguistic Perspective. In Yonata Levy & Jeannette Schaefer (eds.) Language Competence Across Populations: Toward a Definition of Specific Language Impairment. London: Lawrence Erlbaum Publishers. 425-447. Watson, James D. (with A. Berry) (2003) DNA. The Secret of Life. Random Press. Wurmbrand, Susi (2014) The Merge Condition: A syntactic approach to selection. To appear in Kosta, P. et. al. (eds.) Minimalism and Beyond: Radicalizing the interfaces. Amsterdam: John Benjamins. 139 – 177.
30