tag:blogger.com,1999:blog-76906859052072260012024-03-28T13:30:18.504-04:00Ordinary Working GrammarianA blog about natural language syntax, fieldwork and life.Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.comBlogger283125tag:blogger.com,1999:blog-7690685905207226001.post-90918499933827602422024-01-23T02:23:00.004-05:002024-01-23T02:23:29.633-05:00On Implicit Arguments and Logophoricity (Angelopoulos and Collins, poster for NELS 2024)<p>By combining Collins’s (to appear) theory of implicit arguments with</p><p>Charnavel’s (2019) theory of exempt anaphora, we explain crosslinguistic</p><p>variations in the distribution of exempt anaphors.</p><p><a href="https://www.dropbox.com/scl/fi/agy42jc14m53fibo4orwp/Poster_NELS_2024-5.pdf?rlkey=l1ec4fqulb9j4l2tgymyna4ap&dl=0" target="_blank"> Poster</a></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-32813484946210727692024-01-03T00:24:00.011-05:002024-01-05T12:45:48.615-05:00 How to Syntax 2 (Adverbs with Attributive Adjectives)<p>This is the second of a series of blog posts showing how I think about a syntax problem when I first notice it. For the first installment, see:</p><p><a href="https://ordinaryworkinggrammarian.blogspot.com/2023/08/how-to-syntax-i.html" target="_blank">How to Syntax I (the now that-Construction)</a></p><p>I will occasionally choose phenomena that I notice, and talk about them in an informal fashion, breaking down the process of preliminary syntactic exploration. That is, I am just thinking off the top of my head (brainstorming), with few or no revisions. Ideally, I will give myself a time period of three hours maximum to prevent polishing. The focus of the discussion will be on process. I am not trying to come up with a polished analysis. </p><span><a name='more'></a></span><p>Of course, if people suggest references for me to look at, I will look at them later, but that would be a second stage of thought, not the preliminary exploration.</p><p><b>Data Capture</b></p><p>How does a syntactician find data?</p><p>In this particular case, I caught myself writing the following sentence in a recent book review that I posted on my blog:</p><p>(1)<span style="white-space: pre;"> </span>A significant part of the book concerns sometimes deeply personal events in Stohl’s life outside of the office,</p><p>I wrote the sentence in writer mode, not in linguist mode. But once I had written the sentence, and I was looking it over, sometime strange caught my eye: the use of the adverb <i>sometimes </i>modifying the adjective phrase <i>deeply personal</i>. The sentence is completely natural to me, and it raises the following issues: (a) How is the temporal adverb interpreted? (b) What is the temporal adverb modifying? (c) How is (1) related to other uses of temporal adverbs such as (2):</p><p>(2)<span style="white-space: pre;"> </span>The events described were sometimes deeply personal.</p><p>In (2), deeply personal is being used a predicate adjective phrase, not an attributive adjective phrase.</p><p><b>Basic Combinatorics</b></p><p>Before answering these questions and trying to understand the theoretical implications of a sentence like (1), it is first necessary to try to get some kind of basic understanding of the combinatorial possibilities of the construction. I call this Basic Combinatorics. For example, in (1) the adjective phrase is complex involving personal being modified by deeply. It is also possible to find examples with a simple adjective:</p><p>(3)<span style="white-space: pre;"> </span>Her sometimes/often/frequently impenetrable comments really bother me.</p><p>The examples also correspond to sentential uses of the same adverbs:</p><p>(4)<span style="white-space: pre;"> </span>The comments were sometimes/often/frequently impenetrable.</p><p>The example in (3) also differs from the example in (1) in that (3) has a possessor <i>her</i>, and (1) did not have a possessor.</p><p>Another example, due to Richard Kayne, is:</p><p>(5)<span style="white-space: pre;"> </span>that frequently obnoxious friend of theirs</p><p>Once again, this corresponds to the sentence:</p><p>(6)<span style="white-space: pre;"> </span>Their friend was frequently obnoxious.</p><p>In all these cases the adverb must precede the adjective:</p><p>(7)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>* Her impenetrable sometimes comments really bother me.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>*that obnoxious frequently friend of theirs.</span></p><p>But no such constraint holds for clauses:</p><p>(8)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Her comments were impenetrable sometimes.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>?That friend of theirs was obnoxious frequently.</span></p><p>So far, we have examples with the adverbs <i>sometimes, often and frequently</i>? These adverbs are all of a particular class of temporal adverbs specifying the frequency of an event (quantifying over times). </p><p>So, the next question is: Are any other kinds of adverbs possible? </p><p>Since this is a question about adverbs, it helps to have a resources like Cinque 1999 available (Adverbs and Functional Heads, Oxford University Press). It is especially useful to have a searchable version of this book. That will help search for different kinds of adverbs to test.</p><p>What about the evaluative adverb <i>unfortunately</i>?</p><p>(9)<span style="white-space: pre;"> </span> He regretted his unfortunately inappropriate remarks.</p><p>This sentence seems acceptable, and has the following clausal counter-part:</p><p>(10)<span style="white-space: pre;"> </span>His remarks were unfortunately inappropriate.</p><p>What about the epistemic adverb <i>probably</i>?</p><p>(11)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>?I did not happen to hear the probably excellent lecture.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>?He regretted his probably inappropriate remarks.</span></p><p>These examples do not seem as quite as natural to me as the other adverb classes. Since they are not completely unacceptable, I suspect that better examples of the same adverb could be found. At this point, an internet search may be useful. The sentential counter-parts to these are fine:</p><p>(12)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>His lecture was probably excellent.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>His remarks were probably inappropriate.</span></p><p>Consider now evidential adverbs:</p><p>(13)<span style="white-space: pre;"> </span>I was surprised by his obviously/clearly/apparently inappropriate comments.</p><p>Compare to the clausal counter-parts:</p><p>(14)<span style="white-space: pre;"> </span>His comments were obviously/clearly/apparently inappropriate.</p><p>Temporal adverbs like <i>yesterday</i> are completely unacceptable:</p><p>(15)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>*I did not happen to hear the yesterday excellent lecture.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>*He regretted his yesterday inappropriate remarks.</span></p><p>Even though the sentential counter-parts to these examples are OK:</p><p>(16)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>His lecture was excellent yesterday.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>His remarks were inappropriate yesterday.</span></p><p>One difference between <i>yesterday</i> and the other adverbs is position. <i>Yesterday</i> cannot be used clause internally (only clause initially and clause finally):</p><p>(17)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>*His lecture was yesterday excellent.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>*His remarks were yesterday inappropriate.</span></p><p>So, a potential generalization might be:</p><p>(18)<span style="white-space: pre;"> </span>Only clause-internal adverbs can occur with attributive adjectives.</p><p>Note (18) does not say that all clause-internal adverbs can occur with attributive adjectives. Rather, if an adverb occurs with an attributive adjective, then it will appear clause-internally. I have no idea if (18) is an accurate generalization. But by formulating the generalization, you open up the possibility of testing it with further examples down the line.</p><p>Generalization (18) seems to be related to the generalization illustrated in (7). Only clause-internal adverbs can occur with adjectives, and when they do, they must precede the adjective.</p><p>But a clear generalization is the following:</p><p>(19)<span style="white-space: pre;"> </span>Not all sentential adverbs can occur with attributive adjectives.</p><p>I have just briefly gone through a few cases. An outstanding research question is what classes of adverbs can occur with attributive adjectives.</p><p>Can two adverbs be used at the same time? How about:</p><p>(20)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>His unfortunately sometimes inappropriate remarks caught me off guard.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>His unfortunately frequently obnoxious friend was also invited.</span></p><p>These seem a bit awkward, but basically OK. So the generalization is:</p><p>(21)<span style="white-space: pre;"> </span>An attributive adjective may occur with multiple adverbs.</p><p><b>Paraphrase and Entailment</b></p><p>Let’s consider the interpretation of the relevant examples. We will probe this question using paraphrase and entailment. Consider first:</p><p>(22)<span style="white-space: pre;"> </span>Her sometimes/often/frequently impenetrable comments really bother me.</p><p>This sentence seems to be equivalent to the following with an appositive relative clause:</p><p>(23)<span style="white-space: pre;"> </span>Her comments, which are sometimes/often/frequently impenetrable, really bother me.</p><p>Similarly, consider again:</p><p>(24)<span style="white-space: pre;"> </span>that frequently obnoxious friend of theirs</p><p>This sentence seems to be equivalent to the appositive relative:</p><p>(25)<span style="white-space: pre;"> </span>that friend of theirs who is frequently obnoxious</p><p>In neither case is a restrictive relative clause interpretation possible. For example, the following continuation seems difficult:</p><p>(26)<span style="white-space: pre;"> </span>?Her frequently impenetrable comments bother me, but the other ones do not.</p><p>Similarly, the following seems difficult:</p><p>(27)<span style="white-space: pre;"> </span>?My sometimes unpleasant cat likes my often amusing cat.</p><p>These remarks lead to the following open question:</p><p>(28)<span style="white-space: pre;"> </span>When a sentential adverb is used with an attributive adjective, is the interpretation always appositive (non-restrictive)?</p><p><b>Theoretical Significance</b></p><p>What is the possible theoretical significance of these empirical observations? In this section, I am not proposing to give a polished analysis, just a few pointers to what path one could take in developing the topic.</p><p>The adverbs that we have been looking at are usually taken to modify clausal projections such as TP or AspP (or the clausal projections in Cinque 1999). But in the relevant example, they appear together with attributive adjectives in a DP, not a clause:</p><p>(29)<span style="white-space: pre;"> </span>his sometimes inappropriate comments</p><p>It seems unlikely that <i>sometimes</i> is modifying <i>inappropriate</i> directly here:</p><p>(30)<span style="white-space: pre;"> </span>his [AdjP sometimes inappropriate] comments</p><p>Rather, <i>sometimes</i> is a quantifier quantifying over times, and should undergo QR to adjoin to a sentential constituent (as other quantifier phrases do). </p><p>To complicate the picture a bit, the following does not seem unacceptable to me:</p><p>(31)<span style="white-space: pre;"> </span>Often inappropriate though his comments are….</p><p>Even though <i>often inappropriate</i> is moving as a constituent, the question is still how <i>often</i> can modify <i>inappropriate</i> directly. My suspicion is that a clausal constituent larger than an AdjP is moving in (31).</p><p>Putting aside (31) for the movement, I suggest that in the example in (29), the string <i>sometimes inappropriate</i> is a really a reduced appositive relative:</p><p>(32)<span style="white-space: pre;"> </span>His comments which are sometimes inappropriate really bother me.</p><p>The reason why various sentential adverbs can be used with attributive adjectives is that attributive adjectives can be reduced appositive relatives with a hidden clausal structure. I put the conclusion in the strongest possible way as follows:</p><p>(33)<span style="white-space: pre;"> </span>All attributive adjectives (both restrictive and non-restrictive) are reduced relatives.</p><p>This is a well-known analysis of attributive adjectives in generative grammar, mostly recently promoted by Kayne 1994.</p><p><b>Conclusion</b></p><p>I have discussed some methods of preliminary syntactic exploration in this blog post. A summary of the methods is given here:</p><p>(34)<span style="white-space: pre;"> </span></p><p>a.<span style="white-space: pre;"> </span>Data Capture</p><p>b.<span style="white-space: pre;"> </span>Basic Combinatorics</p><p><span style="white-space: normal;">c.<span style="white-space: pre;"> </span>Formulating Generalizations</span></p><p><span style="white-space: normal;">d. Formulating Open Questions</span></p><p><span style="white-space: normal;">e.<span style="white-space: pre;"> </span>Paraphrase and Entailment</span></p><p>This is only the very first step in cataloguing the methods syntacticians use in preliminary exploration. I hope to be able to post many similar blogs filling out this list in the future.</p><p><b>Addendum (Wednesday January 3 2024)</b></p><p>After posting to my blog, I received two personal communications. Since the blog post is not meant to be a published or polished paper, I do not make any corrections. But I add the posts here, since they are very relevant:</p><p>Cinque 2010: 57 notes: "If modification by (speech act, epistemic<span style="font-family: times;">, etc.) clausal adverbs is diagnostic of the presence of a clausal constituent, one should expect only indirect modification</span> adjectives, which enter a reduced relative clause, to allow them." </p><p>In that source, he gives the following examples:</p><p>a. These are frankly unacceptable conditions</p><p>b. This is a probably favourable situation</p><p>c. This is a certainly important contribution</p><p>He also notes (personal communication) tha<span style="font-size: medium;">t</span><span style="font-family: times;"> "<span style="background-color: white; color: #222222;">T</span><span style="background-color: white; color: #222222;">hese sentential adverbs are only compatible with predicative adjectives (those derivable from relative clauses), but not with direct modification adjectives."</span></span></p><p>The source is: <br />Cinque, Guglielmo. 2010. The Syntax of Adjectives. MIT Press, Cambridge.</p><p>Richard Kayne (personal communication) gives me the following examples of restrictive relatives with sentential adverbs:</p><p>a. The only frequently obnoxious friend of theirs that I've met personally is John Smith.</p><p>b. Any even sometimes unpleasant cat is in danger of being given away.</p><p><br /></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-17028925477787730932023-12-24T09:17:00.003-05:002023-12-24T09:17:32.985-05:00Two Abstracts for: The Cambridge Handbook of the Minimalist Program (forthcoming, Grohmann and Leivada eds.)<p>I am very proud of both of these little squibs, which are both foundational. Erich Groat and Daniel Seely are two of the deepest thinkers about the foundations of minimalist syntax out there, and I am honored to have been able to work with them. Both of these papers follow closely on earlier results of mine, including Collins 2002 ('Eliminating Labels') and Collins and Stabler 2016 ('A Formaliation of Minimalist Syntax'). I am glad that they are finally going to see the light of day in Grohmann and Leivada's eagerly anticipated handbook.</p><p><span></span></p><a name='more'></a><p></p><p><b>Distinguishing Copies and Repetitions</b></p><p><b>Chris Collins, New York University</b></p><p><b>Erich Groat, University of Nottingham</b></p><p><b>Abstract:</b></p><p>In this paper, we will consider a number of proposals for distinguishing copies and repetitions. We show that each proposal faces serious difficulties. In particular, any solution to the issue of distinguishing copies and repetitions consistent with minimalist aims must meet the following criteria: (a) no operations other than Merge should be used to build structure, (b) nothing beyond lexical items and the structures built from them by Merge should be interpreted by the interfaces, and (c) the definition of Merge should not be made more complex than Merge(X,Y) = {X,Y}. No current proposal satisfies all of these criteria. We conclude that no adequate proposal exists in minimalist syntax for distinguishing copies and repetitions.</p><p><b>Keywords:</b></p><p>copies, repetitions, chains, occurrences, multi-dominance, phases, phase-level memory</p><p>(https://ling.auf.net/lingbuzz/003809)</p><p><b><br /></b></p><p><b>Labeling without Labels</b></p><p><b>Chris Collins, New York University and</b></p><p><b>T. Daniel Seely, Eastern Michigan University</b></p><p><b>Abstract:</b></p><p>We argue in this vignette that Chomsky’s 2013, 2015 Problems of Projection (PoP) is entirely consistent with the label-free syntax initiated by Collins 2002, and further explored in Seely 2006, in deriving the effects of labels from independently motivated principles. Although PoP refers to labels and a ‘labeling algorithm’, in fact, there are no labels in Chomsky’s framework. Rather, PoP offers a novel, and elegantly simple, answer to the question of how the effects of labels can be derived without appeal to any special (and stipulative) label-projection mechanism. With respect to categorial identification of a syntactic object at the interfaces, label information in PoP is provided by (i) irreducible features of lexical items, and (ii) 3rd factor, hence freely available, Minimal Search. Merge, the most fundamental operation of the narrow syntax, is thereby simplified by eliminating the label-projection component, allowing just binary set formation: Merge(X, Y) = {X, Y}.</p><p><b>Keywords: </b></p><p>labels, labeling algorithm, label-free syntax, minimal search, simplest Merge, 3rd factor, interfaces, Strong Minimalist Thesis</p><div>(https://ling.auf.net/lingbuzz/005486)</div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-63820526144465795842023-12-24T08:24:00.011-05:002023-12-24T13:32:31.542-05:00Chomsky and Me Too: Review of Stohl 2023<p>(https://www.orbooks.com/catalog/chomsky-and-me/)</p><p>In this blog post, I review Bev Stohl’s memoir ‘Chomsky and Me’ (2023, OR Books) from my personal point of view, as a graduate student who attended the MIT Department of Linguistics from 1988 to 1993. To complement Stohl’s perspective, I describe some of my own experiences in the department, studying with Chomsky.</p><span><a name='more'></a></span><p>Stohl was Chomsky’s administrative assistant at MIT from March 1993 to June 2017. Through his writings, interviews, talks and correspondence, Chomsky developed into a world-wide clearinghouse of political activism. When 9/11 took place, people called him from all over the world asking for interviews to make sense of it all (pg. 50). Countless numbers of people have filed through his office, including frequent film crews, hoping to get advice from him. Over one three-day weekend alone, they received over one thousand e-mail messages (pg. 178). For 24 years, Stohl was the gatekeeper for this vast and chaotic enterprise. Every single person had to go through her to get to Chomsky. Much of the book is about the constant challenges she faced in fulfilling this duty.</p><p>Stohl writes in an entertaining and an easy-to-read style. The bottom line is that I really enjoyed this book. I especially recommend it to linguists, who will learn about a side of Chomsky beyond his linguistics persona.</p><p>The memoir portrays a human side of Chomsky that would otherwise be impossible to know about. It describes Chomsky as a mild-mannered somewhat absent-minded professor, who loves to spend time with his grandchildren, and to go boating at Cape Code during the summers. Stohl developed a warm caring relationship with Chomsky and Halle, who both went to her mother’s funeral (pg. 208). Within the day-to-day turmoil of Chomsky’s schedule, they maintained a friendly light-hearted banter. The picture painted is of a team consisting of Stohl and Chomsky and a few others working as a cohesive unit. When she retired, Chomsky said of her (pg. 300):</p><p>“When you walk into our offices over there [he gestured toward our suite], you will find a warm and welcoming atmosphere. Bev created that.” A short pause followed. “Difficult day-to-day things happened in that office. I know it couldn’t have been easy for her, taking care of two old men. There were some very difficult times, but Bev was there, and she did it all well.”</p><p>A significant part of the book concerns events in Stohl’s life outside of the office, before and during her time with Chomsky. These parts of the book help us understand how she came to write the book she did. How did Stohl come to work at MIT? How did she get the job as Chomsky’s assistant? In spite of the intense pressures associated with the position, how did she manage to keep it and thrive for 24 years? What parts of Chomsky’s work resonated with her family background? What parts of the job did she enjoy the most? What experiences in her life led up to her decision to write a memoir? How exactly did she start writing the memoir? What role did her blog play in the process? All of these questions are addressed directly or indirectly in the memoir.</p><p>Stohl’s descriptions of Chomsky and the MIT campus are precious to me, jogging old memories. For example, at one point she describes Chomsky’s unique hand gestures (pg. 177):</p><p>I hadn’t realized that I held a view on Noam’s personal sign language until I shared it. “I think he is physically moving and organizing his thoughts with his hands. He pushes both hands to the left or the right, palms splayed slightly down, as if to say, ‘Let’s put this thought over here,’ or ‘That refers to this group.’ He may come to center, fingertips of both hands pointing inward toward his chest, then fanning all fingertips forward and out to the sides.” I illustrated each movement as I spoke. “It has always looked to me like his thoughts are in the space in front of him, and he’s illustrating them as he speaks.”</p><p>We had graduate students from all over the world, including North America, Africa, Europe, India, China and Japan, and all of them were using Chomsky’s unique hand gestures to some extent. Just by watching somebody speak, you could tell if they were an MIT linguistics student by the way they moved their hands. In retrospect, it was hilarious. We were subconsciously imitating him, even down to his hand gestures.</p><p>Stohl also gives a number of harrowing descriptions of Building 20, the long-time home of the MIT Department of Linguistics, before it moved to the Stata Center.</p><p>“Building 20 was put up in 1942 as a temporary structure for radiation research during World War II, to be torn down at war’s end. Yet here it stood, or leaned, nearing collapse. Wood shingles had cracked and fallen during our scorching summers and icy winters, and the whitewashed walls had faded to gray. Each time the door to our suite flew open, a knob-sized hole in the wall poured a layer of white powder onto the floor. Asbestos dust, we later learned. This was my new home.”</p><p>This building was a hive of linguistic activity during my time there. Maybe because of its dilapidated condition, we seemed to thrive in it. We would enter the building by climbing metal stairs on the outside of the building. Were those stairs the fire escape? The floors were wooden, and I could always hear people approaching from far down the hallway. But the physical condition of the building did not bother me at all.</p><p>Given Chomsky’s central role in linguistic theory since the mid-50s, it might be a bit surprising that linguistics plays such a minor role in Stohl’s memoir. The linguists and linguistic issues make at most guest appearances, flashing briefly in and out of the powerful day-to-day flow of political activism. </p><p>Here is a nearly complete list of the linguistic anecdotes in the memoir: There was a brief description of Chomsky’s interview with Ann Makepeace who filmed a documentary about the revival of the Wampanoag language (pg. 47). Neurolinguistics makes an appearance during a trip to Italy (pg. 181): “Neurolinguists were studying the brain–language connection by hooking up electrodes to the exposed brains of people before surgery, noting which parts of the brain lit up when answering questions. The patients were conscious, with exposed brains.” Gene Searchinger’s interview with Chomsky (pg. 43) for his series ‘The Human Language’ was mentioned briefly. Somewhat more extensive coverage was given of the behind-the-scenes activity of Michel Gondry’s wonderful film about Chomsky ‘Is the man who is tall happy?’ (pg. 280).</p><p>I did not walk away from Stoh’s memoir with any additional insight into this period of history (1993-2017) in the field of syntax, the field that Chomsky is closely associated with in linguistics. Nor did Chomsky’s interactions with other linguists (letters, notes, calls, e-mail messages, meetings, debates, public lectures, class lectures, phone conversations, interviews, manuscripts) play a significant role in the memoir. Luigi Rizzi and Adriana Belletti make a cameo appearance, joining Chomsky at an opera in Pavia, Italy (pg. 180). A few other syntacticians (Jim Huang, Andrea Moro) make brief cameo appearances as well. The word ‘syntax’ appears in the memoir only twice, and the word ‘minimalist’ only once. The terminology of syntactic theory does not appear at all. </p><p>Since Stohl is not a linguist, there is no way that she could have understood those themes and reported on them accurately. She herself admits her lack of knowledge about linguistics (pg. 31): </p><p>“From my newbie vantage point, the linguistics side of things seemed innocuous, but also murky. I could define phonology and syntax, but knew little more about linguistics. Standing outside Noam’s office, I’d overheard him asking an advisee, ‘Can you say in Icelandic: There have many men baked cakes?’ These peculiar discussions made me wonder what it was linguists did. Years later, I still wasn’t sure.”</p><p>Another reason for the lack of coverage of linguistics may be the way Chomsky worked in the two worlds: political activism versus linguistics. Although he would meet regularly with the linguistics and philosophy students in the department, the vast majority of the external visitors to the department must have been activists, not linguists. These were the celebrities and the people arriving with film crews. They would be the people who would catch Stohl’s attention and often give her problems in the office (moving furniture around, running over time, etc.).</p><p>On a related note, Stohl (pg. 289) gives a career tabulation of Chomsky’s writings of over 80 political books and over 30 linguistics books. That ratio might also be taken as an approximation for the amount of time, effort and thought Chomsky spent on the two disciplines during his entire career (73% political activism, 27% linguistics and philosophy). And that percentage seems to have tilted even more in favor of activism over time (see Stohl 2015).</p><p>Now, about my relationship with Chomsky.</p><p>I began my undergraduate education at the University of Minnesota in 1981, and transferred to MIT midway through my sophomore year. Chomsky’s work was prominent at MIT then, often being discussed in my courses. In the summer after my sophomore year, I read ‘Reflections on Language’ (1975, Pantheon Books). I know I read it, because my copy of the book still has copious margin notes and underlining. My notes make reference to Piaget and Artificial Intelligence, showing what was on my mind at that time. I was struggling to understand what Chomsky was saying.</p><p>Then in the summer after my junior year, I read through Andrew Radford’s Transformational Syntax (1981, Cambridge University Press) in its entirety and worked through the all exercises with a friend. The book was a very clearly written presentation of Chomsky’s theory of syntax. I remember staring in awe at the text of the book. The combination of formal reasoning with psychological themes resonated very deeply with me. I was immediately hooked.</p><p>I returned to MIT as a graduate student in the Department of linguistics, where I studied from Fall 1988 to Spring 1993. Stohl and I overlapped for a few months, although to be honest I do not remember her from that time. Rather, I remember that Chomsky’s administrative assistant was Jamie Young. Stohl took over from Young in 1993, after interviews involving Halle and Young, but not including Chomsky (pg. 29).</p><p>At the beginning of each semester, I walked into Chomsky’s office, and arranged my meetings with Young for the semester. All the grads would do this, so his schedule quickly filled up. I also attended his Fall and Spring seminars. </p><p>Chomsky taught two courses per year. In the Fall he taught 24.958 ‘Linguistic Structure’ always scheduled on Thursday afternoons from 2:00 to 4:00. In the Spring he taught 24.957 ‘Introduction to Linguistic Theory at an Advanced Level’. This was a philosophical course examining the foundations of syntactic theory. They were listed as follows in the 1986-1987 MIT Course Catalogue.</p><p>24.957 Introduction to Linguistic Theory at an Advanced Level (A)</p><p>Prereq.: Permission of Instructor</p><p>G(2)</p><p>3-0-9</p><p>Discussion of conceptual and methodological issues: goals of linguistic theory and its place in the study of thought and behavior; descriptive and explanatory theories; the nature, use and acquisition of language compared with other cognitive systems; relations of form, meaning and language use. Examinations of theories of transformational generative grammar as they have evolved and are now being pursued: theory of base, transformations, semantic interpretation of formal structures, logical form and conditions on the form and functioning of rules.</p><p>N. A. Chomsky</p><p>24.958 Linguistic Structure (A)</p><p>Prereq.: 24.952 or 24.957</p><p>G(1)</p><p>3-0-9</p><p>Current work on topics in syntax and semantics. </p><p>Permission of instructor required.</p><p>N. A. Chomsky</p><p>Chomsky was not at MIT in the Fall of 1988 presumably because he had just had back surgery that summer (see Chomsky and Kelman 2021: 113). So, Richard Larson took over his seminar that semester, and taught the beginnings of his VP-shell theory. Larson’s theories fed directly into Ken Hale and Jay Keyser’s work on argument structure, and all this work formed a critical part of Chomsky 1995 (see below).</p><p>From my time in Togo in the Peace Corps (1985-1987), I had become obsessed by serial verb constructions, and I wanted to start to develop my thinking about them. So, for my paper for the Spring 1989 seminar, I wrote about Mark Baker’s paper on serial verb constructions (1989. Object Sharing and Projection in Serial Verb Constructions, Linguistic Inquiry 20). While Baker’s paper ingeniously captured the object sharing property of serial verbs, it did so at the expense of a drastic weaking of X’-Theory (so that verbal projections could have two heads). There was a serious restrictiveness issue lurking in Baker’s account: allowing projections to have exactly one head drastically limited the total number of syntactic structures available to a language learner. In syntactic theory, there is an ever-present tension between theoretical restrictiveness and empirical coverage.</p><p>In retrospect, I should have focused on the important restrictiveness issue, but instead I detailed my thoughts on a number of small technical issues in Baker’s paper. I clearly remember Chomsky’s main comment at the very top of my paper was “Pretty terse + hard to follow”. This is an exact quote, since I still have a scan of the paper today. Some other comments on the paper were: “I don’t follow the argument.”, “Sounds plausible, but you haven’t really given an argument.”, “I don’t see why this follows.”, “There is something wrong with the reasoning here.”, “I don’t see this.”, “Not a very strong argument.” He had obviously read the paper very carefully, and he did not find it convincing. I was completely devastated. This was one of my first intellectual interchanges with Chomsky, and I had apparently failed. After spending days thinking of detailed responses to all of his remarks and jotting them down on the paper under his remarks, I realized he was exactly right in all of his comments. The paper was indeed terse and hard to follow. I needed to raise my game to the next higher level. That was a learning experience for me, and I appreciate his honesty in replying to my paper.</p><p>As for his Fall syntax seminar, it was invariably packed with students and faculty both from MIT and surrounding schools (e.g., Harvard, UMASS, UCONN, and many others). Chomsky always focused on the structure of syntactic theory, not just solving empirical problems, but laying out what the theory should look like. He freely took questions in class, so the atmosphere was electric. People were raising their hands bringing up counter-examples or noting theoretical consequences. He had developed a system whereby he would lecture for two hours, then for the third hour only students in the class could attend a Q and A period. As I recall, there were never any handouts, nor was there a syllabus. He just lectured from his notes, and wrote extensively on the blackboard. The feeling I got was that he was developing most of the material as the semester went on, which I found exciting. From that time on, I always associate the oncoming chill of the Fall air with the intellectual excitement I got attending Chomsky’s Fall lectures.</p><p>His seminal paper ‘Some Notes on Economy of Derivation and Representation’ (first published in MITWPL in 1989) had already started circulating as a draft in 1988, and was creating a buzz amongst the students. In that paper, he introduced the notion of economy of derivation and used it to reanalyze parts of Jean-Yves Pollock’s famous paper on verb movement in French and English (1989. Verb Movement, Universal Grammar, and the Structure of IP. Linguistic Inquiry 20, 365-424). The notion of economy was to become a central component of the minimalist program. </p><p>The years of my graduate education (1988 to 1993) correspond roughly with the beginning of minimalist syntax, which I would date from the publication of Chomsky 1989 (‘Some Notes’, MITWPL 10) to the publication of Chomsky 1995 (‘The Minimalist Program’, MIT Press). I call this period ‘early minimalism’. It was fascinating to see how the ideas developed over those years. The minimalist program did not just appear overnight, but it took years of pondering the data and hammering away at the theory before taking form.</p><p>All of the extremely smart graduate students focused their attention on helping him to work out his new theory. We were engaged in a common project, having long and intense discussions amongst ourselves to try to understand it. Since the evolving minimalist program involved quite a few conceptual shifts from past versions of generative syntax (e.g., Merge, the Copy Theory of Movement, Bare Phrase Structure, eliminating D-Structure and S-Structure, eliminating government, economy conditions on derivations and representations, the notion of convergence), it was an intellectually exhilarating time period. All the old analyses had to be rejected or completely revised. It felt like the doors had been opened wide, leading to vast empirical and theoretical vistas. It was a time of great opportunity for the students, who were writing insightful papers and theses exploring the framework. Chomsky’s preliminary steps in minimalist syntax were published in ‘The Minimalist Program’ (MIT Press, 1995).</p><p>I suspect that the development of minimalism was one of the reasons that I found a prestigious job immediately after graduate school (Cornell University, 1993). People in other universities wanted to know what was going on, and when I arrived, they wanted me to teach seminars explaining the new developments. I was the best person to help them, because I had just spent years deeply immersed in these new ideas at MIT. Arriving at Cornell, I also interacted with the philosophers there, who were very knowledgeable about linguistic theory and enthusiastic about minimalism. In the years after graduation, I was invited all over the world to talk about minimalist syntax, including China, France, Germany, Holland, Japan, Norway, South Africa and Spain. Apparently, a very small bit of Chomsky’s magic had rubbed off on me.</p><p>The excitement in the air in the late 1980s and early 1990s paralleled the excitement in the air at the time that Government and Binding (Principles and Parameters) was being developed in the fall of 1979, around ten years earlier. According to David Pesetsky (quote from Macfarquhar 2003, The New Yorker):</p><p> “It felt like a revolution,” he says. “It was very exciting. Suddenly there were questions that you could ask that hadn’t been asked before, and real answers to questions that people had been asking before. And to be a student here at the time was an incredible privilege. In a sense, it was a cheat. Because it was just very, very easy to say something interesting that no one had ever said before. You could be a celebrity!”</p><p>Curiously, when minimalism rolled around, Pesetsky was critical of it. He was not at all onboard. I remember Pesetsky making critical comments during his own seminars and also once right after Chomsky’s lecture. Other students from my cohort experienced similar staunch opposition to minimalism. Perhaps Pesetsky was just being conservative theoretically, which is not a bad thing. After all, theoretical innovations should be explored and tested empirically before they are adopted. But it must have been difficult to be a syntax faculty member at MIT and to be faced with the intellectual tidal wave of Chomsky’s ideas.</p><p>It is important to highlight Howard Lasnik’s role in the development of early minimalism. Lasnik was one of Chomsky’s closest collaborators in syntax around then, co-authoring chapter 1 of Chomsky 1995. Lasnik always attended Chomsky’s Thursday lecture. In the hours leading up to the lecture he was available in Building 20 for meetings with students. I tried to meet with him as much as possible. During these meetings, we would talk about my own work, but we would also talk about what Chomsky was developing in his seminar. He was a key component in my syntax education. His students at the University of Connecticut consistently produced some of the most important theses on minimalist syntax.</p><p>My appointments with Chomsky were intense. I recall jokingly telling a classmate that meetings with him resembled a scene in Star Trek where Captain Kirk and Spock are fighting (with the suspenseful background orchestra music), and clearly Chomsky was Spock. It is not that Chomsky was aggressive in any way, he was just very focused as a person. He did try to end each appointment with a positive word or two, saying something like “Good work!”. See also Stohl’s memoir (chapter 4, pg. 40) for a description of a student meeting.</p><p>For Chomsky the issue was always how your work fit into syntactic theory, in this case minimalism. Other people had very different styles. With Howard Lasnik, the focus was on the logic of the argument, which is why years later his students put together a volume of papers called ‘Step by Step’ (2000, MIT Press). David Pesetsky would pepper you with questions about alternative possible analyses, dubbed by the students in my cohort the ‘Pesky Set’ (that set of pesky alternative analyses that you need to consider before finding the best analysis). Ken Hale had a vast knowledge of cross-linguistic data at his fingertips, not just of Indo-European languages, but also of Native American and Australian languages too. He would often help you make important connections to related phenomena in other languages. To be able to train with these four excellent syntacticians was an incredible privilege. </p><p>For my dissertation (Topics in Ewe Syntax 1993), Chomsky was a member of my committee, along with Ken Hale (Chair) and David Pesetsky. Since Ken Hale was a high caliber field linguist, I thought he would be a more appropriate choice as Chair. But I met just as frequently with Chomsky. He was especially interested in what I had to say about successive cyclic movement (movement broken down into smaller steps) and economy of derivation. He played an important role in advising me on my first refereed journal publication (1994. Economy of Derivation and the Generalized Proper Binding Condition. Linguistic Inquiry 25.1, 45-61). Although this paper was published in 1994, I had written it under Chomsky’s supervision while at MIT in 1993. I believe this paper was one of the very first papers to show how economy conditions apply to syntactic derivations, helping to explain a complicated empirical phenomenon.</p><p>In the years after graduation in 1993, I have kept in regular contact with Chomsky in various ways. </p><p>For my first study leave from Cornell, I flew from Ithaca every Thursday to his syntax seminar. I would wake up at 5:00am in the morning to catch my flight, arrive in Logan airport, take a taxi into Cambridge and spend the day in meetings at Au Bon Pain in Kendall Square with MIT students and colleagues. These were great meetings, having a coffee and croissant, and debating syntactic theory in the warmth of a hidden corner of a café. After Chomsky’s lecture, I would take the plane back from Logan and arrive late at night in Ithaca, opening the door of the house when the kids were already fast asleep, syntax ideas coursing through my brain.</p><p>In 2009, a few years after Carol Chomsky passed away, I gave a talk on my paper ‘A Formalization of Minimalist Syntax’ (2016, Syntax) at the MIT Syntax-Semantics Reading Group. I was happy to see many of my old teachers there, including Wayne O’Neil and David Pesetsky. Chomsky did not come to the talk, but I had made an appointment with him afterwards. This was also the first time that I remember meeting Stohl, since she was there to let us into his office in the Stata Center.</p><p>Because my wife had never met him, I decided to take her along to introduce her. My wife is a nurse, so Chomsky and her spent about half of my meeting talking about medical care and the importance of nurses. He emphasized the wonderful care that Carol had received from nurses during her illness. He seemed happy to pass the time talking to my wife about health care in America. Finally, as calmly as I could manage, I asked my wife if I could please discuss my syntax paper, which was the sole reason for my visit to Cambridge that day. We had a productive conversation where he once again expressed his long-standing skepticism about formalization in syntactic theory. I quizzed him on various proposals in the paper. When Stohl appeared in the doorway, we got up to go. As usual, my appointment with Chomsky had given me lots to think about. </p><p>Since graduation, I have kept in contact with Chomsky on a regular basis through e-mail. I write messages to him to discuss various topics. If I hear he has a new paper out, I write to him to get a copy. I always send him papers of mine that I think he will be interested in. Since 1993, I have exchanged hundreds of e-mail messages with him. And he has never once has failed to respond. These discussions could be quite intense lasting several weeks. From what I know about other syntax colleagues, they too have had very productive professional relationships with Chomsky after graduating, involving intense and lengthy e-mail discussions about syntactic theory. In this, we are beneficiaries of Chomksy’s willingness to engage with people, one of the important running themes of the memoir.</p><p>Mostly recently, I did two closely related interviews with him for my blog, one in 2021 about formal semantics, and one in 2022 about language and thought:</p><p>‘A Conversation with Noam Chomsky about Formal Semantics’</p><p>(https://ordinaryworkinggrammarian.blogspot.com/2021/06/a-conversation-with-noam-chomsky-about.html)</p><p>‘A Conversation with Noam Chomsky about Language and Thought’</p><p>(https://ordinaryworkinggrammarian.blogspot.com/2022/03/a-conversation-with-noam-chomsky-about.html)</p><p>These interviews were just natural extensions of e-mail correspondence that we were already having. After a few dozen e-mail messages, I would ask Chomsky: “Can I post this discussion on my blog?”, and he agreed, with no restrictions at all. I did let him read the interviews to make corrections, but except for typos he did not change anything.</p><p>As is clear from this blog post, Chomsky’s influence on my intellectual development has been profound. In the 1950s, he created generative grammar, the framework which captivated me as an undergraduate, and led me to graduate school in linguistics. Then, in early 1990s he developed the minimalist program, which I have been working on ever since. Because of his importance in my life, I was quite eager to read Stohl’s memoir. Although the memoir did not touch on the development of syntactic theory, and says little about linguistics more generally, I was richly rewarded with detailed descriptions of Chomsky’s day-to-day work in the office, and a much better understanding of Chomsky as a human being.</p><p><b>Acknowledgments</b></p><p>Thanks to Akira Watanabe for sending me a scan of his notes on Chomsky’s lectures (1989-1991).</p><p><b>Selected References</b></p><p>Gammie, Duncan. 2022. Bev Stohl – Chomsky’s Assistant. Dunk Tank [Podcast].</p><p>(https://dunctank.podbean.com/e/bev-stohl-chomsky-s-assistant/)</p><p>Genova, Evelisa. 2023. Confidence and Creativity with Noam Chomsky’s Assistant Bev Stohl. Stories of Life and Love [Podcast].</p><p>(https://podcasters.spotify.com/pod/show/evelisa/episodes/Confidence-and-Creativity-with-Noam-Chomskys-Assistant-Bev-Stohl-e1t988f)</p><p>Hawkins, John. 2023. Interview with Ben Boisseau Stohl: Chomsky and Me. OpEdNews.com.</p><p>(https://www.opednews.com/articles/Interview-with-Bev-Boissea-Critical-Thinking_Interviews_Noam-Chomsky-230322-723.html)</p><p>Lydon, Christopher. Beverly Stohl on Noam Chomsky’s Soul. Sound Cloud.</p><p>(https://soundcloud.com/radioopensource/beverly-stohl-on-noam-chomskys-soul)</p><p>Stohl, Bev. Linked-In Profile.</p><p>(https://www.linkedin.com/in/bev-stohl-01505728/)</p><p>Stohl, Bev.Bev Stohl’s Stata Confusion [Blog].</p><p>(https://bevstohl.blogspot.com/)</p><p>Stohl, Bev. 2015. What It’s Like to Be Noam Chomsky’s Assistant. Chronicle of Higher Education (December 18, 2015)</p><p>(https://uat.brightspot.chronicle.com/article/what-its-like-to-be-noam-chomskys-assistant/)</p><p>Stohl, Bev. 2020. Mamma’s gonna buy you a mocking bird. Stethoscopes and Pencils.</p><p>(https://stethoscopesandpencils.com/2020/11/12/mamas-gonna-buy-you-a-mocking-bird/)</p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-65424463059692508652023-12-22T09:03:00.001-05:002023-12-22T09:03:27.297-05:00Harvard Society of Fellows Application (September 2 1992)<div><a href="https://www.dropbox.com/scl/fi/wg57ax0yvo5oj68gnuooh/Society-of-Fellows-cleaned-copy.pdf?rlkey=e58o2hyytwdgjl7fxsj5nxbj1&dl=0" target="_blank">Application (September 2 1992)</a></div><div><br /></div>At the end of graduate school, I applied for Harvard’s Society of Fellows. It was basically a three-year period where one could do any research one wanted, and interact with all kinds of very smart people. I knew Chomsky had been a fellow nearly forty years earlier, and that this fellowship allowed him the intellectual room to write his master piece the Language Structure of Linguistic Theory (LSLT, of which his dissertation is a chapter). I would have loved following in his footsteps. My application was all about economy of derivation and trying to develop it in various ways. I applied for the position, and told Morris Halle, who said to me something like: “No, you will never get it.” I was hurt by that comment, but he was just being realistic. I did not get the fellowship. <div><br /></div><div>Several MIT linguistics students subsequently went on to get the fellowship.</div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-1239287231295146142023-11-06T03:52:00.005-05:002023-11-06T03:57:03.812-05:00Two Abstracts for ACAL55 on Kalahari Khoe<p>Here are two related abstracts that I submitted for ACAL55 with co-authors. ACAL stands for Annual Conference on African Linguistics.</p><p>They both concern the classification of Khoe-Kwadi languages (Central Khoisan). The first deals specifically with Tshila, which has not been very well classified before. The second deals with the structure of the Kalahari Khoe subgroup of Khoe-Kwadi, arguing that it should be divided into northern and southern Kalahari Khoe. The methodology of the second paper is based on the Bantu linguistics paper by Marten, Kula and Thwala 2007. </p><p>As of the posting date (November 6, 2023), neither abstract has been either accepted or rejected.</p><p><br /></p><p>Batchelder-Schwab, Andre and Chris Collins. 2023. Classification of Tshila. </p><p>Abstract submitted to ACAL55. <a href="https://www.dropbox.com/scl/fi/o166ceqlemz1vx6sx5ymc/ACAL55-Tshila.pdf?rlkey=a72cme0dgb3n0jnqw2p56028z&dl=0" target="_blank">Abstract.</a></p><p><br /></p><p>Collins, Chris and Anne-Maria Fehn. 2023. Parameters of Morphosyntactic Variation in Kalahari Khoe.</p><p>Abstract submitted to ACAL55. <a href="https://www.dropbox.com/scl/fi/70cm0qmf6thiew4hyzyzq/ACAL55-Parameters.pdf?rlkey=trqq53icn7i9wl5a10c4slr0d&dl=0" target="_blank">Abstract.</a></p><p><br /></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-5171140965120144522023-11-06T02:46:00.002-05:002023-11-06T02:46:41.708-05:00On implicit arguments and logophoricity (NELS abstract, Angelopoulos and Collins 2023)<p>This abstract was accepted as a poster at NELS 54 (2024). Empirically, it documents differences between exempt anaphora in Greek and English. It accounts for those differences by postulating a deep connection between logophoricity and implicit arguments in the sense of Collins 2023 (forthcoming, MIT Press). </p><p>If you are unable to download the abstract, let me know.</p><p><a href="https://www.dropbox.com/scl/fi/naaeblzsm9vze566cyngh/nels-9e7aa256-05cd-4f54-8ad8-e504d5b3bc76-1.pdf?rlkey=9u91dup4hz9qgkzqo66svacqp&dl=0" target="_blank">Abstract: On implicit arguments and logophoricity</a></p><p><br /></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-17404405637917391672023-10-14T10:46:00.007-04:002023-10-14T10:47:26.690-04:00Possible Seminar Topics 2024-2025<p>The following is a list of possible seminar topics for 2024-2-25. That is, each item below represents a different seminar topic. I need to choose one of them.</p><p><b>1.</b></p><p><b>Quotative Inversion</b></p><p>We will begin the semester reviewing the literature on quotative inversion written in the wake of Collins and Branigan (1997). We will try to systematically enumerate all known syntactic properties of quotative inversion in English. We will then develop a more modern account based on remnant movement and smuggling. Connections to related phenomena, such as locative inversion and subject-object inversion in Bantu, will be explored. Students will be encouraged to look at quotative inversion cross-linguistically for course papers and presentations.</p><p><b>2.</b></p><p><b>Merge, MERGE and Workspaces.</b></p><p>Recent work on the foundations of minimalism, by Chomsky and others, has focused on the role of the workspace in syntactic derivations. In this seminar, we will review work on workspaces, starting with Collins and Stabler 2016. We will evaluate Chomsky’s arguments for MERGE over Merge. Emphasis will be on developing empirical predictions of the various theoretical formulations. Depending on the interest of the students, other possible topics may include labelling and copies versus repetitions.</p><p><b>3.</b></p><p><b>Morphology as Syntax: Spelling out Syntactic Structure</b></p><p>In this seminar, we will review various proposals in the morphology literature for spelling out syntactic structure, including proposals based on spans, DM (Vocabulary Insertion), Nanosyntax and MaS (Collins and Kayne 2023). We will discuss the theoretical foundations of each of these approaches. Then we will present various case studies from the literature, and compare them.</p><p><b>4.</b></p><p><b>Topics in Argument Structure</b></p><p>The purpose of this seminar will be to investigate a small range of topics (e.g., adjectival passives, unaccusatives) from the perspective the Merge-based approach to argument structure developed in Collins 2023 (see also Collins 2005). In the first two weeks, we will review the main results of Collins 2023, and then quickly branch off into unknown territory. The topics investigated will be decided jointly by the participants in the seminar.</p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-30916724480674647822023-10-11T04:26:00.002-04:002023-10-11T04:33:10.712-04:00Reply by Kenneth Wexler ("On Realizing External Arguments")<p>The following post is a reply by Prof. Kenneth Wexler to Chris Collins' blog post reviewing Koring et. al. ("On Realizing External Arguments", Linguistic Inquiry, forthcoming).</p><p>The article is found here:</p><p><a href="https://direct.mit.edu/ling/article-abstract/doi/10.1162/ling_a_00520/117701/On-Realizing-External-Arguments-A-Syntactic-and?redirectedFrom=fulltext" target="_blank">Koring et. al. ("On Realizing External Arguments")</a><br /></p><p>The review is found here:</p><p><a href="http://ordinaryworkinggrammarian.blogspot.com/2023/10/review-of-on-realizing-external.html" target="_blank">Review of Koring et. al. (forthcoming)</a><br /></p><span><a name='more'></a></span><p>I am grateful for Chris Collins’ review of Koring, Reuland, Sanger and Wexler (in press, henceforth referred OREA), which considers our attempt to integrate acquisitional and judgmental evidence in helping to construct some portion of linguistic theory. I will start by totally agreeing with Chris’ observation that the fact that two different approaches, with two very different data sets, converge on the same conclusion – that the external argument of short (i.e. no audible external argument) verbal passives is syntactically projected, lends strong support to the conclusion. It was one of the points of our paper that that was in fact the case. And Chris’ well-known 2005 paper and recent book (in press, MIT Press) also reaches this conclusion. Yet they argue in a completely different way.</p><p>In the penultimate paragraph of his review, Chris writes that OREA would have been “strengthened considerably” had we noted how its conclusions decided between “current theories of the passive,” namely between Bruening (2013), who argued that the implicit external argument of the verbal passive is not syntactically projected and theories in which the argument is syntactically projected. But I believe that’s exactly what we did do, though perhaps in a different form. We argued against Bruening’s view that verbal and adjectival passives project exactly the same functional structure. We argued that a major difference was that we claimed that verbal passives have a syntactically projected EA and adjectival passives don’t. Bruening claims neither does.</p><p>At one point in the reviewing history of OREA, a reviewer thought that too much was devoted to a critique of Bruening’s conclusions, which we had carried out since that is a current approach that gets attention, and we thought that it had to be wrong. In response, we greatly shortened our discussion. Possibly too much. Nevertheless it is obvious from our conclusions that our results disagree with Bruening’s theory of the verbal passive. We stressed that the EA in verbal short passives is syntactically projected.</p><p>Chris argues that implicature analysis of the disjointness effect in short passives has a problem. Following Fox and Katzir’s grammatical implicature model, which unites implicatures and focus theory, we argue that the reference of the external argument of the implicit argument in a short verbal passive is determined to some extent by implicatures. In the sentence Mary was nominated EA, the theory says to substitute other salient constituents (there are other alternatives, not relevant here) for the EA, and if this new sentence is strictly stronger logically than the original sentence, then that new sentence is negated. So Mary was nominated implies that it’s not the case that Mary nominated Mary. This is the disjointness effect.</p><p>Chris argues that we could substitute any DP x that is somehow in the context into the EA position and derive the implicature that it’s not the case that x nominated Mary. In particular, in our experiments, we could have done just that. So when we show a picture of Homer washing Bart another one of Bart washing Homer, and have the kid choose a picture as the meaning of Bart was washed, not only is “It’s not the case that Bart washed Bart” is derived, but so is “It’s not the case that Homer washed Bart.”</p><p>However, despite that fact that early in the experiment, kids were made familiar with the characters (usually in a minute or so), there is no reason to take any character who hasn’t been mentioned in the sentence as salient. Clearly, it’s the mentioned character who is salient when we create alternatives. So no such incorrect implicature is made.</p><p>Chris argues that OREA’s argument against a Principle B analysis of the missing disjointness in verbal passives in children is wrong. At first, I thought that Principle B might be the explanation. But with further thinking, it didn’t seem pan out. We followed (our footnote 11) Bhatt and Pancheva (2006) in accepting that the implicit external argument is an existentially bound variable and thus would not allow coreference. Chris accepts this claim for existentially bound implicit EAs but argues that there are different kinds of bound variables besides existentially bound ones. See his in-press book, chapters 2 and 4. One of them is pro. If the argument were pro, Principle B would prevent coreference with the surface subject of the passive, yielding disjointness. If children were “more forgiving” of Principle B errors than adults then they would violate disjointness, yielding our experimental results.</p><p>I have attempted to understand Chapter 2’s arguments about the different types of implicit arguments, and I just haven’t had enough time to unpack them, so I want to be careful here. In fact, I quite admire Chris’ dogged and detailed search for exactly what type of entity the implicit argument is, it’s very sophisticated and grapples with much data. I’m not sure how we know that the relevant implicit EAs in our example is pro, not an existentially bound variable. Furthermore, one thought I have had is that if in fact pro is bound by the subject, not just co-referential with it, then experimental results starting with Wexler and Chien (1985) and Chien and Wexler (1990) through Thornton and Wexler (2000) and many more papers in several languages and many labs show that bound variables do not induce much in the way of errors in children (e.g. every duck pointed to her can’t mean that every duck pointed to herself, for children as well as adults.) So if this binding of pro is true variable binding, we wouldn’t get the errors in disjointness in children. Now it may be that coreference rather than binding could be involved with pro, in which case my argument against it is wrong, and the typical “binding” (that is referential) errors would induce the disjointness error. It’s my lack of understanding, my fault I am sure, that prevents me from coming to a more explicit conclusion here.</p><p>Chris goes on to argue that the advantage of his Principle B proposal is that it wouldn’t require such a “drastic difference” between the way children and adults represent verbal passives. And in fact, the perfect result would obtain; children learn what the verbal passive is on their first hearing one.</p><p>Several comments are worth making. First, we have no idea how it is that a child, before knowing how a form of words is syntactically decomposed, takes a sentence (string of words? partially analyzed?) in the input and maps it into a derivation. One might think, well, the child knows from the context that the first DP is the theme rather than the agent, so it can’t be the typical active sentence. Fair enough; I have long argued that some semantic information from context is necessary to allow development of grammar (Wexler and Hamburger, Wexler and Culicover, many other papers). But I have also pointed out how difficult it is to know how information is available and reliable and how it works. Even with many years of useful work on syntactic bootstrapping, we don’t have as yet a good substitute for semantic information. We do want the data to be what Chomsky calls “epistemologically available” to the child. The simpler and more surface transparent the better. But let’s say that the child without passive grammar knows enough about thematic roles and has enough cognitive capacity to figure out that the thematic role of the surface subject in a short verbal passive is a theme/patient, not an agent. So not the external argument. Perhaps, and this is even more difficult to know happens, they even know that the sentence is intended as eventive rather than stative, e.g. perhaps it includes a progressive, so that it’s not an adjectival passive (though consider that even this is problematic given what we’ve seen about get-passives). Can the kid really tell from context whether the verb is eventive or whether it’s an aux (like get) with a resultant-state adjectival passive? Even if the child can figure out the verb is in fact eventive, couldn’t the sentence be a topicalization? We know that it’s not topicalization because we have judgmental data about all sorts of phenomena, about how passives and topicalizations work differently. But the child has no such access, by a long shot. So there’s not a perfect result. This is unknown.</p><p>That doesn’t mean that we must believe that the child represents verbal passives as adjectival passives. (Not that it’s actually a drastic difference between the representations of the adult and child for verbal passives. Rather the child simply thinks that verbal passives are ungrammatical. I’ve suggested, provocatively to be sure, in my Development of Phases paper in MITWPPL, that the child is a more perfect minimalist grammarian than the adult, who seems to deviate in certain ways, in particular in allowing defective phases. In general the theme of much of my work in acquisition is that children have more tightly constrained UG than adults.</p><p>It's not a prima facie argument that tells us that kids analyze verbal passives as adjectival Rather, it’s years of empirical research. The starting point for our research for OREA was an attempt to find yet another empirical test of the conclusion reached first in Borer and Wexler (1987) that children performed well on a subclass of verbal passives (especially activity verbs versus subject experiencer verbs) because they analyzed these verbs as adjectival passives. The syntax of verbal passives was taken to be ungrammatical for young children, but the adjectival interpretation allowed the children to at least compute the theta roles correctly, despite considering verbal passive syntax as ungrammatical. Children from ages 4 to 5 performed well on verbal passive, but only several years later on subject experiencer passives (despite doing well on subject experiencer actives.)</p><p>Over the years, there has been a great deal of empirical evidence supporting the conclusion that children before about 4 or 5 parse verbal passives as adjectival and very little counterevidence. This has even reached to direct interpretive experiments, where the children very often interpret verbal passives as adjectival, but not vice-versa. The languages in the experiments were Catalan and Spanish, in which the verbal and adjectival passives are both auxiliary + participle, but are not homophonous, unlike English, since the auxiliaries for verbal and adjectival passives differ.</p><p>We thought that Dutch would be another language in which to test the adjectival analysis assumption without homophony. The auxiliaries of the verbal and adjectival passives differ. But we wanted to test yet another, very different, phenomenon, the disjointness effect in verbal passives, which doesn’t hold in adjectival passives. We predicted that kids at a young enough age would violate disjointness because they had misanalysed verbal passives as adjectival. It’s always a useful idea to find totally different types of phenomena to support a theoretical position.</p><p>Our thinking, of course, would not have been accepted by Bruening’s analysis. He argued that disjointness holds or doesn’t hold in either verbal or adjectival passives, and that when it appeared that disjointness held, it was for reasons other than the existence of an implicit external argument, which we took as central. So we argued against Bruening, since it seemed to us that his article was getting attention. </p><p>Perhaps we should have referenced the literature that argued for the existence of a syntactically projected external argument in verbal passives. But the point seemed to us to have been accepted in the literature for more than 70 years. In (1a) (examples like this from Kratzer) there is simply a strongly felt intuition that there is an external argument: there is an agent or instrument that is securing the mountaineer. In (1b), which is adjectival (the non-progressive present tense verbal passive is ungrammatical or, perhaps, infelicitous, in English), there is no such intuition.</p><p>(1)<span style="white-space: pre;"> </span></p><p>a. The mountaineer is being secured</p><p>b. The mountaineer is secured</p><p>Of course, there is a disjoint reference intuition in (1a) but not (1b). In (1a) we strongly feel that somebody or something other than the mountaineer is doing the securing. In (1b) we don’t – it could be the mountaineer herself who is doing it. <i>Secure</i> is not in the class of verbs that Bruening claims are the only ones that shows the relevant effects – the verbs which alternative between transitive and intransitive, like <i>dress</i>, <i>wash</i>. In fact, we would claim that they exist for all activity verbs, the ones that Kratzer shows all have a resultant-state adjectival passive. One more example.</p><p>(2)<span style="white-space: pre;"> </span></p><p>a. Mary is being established (in her law practice)</p><p>b. Mary is established (in her law practice)</p><p><i>Establish</i> is not an alternating one or two argument verb, but native speakers have a strong intuition that in (2a), the person establishing Mary is not Mary, but in (2b), it might very well be (probably the preferred reading, in fact, although of course (2b) is ambiguous as to whether or not the establisher is Mary.)</p><p>These judgments have been accepted as so strong that they are used in baby syntax or even in intro to linguistics or even in discussions with the non-linguistic general public to illustrate the syntactic and semantic role of arguments for which here are no words, phonetically empty arguments. We took this position as simply classical and didn’t reference (much of?) the literature. Perhaps it would have been better to do so. We did say that the strong intuition was foundational.</p><p>OREA does 2 experiments (one Dutch, one English) which show that young kids in fact violate disjointness – 3-year-olds are completely at chance, displaying no knowledge. This is yet strong further evidence that follows from our analysis that disjointness follows from the syntactic projection of the external argument in verbal but not adjectival passives. The young kids have no external argument in their comprehension of the passive – it is adjectival for them. At the same time, the experimental evidence shows yet again and with totally different phenomena that the classical and already amply confirmed result that kids take verbal passives as adjectival is true.</p><p>The major empirical counterargument to kids’ considering verbal passives as adjectival is Steve Crain’s (and somebody else, was it Janet Fodor?) that elicited verbal passives fairly easily from young children. They didn't notice one property of their data, that was immediately obvious (they provided many examples on their handout at the time, a BUCLD presentation, probably printed someplace by now), and as I immediately pointed out after their talk. Most of the elicited passives were <i>get-</i>passives, like (3) not be passives.</p><p>(3)<span style="white-space: pre;"> </span> </p><p>Luke Skywalker got zapped. </p><p>Much judgmental data would be most consistent with the conclusion that that these <i>get</i>-passives use the aux <i>get </i>meaning something like <i>become</i> (though with different aspectual/time properties) with a resultant-state adjectival passive participle complement, meaning as Kratzer has demonstrated: the state such that there was a previous time t at which an event of zapping took place on Luke. So there was a transition of Luke into a state such there was an event of zapping Luke that took place earlier. We might think that (3) is verbal because there is an activity going on, a transition, but it’s not a verbal passive, simply a transition into a state.</p><p>Let’s test disjointness in get-passives. Since the complement is adjectival there is no reason for disjointness to hold. For comparison let’s use the same verb as in (2).</p><p>(4)<span style="white-space: pre;"> </span> </p><p>Mary is getting established (in her law practice)</p><p>Disjointness is not required in (4), strongly contrasting it with (2a). In fact, the preferred out of the blue reading of (4) is that Mary herself did the establishing. Of course, there might be another establisher. Adjectival passives simply don't have a disjointness requirement; they are ambiguous in whether or not the external argument is disjoint.</p><p>Given this very brief summary, with most data not even mentioned, we have very good evidence to support the claim that children often analyze verbal passives as adjectival. So the suggestion that children represent verbal passives identically as adults is not an advantage, it actually contradicts established data.</p><p>Here we have to be very clear. We cannot give precedence to any one type of data on conceptual grounds. As Chomsky has often written, data don’t come with labels saying, “I am useful data, pay attention” or, “I am data that doesn’t have to be paid attention to.” In particular, as Chomsky has argued, there is no prima facie reason to give pride of place to traditional judgmental data of adults, which of course has been by far the most frequent type of useful data in generative linguistics. Supplemented more and more frequently by experimentally induced judgments of grammaticality, meaning, felicitousness, etc. Child data fits in in exactly the same way, it’s another type of data. And if we want to obtain explanatory adequacy and feasibility, we will want to understand child data. If the data look good that verbal passives in kids are often analyzed as adjectival passives, then we have to understand this fact; it’s not an advance to dismiss it, it’s a loss.</p><p>In the course of my career, I and my colleagues have often ended papers by pointing out what we’ve learned about the nature of UG from child data. Not just about child grammar but about UG. Others have done this we well. Of course adult judgmental data is hugely important. But it doesn’t hold pride of place if there are also other data that are useful. I continue to hope, perhaps too optimistically, that this doesn’t have to be said in every paper, that it will just be as natural in pursuing linguistic theory as the use of adult judgmental data. In OREA we argue for exactly that – that the kid data tells us a good deal about UG, not just when it confirms that children know everything that there is to know about a piece of UG, but even more when it has to assume that some pieces are missing. A different system to test and to help break down components and yield natural experiments.</p><p>Chris mentions at the conclusion of his review that he leaves to future work a minimalist merge derivation for adjectival passives. This would be a pleasure to see, especially given how useful and stimulating his very careful and very detailed merge analysis of verbal passives is. OREA doesn’t attempt as detailed a merge analysis, for verbal or adjectival passives, but does suggest some merge properties that account for the difference, for example a difference in timing for certain operations. Language acquisition results clearly establish that the needed operations for verbal passives arise considerably later in development than for adjectival passives. Given the theories on offer for the delay of verbal passives (especially the Universal Phase Constraint (UPR)), it is clear that there has to be some fundamentally different syntax between verbal and adjectival passives, in particular involving the relation between the underlying and surface positions of the theme. In studies of semantic properties of resultant-state adjectival passives together with their syntax, it is perhaps mostly concluded that the theme of adjectival passives does not originate in object position, but either directly in subject position or some close by position, e.g. Embick, David. 2004. On the structure of resultative participles in English. Linguistic Inquiry, 35(3), 355-392. I look forward with great anticipation to what Chris’ endeavors will yield. If they take consideration clear results in language acquisition, so much the better.</p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><br /></p><p><span style="white-space: pre;"> </span></p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-87172542378942655042023-10-08T10:43:00.001-04:002023-10-08T10:43:12.868-04:00Review of "On Realizing External Arguments" by Koring et. al. (to appear, Linguistic Inquiry)<p>Review of “On Realizing External Arguments: A syntactic and implicature theory of the disjointness effect for passives in adult and child grammar” by Loes Koring, Eric Reuland, Nina Sangers and Ken Wexler. (to appear, Linguistic Inquiry)</p><p>(https://direct.mit.edu/ling/article-abstract/doi/10.1162/ling_a_00520/117701/On-Realizing-External-Arguments-A-Syntactic-and?redirectedFrom=fulltext)</p><span><a name='more'></a></span><p>This article (henceforth OREA) argues that the implicit external argument of the short passive is syntactically projected. That conclusion converges very nicely with a similar conclusion in Collins 2005, and Collins 2023 (MIT Press, forthcoming). While OREA’s argument is based on a disjointness implicature, Collins 2005, 2023 is based on principles A and B of the binding theory, the distribution of Helke expressions (e.g., ‘on my own’) and the distribution of secondary predicates. It is striking that two completely different sets of data yield the same theoretical conclusion, namely that the implicit argument in the short passive is syntactically projected. Such converging results (based on different data and different methodologies) should give one confidence in the theoretical conclusion.</p><p>In this blog post, I will outline the argument in OREA, and point out an issue that is unclear to me. I will also offer an alternative way to understand their child language data.</p><p>Consider the following pair of the sentences. </p><p>(1)<span style="white-space: pre;"> </span></p><p>a.<span style="white-space: pre;"> </span>John was seen.</p><p>b.<span style="white-space: pre;"> </span>John was seen by somebody other than John.</p><p>For adult speakers it is generally the case that whenever (1a) is true, so is (1b). But (1b) seems like an implicature instead of an entailment, since it can be canceled:</p><p>(2)</p><p>John was seen, namely by John himself looking through a security camera.</p><p>OREA derives the implicature in (1b) using the syntactic theory of implicatures in Fox and Katzir (2011). As I understand them, he basic assumptions are as follows: </p><p>(3)<span style="white-space: pre;"> </span></p><p>a.<span style="white-space: pre;"> </span>In the short passive, there is a syntactically present EA (‘external argument’)</p><p>b.<span style="white-space: pre;"> </span>The implicit EA is existentially bound (pg. 47).</p><p>c.<span style="white-space: pre;"> </span>The EA generates implicatures.</p><p>d.<span style="white-space: pre;"> </span>The implicatures are negated stronger alternatives.</p><p>e.<span style="white-space: pre;"> </span>The strong alternatives are formed by replacing the EA by contextually salient constituents.</p><p>Now consider (1a) to see how this works. By (3a,b), the representation of (1a) is something like the following (see Collins 2005, 2023 for a similar assumption):</p><p>(4)<span style="white-space: pre;"> </span></p><p>John was seen EA.</p><p>By (3c,d,e), the following alternative is negated, assuming ‘John’ is contextually salient:</p><p>(5)<span style="white-space: pre;"> </span></p><p>John was seen by John.</p><p>(4) and (5) together entail (1b), which is the disjointness implicature.</p><p>My concern with this argument is the following. By the assumptions in (3), any replacement of EA by a contextually salient proper name should be negated, since replacement by a proper name will always yield a stronger alternative. But (5a) entails (5b):</p><p>(6)<span style="white-space: pre;"> </span></p><p>a.<span style="white-space: pre;"> </span>John wasn’t seen by John, Bill, Mary….</p><p>b.<span style="white-space: pre;"> </span>John wasn’t seen by anybody.</p><p>But (6b) contradicts (4). So (6b) is a contradictory implicature. </p><p>One could argue that in (4) only ‘John’ is contextually salient, and so only ‘John’ can be substituted for the implicit EA. But the authors clearly state that the children are familiarized with all the characters in the experiment at the beginning of the session (pg. 17): “Each experimental session started with a warm-up phase in which the child was familiarized with all the pictures, the four characters, and the verbs used.” So all the relevant characters should have been contextually salient, and available for substitution. </p><p>To patch this up, some more specific notion of contextual salience is needed. Perhaps this issue was discussed in the Fox and Katzir (2011), which I have not yet consulted. But it seems like a big enough issue that the authors should have addressed it in the paper itself.</p><p>This discussion brings up a different possibility for understanding the lack of the disjointness implicature in child language. OREA notes that in sentences like (7) for child speakers there is no disjointness implicature:</p><p>(7)<span style="white-space: pre;"> </span>Bart was washed.</p><p>In other words, children are happy to use sentences like (7) even if Bart washes himself. For this OREA concludes that passives in child language lack a syntactically projected EA. Lacking such and EA, the disjointness implicature in (3) does not go through (since the calculation of the disjointness implicature requires a syntactically projected implicit EA, see 3e above).</p><p>OREA argues that the reason why the child passive lacks a syntactically projected EA is that the child passive is actually an adjectival passive, which always lack a syntactically projected EA, even in adult language. As they grow older, they acquire the distinction between the two kinds of passive.</p><p>An alternative is that children do in fact have a syntactically project EA in (7), but it is the following, where the implicit EA is a null pro.</p><p>(8)<span style="white-space: pre;"> </span>Bart1 was washed pro1.</p><p>On this theory, the difference between children and adults is that children are more forgiving of principle B violations, and so allow representations such as (8) (on such a difference, since footnote 11 of OREA).</p><p>In fact, Collins 2023 shows that the implicit EA in the short passive has three distinct syntactic realizations: (a) generic implicit argument (progen), (b) existential implicit argument (proun), (c) definite implicit argument (prodef). The implicit argument in (8) would fall under case (c): definite implicit argument.</p><p>OREA considers the possibility that the disjointness effect in the passive is caused by a principle B effect, but they reject the possibility in the following terms (pg. 6):</p><p>(9)</p><p>“We will now consider the disjoint reference effect in verbal passives in more detail. Is it the result of a syntactic violation, such as condition B? The answer is negative since it can be cancelled,…”. </p><p>They add the following in footnote 11:</p><p>(10)</p><p>“In order for children to allow such a representation would then mean that, for children, as opposed to adults, passivization does not involve existential quantification.”</p><p>The problem with (9) and (10) is that they fail to recognize that there are actually three different types of implicit arguments involved in the short passive in English. When the implicit argument is definite, it triggers a principle B effect, as noted in Collins 2023 (chapter 4). When the implicit argument is existential, it triggers a disjointness implicature, as noted by the authors. Both explanations are needed to account for the data.</p><p>The advantage of my proposed principle B alternative is that it would not require such a drastic difference between the way children and adults represent passives. Rather, principles of UG (such as the Theta-Criterion) would apply at the beginning of acquisition and ensure the child acquires passive syntax (corresponding to the adult passive syntax) immediately upon hearing it for the first time (see Collins 2023, chapter 4 for relevant discussion and the ‘wedge’ argument).</p><p>A potential problem with my account is the following, noted by OREAL in footnote 11:</p><p>(11)</p><p>“Furthermore, the developmental pattern we observed differs from the developmental pattern observed in Condition B environments. Children allow illicit coreference up till the age of six (Chien and Wexler 1990), whereas in the present experiment most 4- and 5-year-olds did not display any difficulties.”</p><p>It is not clear to me how to account for this difference in development pattern under my suggested principle B alternative. However, the gap between age six and age four-five does not seem all that great to me. It may be that something about the passive is enabling them to correctly process principle B effects. Alternatively, they may be learning, like adults, that the dominant interpretation of the passive is existential/generic, as opposed to definite (see Collins 2023, chapter 4 for discussion). If so, they may cease to freely use the definite pro implicit argument as they get older.</p><p>While I criticize OREA in various ways above, I do agree that there is a disjointness implicature when the implicit EA is existential. How exactly to capture this implicature remains unclear to me. It is not really clear to me that a syntactically projected EA is needed to capture the disjointness implicature. In any case, the syntactic evidence in Collins 2005, 2023 for a syntactically projected EA in the short passive is already very compelling.</p><p>A minor criticism of OREA relates to an ongoing disagreement in the field as to whether the implicit EA in the passive needs to be syntactically projected. Collins 2005 and 2023 argue that the implicit EA of the short passive is syntactically projected. Bruening 2013 takes a contrary position. The paper would have been strengthened considerably if it had noted how its conclusions decided between current theories of the passive.</p><p>A challenge, not picked up in this review, is to develop an account of adjectival passives in the Merge-based framework of Collins 2023. I leave this to future work.</p><p><b>References</b></p><p>Bruening, Benjamin. 2013. By Phrases in Passives and Nominals. Syntax 16, 1-41.</p><p>Collins, Chris. 2005. A Smuggling Approach to the Passive in English. Syntax 8, 81-120.</p><p>Collins, Chris. To Appear. Principles of Argument Structure: A Merge-Based Approach. MIT Press, Cambridge.</p><p>Fox, Danny, and Roni Katzir. 2011. On the characterization of alternatives. Natural language semantics,19(1), 87-107.</p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-17559580672667073312023-10-01T05:19:00.001-04:002023-10-01T05:19:14.197-04:00 Cua September 2023 Fieldwork by the Numbers<p>From September 16, 2023 to September 30, 2023, we (Nikos and I) did fieldwork on Cua, an endangered Khoe-Kwadi language spoken in southeastern Botswana. We did the fieldwork in Diphuduhudu, which is to the west of the Molepolole-Lephephe road. One goal was to write a rough draft of a paper on the remarkable Cua pronominal system. Another goal was also to introduce Nikos to fieldwork on Khoisan linguistics. We accomplished these goals. </p><span><a name='more'></a></span><p>This expedition was my third field trip of 2023 (actually the third field trip in three months: July, August, September 2023) funded by a four-year NSF grant to train students in doing fieldwork on Khoisan languages:</p><p>https://nsf.gov/awardsearch/showAward?AWD_ID=1760980&HistoricalAwards=false</p><p>In the following summary, I give the numbers characterizing our research for the September 2023 work on Cua: </p><p>1.</p><p>13 speakers, organized into five teams.</p><p>2.</p><p>1,832 sound files of lexical items, phrases and sentences (no new oral texts).</p><p>3. </p><p>162 notebook pages of grammar. 14 notebook pages of oral text.</p><p>4. </p><p>111 new lexical items (entered into FLEx)</p><p>(for a total of 1166 lexical items).</p><p>5. </p><p>A complete rough draft of a syntax paper on Cua pronouns (18 single spaced pages):</p><p>"Three Modes of Pronominal Interpretation in Cua"</p><p>6. </p><p>81 photos</p><p>7. </p><p>2 minutes, 40 seconds rough transcription of an oral text.</p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-40812719690095867912023-09-07T03:48:00.000-04:002023-09-07T03:48:04.101-04:00Summer Reading 2023<p>Continuing with the tradition I started last summer, I post my summer reading for 2023. These books cover roughly the time period between June 1, 2023, and August 31, 2023. If anybody wants to discuss these books with me, let me know.</p><span><a name='more'></a></span><p>As you can see, Deon Meyer remains a big theme, mainly because I am in southern African and I can easily find his books. He is a very good crime writer, who knows how to tell an absorbing story. I am getting close to reading him out (as I have done for other crime writers in the past). For some reason, I tend to get drawn to crime novels. </p><p>I am trying to make sure that each batch of books I read contains at least one of the classics. This time I chose Hemingway, and I was not disappointed. Old Man is an intense psychological account of an old man's battle with a fish. It starts out a bit slow, but once he starts engaging with the fish, it is captivating.</p><p>The Frankl book was recommended by my wife. Half of it is about his memories of being in concentration camps during the Holocaust. It is very grisly and tough to read. It makes you wonder what kind of depths as humans we are able to sink to (whether there is any bottom limit). The other half is about his theory of psychology (logotherapy), in part inspired by his experiences during the Holocaust. I was not really interested in his theory (which is a bit like psychoanalysis), but I read it through because it was all in one book.</p><p>I put an (*) next to my favorites for the summer.</p><p>For background on my reading habits:</p><p>https://ordinaryworkinggrammarian.blogspot.com/2022/08/summer-reading-2022.html</p><p>1.(*)</p><p>Frankl, Viktor. 2006 (1946). Man's Search for Meaning. Beacon Press.</p><p>https://en.wikipedia.org/wiki/Man%27s_Search_for_Meaning</p><p>https://www.amazon.com/Mans-Search-Meaning-Viktor-Frankl-ebook/dp/B009U9S6FI</p><p>2.(*)</p><p>Hemingway, Ernest. 2022 (1952). The Old Man in the Sea. Vintage Books, London.</p><p>3.</p><p>Haig, Matt. 2023. The Humans. Canongate Books.</p><p>4.(*)</p><p>Meyer, Deon. 2018. Fever. Hodder and Stoughton.</p><p>5.</p><p>Littlewood, Fran. 2023. Amazing Grace Adams. Penguin Random House.</p><p>6.</p><p>Meyer, Deon. 2011. Trackers. Hodder and Stoughton.</p><p>7.</p><p>Paolini, Christopher. 2023. Fractal Noise. Tor.</p><p>More of a psychological exploration than anything else. What happens to humans psychologically under extreme stress.</p><p>8.</p><p>Meyer, Deon. 2018. The Woman in the Blue Cloak. Hodder and Stoughton.</p><p>9.(*)</p><p>Bear, Greg. 1987. The Forge of God. Legend Books.</p><p>This is a very good sci-fi book. Curiously I had read it before (maybe decades ago). I did not realize that until I started rereading, and could remember some of the individual scenes. But I enjoyed rereading.</p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-2207209338149449512023-09-05T02:31:00.001-04:002023-09-05T02:31:04.664-04:00Tshila 2023 Fieldwork by the Numbers<p>From August 24, 2023 to September 4, 2023, we (Andre and I) did fieldwork on Tshila, an endangered central Khoisan language spoken in southeastern Botswana. We did the fieldwork in Kaudwane, which is on the edge of the Khutse game reserve. The goals of the fieldwork were to collect at least 500 vocabulary items, at least one full pronoun chart (30 pronouns) and some basic grammatical information. We accomplished these goals. For context, we want (a) to set Andre up for future fieldwork on Tshila, and (b) to start the community thinking about developing an orthography.</p><span><a name='more'></a></span><p>This work is part of a four-year NSF grant to document Cua and Tsila and to train students in doing fieldwork on Khoisan languages:</p><p>https://nsf.gov/awardsearch/showAward?AWD_ID=1760980&HistoricalAwards=false</p><p>In the following summary, I give the numbers characterizing our research for summer 2023 work on Tshila: </p><p>1.</p><p>9 native speaker consultants (4 women, 5 men), organized into five teams.</p><p>2.</p><p>3,457 sound files of lexical items, phrases and sentences (no oral texts).</p><p>3. </p><p>143 notebook pages of grammar and lexicon.</p><p>4. </p><p>565 lexical items (entered into FLEx).</p><p>5. </p><p>120 pronouns (subject, object, possessor, subjunctive)</p><p>6. </p><p>163 photos</p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-23204328920622339072023-08-20T04:23:00.009-04:002023-09-11T05:27:07.070-04:00How to Syntax 1 (the now that-Construction)<p>This is the first of a series of blog posts showing how I think about a syntax problem when I first notice it. I will occasionally choose phenomena that I notice, and talk about them in an informal fashion, breaking down the process of preliminary syntactic exploration. That is, I am just thinking off the top of my head (brainstorming), with few or no revisions. The focus of the discussion will be on process. I am not trying to come up with a polished analysis. Of course, if people suggest references for me to look at, I will look at them later, but that would be a second stage of thought, not the preliminary exploration.</p><span><a name='more'></a></span><p><b>Now That</b></p><p>In English, the following construction is used:</p><p>(1)<span style="white-space: pre;"> </span>Now that I have finished, I will leave.</p><p>I will call the phrase [now that I have finished], the <i>now</i>-phrase for short, and the construction the <i>now that</i>-construction. I have never read anything written about this construction in my career. I also did not try to look it up in the standard sources (e.g., JSTOR). Lastly, I have done no internet searches so far trying to confirm or disconfirm any of the data points below.</p><p>How did I find the construction? I have had it on the back burner for a few weeks, but I can't honestly say that I remember the exact moment I noticed it as being interesting. But the way I find most of my constructions is by a kind of monitoring/observing of the ambient language around me. There are periods when I monitor what I am saying or hearing or reading, and analyze it a bit. In that way, I stumble across lots of interesting things. Most of my papers on English syntax/semantics come from that kind of process.</p><p><b>Basic Combinatorics</b></p><p>The first stage in preliminary exploration is what can be called basic combinatorics (or alternatively, basic experimentation). Just try deleting, inserting, replacing and permuting various words and morphemes in the construction to get a feel for how it is put together.</p><p>For example, it seems the <i>that</i> is obligatory. Consider (2) (a case of <i>deletion of a word</i>):</p><p>(2)<span style="white-space: pre;"> </span>Now I have finished I will leave.</p><p>While (2) is good, it seems to have a structure very different from (1). Rather, (2) seems to be two separate sentences. So, the punctuation should be:</p><p>(3) <span style="white-space: pre;"> </span>Now, I have finished. I will leave.</p><p>But in (1), there seems to be only one sentence, not two. For example, consider the punctuation:</p><p>(4)<span style="white-space: pre;"> </span>Now that I have finished. I will leave.</p><p>The punctuation in (4) makes the <i>now</i>-phrase stand on its own as a sentence, but that is impossible. </p><p>The generalization that <i>that</i> cannot be omitted in (1) makes one try to think of other constructions where <i>that</i> can and cannot be omitted. I call this process searching for syntactic parallels (or for related constructions). The search for syntactic parallels is a very important part of syntactic exploration and can lead to many different kinds of empirical questions and testable hypotheses. </p><p>For example, in manner relative clauses and temporal relative clauses the<i> that</i> is freely omitted:</p><p>(5)<span style="white-space: pre;"> a. </span>The way (that) John treats me is unacceptable.</p><p> b. The day (that) I saw John, it was raining.</p><p>But in other relative clause that cannot be omitted (e.g., subject relatives). So right away we have a question about how (1) is related to various relative clause constructions. </p><p>Another combinatoric generalization is that the <i>now</i>-phrase can appear in different positions (permutation of phrases). In (1), it appears initially, in (6) it appears finally:</p><p>(6)<span style="white-space: pre;"> </span>I will leave now that I have finished.</p><p>But even though the now-phrase can appear either initially or finally, it cannot be broken up:</p><p>(7)<span style="white-space: pre;"> </span>*Now, I will leave, that I have finished.</p><p>So, the <i>now</i>-phrase seems to be constituent with an obligatory <i>that</i>. </p><p>Other temporal words (different from <i>now</i>) do not seem to allow the same kind of construction (<i>replacement of a word</i>):</p><p>(8)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>*Tomorrow that I finish, I will leave.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>*Yesterday that I finished, I left.</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>c.<span style="white-space: pre;"> </span>*When that you finish, you will leave.</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>d.<span style="white-space: pre;"> </span>*Then that I have finished, I will leave.</span></p><p>Even modification of <i>now</i> seems impossible (<i>insertion of a word</i>):</p><p>(9)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Right now, I finished.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>*Right now that I finished, I will leave.</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>c.<span style="white-space: pre;"> </span>Just now, I finished.</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>d.<span style="white-space: pre;"> </span>*Just now that I finished, I will leave.</span></p><p>And even synonyms do not allow the <i>that</i>-clause. Assuming <i>now</i> and <i>at this moment</i> are synonymous, consider:</p><p>(10)<span style="white-space: pre;"> </span>a. <span style="white-space: pre;"> </span>At this moment (= now), it is raining.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>*At this moment (=now) that I have finished, I will leave.</span></p><p>Another basic combinatorial fact is that <i>now</i> and <i>now</i>-phrases do not always have the same syntactic distribution:</p><p>(11)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Now is the best time to leave.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>??Now that I have finished is the best time to leave.</span></p><p>(11b) sounds very awkward. The generalization seems to be that <i>now</i> as a subject cannot be replaced by a <i>now-</i>phrase. The unacceptability of (11b) seems to be related to similar facts with subordinate adverbial clauses:</p><p>(12)<span style="white-space: pre;"> </span>*Since I have now finished is the best time to leave.</p><p>(12) constitutes a syntactic parallel between <i>now</i>-phrases and subordinate adverbial clauses.</p><p>Ultimately, a syntactic analysis will have to account for these basic combinatorial facts. In a final paper, it is not enough just to notice a bunch of stuff. But crucially, the analysis will be successful to the extent it provides explanations for these facts.</p><p><b>Paraphrase and Entailment</b></p><p>All the above facts bring up the issue of the syntactic and semantic relation between now and the <i>that</i>-clause in a <i>now</i>-phrase. Faced with this issue, the first thing is to try to find a paraphrase of (1) to illuminate the structure and meaning. This stage of investigation is called paraphrase and entailment.</p><p>Here is one attempt:</p><p>(13)<span style="white-space: pre;"> </span>Since I have finished, I will leave.</p><p>(13) has roughly the same interpretation as (1), but (13) seems to be lacking the temporal implication that (1) has. In other words, there seems to be the following implication:</p><p>(14)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Now that I have finished, I will leave.<span style="white-space: pre;"> </span>(implies)</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>I have now finished.</span></p><p>The same implication does not seem to hold for (15) (similar remarks hold for given that):</p><p>(15)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Since I have finished, I will leave.<span style="white-space: pre;"> </span>(does not imply)</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>I have now finished.</span></p><p>In other words, in (15a), maybe you finished a while ago, not now.</p><p>Here is another example illustrating the same fact:</p><p>(16)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Now that I am over 18, I can vote.<span style="white-space: pre;"> </span>(implies)</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>I just turned 18.</span></p><p>(17)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Since I am over 18, I can vote.<span style="white-space: pre;"> </span>(does not imply)</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>I just turned 18.</span></p><p>Here is another attempt at paraphrasing (1):</p><p>(18)<span style="white-space: pre;"> </span>Immediately after finishing, I will leave.</p><p>The problem here is that an inference like (19) still does not go through:</p><p>(19)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Immediately after finishing, I will leave.<span style="white-space: pre;"> </span>(does not imply)</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>I finished now.</span></p><p>If one says (19a), there is no implication of finishing now.</p><p>One approach that seems to work better is the following:</p><p>(20)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Since I have now finished, I will leave.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>Given that I have now finished, I will leave. </span></p><p>These seem very close to (1) in interpretation. The paraphrases in (20) are close enough to (1) that they provide preliminary hypotheses about what the structure of (1) is. </p><p>Let me clarify that I am not claiming that paraphrases are in all cases the correct syntactic analysis. But used judiciously, they provide clues as to what the correct syntactic analysis may be. The basic assumption is that meaning/interpretation is derived from syntactic structure. So if the meanings/interpretations are the same, there might be a similar syntactic structure. This is not a logically rigorous conclusion, but rather a way to generate a testable hypothesis.</p><p>The basic idea is that now is moved to the front of the clause and something like <i>since</i> or <i>given </i>is silent:</p><p>(21)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>now SINCE I have <now> finished, I will leave.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>now GIVEN that I have <now> finished, I will leave.</span></p><p>This analysis leaves open many questions. For example, are there other contexts where SINCE and GIVEN have null occurrences in English? Another question is this: If now-preposing triggers a silent SINCE or GIVEN, why can’t there be a silent SINCE or GIVEN with other temporal expressions:</p><p>(22)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Since I finished yesterday, I will leave.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>Given that I finished yesterday, I will leave.</span></p><p>(23)<span style="white-space: pre;"> </span>*Yesterday that I have finished, I will leave. </p><p><span style="white-space: normal;">An alternative analysis to the one in (21) is that (1) is some kind of relative clause construction where the head noun is now. Adopting the head raising analysis, the structure would be as in (24):</span></p><p>(24)<span style="white-space: pre;"> </span>now that I have <now> finished, I will leave.</p><p><span style="white-space: normal;">But the same criticism leveled against (21) holds here: why can’t the construction work with other temporal expressions such as <i>yesterday</i>? </span></p><p><b>Movement Diagnostics</b></p><p>Once we reach the level of a specific analysis as in (21) or (24), we can ask more detailed questions involving well know syntactic diagnostics. If <i>now </i>undergoes movement (as in (21) or (24)) does it show movement diagnostics? Can the movement be successive cyclic? Consider the following:</p><p>(25)<span style="white-space: pre;"> </span>Now that I said that John has finished, I will leave.</p><p>The question is what the interpretation of this sentence is. There are at least two possibilities:</p><p>(26)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Since I have now said that John has finished, I will leave.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>Since I have said that John has now finished, I will leave.</span></p><p><span style="white-space: normal;">It is clear to me that the only interpretation of (25) is (26a), not (26b). This suggests that if now undergoes movement in the analysis of <i>now that</i>-constructions, it cannot undergo long-distances successive cyclic movement.</span></p><p><b>Conclusion</b></p><p>I have discussed some methods of preliminary syntactic exploration in this blog post. A summary of the methods is given here:</p><p>(27)<span style="white-space: pre;"> </span>a.<span style="white-space: pre;"> </span>Basic Combinatorics</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b.<span style="white-space: pre;"> </span>Searching for Syntactic Parallels</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>c.<span style="white-space: pre;"> </span>Paraphrase and Entailment</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>d. <span style="white-space: pre;"> </span>Movement Diagnostics</span></p><p>This is only the very first step in cataloguing the methods syntacticians use in preliminary exploration. I hope to be able to post many similar blogs filling out this list in the future.</p><p><b>Acknowledgments:</b> Thanks to Richard Kayne and Paul Postal for discussion of this construction.</p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-42575505396355160132023-07-31T05:37:00.006-04:002023-07-31T05:37:55.596-04:00 Summer 2023 Fieldwork by the Numbers<p>During June and July of 2023, we (Olivia and I) did fieldwork on Cua, an endangered central Khoisan language spoken in southeastern Botswana. We did the fieldwork in Gaborone (one week) and Diphuduhudu (one month). The goals were to finish a rough draft of the grammatical sketch, bring the number of lexical entries up to at least 1,000, and to continue working on oral texts. We were successful in these goals.</p><span><a name='more'></a></span><p>This work is part of a four-year NSF grant to document Cua and to train students in doing fieldwork on Khoisan languages:</p><p>https://nsf.gov/awardsearch/showAward?AWD_ID=1760980&HistoricalAwards=false</p><p>In the following summary, I give the numbers characterizing our research for summer 2023. </p><p><br /></p><p>1. 3,959 sound files of lexical items, phrases and sentences </p><p>(for a total of 9791 sound files for 2019, 2022, 2023).</p><p>2. 292 notebook pages of grammar, lexicon and oral texts.</p><p>3. A rough draft of complete grammatical sketch (25 chapters, 85 pages).</p><p>4. Two rough draft transcribed oral texts </p><p>(for a total of four rough draft transcribed oral texts).</p><p>5. 237 new lexical items </p><p>(for a total of 1058 lexical items in FLEx).</p><p>6. 605 new photos </p><p>(for a total of 1437 photos for 2019, 2020, 2023).</p><p><br /></p><p><b>Previous Years:</b></p><p><b>2019:</b></p><p>https://ordinaryworkinggrammarian.blogspot.com/2019/08/summer-2019-fieldwork-by-numbers.html</p><p><b>2022:</b></p><p>https://ordinaryworkinggrammarian.blogspot.com/2022/08/summer-2022-fieldwork-by-numbers.html#more</p><p><br /></p><p><br /></p><p><br /></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-40256567383606922912023-06-23T02:58:00.002-04:002023-06-23T02:58:48.197-04:00MaS3 (Morphology as Syntax 3)<p><b> Morphology as Syntax 3</b></p><p><a href="https://old.linguistlist.org/issues/34/34-1995.html?fbclid=IwAR1A1GOY79r2tJlvO0Dm6_rCwKQAsYlOHFZDJXQ43sMOJjMYUz1iD1-vRAQ">LinguistList Announcement</a></p><p>Short Title: MaS3</p><p>Date: 15-Sep-2023 - 16-Sep-2023</p><p>Location: Université du Québec à Montréal, Canada</p><p>Contact: Tom Leu</p><p>Contact Email: leu.thomas@uqam.ca</p><p>Linguistic Field(s): Linguistic Theories</p><p><b>Meeting Description:</b></p><p>The purpose of this workshop is to investigate the relationship between morphology and syntax, and in particular to investigate the extent to which morphological generalizations can be accounted for in terms of purely syntactic operations and conditions. Can morphology and syntax be unified under purely Merge based theories with the same principles? If so, what does this tell us about the type of syntactic theory we should pursue?</p><p><b>Invited speakers (in alphabetical order):</b></p><p>Léna Baunaz (University of Geneva)</p><p>Vicki Carstens (University of Connecticut)</p><p>James Crippen (McGill University)</p><p>Peter Kondwandi Msaka (University of Malawi)</p><p>Neil Myler (Boston University)</p><p>Sandhya Sundaresan (SUNY Stony Brook)</p><p>Peter Svenonius (University of Tromsø)</p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-76544605726371303152023-06-21T03:11:00.009-04:002023-06-21T03:11:49.457-04:00Papers on Negation and Quantifiers<p>Here is a series of five papers I have written recently concerning negation, quantifiers and negating quantifiers. They are all syntax/semantics interface papers, with a substantial semantic component. The constant theme throughout is the application of the basic framework of Collins and Postal 2014 to various phenomena. </p><span><a name='more'></a></span><p>The relevant principle from Collins and Postal 2014 is the following:</p><p><br /></p><p>If X has a semantic type ending in <t>, then</p><p>NEG takes X with semantic value: λP1….λPn […]</p><p>And returns Y with semantic value: λP1…λPn ➖[…]</p><p><br /></p><p>I only include in this list singled-authored papers dealing with the syntax/semantics interface. I have a series of other papers on the syntax of negation with Francis Blanchette, Paul Postal and Elvis Yevudey. Let me know if you need any of these papers. I would be happy to send them to you, and to discuss them with you.</p><p><b>2023<span style="white-space: pre;"> </span>Negating Gradable Adjectives. Natural Language Semantics.</b></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>(https://rdcu.be/debuX)</span></p><p><b>Abstract:</b> In this short paper, I analyze the syntax and semantics of the prefix un- with gradable adjectives like unhappy and compare it to the syntax and semantics of not. Within the framework of Collins and Postal 2014, I propose that un- and not have the same semantics but negate different constituents, accounting for differences in interpretation.</p><p><b>2022<span style="white-space: pre;"> </span>Scalar Modifiers of Quantifier Phrases. Glossa 7(1), 1-20.</b></p><p>(https://www.glossa-journal.org/article/id/6162/)</p><p><b>Abstract:</b> This paper analyzes the syntax and semantics of scalar modifiers of quantifier phrases in expressions like almost every student, absolutely every student and nowhere near every student. The semantics is based on scales (positive and negative) of generalized quantifiers.</p><p><b>2020<span style="white-space: pre;"> </span>Outer Negation of Universal Quantifier Phrases. Journal of Linguistics and Philosophy. (https://doi.org/10.1007/s10988-019-09269-4)</b></p><p><b>Abstract:</b> This paper discusses two ways of negating DP quantifier phrases. In one way, NEG modifies the quantifier D directly with the structure [[NEG D] NP] (inner negation). In the other way, NEG modifies the whole DP with the structure [NEG DP] (outer negation). I give evidence based on negative polarity items that negated universal quantifier phrases like not every student involve outer negation (contra Hoeksema 1986, 1987).</p><p><b>2017<span style="white-space: pre;"> </span>A Scope Freezing Effect with Negated Quantifier Phrases. Natural Language </b></p><p><b>Semantics 25, 315-327.</b></p><p><b>Abstract:</b> I document a scope freezing effect found with negated quantifier phrases (distinct from the scope freezing effect discussed in Collins 2016a). In a sentence with a negated quantifier phrase of the form [NEG DP1], no quantifier phrase DP2 can take scope between NEG and DP1. I show how this scope freezing effect can be explained in terms of the analysis of negated quantifier phrases given in Collins and Postal (2014) and Collins (2016a).</p><p><b>2016<span style="white-space: pre;"> </span>Not even. Natural Language Semantics 24, 291-303.</b></p><p><b>Abstract: </b>This paper proposes an analysis of the semantics of even that is consistent with the assumptions about the syntax and semantics of negation in Collins and Postal (Classical NEG raising, MIT Press, Cambridge, 2014). First, I review the distribution of negation, showing how negation may modify quantificational expressions where it gives rise to scope freezing effects. Second, I discuss the fact that even-phrases can be modified by negation, as in Not even John is there. On the basis of this fact, I argue that even is a quantifier. Lastly, I show that my data provides new empirical support for the assumption that there are two kinds of even, depending on the role played by focus in the scalar presupposition.</p><div><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-78311634418374906802023-06-14T02:28:00.005-04:002023-06-14T02:39:56.697-04:00Guest Blog Post by Pauline Jacobson on Constituent Structure <p><b>Losing sight of the forest through the trees: </b></p><p><b>Remarks on constituent structure and constituent structure tests </b></p><p>Pauline Jacobson</p><p>Program in Linguistics</p><p>Brown University</p><p><a href="https://www.dropbox.com/s/i220gsb32bc7lch/Jacobson%20-%20guest%20blog%20%28new%29.pdf?dl=0">Original .pdf Version</a></p><p>This is a guest blog which I was invited to write by Chris Collins. I am very grateful to Chris for the opportunity to write up and post these thoughts. Needless to say, he doesn't necessarily agree with or endorse anything in these remarks. </p><span><a name='more'></a></span><p><b>1. Introduction</b></p><p><span style="white-space: normal;">It has become common - seemingly almost mandatory - for both Introductory Linguistics books and Introductory Syntax books to lay out a battery of tests for constituency [1] - usually some combination of 'movement' (or: ability to appear at the front); clefts; ability to appear alone as a fragment (usually as an answer to a question); 'replacement by a proform' [2]; deletion; and conjunction. I have not done an exhaustive survey of Introductory books, but am basing this on three fairly current common Introductory Linguistics books: Fromkin et al. 2000 (reprinted 2009); Ohio State Dept. of Linguistics Language Files 12th edition (Dawson and Phelan eds.) 2016; and O'Grady et al, 2017, plus five current introductory syntax books: Carnie 4th edition 2021; Adger 2003 (reprinted 2012); Larson 2010; Koeneman and Zeijlstra 2017; and Kim and Sells 2008. I take this is to be a reasonable sample. </span></p><p>The purposes of these remarks are as follows. Sec. 2 points out that the majority of these books commit - sometimes indirectly and by implicature (often via the exercises) - the fallacy of concluding that if something does not pass the test it is not a constituent. Sec. 3 discusses a bigger picture goal. This is to argue that putting such an emphasis on trees (where the tests for constituency are a part of that emphasis) obscures the more important project of trying to model speakers' knowledge' of the 'grammar'. (In the case of all of these books, the primary language being studied is English, although some have more cross-linguistic coverage as well.) And if that knowledge has something to do with 'constructing trees', that is only as a byproduct of speakers knowing the combinatory rule system. It is true that under a certain set of assumptions trees are a reasonable representation of (part of) that knowledge when applied to a given sentence, but in fact it is only under particular set of assumptions about what grammars are that it even makes sense to talk about 'the tree' for a given (unambiguous) sentence. [3] And this is often not the assumption made by the end of these books (particularly Introductory Syntax texts). Of course initial oversimplifications are almost necessary, but the inconsistencies (or, need to revise) are generally never pointed out. Sec. 4 suggests that there are more general ways to discover the rule system - even at an elementary level. Moreover, while the standard 'tests' themselves are not without interest, it would be better to explore just what they are tests for (and why most of them are weak tests). Sec. 5 discusses my own reason for being interested in the way this is presented. As a proponent of a particular version of Categorial Grammar, I believe that a given (unambiguous) sentence can be put together in many different ways and so has many different 'constituent structures' (as noted in fn. 3, most theories actually are committed to that in a few cases, but Categorial Grammar allows for far more ways to put together an unambiguous sentence). And this flexibility is supported by what I think is the most reliable of the 'tests', which is conjunction. For example, this view means that in "Lee thought that someone had stolen the peanut butter", there is an analysis by which "thought that someone" is a 'constituent' - as witnessed by conjunctions such as "Lee thought that someone and was determined to find out just who had stolen the peanut butter". But many linguists gasp at the claim that "thought that someone" could possibly be thought of as a constituent: after all, it passes none of the other tests! It seems to me that that automatic 'gasp reaction' is the result of overinterpreting the other tests - as will be shown, lots of other standard constituents pass only the coordination test. The overemphasis on trees has another unhappy consequence for an idea explored in some of the Categorial Grammar and related literature. This is the idea that strings can combine by infixation rather than just being adjacent (see, e.g., Bach 1979 for a CG view, and Pollard 1984 for an HPSG view). If so, then trees in any case are not the appropriate representation for some sentences. Yet the mere idea of an infixation operation is generally dismissed out of hand - purely because of the axiom that we need to represent things as trees! Sec. 6 concludes with a brief note on the interaction of syntax and semantics and the unstated assumptions which underlie its role in 'figuring out trees' - assumptions that I actually agree with but that, in fact, the model of semantics presented in some of these books actually does not agree with. </p><p>A major disclaimer before continuing. While the remarks below are critical of parts of the books noted above, I do not intend these remarks to be critical assessments of any of these books overall. It might sound at times in this blog that I am almost doing a review of Syntax (and Introductory) textbooks. But that is not my intent - and if I make critical comments about a small part of one book that should not be taken as a 'review' of the rest of the book. In fact, I myself have at times used all three of the introductory linguistics books cited here (in various editions) in Intro classes and have used or will use parts of the some of the syntax texts as supplements to my own notes. I am just focusing on the way constituent structure is taught and the non-negotiable and sometimes mistaken assumptions that ensue from that. </p><p><b>2. Beware the Persistent Fallacy</b></p><p>I begin by documenting the surprising fallacy which persists - sometimes outright, sometimes only by implicature (especially in exercises) - in most of the books I consulted. As noted above, this is the fallacy of taking the constituent structure tests not just as sufficient tests for constituency but also as necessary ones. Thus one often sees arguments to the effect that if some string shows a certain behavior it is a constituent. Fair enough, but from there some of these books either directly or indirectly conclude/suggest that if the string doesn't pass that test it is not a constituent. That conclusion is not warranted: my favorite example to illustrate this fallacy is the relative clause "which Sally recommended" in (1): </p><p>(1)<span style="white-space: pre;"> </span>I read the book which Sally recommended.</p><p>This passes none of the tests except coordination and perhaps deletion[4], but is uncontroversially a constituent. The same holds for "ate an apple" in (2a) and "Sandy ate an apple" in (2b); these pass none of the tests except coordination (note: untensed VPs do pass some of these tests but tensed VPs do not, and CPs pass some but Ss without the "that" do not):</p><p>(2)<span style="white-space: pre;"> </span>a. Lee thinks that Sandy ate an apple. (e.g. *Ate an apple, Lee thinks that Sandy)</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b. Lee thinks that Sandy ate an apple. (e.g., *Sandy ate an apple, Lee thinks that) </span></p><p>But these facts are rarely noted. For example, while O'Grady et al 2017 do not come right out and present the tests as necessary and sufficient (they give pronoun replacement, movement and coordination - pp. 178-9), the lack of pointing out that many constituents fail these tests would - I would think - certainly suggest this to a student. After all, it is very common for speakers to pragmatically enrich 'if' statements to 'if and only if' statements, and so with no disclaimers to the contrary a student is likely to draw the conclusion that anything which fails a given test is not a constituent. And indeed, the exercises (6) and (7) (p. 212) in O'Grady et al - while not literally committing to the view that these are 'if and only if' tests - would, in my opinion, suggest this to most students, especially given the lack of any cautionary comments. (The reader can judge for themselves, their language is: "Apply the substitution test to determine which of the bracketed sequences in the following form constituents" (and similarly in the next exercise for movement).) We find the same problem in the introductory syntax text by Kim and Sells 2008 on pp. 19-22, where four of the usual tests are featured: clefts, substitution by a proform, fragment answers, and coordination. While the examples they give that fail the test are standardly not considered constituents (but see my remark above and in Sec. 5), Kim and Sells make no mention of the fact that sometimes standard constituents can also fail the tests - for other reasons. And their exercise 3 on p. 33 again at least suggest that these tests are 'if and only if' tests for constituency. The discussion/exercises in Carnie 2021 is similar. On pp. 91-93 he presents four tests (treating the cleft test as a subcase of movement) but nowhere warns students that these are only sufficient and not necessary tests. And the relevant exercise on p 102 will, I suspect, lead students to conclude that certain things are not constituents on the basis of these tests; the language again asks students to use the tests determine "whether the bracketted strings in each of the following sentences is a constituent". The discussion in Fromkin et al 2000 also conveys the idea that if something cannot, e.g., move by Topicalization then it is not a constituent (see especially p. 153). </p><p> Some of the books are more careful to point that the these tests are only sufficient conditions. But ironically, some of the very same books go right on to give an exercise which at best implicates and at worst downright presupposes that anything failing the test is not a constituent. A particularly blatant example of this is found in Language Files 2016. The seemingly obligatory parade of tests happens on pp. 217-219. In discussing the 'answers to question' tests, the book makes the outright mistake of taking the inability of something to answer a question to show that it is not a constituent (see Sec. 4 below for discussion of this test). But at least in the Clefting section the book is careful to warn students against the fallacy: "So if a cleft is ungrammatical, it doesn't necessarily imply that the displaced expression does not form a constituent" (this is accompanied by an example of a bad cleft with something that is generally acknowledged to be a constituent). But then - exactly 20 pages later (p. 238) - we find the following exercise "Use the cleft test to show that "a highly motivated student" is not a constituent in this sentence". (The sentence in question contains "a highly motivated student of mine") Woops!! The rest of the questions in that exercise also seem to presuppose that the tests are necessary and sufficient conditions for constituency. </p><p>The discussion in Koeneman and Zeijlstra is refreshing in that they not only are careful to point out (pp. 48-49) that the tests are just one-way tests, but also they go on to explicitly demonstrate the fallacy in general: "If it rains, you get soaked. But if you are soaked, it does not automatically follow that it is raining. Other factors may have caused this ...." (p. 48). But what happens in the exercises? On p. 51 we find the following: "Determine with the use of tests whether the italicized strings form a constituent." Now maybe my criticism is nitpicky; it may well be that this is simply poorly worded (note that this is similar to the crucial wording in Carnie 2021). Given the warning earlier, I assume that what the book intends to ask (or should be asking) is something more like - "Determine with the use of tests whether there is evidence that the italicized strings form a constituent". That wording would make clear that the answer in some cases will be "no - there is no evidence". But the actual wording at least suggests that for each string one is supposed to give an answer of 'yes' or 'no' to the question of whether something is a constituent; I certainly suspect that many students would take it that way. Here's a potential way to gauge the intent of the books' exercises. Imagine substituting in a clear constituent for the italicized string in any of these exercises - i.e., imagine an exercise that also included in the list a sentence like: "I saw the student who graduated last year" and where 'who graduated last year" is what is italicized in the exercise. If one really wanted to stress to a student that failure on the relevant test just mean there is no evidence, it would be useful to have such a case in the exercises. But I have seen no book that includes something like that in the usual constituent structure tests exercises. I have a similar quibble with a small section the discussion in Larson in pp. 109 - 110, although it needs to be pointed out that he does correct the misleading inference later. But initially he presents the tests as necessary and sufficient conditions: this is explicit on p. 109 where he gives Principle 4 (P4) "If a string of words can be dislocated, then it is a constituent". He then gives "Bart gave Maggie to Lisa" and follows this with the bad dislocation *"Maggie to Lisa Bart gave", suggesting that this and another ill-formed case are bad because they "violate Principle 4". Of course these sentences can't possibly violate Principle 4 as it is clearly stated only as an 'if' condition. A violation is possible only if one mentally strengthens it to 'if and only if', as I suspect most students would do when told that there is a violation. In fact, the book goes on to say "it follows under P4 that ... "Maggie to Lisa" cannot [be disclocated]". Of course no such thing follows from P4, since it is (correctly) simply an 'if' condition. The subsequent exercise further reinforces the invitation for a student to make the standard fallacy mistake. To be fair, the misleading parts here are temporary; Larson later (pp. 136-139) carefully points out that the tests are only 'if' tests and not 'if and only if' tests, giving a detailed exposition of the very fallacy. And, refreshingly, he even takes one of them ('replacement by a proform'), shows a case where it fails for a clear constituent, and discusses why it fails (something rarely discussed - see sec. 4 below). Moreover, Larson (personal communication) says that this method is a design feature of the book as a whole: hypotheses are presented and then revised. So we might just have a disagreement about effective exposition, but this case does not boil down to presenting a hypothesis that is later revised. That - indeed - is a normal way to do science. Rather, here the book simply presents mistaken reasoning, and then later undoes that (without direct reference to the earlier discussion). The only book of those I consulted that was both careful to point out that the tests are only sufficient conditions and did not have misleading exercises is Adger 2003. His exercises are not asking to use the tests to determine whether something is a constituent (which invites a yes/no answer), but rather to use the tests to determine that two sentences have different constituencies. </p><p><b>3. The broader issue: Losing sight of the forest through the trees</b></p><p>There is a bigger picture consequence to the heavy emphasis in beginning textbooks (and other places) on trees - where the seemingly obligatory section on constituent structure tests is a symptom of this. And this is that it makes it easy to lose sight of the broader project: that of modeling speakers' (unconscious) knowledge of the 'grammar' of their language, and - pedagogically - how we discover that. (In the case at hand in the bulk of the tree discussions in these books the particular language at issue is English, though some books contain tree discussion for other languages too.) And that consists of knowledge of the combinatory rules/principles; a tree for a given sentence is nothing more than the representation of a proof of its well-formedness according to a particular system of rules - this is in fact conveyed to some extent in some of the discussions - especially those that begin with phrase structure rules - but my impression is that this point is lost in some of the other discussions. (Note, incidentally, that a tree can also be the representation of the compositional semantics, but see the discussion in Sec. 6; I will basically ignore the role of the semantics until then.) To the extent that speakers in some sense 'intuitively know' what is the right tree for a given sentence, that is the result of knowledge of the rules/combinatorics/principles defining each local bit of the tree as well formed. And crucially, a tree (as opposed to a sequence of trees or some other object) appropriately represents a sentence only under the assumption that the grammar proves something well-formed via (only) a set of context free phrase structure rules, which can well be listed in highly schematic form in the 'mental grammar'. Indeed this is the view initially taken in almost all books (though almost always under other terminology), and it is a perfectly reasonable way to begin. But the consequence of adopting additional operations (e.g., movement, 'replacement by a proform', deletion - or, in some theories - infixation rules) ultimately undermines the idea that an unambiguous sentence has a (single) tree representation, rather than a sequence of trees. </p><p>A context free phrase structure rule is one that specifies that an expression of some category A is well formed in terms of a sequence of other expressions (including possibly single words) that are adjacent and occur in a particular order. It is conventional to write such rules as, for example: A --> B C . This is a binary rule - with only two daughters, but a context free phrase structure rule in general can have any number of (including one) symbols (or symbol) on the right side of the arrow, and it can also have single words on the right side of the arrow. The rule written as A --> B C can equivalently be described in other ways, such as "A well-formed expression of category A can (NB: not 'must') consist of an expression of category B followed by one of category C". Or: "An A may (NB: not 'must') consist of a B followed by a C”. Or: "If there is a string (expression) of category B followed a string (expression) of category C then the whole thing is a well-formed expression of category A. Finally, one can also recast this information as a local tree consisting of mother node A and two daughter nodes B and C (where B precedes C). These are all equivalent; none is more 'formal' than any of the others - they are merely different notations. But only if the grammar contains nothing but statements of this kind (albeit possibly in quite generalized form) is a single tree an appropriate representation of a sentence. (See fn. 3 and the discussion in Sec. 5 for the point that a given sentence - even an unambiguous one - can have more than one possible tree; by saying 'single tree' here I again mean to oppose this to a sequence of trees.) </p><p>As suggested above, this is not to say that the hypothesis that a sentence can be represented by a tree commits one to a theory of grammar with simply with a list of phrase structure rules. As far as I know, no current theory maintains that the grammar is simply a list of these rules - all theories collapse the rules into generalized schemas, possibly with additional principles restricting what actually instantiates these schemas. The most general such schema (and hence in need of many additional principles to restrict the actual instantiations of this) is - in current parlance - (External) "Merge". Leaving aside definitions involving set formation or other more complex opertions (such definitions are never used in any introductory book that I have seen) and leaving aside "Internal Merge" for the moment, there are various definitions; I take as point of departure the one in Adger 2003. This defines Merge as a "constituent building operation" and is illustrated by a local tree with Z (a variable over node labels) as mother and Y and Z as daughters (again, these are meant to be any node label). Adger goes on to note that in theory Merge could join three or more objects but begins with the hypothesis that it joins only two, hence this immediately limits the possible phrase structure rules (equivalently, local trees) that instantiate "Merge" to only binary ones. Note then that the Merge rule schema which could equally well be written as Z --> Y X where these are any node labels. Incidentally, Adger also intends for this to be a schema without order, just a schema saying the two combining expressions are adjacent. Since X, Y, and Z here are all variables over node labels, then the above is of course equivalent to Z --> X Y, but the (presumed) intent of his discussion is that even when more information is put in there about the node labels as in, say, an instance of merging NPs and VPs to give S, the general schema would allow for either order of the daughters. The actual order would be fixed by other principles. But that still leaves us in the land of schemas over phrase structure rules: merging NP and VP to give S just abbreviates the two cases of S --> NP VP and S --> VP NP. Additional principles specify that only the former exists English. This idea was made explicit in Generalized Phrase Structure Grammar (Gazdar et al 1985) which made a distinction between Immediate Dominance rules and Linear Precedence rules. For example, there would be a rule schema VP --> V, NP (the comma is intended to indicate no order) and a separate head first principle which is a principle on how to instantiate this schema. </p><p>Merge is by no means the only phrase structure rule schema that is commonly used - it merely is the most general (and hence the most in need of additional principles to predict what can actually instantiate it in any given language). A theory that begins with X-bar theory (or uses that to limit the possible instantiations of Merge) is also using (generalized) phrase structure rules, for the X-bar rules are also phrase structure rule schemas. (It is odd that X-bar theory is often presented as an alternative to 'phrase structure grammar'. For sure it is an alternative to a by now Straw Man theory in which each individual phrase structure rule is listed, but it nonetheless embodies the claim that there are phrase structure rules: X-bar theory is simply a theory of the actual ones used in 'constructing' - i.e., proving well formed - expressions in a language.) Note that X-bar theory is also not incompatible with Merge, it is just a way of further refining the set of possible rules instantiating generalized binary "Merge". The X-bar schemas themselves are also still too general. For example, what category or categories can instantiate 'Spec' position depends on what instantiates the sister of Spec (the same holds for the other two rules in the classic X-bar schemas), and so additional principles are still needed. Other theories have different generalized rule schema for phrase structure rules; Generalized Phrase Structure Grammar (Gazdar et al. 1985) was a highly worked out theory along these lines. And many versions of Categorial Grammar, for instance, have categories of the form A/RB and A/LB (the notation differs among different authors), where an expression of category A/RB is something that combines to its right with an expression of category B to give a larger expression of category A. This is simply equivalent to having a generalized rule: A --> A/RB B. Similarly for the 'left slash'. These two rules, incidentally, are not meant as schemas that need to be restricted further (unlike a general schema A --> B C); anything which instantiates these categories can combine in the a appropriate way. Interestingly as pointed out above, some versions of CG also have infixation rules which therefore are not compatible with tree representations, we return to that pint below. (See Bach 1979 and many works since; for further discussion see Jacobson 2014). </p><p>In any case, the discussion here is intended to make the point that if one begins with the assumption that the only combinatory principle is binary (or n-ary) external "Merge" (i.e., a generalized rule of the form Z --> X Y), X-Bar theory, or some other phrase structure rule schema(s) then indeed, a tree is an appropriate representation of the combinatorics proving a particular sentence well-formed. But an overemphasis on the idea that the fundamental way to represent sentences is by a tree (rather than a sequence of trees and/or some other object) obscures the fact that at the end of the day a theory with only phrase structure rule schemas is not the theory assumed in most of these texts. In fact, the minute one introduces 'movement', 'deletion' or 'replacement by a pronoun' as a test for constituency one is already committed to tree altering operations, and one is implicitly endorsing the view that using a single tree to represent many sentences is incorrect. But this is not pointed out. Might/Should not a perceptive student be confused? If movement is recast as Internal Merge the same basic point about a tree holds. After all, combinatory rules that manipulate the internal structure of one of the input items (by possibly deleting or silencing it) also don't have good (honest) single tree representations. This is because there is a difference between the internal structure at the input and the output of the constituent containing the internally merged item (even if nothing more than the addition of a feature suppressing the phonology of that constituent). Sure, it can be drawn in various ways in a tree like fashion - as can be multiple levels with the common use of arrows showing a movement path - but it should be clear that with these devices it makes little sense to talk about the constituent structure of a sentence that has undergone movement. </p><p>A separate point is that the insistence on trees as the primary object of study (rather than the rule system which may or may not yield trees as the best representation) makes a theory with - for example - infixation rules as part of the combinatorics (see, Bach 1979 and much Categorial Grammar work since) completely unfathomable to students since it doesn't lend itself to tree representations. (Not that the books discussed here endorse infixation, but there's no reason the idea should invoke such incredulity other than the fact that an infixation operation can't be represented in a tree.) Is it completely out of bounds to ask whether "look up" is a constituent in "Lee looked the information up"? The very inability to have 'discontinuous constituents' in tree representations leads to making an infixation combinatorial process 'off the table' and hence leads to multiple levels as the only solution for this type of case. The inability to even contemplate anything but movement here is simply an artifact of over interpreting the primacy of trees and not seeing them just as representations of proofs according to one type of rule system. </p><p><b>4. Back to the constituency tests - what are they useful for, and what are useful tests?</b></p><p>To say some string is a 'constituent' means that there is some category A such that the grammar proves that the string in question is a well-formed member of category A (where that category figures in the statement of other rules). Leaving aside semantic intuitions, the heart of determining the existence of such categories centers on distributional facts (some of the standard 'tests' are simply special instances of this more general point). The clearest arguments for such categories are not simply about distributional facts, but are distributional facts which can only be described by a recursive rule system. As a demonstration of these points for a perfectly simple case, take one way we can determine that there is a single category (call it NP) for the expression "the large flea", and that is the same category that we find in "the craziest idea I have ever heard", "the potato head statue in front of the supermarket", "Grumpy the Dwarf", "wine in a plastic bag", "the idiotic belief that Napoleon is president", etc. It is fairly easy to show this at the outset by beginning with a description of what sorts of things can go before and after a verb like "resemble". Whatever the ultimate details turn out to be of the rule system that permits a certain set of strings to be before "resemble" and a set to be after "resemble", it becomes quickly clear that the 'before' set and the 'after' set is the same, although some combinations require a bit of pragmatic imagination. If we tried having a set of rules to predict what is in 'before' set and a separate set for the 'after' an obvious generalization is missed - and the grammar (no matter how the rest of the details go) becomes more complex for having to define that set twice. Note that tests such as 'movement' can be thought of in exactly the same way. Rather than relying on an axiom 'only constituents can move', one can easily teach this in a way as to make it clear why that is likely to be true and hence what that test is telling us. Assuming here - for the sake of discussion - that the account of 'dislocated' constituent involves movement, one can merely posit a rule moving things (it doesn't even need to be written out), and note what it would take to describe the set of things that can move. If we were to list a large group of strings the rule would be hugely complex (and impossible, in fact, given the point about recursion given below), but if there were a small set of categories that all the moveable strings belong to, only that set needs to be named in the description of the movement process. This also makes clear that movement is not an 'if and only if'. Most of the other tests are similarly ultimately just special cases of arguments from simplicity. </p><p>The final nail in the coffin of not defining a single category for, e.g., all the strings that can appear before and after "resemble" is the existence of recursion. One can easily show this in the NP case in virtue of sentences like "The potato head statue in front of the supermarket resembles the wine that resembles the craziest idea I have ever heard". (Enough repetition of "resemble" can seem weird, but the same point can be made substituting in "is similar to" for "resemble".) Or: "The potato head statue that resembles Mr. Peepers resembles the wine that resembles Grumpy the Dwarf". The existence of recursion seals the deal: there is no way to describe (whatever the full form of the grammar is) the set of things that can go before and after "resemble" without invoking a single category and having the grammar define a well-formed set of strings of that category. Then one can just use that category in the statement of other rules. </p><p>The most general case of recursion comes from coordination - since almost all categories can coordinate. Thus there is a generalization that for (almost) every category X, strings of the form X and X are well-formed, and distribute just like other strings of category X. This could be the result of a phrase structure rule schema (possibly itself derived from something more general) of the form X --> X and X (for X a variable over categories.). Similarly for "or". There are other options: if only binary rules are allowed this obviously cannot be right, but alternatively this could be broken down into two binary schemas. See Munn 1993 and Jacobson 2014 for two different (but somewhat converging) ways to do stated within in two different theories. [5]. Which of these is correct is not relevant to the immediate points here (and in either case the exact schemas in question might themselves just be instantiations of much more general schemas). Assuming that the binary solution is correct - giving structures of the form [X [and X]] ) and [X [or X]] - we can note that there still are a few categories which do not coordinate, as we don't find things like *"Lee [[or Sandy] and [or Jo]]" and *"Lee[and Sandy] or [and Jo]]". This means that it appears that a failure to pass the coordination test is - unfortunately - still not definitive proof that something is not a constituent, but the vast majority of categories do pass, including the three 'holdouts' mentioned above (relative clauses, tensed VPs, and Ss). </p><p>Many textbooks are reluctant to place too much emphasis on the coordination test - not for the reason that many categories fail the test (as noted above, almost every category does pass the test) - but rather the opposite. Under "standard" assumptions, lots of things that are taken not to be constituents do allow coordination. In other words, there are many instances of what is often taken to be nonconstituent coordination (or structures which are the result of Across the Board movements). Examples include those in (3), and it is easy to construct many others. </p><p>(3)<span style="white-space: pre;"> </span>a. Mary loves and John hates studying compositional semantics.</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>b. Cap'n Jack restaurant served clams on Monday and lobster on Tuesday.</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>c. Lee bought and Sandy cooked clams on Monday and lobster on Tuesday. </span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>d. Lee believes and suspects that everyone else also believes that the earth is flat.</span></p><p>We return to these in Sec. 5.</p><p>But coming back to the usefulness (or not) of the standard 'tests', it could be illuminating when these are presented to have some discussion as to why some clear constituents fail them - what are they actually testing for? I am not sure that the answer to that is clear in all cases, but at least partial answers are available for some. Take, for example, the case of fragment answers to questions. Leaving aside some complications due to NP and PP mismatches (see Jacobson 2016 for discussion), it is roughly the case that the category of the fragment answer must match the category of the question word (or, in some cases, of the larger Pied Piped expression containing the question word). This matching requirement follows in various different ways. Under the ellipsis view of fragment answers in Merchant 2004 it follows because the fragment itself is moved and the remnant must match the remnant of the wh-question. In the view sometimes called 'direct matching' of Groenendijk and Stokhof, 1984 Ginzburg and Sag 2000 and Jacobson 2016 there is a question/answer discourse unit, and something counts as a 'answer' to the question only if the category of the question word or phrase matches that of the fragment answer. (In that case, the semantics puts the two together in a particular way so as to get the relevant inference.) But either way, the possible categories for fragment answers will be limited by the categories for question words. </p><p><span style="white-space: normal;">It is thus not surprising that, for example, a relative clause cannot stand as a fragment answers to a question because there is no way to question that relative clause. We would expect "who handed in all of the homeworks" to be a good answer to a question only if there were some question word - call it "whiprop" - whose syntactic category was the same as that of a relative clause and where (4) was a good question:</span></p><p>(4)<span style="white-space: pre;"> </span>*Whiprop did you pass every student?</p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>meaning: what is the property such that you passed every student that had</span></p><p><span style="white-space: normal;"><span style="white-space: pre;"> </span>that property.</span></p><p>If we had such a question, then we would expect that "who handed in all the homeworks" to be a good answer. So the space of fragment answers is bounded by what question words/phrases we have. Of course this raises another interesting question: why do we have the question words (or phrases in Pied Piping cases) that we do? That is an interesting question but only indirectly bears on the issue of constituency. One can and should ask similar questions for each of the tests. </p><p><b>5. Why is this blog so long? The agenda of a Categorial Grammarian</b></p><p>My interest in the role of constituent structure and constituent structure tests is driven in part by a specific agenda. This is to lend plausibility and ward off a common objection to some of the interesting results and techniques in Categorial Grammar - a framework rarely taught and arguably underappreciated. This is not the place to give a tutorial on CG nor on many of its facets, but just to highlight the one regarding constituency. (I do wish to note that I think the main advantage of CG is that it provides a beautiful and smooth fit between the syntax and the compositional semantics, but since this is not a piece about semantics I leave it to the interested reader to explore that further, see Jacobson 2014 for a relevant textbook.) The one bit of background needed here is the idea that syntactic categories are encodings of distributional facts. There is a small set of primitive categories, and a recursive definition of other categories which basically encode what argument an item takes (in general - what will be its sister) and what will be the result (the mother category). As noted in Sec. 3, this then allows extremely general phrase structure rule schemas. Thus an expression of category A/RB is something that wants a B on its right to give an A, and A/LB wants a B to its left to give A. (There might also be categories which say that the expression is a circumfix - i.e. the material will take something as an infix, and categories which would say that something is an infix.) The directional features on the slashes are not listed item by item, but given by general rules (see Jacobson 2014 Chapter 6 for discussion). </p><p>There are many versions of Categorial Grammar, but following a tradition advanced in the 1980s by, e.g., Steedman 1987, Dowty 1988 and others, the syntactic (and semantic) combinatorics are such that many unambiguous sentences have many ways to be put together (giving a single meaning). See also related work in the Type Logical Grammar tradition. Thus a simple sentence like "Lee loves bananas" can be put together in the usual way: "loves" and "bananas" combine to form a constituent (VP - or, in CG terms S/LNP) which then combines with the subject "Lee". But there are other operations according to which "Lee" and "loves" can first combine (three possibilities for this are outlined in Jacobson, 2014) such that "Lee loves" is a constituent which is expecting to find an NP on its right to give a S. The way the semantics and syntax work together means there is no problem getting the right argument structure: "Lee loves" denotes the set of things that Lee loves (while "loves bananas" in the other analysis denotes the set of lovers bananas). This automatically allows for right node raising cases like that in (3a) above under the assumption that two expressions of the same category can conjoin to give a third of that category. Dowty 1988 showed that the same types of techniques allow "clams on Monday" to combine to become something wanting a transitive verb to its left to give a VP - i.e., (S/LNP)/L((S/LNP)/RNP).</p><p>A common reaction to the idea that "Lee loves" can be a constituent is complete disbelief: does the person advocating this have the slightest idea of what we should have learned in Introductory Linguistics/Syntax? Don't we have lots of evidence that in "Lee loves bananas", "loves bananas" is a constituent - i.e., that there is some category (VP) and the grammar proves "loves bananas" as a well-formed member of that category? Yes of course we do - such evidence comes from distributional facts including recursion and is often presented early on in a syntax text. (See, e.g., Larson 2000 who nicely lays out this evidence.) But a careful consideration of just what the evidence shows is that it only shows that the sentence has one such analysis - not that that is the only analysis! That this is all it shows is almost never pointed out. Continuing with this point, take the claim that "do so" is a VP proform (I have questioned that in fn. 2 but that is irrelevant here, for the sake of discussion let us assume that the textbook wisdom on that is correct.) All that shows is that there is a proform of some category and that "loves bananas" can be analyzed as a well formed exprssion of that category. None of this shows that there is not also another (semantically equivalent) analysis by which "Lee loves" is an expression of some other category - albeit one with limited distribution. </p><p>It is only if we make the assumption that each unambiguous sentence has just one analysis does it follow that "Lee loves" cannot also be a constituent. And I think that this is another assumption that is at least implicitly made in many books, and which becomes sufficiently ingrained that students (and many others) rarely even notice that showing that "loves bananas" can be a constituent hardly means that that is the only analysis. This assumption creeps in when, for example, a particular tree is shown in which some sequence of words is not a constituent from which it is concluded that that sequence is not a constituent in the sentence in question. All that follows is that the sequence is not a constituent under the analysis shown by that particular tree, but it doesn't mean there aren't other ways that the grammar proves the sentence well-formed. In fact, we return now to the point made in fn. 3 - it seems to me that any theory really does have to be committed to the view that some sentences have more than one analysis even under a single meaning. Given "Roses are red and violets are blue and irises are white", it would take a lot of extra work and machinery - under almost any conceivable view of coordination - to block two analyses of this (which are semantically equivalent). The two analyses are: (i) [[roses are red and violets are blue] and [irises are white]] and (ii) [[roses are red] and [violets are blue and irises are white]]. (Whether the rules for coordination are binary or not makes no difference here.) If we substitute in "or" for one of the occurences of "and" we find that the fact that there are two analyses is a happy result, since "Roses are red or violets are blue and irises are white" is ambiguous in just the way the two possible brackettings predict. It just happens that the semantics of "and" is associative, and so no ambiguity happens to arise in the case where there are two occurrences of "and". So as far as I can see, all or at least most theories really do allow for the possibility of an unambiguous sentence having two analyses (as I say - it would take a lot of extra machinery to preclude this), and the notion that "Lee loves bananas" has two analyses should not be such a shock. But it often is a shock, due to the common talk by which an unambiguous sentence has a (single) tree. I personally think that if trees were more clearly presented as the end result of a rule system with respect to a given sentence, it would be clearer at the outset that the rule system could work in two different ways to prove a sentence well-formed even under a single meaning. </p><p>Returning to the predictions of Categorial Grammar (and related theories), depending on the exact version that one adopts one can come up with a grammar in which any substring of a given sentence is a 'constituent' and should therefore pass the coordination test. We noted above that the categories that can coordinate will have to have some limits; assuming a binary view of coordination we saw above that there are constituents that still cannot conjoin. But the flexibility of constituency combined with the assumption that only 'constituents' can conjoin leads to the possibly surprising prediction that almost any substring can pass the coordination test. And while it is easy to construct cases that appear to threaten this generalization, they often have other reasons for being bad. For example, Adger 2003 (p. 125) gives some examples of bad cases, but these can all be tweaked into similar sentences which are good. A bad example is *"Owly hated [the evil] and [the wise] eagle" - but with a bit of prosody and getting the right focus and other tweakings we can construct the parallel good sentence: "Owly hated (both) THE EVIL and A WISE eagle" (caps here used to indicate contrastive stress). Granted this prefers to have the addition of "both" there, but that should not affect the main point. Or his example (79) *"Owly hated the and loved the bat" is easily fixed to remove the strange repetition of "the" - giving "Owly hated two and loved three bats". If we take seriously that conjunction is a test for constituency and have various ways to put these together, these results then become unsurprising. </p><p>But wait says the skeptic again: what about all those standard constituent tests that we learned about in Intro (Linguistics or Syntax)? "Lee loves" passes almost none of them except coordination (Ditto for all the 'nonstandard' constituents discussed above). Of course it is the very reaction (which I literally have heard) that prompts this blog in the first place. The reaction stems simply from bad teaching and the fallacy discussed at length in Sec 2. As stressed at several points in this blog, plenty of 'standard' constituents also fail the tests. Once again, we would ultimately want an explanation for this (for both the case of "Lee loves" and for the case of the other constituents), but again we can discover this only by looking by case to see what these tests are testing. Jacobson 2014 discusses this in greater detail: there it is shown that there is good reason why we would not expect to find an anaphor of category S/RNP (the category of "Lee loves"). The remarks there extend to predict that we would not find an anaphor of the category of something like "clams on Monday". And the limited distribution of these 'funny' constituents also follows from the way that the word order generalizations work. As to why "Lee loves" cannot be the answer to a question - this is the same as the earlier discussion about relative clauses: there is no question that this can answer. </p><p>There is another piece of evidence often used to determine constituent structure: c-command, as it plays a role in the statement of, e.g., Weak Crossover, the 'binding' of reflexives, the distribution of Negative Polarity items, etc. The claim that "Lee loves bananas" can be put together by first combining "Lee" with "loves" to give wreaks havoc with standard notions of c-command. As such, this is another reason why the flexible constituent structure possibilities discussed above draws gasps: how can one possibly propose that there is an analysis by which "bananas" c-commands "Lee"? Won't this completely ruin all generalizations based on c-command?</p><p><span style="white-space: normal;">The answer is no. This is another great example of losing sight of the forest through the trees - and of taking the trees as having some primacy over a deeper understanding of the phenomena in question. All of the phenomena above should have something to do with the way the semantics is put together, and one might hope that any c-command generalizations are the result of something else. Put differently, it seems highly unlikely that the grammar contains, for example, a statement to the effect that a Negative polarity item must be c-commanded by a downward entailing operator. If one takes the general line begun in Kadmon and Landman 1993 or many later variants of this, downward entailing environments are ones where semantic strength is reversed, and that remains the case regardless of the order in which things are put together. The hope is that distribution Negative Polarity Items follows from their meaning combined with the strength reversing property (and perhaps additional principles). Weak Crossover effects can also be the result of the way 'binding' works, where c-command need not be explicitly built into the grammar in any way. For one way to accomplish this, see the variable-free account of binding in Jacobson 1999 and subsequent work: the fact that there is no 'binding' relationship possible in *"Hisi mother called every fourth grade boyi" follows from the basic argument structure of "call" and the way that interacts with the operation that effects 'binding'. The order in which the combinatorics is put together makes no difference. </span></p><p><b>6. A note about semantics </b></p><p>It is also typical for introductory books to point to ambiguous sentences (or phrases) like "The spy saw the man with the binoculars" or "old dogs and cats" as a help in determining constituent structure, since in each case these should have two different trees. Such discussions generally appeal to common sense semantic intuitions as a way of constructing these trees - and thereby (the unspoken part) - as a way of determining the possible rules that instantiate generalized schema. This is fine, but I always find it striking that this general strategy assumes something about the syntax/semantics interface but never makes that assumption explicit - seeming to take it as common sense. But actually this view is inconsistent with the model of syntax/semantics interface taken by the end of the some of these books - at least those that mention rules producing a Logical Form which is the input to the interpretive procedure. (Those books include Adger 2003 although only in passing, Carnie 2021 and Koeneman and Zeijlkstra 2017.) Notice that the apparent common sense obsesrvations about ambiguity only make sense under the assumption that the trees sanctioned by the initial p.s. rule schema (X-bar theory, Merge, or whatever) are what is interpreted! In fact, the simplest way to conceive of this would be not to have the syntax 'produce' trees which are then interpreted, but rather to adopt Direct Compositionality (yes, another agenda-driven plug) whereby each phrase structure rule (schema) is coupled with a semantic rule giving the meaning of the mother in terms of the meaning of the daughters. (Under this view, there is no level which is interpreted: interpretation proceeds in tandem with the syntactic building operations.) This is exactly the view from Montague 1970 and other works by Montague and which is taken in Categorial Grammar, among other theories (e.g., Gazdar et al. 1985 also took this view as just one example). I personally think it is right, but generally no comment is made about how the syntax and semantics explicitly work together, and what 'trees' have to do with meaning.</p><p>In sum, then, if one is going to introduce ambiguity and more generally semantic intuitions as a way to justify a particular set of rules (i.e., a particular constituent structure), it would be helpful to make explicit exactly the assumptions about how semantics work. I don't know that this is ever done, and in some cases these assumptions are discarded by the end of the books, but no comment is made on how that may or may not effect the initial 'evidence' given for constituent structure. </p><p><b>7. Conclusion<span style="white-space: pre;"> </span></b></p><p>My main goal has been to argue that much of the discussion about trees in so many introductory books (including the semantic discussions, constituent structure, the role of c-command etc.) potentially obscures the details of the model of grammar being assumed, the project of modelling that, a search for deeper explanations as to why things work the way they do. This has led to certain ideas about syntax being 'non-negotiable' truths when indeed they should certainly be up for discussion. Not having embarked on the difficult project of writing a syntax textbook myself, I should - and to some extent do - feel loathe to criticize others for what is a hugely difficult territory to navigate, since obviously oversimplifications are needed at the outset. Nonetheless, the result of ignoring these issues means that the problems I am addressing here go beyond just the teaching of syntax itself, for some of the mistaken or questionable conclusions that result have become very deeply ingrained in the field as a whole. </p><p><b>Footnotes </b></p><p>[1] Incidentally, my impression is that trotting out the battery or some subset of constituent tests was not always de rigeur in introductory books but has become so only in the last few decades. To check this intuition (albeit not very systematically) I looked at four earlier syntax texts - Baker 1978 , Soames Perlmutter 1979, van Reimsdijk and Williams 1986 and Napoli 1993. Sure enough, none of these books do this insofar as I could find. Baker does discuss the 'proform' replacement test towards the end of his book, but only as a way to discussing that phenomenon itself, not as a means of testing for constituents. He assumes a certain constituent structure, and used that argue for a specific conclusion about conditions for 'one-replacement'. </p><p>[2] I think the 'replacement by a proform' test is problematic especially in the case where "do so" is treated as a single 'proform' that 'replaces' certain VPs. Does it make sense to call "do" here part of a 'proform'? Is it not the same main verb do that takes NP but not VP complements: "do nothing", "do something", "do several thing", "What he had done was he took out the garbage" (specificational sentence with free relative in precopular position), "What he had done was stupid" (predicational sentence with free relative in subject position). In all such cases the complement or the 'missing' material (trace, gap, whatever) is arguably an NP. If this is the same veb "do", then "so" is the proform (not "do so"). It does remain unclear what is the category of "so" since it does not behave as like an NP (note: *"Lee changed the lightbulb, and Sandy either did so or nothing" vs. "..and Sandy either did so or did nothing"). Nonetheless, given that there is main verb "do" with the same semantic restrictions as the "do" that occurs with "so", it is reasonable to think they are the same verb. Thus note: *"Lee knew the answer and Sandy did so too", alongside *"Lee did several things, including know the answer". If this is main verb "do" then "so" is the anapahor; it is also a CP anaphor but with rather limited distribution ("think so", "hope so", but not *"regret so", etc.) Thus the "do so" construction is complex to analyze, but in any case calling "do so" an 'proform' that 'replaces' a VP is a questonable.</p><p>[3] By saying that books convey the idea that an unambiguous sentence is represented by a (single) tree I mean this simply in opposition to the representation of a sentence as a sequence of trees, or some other object. I know of no theory that actually requires an unambiguous sentence to necessarily have only one possible analysis. Whatever one adopts as the fine grained analysis of conjunction, a case like "Roses and red and violets are and lilies are white" will automatically have two possible brackettings which - under most assumptions of how the semantic works - will have the same truth conditions. This will be discussed more in Sec. 5. The key point here is that what is generally conveyed is that a sentence has as its analysis a single tree as opposed to a sequence of trees or some other object.</p><p>[4] Thus Collins 2015 proposed that relative clause can delete. </p><p>[5] The rules could be such that the 'structure' assigned to a string of the form X and X is [[X and] X] or the rules could end up analyzing this as [X [and X]]. Both Munn 1993 and Jacobson 2014 opt for rules of the latter type, as these would follow form other general facts about English word order. </p><p><b>References</b></p><p>Adger, D. 2003. Core Syntax: A Minimalist Approach. Oxford: Oxford University Press. </p><p>Bach, E. 1979. "Control in Montague Grammar". Linguistic Inquiry 10, 515-31.</p><p>Baker, C.L. 1978. Introduction to generative transformational syntax. Englewood Cliffs, NJ: Prentice-Hall. </p><p>Carnie, A. 2021. Syntax: A Generative Introduction, 4th edition. Malden MA: Wiley Blackwell. </p><p>Collins, C. 2015a. Relative Clause Deletion. In Ángel J. Gallego and Dennis Ott <span style="white-space: pre;"> </span>(eds.). 50 Years Later: Reflections on Chomsky’ Aspects. Vol. 77 of MIT Working Papers in Linguistics, Cambridge, MA. 57-69.</p><p>Dawson, H. and M. Phelan (eds.) and Dept. of Linguistics, Ohio State University, 2016. Language Files: Materials for an Introduction to Language and Linguistics 12th edition. Columbus: Ohio State University Press. </p><p>Dowty, D. 1988. "Type Raising, Functional Composition, and Non-Constituent Conjunction". In R. Oehrle, E. Bach, and D. Wheeler (eds.), Categorial Grammars and Natural Language Structures. Dordrecht: Reidel, 153-197.</p><p>Fromkin, V. et al., 2000. Linguistics: An Introduction to Linguistic Theory. Malden MA: Blackwell Publishing.</p><p>Gazdar, G., E. Klein, G.K. Pullum and I. Sag 1985. Generalized Phrase Structure Grammar. Oxford: Basil Blackwell.</p><p>Ginzburg, J. and I. Sag 2000. Interrogative Investigations: The form, meaning, and use of English interrogatives. Stanford CA: CSLI Publications. </p><p>Groenendijk, J. and M. Stokhof 1984. Studies in the semantics of questions and the pragmatics of answers. Amsterdam: University of Amsterdam Dissertation. </p><p>Jacobson, P. 1999. "Toward a Variable Free Semantics". Linguistics and Philosophy 22, 117-185.</p><p>Jacobson, P. 2014. Compositional Semantics: An Introduction to the Syntax/Semantics Interface. Oxford: Oxford University Press.</p><p>Jacobson, P. 2016. "The Short Answer: Implications for Direct Compositionality and Vice-Versa". Language 92, 331-375. </p><p>Kadmon, N. and F. Landman 1993. "Any". Linguistics and Philosophy 16, 353-422.</p><p>Kim, J-B and P. Sells 2008. English Syntax: An Introduction. Stanford: CSLI Publications. </p><p>Koeneman, O. and H. Zeijlstra 2017. Introducing Syntax. Cambridge (UK): Cambridge University Press. </p><p>Larson, R. 2010. Grammar as Science. Cambridge MA: MIT Press.</p><p>Merchant, J. 2004. "Fragments and ellipsis". Linguistics and Philosophy 27, 661-738. </p><p>Montague, R. 1970. English as a formal language. B. Visentini (ed.), Linguaggi nella Società e nella Tecnica. Milan: Edizioni di Comunità, 189-224. </p><p>Munn, A. 1993. Topics in the syntax and semantics of coordinate structures. Doctoral dissertation, University of Maryland, College Park.</p><p>Napoli, D.J. 1993. Syntax: Theory and Problems. Oxford: Oxford University Press.</p><p>O'Grady, W. et al., 2017. Contemporary Linguistics: An Introduction, 7th edition. Boston/New York, Bedford/St. Martin's Press.</p><p>Pollard, C. 1983. Generalized Phrase Structure Grammars, Head Grammars, and Natural Language. Ph.D. Dissertation, Stanford University. </p><p>van Riemsdijk, H. and E. Williams, 1986. Introduction to the Theory of Grammar. Cambridge MA: MIT Press. </p><p>Soames, S. and D. Perlmutter. Syntactic Argumentation and the Structure of English. Berkeley CA: University of California Press. </p><p></p><p>Steedman, M. 1987. "Combinatory grammars and parasitic gaps". Natural Language and Linguistic Theory 5, 403-439.</p><div><br /></div><br /><p><br /></p><p><br /></p><p><br /></p><p><br /></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com2tag:blogger.com,1999:blog-7690685905207226001.post-24297727614821786292023-05-04T14:58:00.001-04:002023-05-04T14:58:35.296-04:00Tse Ke Di Ratang Le Tse Ke Sa Di Rateng Kwa Metseng Ya Botswana <p>This is a translation into Setswana of an earlier blog post. I would like to thank Seabelo John for helping me with this translation.</p><p><b>Likes and Dislikes of Living in Rural Botswana</b></p><p>Tse Ke Di Ratang Le Tse Ke Sa Di Rateng Kwa Metseng Ya Botswana </p><p><b>(with the help of Seabelo John)</b></p><p>(ka thuso ya ga Seabelo John)</p><p><b>These likes and dislikes are based on my experiences living in Diphuduhudu (local language: Cua) and Mokgenene (local language: Sasi) (being in) in rural Botswana during my linguistic fieldwork. </b></p><p>Dilo tse ke di ratang le tse ke sa di rateng di remeletse mo maitemogelong a me fa ke nna kwa Diphuduhudu (teme: Secua) le kwa Mokgenene (teme: Sesasi) e le metse ya Botswana jaaka ke tlhotlhomisa diteme.</p><p><b>The locations I stayed at are called ‘settlements’, meaning that they were created as places where the ‘Basarwa’ (Khoisan population) could move to for access to amenities (e.g., school, clinic). </b></p><p>Mafelo a ke neng ke nna mo go one a bidiwa bodulo/magae, go raya mafelo a Basarwa ba ka yang teng go bona ditlamelo jaaka dikolo le dikokelwana. </p><p><b>I have no experience living in other kinds of rural villages in Botswana.</b></p><p>Ga ke na maitemogelo a go nna kwa metseng e mengwe ko Botswana. </p><p><b>I am grateful to the people of these villages for their friendship and for supporting my research. </b></p><p>Ke leboga banni ba metse e ka gore ba na le botsanalo gape ba rotloetsa ditlhotlhomiso tsa me.</p><p><b>Just to be clear, I love staying in the village and doing fieldwork. </b></p><p>Ke batla gore batho ba tlhaloganye gore ke rata go nna mo metseng ke dira ditlhotlhomiso.</p><p><b>The likes and dislikes are presented in no particular order.</b></p><p>Dilo tse ke di ratang le tse ke sa di rateng ga di a tlhomaganngwa ka tsela epe.</p><p><br /></p><p><b>Likes</b></p><p>Dilo tse ke di ratang</p><p><b>1.<span style="white-space: pre;"> </span>Seeing the stars and the Milky Way at night.</b></p><p>Go bona dinaledi le mokgatšha wa dinaledi bosigo.</p><p><b>2.<span style="white-space: pre;"> </span>Doing fieldwork all day long.</b></p><p>Go dira ditlhotlhomiso tsa puo letsatsi lotlhe.</p><p><b>3.<span style="white-space: pre;"> </span>Discovering something cool about Cua/Sasi.</b></p><p>Go tlhaloganya se sentle (se se kgatlhang) ka dipuo tsa Secua kgotsa Sesasi.</p><p><b>4.<span style="white-space: pre;"> </span>Interacting with my elderly consultants.</b></p><p>Go buisana le bagolo ba ba nthusang.</p><p><b>5.<span style="white-space: pre;"> </span>Hot fresh baked bread from the bakery.</b></p><p>Borotho jo bo molelo jo bo tswang kwa lebentleleng la borotho/maapeelong a borotho.</p><p><b>6.<span style="white-space: pre;"> </span>No e-mail, no TV, no Internet.</b></p><p>Ga go na melaetsa ya maranyane, setshwantso sa motshikhinyego, le enthanete mo motseng.</p><p><b>7.<span style="white-space: pre;"> </span>Giving clothes to children, and seeing them smile and laugh.</b></p><p>Go neela bana diaparo, le go ba bona ba nyenya ba bo ba tshega.</p><p><b>8.<span style="white-space: pre;"> </span>Listening to Stadium play his guitar.</b></p><p>Go reetsa Stadium a tshameka katara ya gagwe.</p><p><b>9.<span style="white-space: pre;"> </span>Stadium’s song: ‘On the other side of the river, there is no honey.’</b></p><p>Pina ya ga Stadium: ‘Kwa boseja ga noka, ga gona dinotshi’.</p><p><b>10.<span style="white-space: pre;"> </span>Hearing and practicing Cua/Sasi.</b></p><p>Go utlwa le go ikatisa Secua le SeSasi.</p><p><b>11.<span style="white-space: pre;"> </span>Hearing and practicing Setswana.</b></p><p>Go utlwa le go ikatisa Setswana.</p><p><b>12.<span style="white-space: pre;"> </span>Using my solar panel for electricity.</b></p><p>Go dirisa maranyane a me a a dirisang letsatsi go fetlha motlakase.</p><p><b>13.<span style="white-space: pre;"> </span>Seeing six red bars (charged) on the inverter.</b></p><p>Go bona invetara (sekaedi sa motlakase) e/se na le masedi a mahibidu a le marataro. </p><p><b>14.<span style="white-space: pre;"> </span>Sitting around a fire at night talking.</b></p><p>Go ora molelo, le go bua le ditsala bosigo.</p><p><b>15.<span style="white-space: pre;"> </span>Eating motogo (sorghum porridge) in the morning.</b></p><p>Go ja motogo phakela.</p><p><b>16.<span style="white-space: pre;"> </span>Taking walks in the village after work, talking to people.</b></p><p>Go tsamaya ka dinao mo motseng morago ga tiro (fa re tšhaisa), </p><p>le go bua le batho ba motse.</p><p><b>17.<span style="white-space: pre;"> </span>Taking a hot bucket shower, in my office.</b></p><p>Go tlhapa ka metsi a molelo mo emereng mo ofising ya me.</p><p><b>18.<span style="white-space: pre;"> </span>Playing Morabaraba (advanced tic-tac-toe) in the sand/on the ground.</b></p><p>Go tshameka Morabaraba (kgotsa Mhele/Mohele) mo mmung.</p><p><b>19.<span style="white-space: pre;"> </span>Waking up to the rooster crowing.</b></p><p>Go tsoga mokoko o lela.</p><p><b>20.<span style="white-space: pre;"> </span>Locating and buying local products (e.g., tomatoes, vegetables, eggs)</b></p><p>Go batla le go reka dijo tsa mo gae jaaka ditamati, merogo, le mae.</p><p><b>21.<span style="white-space: pre;"> </span>Driving to the nearest town for provisions.</b></p><p>Go kgweeletsa kwa motseng wa mabapi go batla dithoto.</p><p><b>22.<span style="white-space: pre;"> </span>Absolute quiet in the middle of the night.</b></p><p>Ga go na modumo gotlhelele bosigogare.</p><p><b>23.<span style="white-space: pre;"> </span>Breathing the fresh country air.</b></p><p>Go hema pefo e phepha ya motse.</p><p><b>24.<span style="white-space: pre;"> </span>Watching the sunrise and sunset over the desert.</b></p><p>Go bona tlhabo ya letsatsi le phirimo ya letsatsi mo sekakeng.</p><p><b>25.<span style="white-space: pre;"> </span>Living outside my comfort zone.</b></p><p>Go nna go sena manobonobo.</p><p><br /></p><p><b>Dislikes</b></p><p>Tse ke sa di rateng</p><p><b>1.<span style="white-space: pre;"> </span>Flies everywhere, all the time.</b></p><p>Go na le dintsi tse dintsi gongwe le gongwe, letsatsi le letsatsi.</p><p><b>2.<span style="white-space: pre;"> </span>Sand on my office floor.</b></p><p>Go na le motlhaba fa fatshe mo ofising yame.</p><p><b>3.<span style="white-space: pre;"> </span>Walking on sand.</b></p><p>Go tsamaya ka dinao mo motlhabeng.</p><p><b>4.<span style="white-space: pre;"> </span>Getting sand in shoes.</b></p><p>Go tsenwa ke motlhaba mo ditlhakong</p><p><b>5.<span style="white-space: pre;"> </span>Thorns in sand.</b></p><p>Mitlwa mo motlhabeng.</p><p><b>6.<span style="white-space: pre;"> </span>Seeing people living in abject poverty.</b></p><p>Go bona batho ba ba senang madi gotlhelele.</p><p><b>7.<span style="white-space: pre;"> </span>Seeing widespread alcoholism.</b></p><p>Go bona batho ba ba nwang bojalwa thatha.</p><p><b>8.<span style="white-space: pre;"> </span>Using one small room for office, bedroom, and shower.</b></p><p>Go dirisa kamore e le nngwe e nnye e le ofisi, kamore ya borobalo, </p><p>le kamore ya botlhapelo.</p><p><b>9.<span style="white-space: pre;"> </span>Using the outdoor latrine.</b></p><p>Go dirisa ntlwana ya boithomelo ya kwa ntle.</p><p><b>10.<span style="white-space: pre;"> </span>The smell of the outdoor latrine.</b></p><p>Monko wa ntlwana ya boithomelo ya kwa ntle.</p><p><b>11.<span style="white-space: pre;"> </span>Walking to the latrine in the dark in the morning.</b></p><p>Go ya kwa ntlwaneng ya boithomelo mo lefifing phakela.</p><p><b>12.<span style="white-space: pre;"> </span>Chickens defecating on my porch.</b></p><p>Dikoko di ithomela mo setupung same.</p><p><b>13.<span style="white-space: pre;"> </span>Goats drinking from my buckets.</b></p><p>Dipodi di nwela metsi mo diemereng tsame.</p><p><b>14.<span style="white-space: pre;"> </span>Cloudy days when the solar panel does not work well.</b></p><p>Malatsi a maru a thibileng fa maranyane a letsatsi a sa bereke sentle.</p><p><b>15.<span style="white-space: pre;"> </span>Seeing three bars (not charged) on the inverter.</b></p><p>Go bona invetara (sekaedi sa motlakase), e na le masedi a le mararo.</p><p><b>16.<span style="white-space: pre;"> </span>No electricity (no fans or heaters).</b></p><p>Ga go na motlakase (ga go na fene, ga go na hitara).</p><p><b>17.<span style="white-space: pre;"> </span>No washing machine.</b></p><p>Ga go na motšhine o o tlhatswang diaparo.</p><p><b>18.<span style="white-space: pre;"> </span>No indoor running water.</b></p><p>Ga go na metsi mo ntlong.</p><p><b>19.<span style="white-space: pre;"> </span>Getting skin cancer from being in the sun.</b></p><p>Go tsenwa ke kankere ya letlalo fa ke nna mo letsatsing.</p><p><b>20.<span style="white-space: pre;"> </span>Getting out of my blankets in the frigid morning air.</b></p><p>Go tswa mo dikobong fa go le tsididi phakela.</p><p><b>21.<span style="white-space: pre;"> </span>Chickens crowing during recording.</b></p><p>Dikoko di lela fa ke gatisa batho.</p><p><b>22.<span style="white-space: pre;"> </span>Children crying during recording.</b></p><p>Bana ba lela fa ke gatisa dipolelo tsa Batho.</p><p><b>23.<span style="white-space: pre;"> </span>Worrying about my house in Gaborone getting robbed.</b></p><p>Go belaela gore ntlo yame ko Gaborone e tla thubiwa.</p><p><b>24.<span style="white-space: pre;"> </span>Living outside my comfort zone.</b></p><p>Go nna go sena manobonobo.</p><p><br /></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-12965761081663316462023-05-04T12:05:00.003-04:002023-05-04T12:10:16.530-04:00Implicit Arguments at WCCFL 41 (2023): Handout and Posters<p> Here are three posters and one talk handout to be presented at WCCFL 41 at Santa Cruz. </p><p>All of these presentations concern the syntax of implicit arguments.</p><p><br /></p><p><a href="https://www.dropbox.com/s/eo0jhx92eymem30/Sulemana%20WCCFLE%20%283%29%20%28poster%29.pdf?dl=0" target="_blank">Buli</a></p><p style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">Passive without Morphology: A Case for Implicit Arguments</p><p><span style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">Abdul-Razak Sulemana, University of Ghana, Legon</span></p><p><br /></p><p><a href="https://www.dropbox.com/s/0lxqq847z6ka1s2/Gotah_Final_WCCFL_41%20%28poster%29.pdf?dl=0" target="_blank">Ewe</a></p><p style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">The Syntax of Ewe (Tongugbe) Nya-Constructions</p><p><span style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">Selikem Gotah, New York University</span></p><p><br /></p><p><a href="https://www.dropbox.com/s/w89tnrvgnloqha7/Nikos%20WCCFL_Talk%20%282%29.pdf?dl=0" target="_blank">Greek</a></p><p style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">On the Syntactic Status of Implicit Arguments in UG: Greek as a Case Study</p><p><span style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">Nikos Angelopoulos, Chris Collins, Dimitris Michelioudakis and Arhonto Terzi</span></p><p><br /></p><p><a href="https://www.dropbox.com/s/jn8ykf05kcuddwv/storment_wccfl%20%283%29%20%28poster%29.pdf?dl=0" target="_blank">Spanish</a></p><p><span style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">Two is Better than One: A Number Mismatch with Deficient Implicit Arguments</span></p><p><span style="background-color: white; color: #222222; font-family: "Times New Roman", Times, FreeSerif, serif; font-size: 15.4px;">John David Storment, Stony Brook</span></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-81556009519954382682023-04-06T09:01:00.002-04:002023-04-06T09:01:44.331-04:00MA Scholarship: Khoisan Languages at MIT<p>This is a legitimate opportunity for those of you who are interested in promoting Khoisan languages in Botswana, South Africa, Namibia, Angola and Zimbabwe. Please take a look at the description and see if it is right for you. Linguistics background is not necessary.<br /> </p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjbn-Ip2yLkwJUWTdvEKktKuUYM48W53IMkawojTasub7klPGuIzXBcQuEVCDnzubSjaF01t_XsRKJESq9CjfjAMdqigr7WgAKCiegKvUz8FzMHUK_PNEBMcWaLlr5VRgfLANxLe4OGukNl-OkvTXZrr0ILwnsGHbZFpTlIIzH1mi1GJfaLqpJcDUwf-A" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="1595" data-original-width="1224" height="602" src="https://blogger.googleusercontent.com/img/a/AVvXsEjbn-Ip2yLkwJUWTdvEKktKuUYM48W53IMkawojTasub7klPGuIzXBcQuEVCDnzubSjaF01t_XsRKJESq9CjfjAMdqigr7WgAKCiegKvUz8FzMHUK_PNEBMcWaLlr5VRgfLANxLe4OGukNl-OkvTXZrr0ILwnsGHbZFpTlIIzH1mi1GJfaLqpJcDUwf-A=w461-h602" width="461" /></a></div><p></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-66773078675233949552023-04-05T15:07:00.002-04:002023-04-06T09:01:54.884-04:00Chomsky Keio-EMU Lectures (March 2023)<p> <a href="https://www.youtube.com/playlist?list=PLWXQYx-RCmeP7B2UtIA8OJsvAF-xvjDuZ" target="_blank">Keio-EMU Lectures (March 2023)</a><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEiQTHLaoQ0QmbebEFipEbAVh1_nVto1uUrL_ovzMnLC8Z_YkAZnfzNc_0YNvLwmk5wJZI07vXgqcgC4EoRGnGtR27RsPULC_suzAwIZ7U5Krj3jPGYSaPl2pY_HuHIeI1OWePT4RbymLTaUchCR4XWHpUd7AcOJG69Lncc2NonOSQuGMMM5kEi7PWIrJw" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="3508" data-original-width="2481" height="640" src="https://blogger.googleusercontent.com/img/a/AVvXsEiQTHLaoQ0QmbebEFipEbAVh1_nVto1uUrL_ovzMnLC8Z_YkAZnfzNc_0YNvLwmk5wJZI07vXgqcgC4EoRGnGtR27RsPULC_suzAwIZ7U5Krj3jPGYSaPl2pY_HuHIeI1OWePT4RbymLTaUchCR4XWHpUd7AcOJG69Lncc2NonOSQuGMMM5kEi7PWIrJw=w453-h640" width="453" /></a></div><p></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-81943796527008410642023-03-07T14:50:00.005-05:002023-03-07T14:56:12.778-05:00Internet Searches as a Tool in Syntactic Research (third version)Internet searches have turned out to be a revolutionary tool in syntactic research. In this blog post, I outline the general methodology.<div><br /></div><div><a href="https://www.dropbox.com/s/8eeql5g8n8szm4b/Internet%20Searches%20%28March%207%202023%29.pdf?dl=0" target="_blank">Internet Searches</a><br /></div>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-57651635915265748382023-03-03T11:19:00.003-05:002023-05-04T12:08:30.780-04:00Implicit Arguments at WCCFL 41 (2023): Abstracts<p>The following abstracts concerning implicit arguments have been accepted to WCCFL 41:</p><span><a name='more'></a></span><p><a href="https://www.dropbox.com/s/1etqu1ekghfjxrd/Sulemana.WCCFL41.pdf?dl=0" target="_blank">Buli</a></p><p>Passive without Morphology: A Case for Implicit Arguments</p><p>Abdul-Razak Sulemana, University of Ghana, Legon</p><p><br /></p><p><a href="https://www.dropbox.com/s/xvy13ouwtlbuqa1/Gotah%20WCCFL%20Abstract.pdf?dl=0" target="_blank">Ewe</a></p><p>The Syntax of Ewe (Tongugbe) Nya-Constructions</p><p>Selikem Gotah, New York University<span><br /></span><span></span></p><p><br /></p><p><a href="https://www.dropbox.com/s/5ku9sh3zbhcsutc/Angelopoulos%20WCCFL%20%282%29.pdf?dl=0" target="_blank">Greek</a><br /></p><p>On the Syntactic Status of Implicit Arguments in UG: Greek as a Case Study</p><p>Nikos Angelopoulos, Chris Collins, Dimitris Michelioudakis and Arhonto Terzi</p><p><br /></p><p><a href="https://www.dropbox.com/s/6czegaokep1eaz5/Storment%20wccfl_abstract.pdf?dl=0" target="_blank">Spanish</a></p><p>Two is Better than One: A Number Mismatch with Deficient Implicit Arguments<br />John David Storment, Stony Brook</p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0tag:blogger.com,1999:blog-7690685905207226001.post-13407570622562436672023-02-28T09:03:00.006-05:002023-02-28T09:03:37.349-05:00On the Syntactic Status of Implicit Arguments in UG: Greek as a Case Study (Accepted WCCFL 41 2023 Abstract)<p> The abstract below is a joint effort by our team, including (in alphabetical order):<br />Niko Angelopoulos, Chris Collins, Dimitris Michelioudakis and Arhonto Terzi.</p><p><a href="https://www.dropbox.com/s/wi7kasearjetbln/WCCFL%20%282%29.pdf?dl=0" target="_blank">On the Syntactic Status of Implicit Arguments: Greek as a Case Study</a></p><p><br /></p>Chris Collinshttp://www.blogger.com/profile/17999530032394474839noreply@blogger.com0