diff options
author | JJ | 2024-03-17 22:24:56 +0000 |
---|---|---|
committer | JJ | 2024-03-17 22:24:56 +0000 |
commit | 6b05a56d5aa2cac4c964a0a9a9987a273c413dc5 (patch) | |
tree | 33d9e94e9b8554228263a3b15e87ef9fc642219a /linguistics | |
parent | 13f77d4144a358df475cad6fdfbfcc75d4cf288e (diff) |
meow
Diffstat (limited to 'linguistics')
-rw-r--r-- | linguistics/syntax.md | 100 |
1 files changed, 69 insertions, 31 deletions
diff --git a/linguistics/syntax.md b/linguistics/syntax.md index 13e3dba..6590911 100644 --- a/linguistics/syntax.md +++ b/linguistics/syntax.md @@ -13,7 +13,11 @@ They can be considered to form an overarching **morphosyntactic** theory. ## summary -> Be warned! These notes are incomplete and almost certainly somewhat inaccurate. Proceed at your own risk. +These notes are ordered in a way that I feel builds upon itself the best. This is not the order in which topics were covered in my syntax class, nor in my textbook. My syntax class covered Agree before Move, and my textbook deeply intertwined Merge with X'-theory and Move with Agree: and I think the both of them suffered a little bit pedagogically for that. + +Certainly, all of syntax cannot be taught at once. Yet the desire to generalize and apply what one has learned to real-world examples is strong, and it is extraordinarily difficult to teach syntax in a way that builds upon itself naturally. This is my best attempt, but it will fall flat in places: when it does, I do recommend either skipping ahead or being content with temporarily (hopefully) not knowing what's going on. + +Be warned! These notes are incomplete and almost certainly somewhat inaccurate. Proceed at your own risk. - History of Syntax - A wrong approach: Phrase Structure Rules @@ -61,48 +65,24 @@ They can be considered to form an overarching **morphosyntactic** theory. ## basic ideas +### constituency + +### heads, specifiers, and complements + ## notation So far, we've been discussing syntax and giving examples using somewhat informal notation. We now formalize this notation. -It cannot be emphasized enough that notational conventions are *just that*: notational conventions. There's nothing stopping us from exclusively using X'-notation or exclusively using bare phrase structure, and syntactic concepts are *not* tied to any specific notation. I will pretty much exclusively bare phrase structure going forth as I like it a whole lot more. - ### X'-theory **X'-theory** (x-bar theory) is a notation originally put forth by Chomsky... -```forest -[XP - [X (head)] - [Y (complement)]] -``` - -```forest -[XP - [Y (complement)] - [X (head)]] -``` - -```forest -[X - [Y_X (left adjunct)] - [X (head)]] -``` - -```forest -[X - [X (head)] - [Y_X (right adjunct)]] -``` - ... ### Bare Phrase Structure **Bare Phrase Structure** (BPS) is a more modern notation that does away with much of the notational cruft of X'-theory. Instead of bar levels and distinctions between bar levels and "phrases", we simply put the *formal features* of our lexicon in the chart itself and only indicate the *types* of phrases. Whether a phrase has yet to close yet or not (previously indicated by a 'bar) is now indicated by whether there are any unsatisfied selectional requirements on the phrase label. -As such, we may represent phrases with the - **Head-Initial Phrases** ![`[X [X_Y (head)] [Y (complement)]]`](head-initial.png) @@ -191,18 +171,61 @@ The lexicon and structure are blended in bare phrase structure. This is useful, ... -### Lexical Entries +It cannot be emphasized enough that notational conventions are *just that*: notational conventions. There's nothing stopping us from exclusively using X'-notation or exclusively using BPS, and syntactic concepts are *not* tied to any specific notation. I will pretty much exclusively use BPS going forth as I like it a whole lot more. -... +### lexical entries + +We have stated that Bare Phrase Structure pulls aspects of the lexicon directly into the syntax tree. But what is a lexicon? + +A **lexicon** is a language speaker's internal collection of lexical entries. But what is a lexical entry? + +What exactly a lexical entry contains is up to some debate. The English language consists of (significantly) upwards of 400,000 words. How humans can hold that much information in our mind, and retrieve it so quickly? This is biologically interesting, and there are arguments for what such entries should and should not contain that come from such fields. For our purposes, we will focus entirely on analysis, and ignore biological motivations. We treat a **lexical entry** as containing the following information about an associated morpheme: +- phonetic features (**p-features**): how the word is pronounced + - With our focus on syntax, we shall simply consider this the standard written representation of the morpheme. But it should really be written in IPA. +- formal features (**f-features**): the type of the morpheme and what types it selects, if any + - These are often written directly on our tree in BPS. While most often they are simply the types of the arguments - heads can select for much more granular features, i.e. -tense, +animacy, etc. +- semantic features (**s-features**): the role of the entry and its arguments in the sentence + - Not all lexical entries have s-features. For tense/aspect/etc, these are their appropriate tense/aspect/etc. For verbs, these are typically *theta roles* (which we shall address later). + +## Minimalism + +[Minimalism](https://en.wikipedia.org/wiki/Minimalist_program) is a *program* that aims to reduce much of the complexity surrounding syntactic analysis. While our theories may end up providing for adequate analyses of natural languages, this is not enough. Phrase structure rules, too, were *adequate*: yet we rejected them for their sheer complexity. If we can explain what we observe in a simpler framework, *we should adopt that framework*. Much of modern advancements in syntactic analysis have come out of Minimalism: bare phrase structure, in particular. + +As with most Chomskyan theories: Minimalism has a *strong* focus on natural language facilities. A core thesis is that *"language is an optimal solution to legibility conditions"*. I don't find this interesting, so I won't get into it, and instead will focus on the definitions and usage of the basic operations rather than the motivation for them. + +Modern Minimalism considers into three *basic operations*: <span style="font-variant: small-caps;">Merge</span>, <span style="font-variant: small-caps;">Move</span>, and <span style="font-variant: small-caps;">Agree</span>. All that we will discuss can fall into one of these basic camps. ## Merge +<span style="font-variant: small-caps;">Merge</span>(α, β) is a function that takes in two arguments of type α and β and outputs a single node of either type α or β. + +Merge is *the* fundamental underlying aspect of syntax and arguably language as a whole. Compositionality, headedness, movement (in some camps), and a whole lot more can be considered to be rolled into it. + ### projection ### selection ### silent heads +Why are proper names Ds? Why is it possible to say either *I moved the couches* and *I moved couches*, but only possible to say *I moved the couch* and not *I moved couch*? Why is the infinitive form of a verb identical to the present, in some cases? + +These inconsistencies can be all addressed by one (controversial) concept: the idea of *silent morphemes*, invisible in writing and unpronounceable in speech. We represent such morphemes as ∅, and so may write the earlier strange sentence as *I moved ∅-couches*. + +... + +p-features | f-features +-----------|----------- +the | $D_{N}$ +a | $D_{N (-plural)}$ +∅ | $D_{N (+plural)}$ + +p-features | f-features | s-features +-----------|------------|----------- +will | $T_{D,V}$ | future +-ed | $T_{D,V}$ | past +∅ | $T_{D,V}$ | present +to | $T_{D,V}$ (-tense) | infinitive + ## Move ### head movement @@ -219,6 +242,21 @@ The lexicon and structure are blended in bare phrase structure. This is useful, ### binding +How do pronouns work? + +... + +The theory of binding operates under three fundamental principles. +- **Principle A**: an anaphor must be bound in its domain. +- **Principle B**: a pronoun must be free in its domain. +- **Principle C**: an r-expression may never be bound. + +Our principles imply various things. Principle A implies that: +- a reflexive must be coreferential with its antecedent +- the antecedent of a reflexive must c-command the reflexive +- the reflexive and its antecedent must be in all the same nodes that have a subject + + ### raising and control ## References |