![]() The model is able to successfully induce the latent system generating the observed strings from small amounts of evidence in almost all cases, including for regular (e.g., a n, ( a b ) n, and might characterize the key dependencies in English “if–then” relationships. We demonstrate this by providing the same learning model with data from 74 distinct formal languages which have been argued to capture key features of language, have been studied in experimental work, or come from an interesting complexity class. Here, we describe a learning system that is maximally unconstrained, operating over the space of all computations, and is able to acquire many of the key structures present in natural language from positive evidence alone. Until recently, the computational requirements of language have been used to argue that learning is impossible without a highly constrained hypothesis space. A major goal of linguistics and cognitive science is to understand what class of learning systems can acquire natural language.
0 Comments
Leave a Reply. |