An Entropy and Noisy-Channel Model for Rule Induction

Author: Silvia Radulescu
LOT Number: 607
ISBN: 978-94-6093-392-9
Pages: 291
Year: 2021
1st promotor: Sergey Avrutin
2nd promotor: Frank Wijnen
€39.00
Download this book as a free Open Access fulltext PDF

Language learners not only memorize specific items and combinations of items, but they also infer statistical patterns between these specific items (item-bound generalization), while also forming categories and generalized rules that apply to categories of items (category-based generalization). The mechanisms and factors that trigger and modulate rule learning are still largely underspecified.

                The main goal of this dissertation is to propose and test an innovative entropy model for rule induction based on Shannon’s noisy-channel coding theory.The main hypothesis of the entropy model is that rule induction is an encoding mechanism graduallydriven by the dynamics between an external factor – input entropy – and an internal factor – channel capacity. Entropy measures input variability, while channel capacity is the amount of entropy processed per second.

The findings showed that when input entropy increases, the tendency to move from item-bound generalization to category-based generalization increases gradually. Results also showed that low input entropy facilitates item-bound generalization, not only rote memorization. In the case of non-adjacent dependencies, results showed that it is input entropy that drives rule learning, not the set size of items, as it was previously claimed. Regarding channel capacity, the findings showed that sped up rate of information transmission leads to higher tendency towards category-based generalization. These findings bring evidence in favor of the entropy model. The dissertation also sketches the first joint information-theoretic and thermodynamic model of rule induction, proposing that the 2nd law of thermodynamics and the constructal law of thermodynamics can answer why and how rule induction happens.

 

Language learners not only memorize specific items and combinations of items, but they also infer statistical patterns between these specific items (item-bound generalization), while also forming categories and generalized rules that apply to categories of items (category-based generalization). The mechanisms and factors that trigger and modulate rule learning are still largely underspecified.

                The main goal of this dissertation is to propose and test an innovative entropy model for rule induction based on Shannon’s noisy-channel coding theory.The main hypothesis of the entropy model is that rule induction is an encoding mechanism graduallydriven by the dynamics between an external factor – input entropy – and an internal factor – channel capacity. Entropy measures input variability, while channel capacity is the amount of entropy processed per second.

The findings showed that when input entropy increases, the tendency to move from item-bound generalization to category-based generalization increases gradually. Results also showed that low input entropy facilitates item-bound generalization, not only rote memorization. In the case of non-adjacent dependencies, results showed that it is input entropy that drives rule learning, not the set size of items, as it was previously claimed. Regarding channel capacity, the findings showed that sped up rate of information transmission leads to higher tendency towards category-based generalization. These findings bring evidence in favor of the entropy model. The dissertation also sketches the first joint information-theoretic and thermodynamic model of rule induction, proposing that the 2nd law of thermodynamics and the constructal law of thermodynamics can answer why and how rule induction happens.

 

Categories