Toward a Modality-Neutral Theory: Reconstructing Hockett’s Design Features of Human Language Through Multimodal Critique

Authors

  • Xiaoxuan Yang School of English Studies, Zhejiang Yuexiu University, Shaoxing 312030, China Author

DOI:

https://doi.org/10.63313/LLCS.9056

Keywords:

Hockett's design features, Modality-neutral theory, Multimodal critique, Cogni-tive plasticity, Language evolution, Digital semiotics

Abstract

Charles F. Hockett’s design features framework faces critical limitations when applied to non-vocal modalities. This study exposes its phonocentric bias (mar-ginalizing sign languages), feature redundancy (e.g., non-unique specialization), and physical overemphasis (prioritizing transient signals over cognition). Through multimodal analysis of American Sign Language, emoji systems, and programming languages, we demonstrate Hockett’s inability to explain spatial syntactic hierarchies, viral cultural transmission, or AI-augmented generativity. The paper proposes a reconstructed modality-neutral theory centered on Mul-tichannel Capacity (neural processing across sensory pathways), Hierarchical Combinatoriality (recursive generativity in ASL/code/emoji), Human-Machine Productivity (GPT-4’s co-creation dynamics) and Cross-Modal Translatability (Peircean semiotic resilience). Empirical validation includes ASL poetry’s visual poetics and pandemic health codes’ digital phatic communion. The model ad-vances linguistic epistemology by reorienting evolution toward gestural proto-language, dissolving artificial/natural language boundaries, and establishing AI accountability frameworks. Practically, it informs multimodal AI design and accessibility technologies, affirming language as a cognitive architecture for meaning-making beyond biological constraints.

References

[1] Amedi, Amir, et al. "The Occipital Cortex in the Blind: Lessons About Plasticity and Vision." Current Directions in Psychological Science, vol. 14, no. 6, 2005, pp. 306–311.

[2] Bach-y-Rita, Paul. "Sensory Substitution and the Human-Machine Interface." Trends in Cognitive Sciences, vol. 7, no. 12, 2003, pp. 541–546.

[3] Bender, Emily M., et al. "On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?" Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 2021, pp. 610–623.

[4] Bergen, Benjamin K. Louder Than Words: The New Science of How the Mind Makes Mean-ing. Basic Books, 2012.

[5] Corballis, Michael C. "Left Brain, Right Brain: Facts and Fantasies." PLOS Biology, vol. 12, no. 1, 2014, e1001767.

[6] Danesi, Marcel. The Semiotics of Emoji: The Rise of Visual Language in the Age of the In-ternet. Bloomsbury Academic, 2017.

[7] Derrida, Jacques. Of Grammatology. Translated by Gayatri Chakravorty Spivak, Johns Hop-kins University Press, 1976.

[8] Emmorey, Karen. Language, Cognition, and the Brain: Insights from Sign Language Re-search. Lawrence Erlbaum Associates, 2002.

[9] Evans, Vyvyan. The Emoji Code: How Smiley Faces, Love Hearts and Thumbs Up Are Changing the Way We Communicate. Picador, 2017.

[10] Floridi, Luciano. "Semantic Capital: Its Nature, Value, and Curation." Philosophy & Tech-nology, vol. 31, no. 4, 2018, pp. 481–497.

[11] Frishberg, Nancy. "Arbitrariness and Iconicity: Historical Change in American Sign Lan-guage." Language, vol. 51, no. 3, 1975, pp. 696–719.

[12] Gallace, Alberto, and Charles Spence. In Touch with the Future: The Sense of Touch from Cognitive Neuroscience to Virtual Reality. Oxford University Press, 2014.

[13] Heller, Morton A. Touch, Representation, and Blindness. Oxford University Press, 2000.

[14] Knuth, Donald E. "Computer Programming as an Art." Communications of the ACM, vol. 17, no. 12, 1974, pp. 667–673.

[15] Kutas, Marta, and Kara D. Federmeier. "Thirty Years and Counting: Finding Meaning in the N400 Component of the Event-Related Brain Potential (ERP)." Annual Review of Psychol-ogy, vol. 62, 2011, pp. 621–647.

[16] Liddell, Scott K. Grammar, Gesture, and Meaning in American Sign Language. Cambridge University Press, 2003.

[17] MacSweeney, Mairéad, et al. "Neural Systems Underlying British Sign Language and Au-dio-Visual English Processing in Native Users." Brain, vol. 125, no. 7, 2002, pp. 1583–1593.

[18] Padden, Carol A. "The ASL Lexicon." Sign Language & Linguistics, vol. 1, no. 2, 1998, pp. 131–152.

[19] Sandler, Wendy. "The Phonological Organization of Sign Languages." Language and Lin-guistics Compass, vol. 6, no. 3, 2012, pp. 162–182.

[20] Stokoe, William C. Sign Language Structure: An Outline of the Visual Communication Sys-tems of the American Deaf. Linstok Press, 1978.

[21] Stout, Dietrich, and Thierry Chaminade. "Stone Tools, Language and the Brain in Human Evolution." Philosophical Transactions of the Royal Society B, vol. 367, no. 1585, 2012, pp. 75–87.

[22] Sutton-Spence, Rachel. Analysing Sign Language Poetry. Palgrave Macmillan, 2005.

[23] Taub, Sarah F. Language from the Body: Iconicity and Metaphor in American Sign Lan-guage. Cambridge University Press, 2001.

[24] Tomasello, Michael. Origins of Human Communication. MIT Press, 2008.

[25] Tseronis, Assimakis. "Multimodal Argumentation: Beyond the Verbal/Visual Divide." Se-miotica, vol. 228, 2019, pp. 37–59.

[26] Zhang, Wei, and Cheris Shun-ching Chan. "Health Code, Digital Surveillance and the Nor-malised Pandemic Routine in China." Surveillance & Society, vol. 21, no. 1, 2023, pp. 1–16.

[27] Zappavigna, Michele. "Ambient Affiliation and #bonding_icon." Social Semiotics, vol. 31, no. 2, 2021, pp. 157–174.

Downloads

Published

2025-07-03

How to Cite

Toward a Modality-Neutral Theory: Reconstructing Hockett’s Design Features of Human Language Through Multimodal Critique. (2025). Literature, Language and Cultural Studies, 2(1), 17-31. https://doi.org/10.63313/LLCS.9056