I Am Paying Attention to My Original Work...I think
- Ramya Namuduri
- Feb 2, 2021
- 2 min read

It is undoubtedly important to pay attention to one’s own projects born out of one’s own initiative as results of one’s own interests. This translates to: I should dedicate time, effort and passion towards my Original Work. How did I come to this conclusion? I paid more attention to words like “own”, “project” and “attention” over other words such as “to” and “is”. Actually, I am not a machine, so it happens more naturally to break the process into clearly defined steps. If I were, however, how can I ever understand the complex human languages when I think in zeros and ones?
This week, I did in fact pay more attention to my Original Work, preprocessing and cleaning my text data, and creating word embeddings - matrices that represent text with numbers. This also included replacing contractions with their expanded forms because I am unsure if “can’t” could be mistaken for “can” which would change the meaning...right?
To make tangible sense out of anything, we need context. Context can be explicitly stated or can be implied. It can be found within the sentence we are reading currently, or the ones before, or sometimes even the ones after. The latter would be difficult and cruel, forcing us to read on before we understand completely. This process is something I often struggle with myself, especially on reading comprehension tests unfortunately, and so I found it intriguing that there was a statistical way of explaining this process using probability, with results that are fascinatingly accurate. Through my research regarding my Original Work project, I came across several references to a research paper, “Attention Is All You Need”, but I had never read it myself. This week, I read it as more reading material to describe transformers and the wild world behind encoders and decoders.
Coming back to the practical aspect, where I use this information to help me with my Original Work project, I came to the word embedding stage. Next, I need to create Query, Key and Value matrices by multiplying the word embedding matrices with the respective weights. However, I do not know what these weights are, and that is the goal of this week. It is to understand how these weights are assigned, what they are trained on, and if they truly are not arbitrary (most likely are not).
Attention truly is all I need. Paying attention to small details, paying attention to questions, paying attention to contractions, paying attention to my project, which itself involves Attention. I am paying attention to my Original Work, however I am not completely certain how to do it...I think.

Comments