Category: Python
Python tutorials
Easy Papers: GlossBERT (Making sense of Word Sense Disambiguation)
Word Sense Disambiguation is the task of understanding the meaning of the focus/target word’s meaning in the context of the sentence. Paper: https://arxiv.org/pdf/1908.07245.pdf
Read moreYou wouldn’t believe what AI can do today
Photo by Lyman
Read morePython News: What’s New From September 2022
In September 2022, the Python 3.11.0rc2 release candidate version became available for you to test and stay on top of Python’s latest features. This release is the last preview version before the final release of Python 3.11.0, which is scheduled for October 24, 2022. Python’s latest bugfix versions, including 3.10.7, have introduced breaking changes to cope with a security vulnerability that affects the str to int conversion and can leave you open to DDoS attacks. As usual, the Python ecosystem […]
Read moreBuilding an on-device Recommendation system
This article provides an adaptive framework to train and serve on-device recommendation system model. This approach personalizes recommendations by leveraging on-device data, and protects user privacy without having user data leave device.
Read morePython — Basic to Advanced-Day1
Congratulations!!! For starting your journey towards Python/Data Science or AI/ML. So obviously for any Data Science or AI/ML project… Python is the ultimate thing one should learn. Let’s make this journey more beautiful along with me in learning, practicing and much more. Wish you very all the best. Let’s Begin!!!
Read moreBeginner’s guide to Twitter sentiment analysis with deep learning
The goal of this article (which happens to be my first Medium article 😊) is to share with you what I have learned while comparing the performance of three different deep learning models on the infamous NLP sentiment analysis task. We will be asking our models to predict positive (1)
Read moreNotes on Andrej Karpathy’s makemore videos. Part 2.
Below are my notes on Andrej Karpathy’s video tutorial on introduction to language modeling. You can watch Andrej’s original presentation on youtube. In Part 1, we worked on a bigram model that takes into account only the local context of a word. This approach is
Read moreSimilarity to Probability — Part I: Visual Word Embedding for OCR Post Correction
In this post, I will revisit in more detail our previous work that uses human-inspired likelihood revision or similarity to probability [Blok et al. 2003] to re-rank or score any word or text fragment based on the semantic relation to an external context. We will use the most popular Semantic Similarity pre-trained model (e.g., w2v, GloVe, fasttext, etc.) to compute these relations.
Read more