My Recommendations to Learn Machine Learning in Production

For the last couple of months, I have been doing some research on the topic of machine learning (ML) in production. I have shared a few resources about the topic on Twitter, ranging from courses to books.  In terms of the ML in production, I have found some of the best content in books, repositories, and a few courses. Here are my recommendations for learning machine learning in production.  This is not an exhaustive list but I have carefully curated […]

Read more

Course Recommendations for Introductory Machine Learning

Before you jump into deep learning, I would strongly advise you to do a few introductory machine learning courses to get up to speed with fundamental concepts like clustering, regression, evaluation metrics, etc.  Here is a thread including a few recent courses you can explore: This is a crosspost of a Twitter thread I published earlier this week.  Elements of AI by University of Helsinki Note: I have taken many machine learning courses online. I do some courses for fun […]

Read more

My Recommendations for Getting Started with NLP

I have been studying natural language processing (NLP) since 2013, back when manual feature engineering was very popular in the world of machine learning. We have come a long way since then. I actually specialized in information retrieval and machine learning techniques for my Ph.D., particularly how they apply to social computing and computational linguistics, while at the same time developing approaches for efficient information extraction from large scale text-based data. I am fortunate to have experience with classical machine […]

Read more

Learn About Transformers: A Recipe

Transformers have accelerated the development of new techniques and models for natural language processing (NLP) tasks. While it has mostly been used for NLP tasks, it is now seeing heavy adoption to address computer vision tasks. That makes it a very important technique to understand and be able to apply. I am aware that a lot of machine learning and NLP students and practitioners are keen on learning about transformers. Therefore, I am motivated to prepare and maintain a recipe […]

Read more

10 Must Read ML Blog Posts

I have been doing NLP/ML research for the last 6 years. I have come across a lot of machine learning resources and papers. Today, I kept thinking about the machine learning / NLP / deep learning related blog posts (not papers) that have been transformational for me. In this blog post, I provide a short collection of a few high-impact blog posts that come to mind. This post was originally a Twitter thread. 1) The Unreasonable Effectiveness of Recurrent Neural […]

Read more

Python AI: How to Build a Neural Network & Make Predictions

If you’re just starting out in the artificial intelligence (AI) world, then Python is a great language to learn since most of the tools are built using it. Deep learning is a technique used to make predictions using data, and it heavily relies on neural networks. Today, you’ll learn how to build a neural network from scratch. In a production setting, you would use a deep learning framework like TensorFlow or PyTorch instead of building your own neural network. That […]

Read more

How to Convert DOCX To Html With Python Mammoth

Introduction At some point in your software development path, you’ll have to convert files from one format to another. DOCX (used by Microsoft Word) is a pretty common file format for a lot of people to use. And sometimes, we’d like to convert Word Documents into HTML. This can easily be achieved via the Mammoth package. It’s an easy, efficient, and fast library used to convert DOCX files to HTML. In this article, we’ll learn how to use Mammoth in […]

Read more

Python Community Interview With Ewa Jodlowska

Today I’m joined by Ewa Jodlowska, executive director of the Python Software Foundation (PSF), the organization devoted to advancing open source technology related to the Python programming language. In this interview, we discuss how Ewa started her tech journey, how COVID-19 affected the PSF, plans for PyCon US 2021, her love of hiking and lifting weights, and much more. Ricky: Thank you for joining me, Ewa. You’ve been at the PSF for over nine years at this point, first as […]

Read more

Machine Translation Weekly 71: Explaining Random Feature Attention

Transformers are the neural architecture that underlies most of the current state-of-the-art machine translation and natural language processing in general. One of its major drawbacks is the quadratic complexity of the underlying self-attention mechanism, which in practice limits the sequence length that could be processed by Transformers. There already exist some tricks to deal with that. One of them is local sensitive hashing that was used in the Reformer architecture (see MT Weekly 27). The main idea was computing the […]

Read more
1 669 670 671 672 673 912