Would recommend, but not for the faint of heart. Perhaps the most difficult class in the cs department. I took this during covid so I had the time to devote to this class to get a decent grade but under normal circumstances this class would probably kill your social life. Gives you good fundamentals to attack deep learning, although arguably a lot of the content is useless for that if that’s what you want to get into. Also for the class competition everyone uses dl techniques so be prepared for that. The homework is hard but very doable if you have the ma background, you just have to understand what the problem is asking. Verma is a good lecturer but not very approachable imo. This class requires high skill in math and programming. If you don’t have that, reconsider. If you do, get ready to lose your social life, although if you are good at math and programming you probably didn’t have one to begin with.
Overall, I wouldn’t recommend unless you have a deep math and stats background. Prof Verma is a great lecturer and explains concepts very clearly. However, the exams and homeworks are so incredibly difficult and way beyond the scope of what is covered in lecture. The homeworks take exorbitant amounts of time and are nearly impossible to complete without attending office hours regularly.
I came from a stats background with a CS background, so I don't think this course's difficulty for me resulted from a lack of preparation. That said, I found the difficulty of this class to often be absurd, and that's coming from a student with above a 4.0. The problem sets are interesting but I found them to be so difficult that doing well on them was mostly a matter of having time to go to office hours for the solutions. Moreover, they are extremely time consuming, and I would argue excessively so for a 3 credit class. The midterm was not too bad, but Verma corrected that mistake with the final which was quite literally the hardest exam I've ever taken in my life. With that said, the class is curved so on net I guess that doesn't matter much. To end on a good note, lectures are good and I feel I learned a lot, even with some prior background in ML. Verma is a very good instructor when it comes to lecturing and explaining proofs and concepts. The only knock I have on the lecture component of the class is that we didn't get to HMMs which were the part of the class I was most looking forward to.
Do not expect any empathy or understanding from Prof. Verma if you make even a small mistake--I got a 0 on a homework worth 10% of my final grade because I misread something and tried to turn it in literally 10 minutes late. The homeworks take forever to do, and the exams are pretty challenging. That being said, he is a good lecturer. However, it is not very applied--we go through a lot of proofs of random things, and code algorithms from scratch, so if you're looking for that do not take this class.
I don't see any reviews for how this class was virtually, so I'll go ahead and throw one out there. In short, depending on your math background, I'd say this class could range from pretty difficult to regularly feeling impossible. The tests are pretty hard (sub 50 medians), and the exam caliber practice material is pretty limited. At best, you might be able to find old exams from other schools like CMU or UC Berkeley which are good for seeing if you can apply what you learned, but I found Verma's exams to invariably be harder and impossible to thoroughly finish. If you've taken multivariable calc, linear algebra, and probability you'll be able to comprehend the symbols on the homework (which is often an important first step in solving the problem!), but beyond that, I think having some kind of classes in real analysis and convex optimization would make the intuition much easier. The TAs were good and helped make the problem sets doable. The prerecorded lectures are nice, and Verma seemed helpful in his flipped classroom office hour sessions. The difficulty with the online format is especially compounded by the fact that no group work is allowed on the homework, so you can't just do 2/5 problems and have your group mates do the rest. It seems the difficulty of the problems was scaled to match things, but the class is inherently hard and there's only so much adjustment that can be made without compromising the integrity of the education. Another brutal aspect of this course is that the gap between the final and the last PSET is small because the exam happens on the last day of class, so there's no reading week to study for it. For reference, during the fall A term, the PSET was due on Monday and the exam window (there was a 48 hour period during which you could sit and submit the 2-hour exam) opened on Thursday. This meant you had most of 3 days to really dedicate to studying for the CUMULATIVE final without worrying about other stuff in the class. Good luck if you have a bunch of other stuff going on in those 3 days. The programming can handily be completed in python and isn't a real focal point of the course— it's more focused on the theory and you receive basically 0 help on any debugging or algorithmic design. If you don't know python don't use this course as a way to try and learn it from scratch. If you know some basic python and are good at googling how to solve your problems, you'll be ok. Despite how hard this class is and how much past experience with the subject matter helps, I'd still recommend it to anyone willing to try really hard because it's one of the few classes I've taken where I feel like I really learned something. I now can read papers on machine learning and have some remote idea of what's going on with the design, and I feel like I can use many existing tools with intent and knowledge on the advantages and drawbacks of each. While the class is curved in the end, don't take it if you're looking for an easy A. I think to the average undergraduate student that's taken the bare minimum prereqs listed on the course website, an A- and maybe even a B+ are great achievements here.
This is the best course that I have taken among all courses from my undergraduate and graduate schools. This course is epic. This course covers the foundation of machine learning. You will learn different machine learning algorithms from the foundation supported by math and proofs. I have weak mathematics before taking this class. My math background only has intermediate-level linear algebra, calculus, and intro-level probability. I haven't done math problems for three years. Before taking this course, I also have completed Andrew Ng's online course. Before enrolling, I thought I might struggle in this class but, in fact, I survived well and really enjoyed my experience. First of all, this is not a relaxing course. It is challenging, but it is not a hard course! What I mean is that the course homework and tests are challenging, but this course is amazingly organized and Verma is an extremely capable instructor. Even though this is a challenging course, I feel safe and comfortable when I sit in the class even though with a weak math background because of Verma's teaching. Verma is very patient on teaching his topics and he cares about how his students learn with quality rather than quantity. He is a strict yet kind instructor. I could feel that he ardently cares about his class and his student. His teaching is perfect. I can understand everything that has been taught in his class even with my weak math background. The pace of the class is slow and nobody would feel lost. The homework are challenging but are interesting to solve. If you spend time on them you can always find solutions. TAs are also helpful in Piazza and OH. The tests are also challenging. However, grades are generously scaled and you might get A- if you get 60% in tests. I learned so many things from this course and it helped me build a fundamental understanding of machine learning problems. If you really want to work in ML-related careers or research, this is the recommended intro-level course. If you hesitate whether this course is too hard, I would say be brave and enroll. You will be fine (suppose you can get 70% right for the homework 0, which is the precondition of enrolling in this class . 70% is not a hard target). Columbia University's tuition fees are expensive. You want to spend your money in the right place, this is the course you should take
I like Professor Verma so much that I decided to write a review: note that I'm biased. Prerequisite: usually linear algebra and prob-stats. But I personally wished I learned more math, especially optimization related math. Although without those it will also be fine, with those knowledge you'll definitely get more out of the class. Homework: up to three students in a group, usually due 2~3 weeks after being posted. Try to pick your teammates cautiously, because if you don't, you'll be the one doing most of the work, which means a lot of hours into homework. I usually go to at least 6 hours of OH every week and read/post on Piazza to figure out the homework. But the homework problems are all interesting! And Professor and TAs are very helpful and responsive both in person and on Piazza. Fortunately, we only had four graded problem sets and the last one is optional. And no homework was due during spring break, which is nice. Exams: Professor Verma made clear that usually you can pick some questions to finish; this is true. Due to the time constraint of 75~80 minutes, and there are 6 problems in total, I usually can only finish 4~5 problems and not all of what I wrote would be correct. But there's curve, so don't sweat too much. It's supposed to be hard! This class teaches various topics on machine learning and I feel I actually know stuff about ML now. One thing I found helpful is that all the additional notes he linked on his website for each topic covered are very helpful. After lectures, sometimes I still have questions about the topics; if I read those notes, most of my questions get resolved. Make good use of them! Go to his OH and you'll learn a lot too! He is very approachable. TAs' recitations are also helpful. He's teaching a topics class on unsupervised learning next semester (during the summer too). Take his class if you can.
Prof Verma's Machine Learning class is a confusing mixture of being not technical enough and over-technical at the same time. He assumes knowledge of some advanced multivariate calculus, multivariate probability and linear algebra without going through the techniques explicitly or in any systematic way, while the prerequisite classes cover nothing close to what he requires. This could be partly due to the fact that half the class are grad students, with the overall result that undergrads feel very left behind. At the same time, Prof Verma introduces results of some advanced derivations, skipping over the math in class entirely with the reason that "it is too easy/simple/boring", but expects you to reproduce it in a test. He focuses on teaching the "intuition" behind the proofs, yet fails to understand that "intuition" comes from understanding, which is impossible for most students unless he goes more into the math. Another huge blindspot for the course is that Prof Verma does not give out solutions for his practice exams. This could seem forgivable, except the homework questions are nothing like exam questions (problem sets are more like mini-projects while exam questions are just out of nowhere), so students who have seen questions of this type for the first time are forced to study without any guide or solution. This coupled with reasons mentioned in the above paragraph result in overall confusion before the exams. (In addition, no one does well on the actual exams). The lack of solutions handed out caused some Piazza savagery in my section, which I feel was actually quite justified. I hope Prof Verma sees this and realizes that no learning happens if the correct answer is not given. Perhaps he believes in unsupervised learning, but the results of this can be unboundedly far from the global optimal. As an example of how ridiculous the technical expectations for this class is, during class Prof Verma regularly mentions Convex Optimization, some material from which is relevant in class. I looked up Convex Optimization and it is E6616 - a 6000-level course. If you're not prepared to deal with some 6000-course material that will not be explicitly taught to you, you may not want to take this Machine Learning course. I personally feel it is a shame because the material is super interesting and relevant, and could have been much better taught. A suggestion is a more depth, and less breadth, could improve the class significantly. Overall, this class seems to reward prior knowledge, rather than diligence over the course, which is something I have an issue with.
Professor Verma's ML section was definitely worthwhile. He's a clear lecturer and gave homework that really helped further develop the concepts we learn in class. But, be warned that the homework was difficult and time consuming, though this is somewhat mitigated by the fact that he allows students to form groups of 3 to work on homework together. Tests were a mix of very easy questions and a couple rather hard questions. Some of the more in-depth math was fortunately not tested on exams, but there were some difficult theoretical problems. Overall, particularly with the curve, the tests were definitely manageable. Personality wise, Prof. Verma is great. He cracks jokes during lecture and he really cares about his students. His OH often seemed to extend passed the slotted time so that he could continue to help students. His niceness was a bit of a drawback at times in that some students would occasionally eat up a lot of lecture asking basic or unnecessarily involved questions and he wouldn't cut them off, but he got better at handling this as the semester progressed. Overall, this was a good course that'll teach you ML basics, but it will probably be one of the heavier workload courses you'll take in a semester.