特别更牛的是:他的GPA: Grade: 3.98/4.00 overall GPA, 4.00/4.00 CS GPA。应该是一门GenEd课拿了B,其他都是A。
他的图没法copy, 我换成了 = transformative, personally enjoyed
—————————————————————————————————————————————
Fall 2022
-
15-859 CC Algorithms for Big Data, David Woodruff
Woodruff is one of the giants in sketching and numerical linear algebra, having developed many of its most important algorithms. There is even a sklearn function called the Clarkson-Woodruff transform that is named after him.
His teaching is extremely clear as he makes sure to justify and explain every step used in a proof. The analysis for many sketching algorithms is highly non-trivial, but Woodruff manages to pull off explaining it in a way that reads like a storybook. He cares deeply about the class and the student’s learning, and one thing that still amazes me to this day is how he will respond to my Piazza questions on a weekend in 2 minutes consistently. I even made a meme about it.
The homework problems are long but rewarding, and you will become initimately familiar with all sorts of linear algebra manipulations and properties.
One caveat is that the weekly lectures are 3-hours with a 10-minute break in the middle. Given how dense the lectures are, this can be quite taxing, so bring snacks or caffeine if needed.
-
15-859 OO Randomness in Computation, Pravesh Kothari
I had limited exposure to most of the topics in this course (mostly from Theorist’s Toolkit 15-751, Graduate Complexity Theory 15-855, and Graph Theory 21-484) such as spectral graph theory, expander graphs, derandomization, etc, and this course helped to solidify and reinforce my understanding. It also proved a lot of things that did not have time to be proved in those earlier classes. Overall I felt that Pravesh is a great lecturer, and the topics covered are very interesting and applicable. The course was offered for the first time this semester, so there were a few rough edges (i.e in the proof of Cheeger’s inequality he initially did not want to prove it in terms of the Laplacian of the graph to avoid introducing new concepts and notations, but doing so ended up being more confusing than helpful), but overall it is quite a good class. The homework problems are reasonable and the workload is on the lighter side.
-
10-708 Probablistic Graphical Models, Andrej Risteski
This class has a reputation of being one of the hardest ML classes, but I think it is actually an excellent class that is very well-taught, so I hope that this reputation does not discourage people interested in the content from taking it. The class can be categorized into three module: representation, inference, and learning. In the representation module, you will learn about how joint distributions of several variables can be represented efficiently by various models, taking into factors such as causal relationships. In the inference module, you will learn that sampling from such models is very hard in general (assuming ?≠NP), and develop probabilistic ways of sampling from them such as Monte-Carlo Markov-Chain and Variational Inference methods. In the final module, you will learn how such models can be fitted to training data. Graphical models form the backbone of many modern machine learning techniques like generative adversarial networks (GANs) and diffusion models, and the way that Andrej teaches all of these topics in a rigorous way to provide a solid mathematical understanding of how they work is essential for keeping up to date with the state-of-the-art in this field.
-
15-784 Foundations of Cooperative AI, Vincent Conitzer, Caspar Oesterheld, Tuomas Sandholm
This course covered many topics in AI which are typically not covered in a machine learning course, such as normal and extensive form games, various forms of equilibriums in games, solving for such equilibriums, learning in games (regret matching), decision theories, and mechanism design. In fact, a lot of it comes from economic theory.
The content was interesting, but I did not enjoy this course as much as I would have liked because the way that the content was presented was relatively hand-wavey, trading depth for breadth. That said, the course is being offered for the first time, so it will probably improve in subsequent iterations.
-
10-617 Intermediate Deep Learning, Ruslan Salakhutdinov
Ruslan is one of the household names in the machine learning community (he invented the Dropout technique to prevent overfitting which is now standard in neural network architectures), and I was very excited to be able to take this class with such a legend in the field. I really enjoy his lectures, and he made many remarks about what was happening in the field when various techniques were being introduced as he was introducing them, which really gives you a sense of how the field has evolved over the last few decades from a man who has seen and been through it all.
However, I think the course infrastructure requires more improvement. Some of the starter code for the assignments are quite poorly written and contain many inconsistencies and wrong/outdated documentation, which leads to a fair amount of frustration from students. One particularly annoying inconsistency was how the data formats of their starter code were transposed from Homework 1 to Homework 2. My guess is that someone tried to update the assignment but did not have time to fully go through to fix all the inconsistencies before it was released.
Many people ask about whether they should take 11-485/785 (Introduction to Deep Learning) offered by the Language Technologies Institute (LTI), or this class offered by the Machine Learning Department (MLD). The main difference is that 11-485/785 is more hands-on and practical (most assignments are working on Kaggle datasets), whereas 10-417/617 is more theoretical.
-
10-703 Deep Reinforcement Learning and Control, Katerina Fragkiadaki
The first half of the course follows the standard Sutton and Barto textbook pretty closely, but the second half discusses topics and techniques that are relatively state-of-the-art (think within the last 3 years). As such, there is not really any reference material other than the papers that those techniques were based on.
There are usually 2-4 papers that are compulsory readings which are assigned to be read before every lecture. Unfortunately I was a bad student and did not read them beforehand, and so after the middle of the semester once the content went beyond any standard textbooks, I found it pretty hard to focus and understand what is going on in the class, and took very little out of lecture. Eventually I had to rewatch them after reading through the papers again to be able to properly appreciate it. So if you taking this class, please avoid my mistake and do your readings before the lecture to save time in the long run!
The homework for this class is really fun as you get to implement reinforcement learning algorithms for agents in various OpenAI Gym environments. All assignments are done in groups of up to 3, so remember to grab a friend or two if you’re taking this class.
-
21-651 General Topology, Florian Frick
This class generalizes many concepts that is taught in an undergraduate analysis course from metric spaces in ?? to arbitrary topological spaces. It took some time for me to un-learn some of the things that I implicitly assumed was just always true, i.e while in a metric space you learn that all sequences contains a convergent subsequences in compact sets, this is no longer true in arbitrary topological spaces. Much of the content have connections and parallels to other deeper areas of mathematics, which I found very beautiful.
-
17-603 Communications for Software Leaders I, Dominick (Nick) Frollini
This course felt like an MBA class. It is a required class for Masters of Software Engineering (MSE) students, and one thing that I did not expect was how much the course was geared towards international students (i.e there was quite some emphasis on what is appropriate for US customs and norms), which is understandable as most of the MSE students taking the class are international.
I found the segments about how to give oral presentations useful, especially the many tips and things to take note of when presenting. However, I don’t think I gained as much from other topics, such as those concerning written communication.
-
15-604 Immigration Course, Dave Eckhardt
Every Monday night, the entire MSCS cohort will gather for this course, where Dave will talk about topics ranging from classes to grad school to things to do in Pittsburgh. It’s usually pretty funny because Dave has a great sense of humor, but many of the sessions are also not critically useful so attendance does being tapering off in the middle of the semester once people start getting busy with school.
Units: 90
This was a really heavy semester for me, mainly because I had to juggle 4 group projects (Deep Learning, Probabilistic Graphical Models, Cooperative AI, Algorithms for Big Data) simultaneously starting from the midpoint of the semester, and all of them were significant course undertakings which are anywhere between 30-50% of the final course grade. This is still with homeworks from all of these classes also being due concurrently, with the exception of Algorithms for Big Data.
Fortunately, all the projects turned out relatively well and I was pretty happy with them, but it definitely took a toll on my physical and mental health. I ended up only mostly talking to my project groupmates over the last two weeks of school when all the projects were due. It would probably be wise to learn from my mistake and make sure that you don’t have too many classes with significant course projects on your schedule to avoid such a situation.
Spring 2022
-
10-725 Convex Optimization, Yuanzhi Li
I did not enjoy 10-701 as it covered a lot of content, but did not go into much detail about many topics, and I felt like there was no true understanding and everything was very hand-wavey. That experience made me hesistant to take any other ML classes.
10-725 changed that for me, as Yuanzhi Li started from first principles and rigorously proved how many machine learning algorithms can converge in some amount of steps given various assumptions. For instance, just in the second lecture you will learn how gradient descent can converge to the optimum up to an epsilon error in a number of states which is proportionate to a chosen learning rate, given assumptions on the smoothness of a convex optimization landscape. This will then be extended to more complicated settings such as stochastic gradient descent, gradient descent with momentum (ADAM), distributed gradient descent, and even quantum optimization.
Yuanzhi Li also understands that the students taking the class have very different learning objectives, and therefore the homework contains a mix of required and bonus problems, where the bonus problems are usually significantly more challenging than the required ones, but are tailored for people who really want to get deep into this stuff. The grading policy is extremely gentle and you essentially only have to score just half the points on the required problems to get an A-. The late day policy is also extremely generous (14 days), so it is quite a good class to take if you want some flexibility in your schedule.
-
15-751 A Theorist’s Toolkit, Ryan O’Donnell
This class aims to prepare students for doing theoretical computer science (TCS) research in the future. It covers a wide range of topics that frequently crop up in TCS research (see the playlist below for a taste), and the homework is challenging but fun. I have been previously exposed to a number of the topics covered in previous courses, but still found many of the topics new to me such as spectral graph theory and semidefinite relaxations incredibly cool. It is usually only offered once every few years, so if you have any interest in doing TCS-related research I would highly recommend that you take the class.
Recordings from a past offering of the course can be found on this Youtube playlist. It also includes recitation videos where Ryan goes through homework problems.
-
15-312 Foundations of Programming Languages, Jan Hoffmann
The programming portion of the homework is the most fun part of this class. You will get to write typecheckers and implement the dynamics of various well-specified languages, starting from the simple lambda calculus to concurrent Algol (which is Golang-like). This class is great for appreciating language design and understanding the pitfalls and antipatterns that pervades most programming languages used in industry.
-
21-355 Principles of Real Analysis I, Robin Neumayer
It is true that different sections of math classes taught by different professors are run like entirely different classes, so your experience with any math class highly depends on which professor you are with. I am really glad that I took the class under Robin, whose passion for teaching, and care and concern shown towards students (in one of the first few lectures she took some time to have everyone introduce themselves so she can know everyone on a more personal level), made me really enjoy the class and the subject matter. It also helped that she followed Rudin pretty closely, so anything that I was unclear about can be easily checked. Her weekly homework has reasonable difficulty and is quite fun.
The pace of the course under Robin is faster than most standard analysis classes, where we covered up to Chapter 7 (Sequences and Series of Functions) in Rudin. I enjoyed the class so much that I wanted to take another analysis class in the future, which led me to take General Topology (21-651) in the subsequent semester. If you are interested in taking analysis, I cannot recommend doing it with Robin enough.
-
21-484 Graph Theory, Wesley Pegden
The class was quite fast-paced and many times I felt quite lost during the longer proofs (some of them took multiple classes to prove). I definitely had to review the content after each lecture to get a better sense of the proof technique. However, this is more due to the complicated setup of many of the proofs. Wesley is a great teacher and I really appreciate how he would re-explain the key ideas many times in different methods so that we can understand them better. His exams are reasonable, and he will send out a review sheet between each midterm where you are only expected to remember the statements for some of the more complicated theorems, instead of being able to reproduce them fully.
The textbook for the clas (Graph Theory by Diestel) is quite tersely written as it is intended for a graduate audience, so definitely do not miss lectures!
-
15-819 Advanced Topics in Programming Language Theory, Robert Harper
The class covered many advanced topics in PL theory where there isn’t much publicly available resources on, so when I got confused during lecture it was quite hard to find other materials online to supplement my learning. Very often I would end up on the nLab wiki, and I’m pretty sure anyone who has visited that knows that it’s not entirely the best place for beginners to understand anything.
The class is offered pass/fail to reduce stress, and homeworks are assigned weekly. You will be asked to re-do questions that you get wrong as Bob really wants to make sure that you understand the material. I think I generally had a ok-understanding of what was going on in the class until the segment on logical frameworks, where it became very abstract. I still hope to rectify this gap in understanding someday when I have time, perhaps by playing around with Twelf.
-
16-385 Computer Vision, Matthew O’Toole
I took this class as a follow-up to Computer Graphics (15-462). Many computer vision techniques covered will abuse linear algebra heavily, so if you were not already an expert in SVD decompositions and making use of eigenvalues/eigenvectors for engineering, this course will definitely make you one. You will also get very comfortable with convolutions and some elementary Fourier analysis. A major part of the course also covers convolutional neural networks, which is the mainstay of many modern computer vision methods.
-
18-358 Introduction to Amateur Radio, Tom Zajdel (AI6CU)
This is a pass/fail mini-course, where the only requirement for passing the class is to obtain at least a Technician amateur radio license.
To be very honest I was a pretty bad student for this course. Given the workload of my other classes, I skipped more than a few lectures and only studied extensively for the FCC licensing exam, a
It was definitely a cool class, and we were also provided handheld radios and taught how to operate them to listen in on Buggy Net over Spring Carnival pretty early on. I was manning SSA’s booth for a few hours during then, and used the chance to hear the Buggy Net operators announcing the start of each race, and live updates on when which buggy crossed a certain junction.
-
70-350 Acting for Business, Evelyn Pierce
Units: 93