Return

Machine Learning Camp!

Update

It’s been a while since I created this blog. In that time, I haven’t done any substantive coding. Maybe that’s because I felt a bit burned out from the coding camp I attended during the summer (shortly before creating the blog). So, I decided to keep VS Code powered down and write this update post instead.

Camp

The camp was the “MCFAM Machine Learning Summer Camp 2021” provided by the University of Minnesota. Overall it was a great experience, and I would definitely recommend it for anyone interested in AI/ML, or even in data analysis.

The first week focused on basic data handling (cleaning, exploration, analysis, etc.) and was surprisingly focused on financial applications. This portion of the camp was well within my skillset, and I was learning a fair amount while remaining pretty comfortable. At the end of the week, we split into teams, each of which chose a kaggle project and worked on it for two days. My team chose the classic Titanic dataset. Even though the challenge wasn’t super difficult, I didn’t end up doing much of the coding, because my teammates were much more proficient than me. However, I still contributed what I could, and the project1 turned out very well.

The second week dived into actual ML. The applications ranged from autonomous vehicles to data compression (surprisingly enough, the details of jpg image compression are fascinating). I was able to grasp most of the abstract concepts fairly well, but I was firmly out of my depth wrt the technical details. This week also had fascinating guest speakers (Pratik Chaudhari and Chris Finlay were highlights) who presented some really inspiring projects and applications. The projects were obviously more challenging than the first week, way out of my depth, and I felt like I was drinking out of the proverbial firehose. The difficult thing about ML/AI is that even very simple programs need a deep set of complex libraries to work, and this makes debugging a nightmare. In addition, feeding the data into the model is tedious (we used OpenAI gym) and making sense of what the model is actually doing requires either a pipeline to export the model’s playing of the game into a video OR a really good abstract understanding of what the numerical reports generated by the model mean (neither of which I had). However, with help from my instructors and classmates, I was able to solve these problems. I ended up finding a solution a for the basic cartpole problem2.

Up Next

I’ve been trying to get some experience in the area of web scraping by researching edit histories on Wikipedia (more info soon) but progress has been slow. I’m now comitted to another camp, basketball this time, but hopefully I can use it to help keep me scheduled.

The camp also showed me the perks of using Jupyter Notebooks in Google Colab (at least for small projects), and I am now using that instead of VS Code.

Pro: I now have a functional text editor

Con: I can’t use synthwave-themed text and background anymore

But all jokes aside, the switch has been good so far. Hoping that the summer improves as it goes.

Peace.

© 2024 Joe Mandell   •  Powered by Soopr   •  Theme  Moonwalk