Thoughts For September, 2025

Thoughts For September, 2025

Artificial Intelligence

It will be difficult to not start any post like this for the foreseeable future where artificial Intelligence is not at the forefront. I am currently learning more about it via a Udemy Course that’s been really informative, as well as reading a book on the subject in tandem. I am finding that AI is a little more nuanced than what I first considered and came up with a progression of understandings on it ending with, you guessed it, MCP servers.

  • What I use AI for is more of a glorified search engine, which is what most people use it for also, but that’s been changing (more on that MCP server stuff later). It’s important to note that AI is not creating any information, but rather rehashing results it gained from its training data, which is essentially a web crawling apparatus. If you ask a question of ChatGPT about a certain subject, it is not divining any new information for you. In fact, it’s most likely regurgitating some Stack Overflow post from 2014. I like to think of it as a highly abstract programming language on top of an enormous bucket of web scrapes. I will add here, but not expound further, but I do not think General AI is around the corner, in fact I don’t think it will happen during our lifetime. The narrow AI right now being used needed three components to come to life. First, a solid mathematical basis, which its had for decades (Markov chains came from pre-Soviet Russia). Second, a way to tie it all together, which was the Google Brain paper, and finally the hardware, which Nvidia provides (built off the back of gamers, you’re welcome!). You don’t have any of these three for Gen AI. To say Gen AI is around the corner is based on more the shock that narrow AI laid upon us. In fact, I think the only way Gen AI will come to be is if Gen AI creates it, or some brilliant anonymous person invents it in a Chinese lab that’s being powered by a secret fusion core reactor. We just don’t know. Sam Altman himself thinks it will take trillions of dollars just to get the hardware, and we still have the other two items I mentioned.  
  • I think most people can understand how AI relates words to each other, but they also think that algorithm is applied to whatever else  you ask it do, which is not the case. If you ask ChatGPT to analyze some financial data for you, it is going to load that into Python, fire up a pandas mainframe, and then export the data back to you. Very helpful if you do not know Python or pandas, but also not helpful if you trust the AI to not import data into the wrong columns and perform inaccurate calculations. I’ve been burnt by the latter when trying to use it to calculate financial ratios on 10-Ks for a financial reporting project for one of my MBA classes.
  • I’m not sure that people appreciate the cost of compute for these AI models. These AI models are incredibly resource intensive and much of the cash burn goes right into running these things. People really think AI is free, because it is right now (for a certain number of prompts) but behind the scenes it’s much different. It’s important to note that AI is not like using Python, CSS, Javascript or some coding language that is open source and free. Even though you can go and pluck any number of open source models from Hugging Face, you still need to run the thing. According to Sam Altman, people just saying “thank you” to it is costing the company millions. I actually looked at running a very lightweight model on my own PC and quickly found I needed to upgrade to at least a 5090, which means another $2K into my machine. In fact, the quote I’ve seen for any machine that can run some of these better models just for my use would be around $20k, and then you need to include electricity costs. I’ve heard said it takes a bottle of water to cool one these apparatuses for every five prompts, so you start to understand these massive cloud investments when this gets kicked out to scale. 
  • AI coding is wonderful for quick prototyping, but I’m not sure what it looks like on larger code sets. But, that might not even be problematic anymore. With the march to micro-services it is possible that smaller and lightweight is the way to go. You do hear a lot about “vibe coders” who are using AI to build an entire applications, but you don’t really hear what those applications are. I have found that the more you code with it the messier it can get, and the only thing you can do is test it to see if it works. Anyone can use it to build a simple HTML document with CSS (that’s actually how I built the sites I donated to our local Scouting organization), but at scale I’m not sure yet and I’m hearing mixed results. Sometimes you hear it’s great then a week later everyone is like “this sucks.” One thing for sure is that it is awesome for pumping out simple Python scripts and discovering new libraries to use. I use it a ton to do that and it’s sped up my learning on the subject. 
  • MCP servers are where the real action is right now. They’re a lot like Python Libraries, you install them, connect to them with a client, and put them to work. I will have more on this as I work through this class, but what I’m seeing is kind of crazy. I think first off they probably need to rename these to something else. MCP Server is kind of confusing, because you have it and the client. It’s not so much a server as it’s a AI bot in waiting, there to work through whatever problem you want solved. The connection between these two is really interesting, you actually put in natural language to tell the bot what it should be doing. The process I’ve seen is the Python side using Fetch. I’ve seen the Node/JS side but I don’t do much there so I’ll probably use Fetch. The exercise demonstrated to me was the use of Playwright to go out and scrape web data and print it back in markdown. That’s a pretty interesting tool as the setup for it was almost instantaneous. Curious though, are people going to stop developing these cool Python Libraries and start focusing more on building MCP servers?

Economic Outlook

I believe 2026 is going to be a tough year for most folks. I believe these tariffs, job shifts to AI and oversea resources, inflation, and economic uncertainty are going to play out. I’m not sure what’s going to happen if Trump takes of the Fed and forces his rate drop, my economic professor would say in college that lowering rates when there’s this much cash in the market doesn’t make much sense. The government continues printing money, further inflaming the problem. At this point, I’m not entirely sure you can trust the economic data coming out. If a bad jobs report gets released, Trump will fire them. A government shutdown isn’t going to help much either and will just see more uncertainty. Right now my suggestion to most would be to start saving money and plan for a rocky next year.  

New Year

I had a birthday this month which means introspection. I typically use that event as a means to make new annual goals and any adjustments in my life plan. Last year the focus was my career, which ultimately led me to joining an MBA program. This year my focus will be on my health and preparing for the next step of aging in my life. I’ve taken a lot of inspiration from Arnold Schwarzenegger’s “The Education of a Bodybuilder.” In this book Arnold talks a lot about proper nutrition and weight lifting techniques. I started really focusing on my lifting and I’ve seen a fairly significant increase in muscle mass and strength. I’ve also taken his advice and started focusing on my core and legs, which admittedly are areas I tend to not focus on. Leg days are boring and core workouts strain my lower back. One of the themes of the book though is technique, as I’ve mentioned before, so I’ve taken to really focusing on the best way to work my core without injuring myself. He also recommends working in jogging and swimming, the first I hate and the second I love, so I’ll most likely be doing more cardio. We joined NKU’s Rec Center as a family so I have access to an amazing array of cardio machines and swimming lap lanes.