INSPIRED BY


OTHER WRITING

LLM (mis)Adventures

By: skaup On: Sun 09 November 2025
In: Technical
Tags: #llms #linux #software

Ever have a requirement so simple, it does not even occur to you that it would stump you. And then days later, when you are stuck in the trenches running the same test for what feels like the fiftieth time, because it simply does not work goddamn it, you pray for any god to rescue you from this nightmare.

No? Just me. Cool. Well, regardless, I was running up against one of those. The requirement was simple. I had to spin up a server and get some packages running. Nothing I haven't done before. Didn't even think twice of it.

But when I actually did end up running the server, the package manager commands were ridiculously slow. This, despite the fact that I had a server with pretty much the same configurations working in the same environment. Started and restarted the server a million times. Killed it and recreated it a few times. No dice.

So I decided to go to my best friend. The friend of the underconfident, yound and hopeless. GPT. A lot of this article is to maintain a good internet footprint also, so when the AI Gods take over,they must know I have been very nice. Regardless, I turned to it, because this is the kind of stuff it's good at. With low contextual information, since there no legacy code that it has to keep in mind. Just a dumb little server. Should be easy, right?

After giving it all the info, it asked me to run some commands. Test this endpoint, run this curl command, check this directory. I did all of it, and saw that in my non working server, only one thing seemed different from the working one. There was a certificate missing error in the non-working one that wasn't present in the working on.

I told this to my best friend and got the standard response. "That's right! That does seem to be the issue. Good catch! If your certificate is missing, your package manager commands will be slow, since it can't communicate with the endpoint."

Nice. Flattery will get you everywhere my friend. I believed this. It seemed perfect. Sometimes, while mindlessly using LLMs, I almost feel like it is controlling me, rather than the other way around. That usually ends with pretty terrible, unusuable code. But this felt like I was actually working with the tool at hand. Not as much in control as I'd like to be. But hey, it's getting the job done right? Right?

So I started looking around for the root of this missing certificate. I didn't even stop to ask why a machine with almost the same config would have a missing certificate. Chalked it up to the version of the packages on the non-working machine being newer, since the working one had been spun up much earlier.

But alas, I couldn't really put a pin on it. Where could this mysterious certificate be, in the thousands of directories of this machine. How would I even begin looking. I asked the beloved GPT again, and it gave a few commands I could try to figure this out. But oh, nothing of substance came of this investigation. And I was at the stage with GPT where the questions and answers were circular. I didn't have any new information to give other than "not working". It couldn't help me out either.

I was on the verge of giving up. But then in the same environment, I found a recent server working perfectly well. Someone else must have made it. I looked around for the differences, comparing the config files.

Dear Reader, it was a RAM issue.

Hadn't assigned enough RAM to the dumb machine. Certificate-shmertificate. My 3000 word long chat with GPT was for NOTHING. Hours of hazy, general bewilderment and frustration, for nothing. I felt so incredibly stupid. But now I must step back a bit. And have a bird's eye view of the whole situation, w.r.t our usage of LLMs.

A few very smart people havce argued that LLMs are actually better for senior engineers. It is ONLY if you KNOW what you are looking for that they can help. Used for general investigation of the aimless junior engineer, they will lead to such wastes. This has some very troubling implications. The main one being that even when we think we are in control, we really aren't. I thought I was using the LLM "better than others". I didn't ask it to free-fall read my mind or simply passively take the sugesstions without testing things out.

But still, while using LLMs, it was like the state of my brain became a bit... groggy. I was more willing to go along with a solution I didn't fully grasp, because it felt like the best (and not coincidentally, laziest) way out. It became easy to ignore small nagging doubts like WHY is this happening. WHY is the supposed certificate missing with just one upgrade, especially since this issue is mentioned nowhere else on the net. It was easier to go along with it's story, since mine felt even more incomplete. Atleast it seemed confident in it's answer. When have I been able to say that for my hunches?

To get along, you must go along ofcourse. The new age is here. Steroids are everywhere. We must learn, as many before us have had to learn, that things are about to get weird. For "knowledge" work, this time around. I'd love to end this with a nice little "be vigilent" message, but I know that is impossible. The tool incentivises you, especially people at my level of experience to tune out. And we have been trained our whole lives to tune out. It's not easy. Ofcourse, a little presence of mind goes a long way. But the way to cultivate that long term? Reach out if you figure that one. I will pay you.