<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom"><title>blog.skaup.co</title><link href="https://blog.skaup.co/" rel="alternate"/><link href="https://blog.skaup.co/feeds/all.atom.xml" rel="self"/><id>https://blog.skaup.co/</id><updated>2026-03-15T10:25:00-07:00</updated><entry><title>Frankenstein</title><link href="https://blog.skaup.co/frankenstein.html" rel="alternate"/><published>2026-03-15T10:25:00-07:00</published><updated>2026-03-15T10:25:00-07:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2026-03-15:/frankenstein.html</id><summary type="html">&lt;p&gt;This is a second part of the series inspired by the movie Frankenstein (2025) directed by Guillermo del Toro. First part is &lt;a href="http://blog.skaup.co/mary-and-mary.html"&gt;here&lt;/a&gt;. Would recommend it to understand this. A quick rundown on the basic plot of the original book. Oh, and spoilers, ofcourse, for the movie.&lt;/p&gt;
&lt;p&gt;The story begins …&lt;/p&gt;</summary><content type="html">&lt;p&gt;This is a second part of the series inspired by the movie Frankenstein (2025) directed by Guillermo del Toro. First part is &lt;a href="http://blog.skaup.co/mary-and-mary.html"&gt;here&lt;/a&gt;. Would recommend it to understand this. A quick rundown on the basic plot of the original book. Oh, and spoilers, ofcourse, for the movie.&lt;/p&gt;
&lt;p&gt;The story begins with a ship captain on an expedition to the antartic. He sees a stranded man - old and weary, but clearly incredibly intelligent. Once the ship captain rescues said man, a creature starts tailing them. None of their weapons work against this creature. The old man tells his tale. He used to be a curious young man with a scientific temperament. He was interested in nature, in the larger questions regarding the origin of life. Just before he left for his studies at university, his mother died. At university, the man puts his entire self into his work. He has a plan to create a living being from the remains of dead people. He toils at it and eventually succeeds. But in his quest to create, he was completely blindsided by the end result. When his creature comes to life, Victor Frankenstein, our narrator, is horrified. He FREAKS out and runs away, leaving the thing to take care of itself.&lt;/p&gt;
&lt;p&gt;Somehow, the creature, left cold and completely alone in the world, manages to survive. He is attacked constantly because he looks unimaginably scary and terrifying. So our creature hides in the woods. He finds a family that he likes, hides near them, and helps them out however he can. He takes some food from them in return. They offer it to their helper. But they don't know who he is. There is a blind man in that family, wise and old. The creature listens to his stories. He loves this family; he grows to care for them. One day, he presents himself to the blind man as the one who has been helping them. The blind man accepts him. He teaches him to read as well. The creature, basically a grown baby, hopes the others will accept him. Once they know him, understand how much he cares for them and understand that he has an internal world too, maybe they won't try to kill him. But they find out and, of course, drive him away. &lt;/p&gt;
&lt;p&gt;The creature knows now that he cannot escape this torment. He is doomed to be alone. &lt;em&gt;If I cannot inspire love, I will cause fear!&lt;/em&gt; So he trails Frankenstein. In the process, he even accidently kills his cousin William. But he begs for understanding and demands from his creator only one thing. &lt;em&gt;Make me a companion&lt;/em&gt;. He wants what any normal person has. Frankenstein says no initially but eventually agrees. He works on it, and almost completes his project. But then realises that maybe they would then have offspring. So he gives it up. The creature hets super mad. He goes on a rampage. But Frankenstein escapes somehow. A cat and mouse game begins. He deludes himself into safety, and makes plans to marry his childhood sweetheart - Elizabeth. But he also remembers the main threat from the creature - _"I will be with you on your wedding night". It all comes to a head - the night of their wedding, the creature strangles her. Then it's Victor's turn to be a revenge-driven maniac. He chases the creature, tries to kill him but fails. Then he dies on the boat with the ship captain. The creature finds the captain and tells him his side of the story. And now that his creator is dead, he has nothing else to live for, he says. The book ends with his death. &lt;/p&gt;
&lt;p&gt;So, pretty dark. The movie follows basically the same format. But the movie makes a few crucial changes. It begins with Victor's childhood and shows his relationship with his parents. His mother is great; his dad not so much. He is kind of cruel and judmental. He doesn't understand his son and forces things upon him. He is a doctor, and it's implied that he kills Victor's mom when their second son (William) is born. So Victor grows up, kind of obsesses with defeating death. And a second crucial change is with Elizabeth. She isn't just standard shoo-in love interest. She likes science too, but because we live ina society, she cannot follow her dream. Victor falls for her, but she is engaged to William. So Victor, a rejected man, does his experiments. And then, I think this is the main change, he does not ditch his creature after he is created. He sticks around, trying to teach his 'son' things. &lt;/p&gt;
&lt;p&gt;But he falls into the same patterns as his father. He is cruel and judgmental and impatient. The creature is locked up in basement, scared and lonely. He only says one word repeatedly - Victor. But Elizabeth visits them, and she and the creature connect. Victor gets jealous. So he burns the house down with the creature. Amazing. Then the same cat and mouse game begins. The creature in this also goes to an old blind man for comfort; he again asks for a companion and is rejected. So in this one, on the wedding night of Elizabeth and William, the creature comes back. He is trying to talk to Victor, they fight, and then in the process, Elizabeth is shot by Victor. William also dies. Victor blames the creature for this and then tries to take revenge. But finally, after failing to kill him multiple times, he decides to accept his 'son'. The creature then leaves. In this, he is kind of immortal, so it's implied he has to go on living, broken, but alive. &lt;/p&gt;
&lt;p&gt;Okay, so firstly I would like to say that Guillermo del Toro's Frankenstein is a gorgeous looking movie. It is absolutely beautifully shot, the costume department did a fantastic job. All the actors are amazing - Jacob Elordi is the creature, he does an amazing job. Mia Goth is Elizabeth and Victor's mom. Amazing choice. And Oscar Isaac is Victor, he does a great job too I think. Some Stills -&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Mia Goth as Elizabeth - [I LOVE THIS COSTUME]&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/frankenstein/mia-goth-elizabeth.png" alt="Description" style="max-width: 50%; height: auto;"&gt; &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Again, Mia Goth as Victor's mother - [No notes, 10/10]&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/frankenstein/frankenstein-film-still1.jpg" alt="Description" style="max-width: 50%; height: auto;"&gt; &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;And Jacob Elordi as Frankenstein's Creature - [Gorgeous Design]&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/frankenstein/creature-2.jpg" alt="Description" style="max-width: 50%; height: auto;"&gt; &lt;/p&gt;
&lt;p&gt;And I can really tell that a lot of effort, creativity and passion went behind this. I do think Guillermo del Toro (just going to call him GDT from now on) really cares about the source material. This is a labour of love. And it shows.&lt;/p&gt;
&lt;p&gt;But there is a hitch. The movie is too sympathetic to the monster. Jacob Elordi does a great job in portraying him. But he also makes him look cool rather than terrifying. The story is more focused on the father-son issues than on the science fiction aspect of it. And this is by design. GDT really connected with that aspect of the story - the father abandoning the son and all. In the book, Victor's dad isn't even that mean. William is a random cousin. It's not that important. But the movie plays this aspect up because it is what GDT wanted to show. And this is completely fine. People connect with whatever part they connect with - we can't really help it. So the movie makes it a story about a man whose ambitions are corrupt. But the book is about a man who is curious and doesn't understand the implications of his work. He isn't cruel in the beginning. He isn't trying to be. He freaks out and runs away upon making the creature for a reason. He is a coward. He is humanly flawed. &lt;/p&gt;
&lt;p&gt;The movie, by making him a bit too on the nose evil, loses this balance. The creature is given all of your sympathy. ELIZABETH is nice to him? What for? The movie seems to imply that this misunderstood woman connects with him, and the mean ambitious man is cruel to him. It's this one, comically evil man's fault. In the book, there is a reason that the only person who treats the creature well in the book is blind man. It's because everyone else is to shit scared on seeing him to even attempt to understand him. The book beautifully balances this aspect. The creature is a baby. He is scared and lonely and hungry. He is smart, sensitive, and has an inner world. You do root for him. But you also know why people are scared of him. He never looks "cool" like he does in the movie. He is UGLY. He is everything you are afraid of. The book implicates YOU in the creature's mistreatment. You would shoot first, ask questions later if you saw this 'monster'. You'd be terrified for anything else. &lt;/p&gt;
&lt;p&gt;Still, I think the movie isn't all wrong. It respects the source material, it's just not interested in telling the same story. I do think aspects of it could have been better, but it's a fun watch. And it's not like Mary Shelley was not into writing stories about messed-up fathers of people with dead mothers. She herself had many, many miscarriages. Some of her kids died soon after they were born. &lt;a href="https://www.frontporchrepublic.com/2024/12/mary-shelleys-grief/"&gt;Brutal stuff&lt;/a&gt;. I think she must have understood a little something about looking at dead remains and wanting to create life from them. &lt;/p&gt;
&lt;p&gt;But in my opinion, Mary Shelley's story is important precisely because of the science fiction aspect of it. The book's subtitle is 'A Modern Prometheus'. This is a story about technological progress. &lt;em&gt;Perhaps a corpse would be re-animated... &lt;a href="https://en.wikipedia.org/wiki/Galvanism"&gt;Galvanism&lt;/a&gt; had given token of such things&lt;/em&gt; &lt;/p&gt;
&lt;p&gt;It is important to understand this. She was in rooms with people who loved discussing this stuff - they were interested in scientific progress and literature. They wanted to know how this would impact their world. And the fears she has described are universal. Do we know what we are dealing with here? When does our natural curiosity for the world around us become perverted? When can we play God? And are these questions still relevant?&lt;/p&gt;
&lt;p&gt;The answers are left as an exercise for the reader's imagination.&lt;/p&gt;
&lt;hr&gt;
&lt;h3&gt;Refs&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://www.sheilaomalley.com/?tag=frankenstein"&gt;Shiela O'Malleys' Excellent Blog, and Frankenstein related work&lt;/a&gt;
&lt;br&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://en.wikipedia.org/wiki/Frankenstein"&gt;Frankenstein Wiki Page&lt;/a&gt;
&lt;br&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://www.frontporchrepublic.com/2024/12/mary-shelleys-grief/"&gt;An excellent article on Mary Shelley&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;hr&gt;
&lt;h3&gt;Post Notes&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;The movie includes a line from Percy Shelley's poem &lt;a href="https://www.poetryfoundation.org/poems/46565/ozymandias"&gt;Ozymandias&lt;/a&gt; - Thank you movie. &lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Mia Goth is excellent. I saw a GDT post, talking about Jane Eyre. Hope he makes a movie where she plays both Jane Eyre and the 'crazy' Bertha Mason - Would work very well. Jacob Elordi or Oscar Isaac could be Rochester - Either way is fine. But they have to do the scene where he dresses up as gypsy woman to meet his crush.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The best line from the book isn't included in the movie - &lt;em&gt;"I will be with you on your wedding night"&lt;/em&gt;. But I do love the tagline of the movie - &lt;em&gt;Only monsters play god.&lt;/em&gt; &lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;</content><category term="Personal"/><category term="stories"/><category term="history"/></entry><entry><title>Mary and Mary</title><link href="https://blog.skaup.co/mary-and-mary.html" rel="alternate"/><published>2026-03-08T10:25:00-07:00</published><updated>2026-03-08T10:25:00-07:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2026-03-08:/mary-and-mary.html</id><summary type="html">&lt;p&gt;This was inspired by Frankenstein 2025. The wonderful &lt;a href="https://www.sheilaomalley.com"&gt;Shiela O'Malley&lt;/a&gt;, whose writing I regularly read, was one of the few contributors to the film. I had read the book, and naturally was excited to see the movie. But to explain the first ever Science Fiction Book, we need a bit …&lt;/p&gt;</summary><content type="html">&lt;p&gt;This was inspired by Frankenstein 2025. The wonderful &lt;a href="https://www.sheilaomalley.com"&gt;Shiela O'Malley&lt;/a&gt;, whose writing I regularly read, was one of the few contributors to the film. I had read the book, and naturally was excited to see the movie. But to explain the first ever Science Fiction Book, we need a bit of a History lesson.&lt;/p&gt;
&lt;p&gt;Mary Wollstonecraft was born in 1759. She was born in an initially middle class family which lost most of it's wealth later. She later, despite her safest option being choosing a career as a teacher, moved to Paris to be a writer. In the middle of the French Revolution. She wrote the foundational text ‘A Vindication of the Rights of Woman’.&lt;/p&gt;
&lt;p&gt;Her work was based on her belief that women and men, despite various physical and sexual differences, were ultimately human. Women were taught to make themselves weaker to appeal to men. Wollstonecraft instead asked women to build an understanding of the world in themselves. She encouraged her readers to teach their young daughters to have a playful childhood not restricted by social pressures. She asked women to strive to attain the same values: truth, courage and selflessness, to the extent that they could. She asked men to get rid of the notion that women were fundamentally less than them since if that were the case, women would never have been able to shoulder any responsibility. Her work was radical for the 18th century. It is still quite radical, in some ways.&lt;/p&gt;
&lt;p&gt;Some quotes first, because there are only one word to describe her writing: Metal -&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;I don’t want women to have power over men; I want them to have power over themselves.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;The tyrants only want slaves, and the sensualists only want toys.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Rousseau declares that a woman should never for a moment feel herself to be independent, that she should be governed by fear to exercise her ‘natural’ cunning, and made a coquettish slave in order to make her a more alluring object of desire, a ‘sweeter’ companion to man whenever he chooses to relax himself. He carries his arguments (which he claims to infer from the indications of nature) still further, and indicates that truth and fortitude—the corner-stones of all human virtue—should be cultivated with certain restrictions, because with respect to the female character obedience is the great lesson which ought to be impressed _on the woman&lt;/em&gt; with unrelenting rigour. What nonsense!&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;What nonsense indeed.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Whatever effect circumstances have on people’s abilities, everyone can become virtuous by the exercise of his or her own reason; for if just one being was created with vicious inclinations—i.e. was created positively bad—what could save us from atheism? or if we worshipped a god, wouldn’t we be worshipping a devil?&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;If we worshipped a God, wouldn't we be worshipping a devil?&lt;/p&gt;
&lt;p&gt;But as you can see from the first quote, she wasn't exactly easy on women either. The book is a tough read, you still recognise some of the behaviours we see today in it. It is in such strong favour of rationality, of development of character, of mental toughness. And she was all of those things. But she was a flawed human too. You can tell by reading the book how much these principles mean to her. But she herself fell short of them, often that too. She fell in love with a certain George Imlay. He was not really into her as much though. She did have her first child with him, even though they were never married. She went around Europe pretending they were so that she could be safe. It was brutal. When he rejected her, she made a few suicide attempts. It's so human, you know such people. They are principled, they believe a lot in those principles, but their real life struggles make them never quite reach those heights. You can tell they are tough on themselves for it.&lt;/p&gt;
&lt;p&gt;But still, it wasn't all bad for her. She always kept up her writing ofcourse. And she found another person - William Godwin. They got married, both revolutionaries in many ways. Mary Wollstonecraft died while giving birth to her second child, Mary Godwin. Pregnancy was the "Valley of Death" for her. &lt;/p&gt;
&lt;p&gt;But her daughter read her writing, was really inspired by it. When she started her romance with Percy Shelley they would MEET AT HER MOTHER'S GRAVE. They then ran away together. She was 16, he was 22. Amazing. 1816 was the "Year without a Summer" in Europe. A volcano exploded in Indonesia, releasing ash and gases into the atmosphere. It literally blocked the sunlight. And thus the monster was born. Mary Shelley was 19 when she wrote it. &lt;/p&gt;
&lt;p&gt;The book was a sensation. Her Husband died a few years later. His boat sank. She spent her life writing, and maintaining his legacy as well. She had a few kids with him too, many died during childbirth, or a little while after. It was a difficult life, but she lived till 53. And she wrote a book so good it is still making me think today. Perhaps that is the best gifts these two women gave us. More about the book and the movie in the next part.&lt;/p&gt;
&lt;hr&gt;
&lt;h3&gt;Refs&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://www.gutenberg.org/ebooks/3420"&gt;A Vindication of the Rights of Woman&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://en.wikipedia.org/wiki/Percy_Bysshe_Shelley"&gt;More for Fun - Percy Shelley's Insane Wiki Page&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;hr&gt;
&lt;h3&gt;More Quotes from Mary Wollstonecraft&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;The passions that have been celebrated for their durability have always been unfortunate.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Fondness is a poor substitute for friendship.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;By its very nature, love must be transitory. &lt;br&gt;&lt;/br&gt;
Searching for a secret that would make it constant is as wild as searching for the philosopher’s stone.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;</content><category term="Personal"/><category term="stories"/><category term="history"/></entry><entry><title>The 3 P's</title><link href="https://blog.skaup.co/the-3-ps.html" rel="alternate"/><published>2026-03-02T12:00:00-08:00</published><updated>2026-03-02T12:00:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2026-03-02:/the-3-ps.html</id><summary type="html">&lt;p&gt;This is about two Jane Austen books. First, Pride and Prejudice. It's about Elizabeth bennet, the secong youngest of a family of five daughters. Her older sister is being courted by a rich guy who just moved into the neighbourhood - Mr Bingley. And Mr Bingley has a douche friend named …&lt;/p&gt;</summary><content type="html">&lt;p&gt;This is about two Jane Austen books. First, Pride and Prejudice. It's about Elizabeth bennet, the secong youngest of a family of five daughters. Her older sister is being courted by a rich guy who just moved into the neighbourhood - Mr Bingley. And Mr Bingley has a douche friend named Mr Darcy. He calls her "barely" tolerable, turns his nose up while talking to her family because he is rich, and is in general, very proud of his own standing.&lt;/p&gt;
&lt;p&gt;Pride and Prejudice is perhaps the most uncomfortable reads I have in a long time. The book is perfect. If the goal of fiction is to convincingly allow you to dissapear into another world. To make the imaginary 'real' for the reader. To write in such a way that their person dissapears and what is left is something both deeply personal and universal, Austen succeeds like nobody's business. Technical skill wise, it is a perfect book. The plot is tight, the strings all pull together perfectly. It's ridiculously funny. It BITES. The social satire of the upper middle class at the time is brutal. Take these -&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;For what do we live, but to make sport for our neighbors, and laugh at them in our turn?&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Or this one about women -&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;A girl likes to be crossed a little in love now and then.&lt;/em&gt;
&lt;em&gt;It is something to think of&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;And this one about the youths -&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Such squeamish youths as cannot bear to be connected with a little absurdity are not worth a regret.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;And this one, when her COUSIN, Mr Collins, after being rejected by her, just goes - oh yeah no, I will keep trying. I know how you ladies like to tease us men, and reject us. It is only proper and ladylike that you do this. Lizzie Bennet - hitting the nail right on the head.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Do not consider me now as an elegant female intending to plague you, but as a rational creature speaking the truth from her heart.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;So good. She understood these characters.None of them are stereotypes. They are fun and larger than life sure, but you understand them all. You can place yourself in Elizabeth's mind. You can feel her pain, her spirit and her concerns. Just mind blowingly good writing. It never feels like the author is coming in to insert her view in a way that bends the internal logic of the story. &lt;/p&gt;
&lt;p&gt;I think it is too good almost. The social satire is so ridiculously on point. You feel you are reading about any modern office even. It's the same level of gossip, misunderstanding, and political maneuvering. The girls, Lizzie's sisters, can't get their fathers estate after he dies. So they will then become poor. This is a constand threat over their heads. So when Lizzie rejects these men. Her cousin, or even Darcy the first time, she is being ridiculously brave. I don't think I'd have her bravery. But that is the point. That is the fairy tale. But then the social commentary, the horrors of this , the marriage market, being the only option. Not just for women, but for even the poorer men to get ahead in life, is just there. And you listen in horror as they are left dejected each time. It's not even like the women can't inherent the property. Mr Darcy's aunt does inherit hers. &lt;/p&gt;
&lt;p&gt;But that option is just never considered. Marrying "well" (aka rich) is their only option. And that, that's not even a scheme. That is the only thing they MUST do. And their role is still passive in the story since they are women. They can't propose. They must expect a proposal. They can't reach out and write love letters. It's indecent. They must just wait around. Till letters come, till the men get their shit together, till something happens, and god willing, it works out. It's such a helpless position, you feel it. I felt crazy reading it. These women have no other skills also. God only knows how long they were taught. maybe they could be teachers.But even that would be "degrading" to them. Even the men, god bless, are just sitting around. Darcy and Bingley also don't work, atleast they aren't like lawyers or priests. They just have thier land, and they gotta handle it. That's it. No one has a real job. No one. It is ridiculous what these people were upto. They just went from party to party, scheming, meddling and judging. &lt;/p&gt;
&lt;p&gt;And this social constriction, this feeling of being trapped by the &lt;em&gt;society&lt;/em&gt; around you, it is everpresent. Austen NEVER allows you to escape it. It's what we typically turn to fiction for - to imagine a world free of the shackled one we are currently in. But she never allows for anything that simple. Virginia Woolf in her book - "A Room of One's Own" talks about how Austen wrote in the middle of her drawing room, surrounded by her family members. The constant threat of interruption. The constant feeling of 'someone is here to watch and judge you by the rules - and you must meet them'. It never goes. It makes for a very compelling storytelling actually. Even if you can't escape it, perhaps you can bend it.&lt;/p&gt;
&lt;p&gt;And here she bends it, but not too much. The social customs are given their due, but what wins is not blind obedience, but careful consideration. Lizzie Bennet ends up with the guy, but she does so by risking a lot. She risks being single, by rejecting Mr Collins. She will not marry without love, she says. And so she comes to understand herself, and then when she finally has what she wants, you do sense her maturity.&lt;/p&gt;
&lt;p&gt;But it's not enough. It's just not enough. It doesn't bend it enough. Enter Persuassion. Persuation is the third P. Anne Eliot, a 27 year old (god, what a spinster) comes in. She is quiet and reflective. Years ago, she gave up her first love, a Wentworth, because her family did not approve of this poor man. Who knows if this guy can make his money working for a living. What a concept, huh? &lt;/p&gt;
&lt;p&gt;And then he comes back into her life. Turns out, being in the british navy in the early 19th century is quite profitable. So while he has made money, her own father has mostly squandered it away. So Anne must make her way around in this new world, where she is the less desirable partner. She is getting older, she has little to no 'prospects'. But she treads carefully. She doesn't run to him, she doesn't expect him to 'take' her back. She knows he must be angry. She understands him. And she moves slowly. She sits in the backgrounds of scenes and watches. She takes small steps. And she is carefully able to, despite her horrible family, and a few romantic rivals, stand up for herself. She doesn't 'win' the guy so much as she allows herself to take space in the world. &lt;/p&gt;
&lt;p&gt;And it is so beautiful. It is less tightly written that P&amp;amp;P in my opinion. But it is a stunning character study. Anne is not quiet because she is secretly cooler than everyone else. She is just a quiet soul. Your heart aches for her. When she takes small steps to stand up to her family, you feel the risk she is taking. You become proud of her. &lt;/p&gt;
&lt;p&gt;But again, Austen, in her usual wonderful self, never lets us off the hook. To the end of the book, when Anne and Wentworth Reunite, she says that all those years ago, when she allowed her family members to persuade her into her choice, she wouldn't have done it otherwise. She knows saying this could hurt him, but what she means is, without the lack of experience in the world, she did the 'right' thing by trusting her family. It is such classic Austen. She has officially bent  the rules. Her love interest in this actually has a job. He is rich, but he is not a nobleman. There is CLASS difference here. It is inverted since Wentworth now has money, he is kind of accepted into the upper echelon. But you know, he isn't making money from Daddy's farmland. He worked for it. And she never lets that slide. It is present - it makes a difference. You can never really be free. &lt;/p&gt;
&lt;p&gt;But maybe she gets something better than easy, unfettered freedom. Because in the end, she takes perhaps the crazier risk. She will not stay on some mansion. She will be on the sea, a tanned sailor, no less brave than her counterpart. She gets to go an adventure in the big wide ocean. And so the book Austen completed very close to her death at 41 ends. With forgiveness and an adventure. The invisible social rule stick is officially slanted. Perhaps, it always should be. &lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;Post Notes&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;A few weeks ago, on the metro, I saw a girl reading Persuasion. I saw this, and wanted to say something. I had started pride and prejudice just the night before. But she seemed busy and in her own world. Perhaps I would just be an annoying stranger, interrupting her day. But somehow, after standing around for I would guess an uncomfortable amount of time, I decided to wager it. When would I see her again? So I asked her how the novel was, and immediately, god bless, her face broke into a smile. We started discussing the book, where I was in it, whether I had seen the movie. "He is very rude to her in the beginning - He calls her ugly, and she doesn't like him either" - we both nodded in agreement. And then I had to leave, before I could say a proper goodbye. But we talked with all the enthusiasm of childhood friends, based on a book. So it goes. I ordered persuasssion soon after. &lt;/li&gt;
&lt;/ol&gt;</content><category term="Personal"/><category term="books"/><category term="review"/></entry><entry><title>Bluffs and Swaps</title><link href="https://blog.skaup.co/bluffs-and-swaps.html" rel="alternate"/><published>2026-02-24T12:25:00-08:00</published><updated>2026-02-24T12:25:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2026-02-24:/bluffs-and-swaps.html</id><summary type="html">&lt;p&gt;Liars Poker is a book my Michael Lewis about his time in the 1980s as a bond salesman in the now infamous Salomon brothers. In the book, he describes the game named in the title (&lt;em&gt;Liars Poker&lt;/em&gt;). I know it more commonly as bluff. You hand out the cards, the …&lt;/p&gt;</summary><content type="html">&lt;p&gt;Liars Poker is a book my Michael Lewis about his time in the 1980s as a bond salesman in the now infamous Salomon brothers. In the book, he describes the game named in the title (&lt;em&gt;Liars Poker&lt;/em&gt;). I know it more commonly as bluff. You hand out the cards, the first player picks a number, and puts down n cards of that numbers "3 Ace's" for example. Now, once you do that, the other person, based on their cards, has to guess if you're a big fat liar. &lt;/p&gt;
&lt;p&gt;This is a very basic game illustrating something - this is betting game, based a lot on luck, a bit on your cards knowledge, but mostly luck, and reading people. You base it on how much you trust them, with some level of know they might be bullshitting you.&lt;/p&gt;
&lt;p&gt;Now - What are swaps? A year ago, I started work in a financial services company. I had to work with some "instruments" called derivatives. So I started, in my free time, reading a bit about it. Why is this important? well as it turns out a lot of the big banks are exchanging money with each other. These instruments are probably working to give you the returns on your accounts. And what do they work on exactly?&lt;/p&gt;
&lt;p&gt;So first - bonds. The very basic thing - a loan. That is all a bond is. It is loan, with a bunch of extra things to attached to it. But at it's very core, it is a loan. When a government issues a bond, you are simply lending it money, which it returns back to you in small increments with interest. &lt;/p&gt;
&lt;p&gt;THAT is it. Nothing more. Now, you take this simple loan, and you start trading it with other people - that is a bond. Basically, who you can trade your share in the loan with other people. Why would you trade it? Perhaps you can no longer afford it. The return rate fluctuates for various reasons. Because people want to ? Put a pin on it.&lt;/p&gt;
&lt;p&gt;Now, let us say you are an Indian company, and you have borrowed money from the Indian bank in rupees. Now, there is another company, and american one, large multinational, which wants to borrow in rupees. They think this will be profitable for various reasons. Put a pin on it. On the other hand, you want the attractive interest rates that a multinational corp gets. So you and this big american company exchange your interest rate payments. I scratch your back you scratch mine type of situation. This is what a swap is. At it's very basic level, it is SPECULATION about interest rates offered to another party. And the only reason the other party would get different interest rates is due to thier creditworthiness.&lt;/p&gt;
&lt;p&gt;The Indian company can afford to pay the loan back at interest rates offerered by US treasury. But because they are less notable in that market, less CREDITWORTHY, they are not offered the same features. So our big American company offers to sidestep this whole beauracratic thing, and offer the Indian company the good rate (which it believes it will pay back). In return it converts its dollars to rupees each time to pay the Indian company's loan.&lt;/p&gt;
&lt;p&gt;This is exactly what happened in the first swap deal in the 1980s, brokered ofcourse by Salomon brothers. It was between IBM and the World Bank. In this case, the World Bank wanted to borrow Swiss Francs but did not have access to it's market (the swiss government had put a limit on World Bank borrowings). IBM , for some reason, had a lot of Swiss franc debt. It wanted access to the U.S Treasure rates the World bank had access to. So they essentially Swapped their loan payments for an agreed upon period. IBM paid the US dollar loans, and the World bank paid the swiss franc loan. This was the world's first currency swap.&lt;/p&gt;
&lt;p&gt;A question I always had was - WHY do interest rate fluctuates. What does this mean - bond price, yield etc. Why are people not being offered the same rates anyways. Effectively, why can't you scratch your own back. Well you just can't reach there. In the same country for example, two parties may have wildly different "credit worthiness" on the surface, but may, in reality, according to the playing parties, have different abilities. Interest Rates change, or ability to borrow changes. An interest rate is the "cost of borrowing" or the return on spending. If you think, you have a lot of people asking for money from you (a bank) then you can offer the people giving you money to allow for these transactions a better return. But if those same borrowers stop return your money, you can't give back as much. It is not a static loan, it changes because people's demands change. And any swap is just a way to speculate on what this could be. &lt;/p&gt;
&lt;p&gt;So, this is just one kind of Swap. But there are many - Interest Rate, Credit Default, etc. Many of them have had a huge effect on our lives in a million invisible ways. And it is all based on CREDITWORTHINESS DIFFERENTIAL. Who is creditable, for what reason? Are those reasons valid in the eyes of market players. So effectively, are your cards good, and are you lying, or not?&lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;p&gt;1.&lt;a href="https://documents1.worldbank.org/curated/en/702581468764955247/pdf/multi0page.pdf"&gt;World Bank Document Explaining Swaps&lt;/a&gt; &lt;br&gt;
2.&lt;a href="https://www.bauer.uh.edu/rsusmel/7386/Case%20-%20IBM%20WB%20Swap.pdf"&gt;Case Study Explaining the original Swap mechanics&lt;/a&gt; &lt;br&gt;
3.&lt;a href="https://www.amazon.in/LIARS-POKER-Michael-Lewis/dp/0340839961?s=bazaar"&gt;Liar's Poker&lt;/a&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;Post Notes&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;I'd like to point the reader to the fact that Salomon brothers was also the place where the infamous Mortgage Backed Security was born. So if you were at all affected by the Global Financial Crisis in 2008, you know where it started atleast. And the people who started it with real pieces of work. Eating cheeseburgers at 8 AM in the morning, betting on coworkers' running races, cursing out MBAs - that was their lifestyle. A lot of the finance sector has been professionalised since 2008. Perhaps that is for the best. But atleast the gluttony was visible then. The name of the Book Liar's Poker comes from a time the heads at Salomon brothers jokingly played the game for a million dollars. But the players knew they weren't joking at all. They knew the real deal. These people will bet on anything. &lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;This post was written after a prompt in Bangalore's &lt;a href="https://events.indieweb.org"&gt;Indie Web Club&lt;/a&gt;. "Write about a topic you're an expert on to introduce it to a beginner". I am not an expert - call it enthusiastic learner. But I have read a bit about this, and wanted to collate it somewhere. So, a note of thanks to them.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;</content><category term="The-Middle"/><category term="finance"/><category term="stories"/></entry><entry><title>Pyaasa</title><link href="https://blog.skaup.co/pyaasa.html" rel="alternate"/><published>2026-02-13T08:32:00-08:00</published><updated>2026-02-13T08:32:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2026-02-13:/pyaasa.html</id><summary type="html">&lt;p&gt;An old one, once more. I wrote this to get into the film society of our university. I was in the script writing department, which I then ended up heading. Once a year, we did a 7 day film challenge, where we tried to write and shoot a short film …&lt;/p&gt;</summary><content type="html">&lt;p&gt;An old one, once more. I wrote this to get into the film society of our university. I was in the script writing department, which I then ended up heading. Once a year, we did a 7 day film challenge, where we tried to write and shoot a short film in 7 days, in groups of about 5 to 8. I wrote and edited the film in the first year. And in the next year, thanks to the head Hardik, I headed these small contest teams as well. I am not going to comment on the quality of those films. Less you know the better. These were some of the best days of college. I used to write the scripts with a partner usually, and it really taught me what a gift that is. &lt;/p&gt;
&lt;p&gt;To writing! Good, bad and in-between.&lt;/p&gt;
&lt;hr&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Yeh duniya agar mil bhi jaye to kya hai?&lt;/em&gt; &lt;/p&gt;
&lt;p&gt;&lt;em&gt;Translation&lt;/em&gt;: Even if I gain this world, of what use is it?         &lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Pyaasa(1957) is a movie directed and produced by Guru Dutt. It was a critical and commercial success. The movie is about an aspiring poet, Vijay. The title is a reference to his desire for fame and success. He lives on the streets, because his family wishes for him to earn a stable living. He tries to get his poems published but is dismissed as a nobody even by small newspapers. His brothers sell his poems to the raddiwala for 10 paise. &lt;/p&gt;
&lt;p&gt;His mother loves him and wants to live with him. He can barely support himself and so he declines, hoping for the right time to come. He meets a prostitute, played by a young and vivacious Waheeda Rehman, named Gulabo. She sings him one of his own poems and tries to lure him. When he refuses her offer, she turns cold and harsh. Soon she realizes that he is the writer of her favourite verses and slowly falls for him.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/pyaasa/waheeda_rehman.jpeg" alt="Waheeda Rehman" style="max-width: 60%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;Vijay, on the other hand, is trying to come to terms with the fact that his first love, Meena, is married to his boss,Mr Ghosh who hires him specifically because he is jealous of him. The cat is let out of the bag eventually and Vijay is fired from his job. His mother dies soon after and he spirals into alcoholism and depression. He feels utterly lost and tries to commit suicide, but is saved by a poor man to whom he lent his only coat, who ultimately dies in front of a train. &lt;/p&gt;
&lt;p&gt;The world declares Vijay dead instead. Gulabo goes to Meena to publish Vijay’s poems. His books cause uproar and he becomes a martyr beloved by the public. To keep earning profit off him, Mr Ghosh and Vijay’s friend refuse to recognise him when he recovers from the train accident. He escapes a mental institution with the help of his friend, a champiwala played by the immensely talented Johnny Walker. Vijay, after being rejected by his paid off brothers, goes to his own funeral. He sings his own poetry but is trampled by a confused mob. His friend and brothers agree to recognise him in order to make money off of his success. When he understands how corrupt the world is, Vijay declares that he isn’t the poet people love, goes to Gulabo and embraces her. They walk into a smog, away from the world and towards peace. &lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/pyaasa/end.png" alt="End" style="max-width: 60%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;The movie is brilliant and holds up surprisingly well. This was Dutt’s prized personal project. He made it after a string of box office hits that allowed him the security to make something alternative. The shot composition is spectacular. The black and white colours are used to show us the troubled state of the characters. When Vijay wakes up in a hospital, his kurta is clean and white but his face is framed by shadows. There are some surreal moments when pieces of paper fly due to aggressive wind and cloud the screen, that happen only when a character’s world has been shaken. When Meena confronts Vijay about his lies, he is standing on a door, emanating heavenly light, signifying his self-actualization. &lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/pyaasa/guru_dutt.png" alt="Light" style="max-width: 60%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;The filmmakers didn’t have monitors. A take was a take was a take. Some of the faces get cut unevenly sometimes, but overall that camera is used effectively. Sahir Ludhianvi, an acclaimed poet and lyricist, wrote the iconic songs. The film seems based a little on his life and his unrequited love for the poet Amrita Pritam, but the comparisons end there. The score is by S.D Burman and it is fantastic. The music is intricately weaved into the film’s atmosphere; rising, looping and shifting to convey Vijay’s hope, despair and grief. His poems(nazmein) are so hauntingly beautiful they will stick with you for a long time to come. He writes about the plight of the downtrodden and exploited women. &lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/pyaasa/amrita_pritam.png" alt="Amrita Pritam" style="max-width: 80%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Amrita Pritam&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/pyaasa/sahir_ludhianvi.png" alt="Sahir Ludhianvi" style="max-width: 80%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Sahir Ludhianvi&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;In a lot of these movies, the misunderstood artistic genius is cruel to those around him, simply because they stand in his way. Vijay, on the other hand, is quiet and controlled. He feels the suffering around him deeply and correctly blames the system that allowed it instead of individual people. &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Mujhe kisi insaan se shikayat nahi. Mujhe us samaj se shikayat hai jo insan se uski insaaniyat cheen leta hai.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I have no issues with any single person. I have issues with the society that takes away the personhood from a person&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The few poems about a lost love and feelings of being rejected from society cut right through. Ludhianvi’s own poems were inspiring to the post independence youth. His work dealt less with god and spirituality and more with the concerns of that time, unemployment, excessive materialism, ignorance and abuse directed towards the underprivileged, that have only exacerbated in the years since. The classic status of the movie is well deserved. &lt;/p&gt;
&lt;p&gt;Writer Abrar Alvi apparently wanted Vijay to land on a compromise with civil society, but Guru Dutt insisted on the ending we got. It works. It certainly leaves a smile on my face. His 11th hour monologue says it best&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Jahan jeene se maut sasti, Jahan Pyaar hota hai wyaapar bankar, Jahan admi kuch nahin, wafa kuch nahi, dosti kuch nahi. Yeh Duniya agar mil bhi jaye to kya hai?&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Where dying is cheaper than living, where love is a business affair, where a person becomes nothing, loyalty is nothing, friendship means nothing. &lt;/p&gt;
&lt;p&gt;Even if I gain this world, of what use is it?&lt;/p&gt;
&lt;/blockquote&gt;
&lt;hr&gt;
&lt;h3&gt;Notes&lt;/h3&gt;
&lt;p&gt;1.&lt;a href="https://www.youtube.com/watch?v=WJK45r-j6TU"&gt;The movie is free on youtube&lt;/a&gt; &lt;br&gt;
2.&lt;a href="https://www.rekhta.org/authors/amrita-pritam/profile"&gt;Images sourced from Rekhta&lt;/a&gt;&lt;/p&gt;</content><category term="Personal"/><category term="movies"/><category term="stories"/><category term="writing"/></entry><entry><title>Back(prop) To The Future</title><link href="https://blog.skaup.co/backprop-to-the-future.html" rel="alternate"/><published>2026-01-19T11:48:00-08:00</published><updated>2026-01-19T11:48:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2026-01-19:/backprop-to-the-future.html</id><summary type="html">&lt;p&gt;How do you connect two things - one which you are kind of familiar (neural networks maths) with, and the other which fills you with dread (Physics)? &lt;/p&gt;
&lt;p&gt;A quick story, before we begin. I was deeply afraid of physics in 11th grade. I would freeze up when I saw certain problems …&lt;/p&gt;</summary><content type="html">&lt;p&gt;How do you connect two things - one which you are kind of familiar (neural networks maths) with, and the other which fills you with dread (Physics)? &lt;/p&gt;
&lt;p&gt;A quick story, before we begin. I was deeply afraid of physics in 11th grade. I would freeze up when I saw certain problems. These things were meant to be connected to the real world. It is the &lt;em&gt;physical&lt;/em&gt; world after all, right? While trying to make sense of the all the straight line forces and the friction coefficients, my brain would reach it's limit. This is not how I saw objects behave. The way our textbooks said light travelled (straight, bending between mediums) was not how I experienced the diffused morning sunlight. It pained me to see this happen. Especially because I seemed to intuitively understand maths. At each step I saw these two subjects were connected, but I simply could not grasp it in the same way. So this connection to physics both intrigued and scared me. Here is the chance, to get at some of this physics business finally, after all these years. Let's try.&lt;/p&gt;
&lt;p&gt;To recap, what is this brachistochrone business, firstly. It was problem, formulated by Johann Bernoulli. So there is a ball, and it has to get from point A to point B. What is the fastest path for it to do so? Disregard friction, only consider gravity (for now).&lt;/p&gt;
&lt;p&gt;You might think it is the straight path. Easiest conclusion to come to. But the question is not the &lt;em&gt;shortest&lt;/em&gt; path. It asks for the path which takes the least amount of time. So this lead to a new field of calculus called the calculus of variations.&lt;/p&gt;
&lt;p&gt;It defines these things called functionals. So imagine you have a set of some of the paths the ball could take. You associate each one with the time it takes for the ball to reach from A to B. This mapping is an example of functional. It is a map from a functions f to real numbers.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Formal definition:&lt;/em&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;A functional $S[f]$ is a map from functions $f$ to the real numbers. &lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Now a functional has something called stationary paths. Just like a functions have stationary points, where the slope is 0. To find something similar, let's look at the above example again. The stationary path in this case would be exactly what we are looking for. Given all the functions and the associated times taken, the path with the minimum time taken is the stationary one. &lt;/p&gt;
&lt;p&gt;&lt;em&gt;In a general mathematical context:&lt;/em&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Let $S[y]$ be a functional that maps functions $y$ that satisfy $y(a) = A$ and $y(b) = B$ to the real numbers. Any such function y(x) for which $$d(S[y + εg])/dε = 0$$ for all functions $g(x)$ that satisfy $g(a) = g(b) = 0$, is said to be a stationary path of $S$.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;What this means practically is it can also be the path which takes the most amount of time, but in our case, we are looking at the minimizing one. In either case y' = constant.&lt;/p&gt;
&lt;p&gt;So how do find this minimal path. Well, long story, but we end up with an equation, called Euler-lagrange equation&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;For the functional L:
$$\frac{\partial L}{\partial y} - \frac{d}{dx} \left( \frac{\partial L}{\partial y'} \right) = 0$$&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;That weird sign is the partial derivative. Basically, if you have an equation with multiple variables and have to derivate it, the partial derivate is the one with respect to one of the selected variables.&lt;/p&gt;
&lt;p&gt;For our brachistochrone problem, as it turns out, when you solve for this equation, you get a path called the cycloid. A cycloid is the curve traced by a point on the rim of a circular wheel as the wheel rolls along a straight line without slippage.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/backprop-2-physics/cycloid.png" alt="Description" style="max-width: 80%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;This is what ends up being the path which takes the minimal amount of time. Here's a video that shows it works &lt;/p&gt;
&lt;div&gt;
&lt;iframe width="560" height="315" src="https://www.youtube.com/embed/li-an5VUrIA?si=xWdCU-brMpIP_60m" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen&gt;&lt;/iframe&gt;
&lt;/div&gt;

&lt;p&gt;&lt;br&gt;&lt;/p&gt;
&lt;p&gt;Pretty solid explanation - the ball needs to accelerate towards point b, without falling straight away to the ground. If it does that, it won't move towards point B. This is the path it takes. &lt;/p&gt;
&lt;p&gt;Now, there are alternative explanations here. This is also the path that follows the "principle of least action" law, i.e it can be derived using the laws of conservation of energy. Bernoulli has a solution which connected this problem to Fermat's least time principle. He was connecting this to the law of refraction. On finding his solution, Bernoulli said&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;In this way I have solved at one stroke two important problems – an optical and a mechanical one – and have achieved more than I demanded from others: I have shown that the two problems, taken from entirely separate fields of mathematics, have the same character.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;And so we get to the root. Functionals can be used to solve all kinds of different problems, as it turns out as well. How light travels in different mediums. How a film of soap spreads. And ofcourse, you could consider, in a neural network. As explained &lt;a href="https://blog.skaup.co/neural-networks-and-lisp-part-1.html"&gt;before&lt;/a&gt;, the training data is a set of functions (i.e x and y points) and the corresponding value of the loss function of your choice. For a any given function you associate it with this loss value. This is a kind of functional. If you minimize it (aka, find the stationary path for it), you have the "model" i.e the approximation that best meets our criteria. &lt;/p&gt;
&lt;p&gt;A &lt;a href="https://lia.disi.unibo.it/Staff/MarcoLippi/publications/ICANN2013a.pdf"&gt;paper&lt;/a&gt; I found also talks about this. It mentions that the only difference in application is that neural networks have discrete weights, whereas the physics methods involve continuous functions. It makes the following comparison for how the Euler-Lagrange equations are used in both problems-&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/backprop-2-physics/link-2-analytical.png" alt="Description" style="max-width: 80%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;There are also some things called &lt;a href="https://arxiv.org/abs/1806.07366"&gt;Neural ODEs&lt;/a&gt;, that change discrete network layers to continuous functions. &lt;/p&gt;
&lt;p&gt;Lecunn himself recognised &lt;a href="https://new.math.uiuc.edu/MathMLseminar/seminarPapers/LeCunBackprop1988.pdf"&gt;this&lt;/a&gt;. "The concepts are new, if not the algorithm", he said. But I am still NOT satisfied. Okay, you give some theory to something well known. Cool? I have derived some things, sure, but nothing else has happened? How do you make this real? I still found myself unable to grasp at this damn thing. &lt;/p&gt;
&lt;p&gt;How do we make it interesting? Well, I have this skill called designing programs? You all have heard of this? Strange thing. But I can use it to simulate things, and try to model some of this physics business? So, how about we run some simulations to see if when we try to minimise the time taken from A to B by using neural networks, does it give us the brachistochrone curve? I am not clear on how exactly would we do this, since in a normal approximation,we would know the path with minimum time (i.e it is the cycloid). How do we reach this in an open ended manner? I have no clue. Must try nonetheless.&lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;p&gt;1.&lt;a href="https://webhomes.maths.ed.ac.uk/~v1ranick/papers/wigner.pdf"&gt;The unreasonable effectiveness of mathematics&lt;/a&gt; &lt;br&gt;
2.&lt;a href="https://www.google.com/url?sa=t&amp;amp;source=web&amp;amp;rct=j&amp;amp;opi=89978449&amp;amp;url=https://www.open.edu/openlearn/ocw/mod/resource/view.php%3Fid%3D72745&amp;amp;ved=2ahUKEwjujbLy9ZaSAxWaXmwGHRHCKJAQFnoECBsQAQ&amp;amp;usg=AOvVaw06SLc2CzhjxUFqv6E8WwgR"&gt;Introduction to the calculus of variations&lt;/a&gt; &lt;br&gt;
3.&lt;a href="https://mecheng.iisc.ac.in/suresh/me256/GalileoBP.pdf"&gt;Bernoulli solution&lt;/a&gt; &lt;br&gt;
4.&lt;a href="https://arxiv.org/abs/1806.07366"&gt;Neural ODEs&lt;/a&gt;&lt;/p&gt;</content><category term="The-Middle"/><category term="neural-networks"/><category term="backpropagation"/><category term="physics"/></entry><entry><title>Put some theory on it</title><link href="https://blog.skaup.co/put-some-theory-on-it.html" rel="alternate"/><published>2025-12-27T10:20:00-08:00</published><updated>2025-12-27T10:20:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-12-27:/put-some-theory-on-it.html</id><summary type="html">&lt;p&gt;&lt;em&gt;OR - How I learned to stop worrying and love all the Origins of Backpropagation.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;If you read about machine learning history, even a bit, you will see a very interesting fight come about. To cut to it, in the 70s and 80s, there was a clash (and to some extent …&lt;/p&gt;</summary><content type="html">&lt;p&gt;&lt;em&gt;OR - How I learned to stop worrying and love all the Origins of Backpropagation.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;If you read about machine learning history, even a bit, you will see a very interesting fight come about. To cut to it, in the 70s and 80s, there was a clash (and to some extent, that clash will ALWAYS be there) between the efficacy of rule based systems  (your standard prolog, &lt;a href="https://web.njit.edu/~ronkowit/eliza.html"&gt;ELIZA chatbot&lt;/a&gt;) vs more non-deterministic systems that claimed to probabilistically "learn" from "experience". Now these more non deterministic systems have many origins - the work of Marvin Minsky, rooted in biology. At least, that is what your standard undergrad-level ML textbook goes into.&lt;/p&gt;
&lt;p&gt;But the PARTICULAR fight in the 70s and 80s was amongst the people who were literally making it possible to have more probabilistic systems. The famous paper &lt;a href="https://www.iro.umontreal.ca/~vincentp/ift3395/lectures/backprop_old.pdf"&gt;Learning representations&lt;/a&gt; popularised the approach of "backpropagation" - i.e in essence something that combines gradient descent, with some fancy (or basic, depending on how you look at it) chain rule derivations. It suddenly became very probable that we could have more stochastic systems. I won't go into it, but it basically boils down to being able to compute gradients in a much easier manner. Thus making it easier to the gradient part of gradient descent and you know, all the C-3PO magic that follows.&lt;/p&gt;
&lt;p&gt;However, it is astonishing to me how much even the people involved with popularising said technique (word used deliberately) were not fans of it. Geoffrey Hinton, one of the co-authors of the paper, who spent years working on the technique also was like .. hmm this is cool, but seems to converge for larger samples (or overfits I guess, in simple words) - OR they would themselves turn back around and say oh this is not as elegant as so and so approach. He himself went back to working on something called Boltzman machines - and then had to FAIL, repeatedly fail actually, to come back to this curious backpropagation business. A very good article that goes into the details of this &lt;a href="https://yuxi.ml/essays/posts/backstory-of-backpropagation/"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;So they ditch this Backpropagation business for a while in the 70s and come back to it in the 80s. It becomes a good approach, and I suppose (I am only guessing) Yann Lecunn and co are accused of plagiarism. There was a trippy guy in the 70s from Sweden (&lt;a href="https://www.werbos.com"&gt;Paul Werbose&lt;/a&gt;) who followed a very similar approach. He came up with the approach by thinking of Freudian theories. Bonkers. Yann Lecunn himself published a &lt;a href="https://new.math.uiuc.edu/MathMLseminar/seminarPapers/LeCunBackprop1988.pdf"&gt;paper&lt;/a&gt; a year after the breakthrough one saying --- ahh this is where you might have seen it before. And he talks about all the Operational Research guys who did similar work, talks about Lagrangian systems.&lt;/p&gt;
&lt;p&gt;And so it becomes clear that this isn't ONE thing. It's an old approach, applied to a new context. In fact, while discussing it with my mentor Asokan Pichai, he told me the approach goes far as NEWTON. I thought we were talking at least 20th century here, how far back does this go?&lt;/p&gt;
&lt;p&gt;Turns out, there is something called Calculus of Variations, which I have been doing some cursory reading about - it is basically the process of finding an optimal path - kind of. The original problem was what path should a ball rolling take from one point to other, so that it may reach it as quick as possible. It was a &lt;a href="https://en.wikipedia.org/wiki/Brachistochrone_curve"&gt;famous problem&lt;/a&gt; in mathematics. Bernoulli sent out a challenge to mathematicians to solve the problem. When Newton anonymously submitted a solution, he was found out. Bernoulli said that he had &lt;em&gt;"recognised the lion by its claws"&lt;/em&gt;. Incredible.&lt;/p&gt;
&lt;p&gt;You might think it's a straight line, but with other forces at play (plus it's ball, it's gotta roll), the actual path is something called a Cycloid, which is the path traced by one point on the ball as it rolls along the straight line between the two points. But the calculations applied in this case can be applied in any case where you associate a function (in this case path of ball from A to B) to a particular value (time taken in this case) - this is called a functional. Finding the optimal path (shortest or longest, as it may be) - it's called finding the stationary path of the functional. &lt;/p&gt;
&lt;p&gt;It can be applied to finding the path taken by a ray of light in a medium to reach from point A to B, to the surface area of a soap film in water to, if you can guess, the function associated with the minimum prediction error for a given set of values. So PHYSICS is now connected to this as well. But really, what I figured out while reading about this stuff is that you can take whatever theory and retrofit it in anything. It doesn't really matter, the origin. It matters for scientific investigation and credit. In real life, what matters was that the approach WORKED in practice. What is interesting here however, is that we have covered subjects from Biology and Physics to Psychotherapy somehow. But to look at the past is interesting because we can see the many different ways we used to come at the same thing. And what other ways could there be, perhaps a connection is in there, waiting to be re-discovered. More details in the next part.&lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;p&gt;1.&lt;a href="https://web.njit.edu/~ronkowit/eliza.html"&gt;Eliza Chatbot - One of the first Chatbots&lt;/a&gt; &lt;br&gt;
2.&lt;a href="https://www.iro.umontreal.ca/~vincentp/ift3395/lectures/backprop_old.pdf"&gt;Learning Representations 1986 Paper&lt;/a&gt; &lt;br&gt;
3.&lt;a href="https://yuxi.ml/essays/posts/backstory-of-backpropagation/"&gt;The Backstory of Backpropagation&lt;/a&gt; &lt;br&gt;
4.&lt;a href="https://new.math.uiuc.edu/MathMLseminar/seminarPapers/LeCunBackprop1988.pdf"&gt;A theoretical framework for back-propagation&lt;/a&gt; &lt;br&gt;
5.&lt;a href="https://www.werbos.com"&gt;Paul Werbos's fun website&lt;/a&gt; &lt;br&gt;
6.&lt;a href="https://en.wikipedia.org/wiki/Brachistochrone_curve"&gt;Brachistochrone curve problem history&lt;/a&gt;&lt;/p&gt;</content><category term="The-Middle"/><category term="neural-networks"/><category term="backpropagation"/><category term="science"/></entry><entry><title>Delivering Baby Oil</title><link href="https://blog.skaup.co/delivering-baby-oil.html" rel="alternate"/><published>2025-12-11T21:00:00-08:00</published><updated>2025-12-11T21:00:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-12-11:/delivering-baby-oil.html</id><summary type="html">&lt;p&gt;A few years ago my family hosted a baby shower for one of my aunts. It was a lovely ceremony, and I, for once, actually figured out the names of some of my relatives.&lt;/p&gt;
&lt;p&gt;A few months later, I am sitting at home, relaxing, zoning out.  It’s a lazy …&lt;/p&gt;</summary><content type="html">&lt;p&gt;A few years ago my family hosted a baby shower for one of my aunts. It was a lovely ceremony, and I, for once, actually figured out the names of some of my relatives.&lt;/p&gt;
&lt;p&gt;A few months later, I am sitting at home, relaxing, zoning out.  It’s a lazy afternoon, and my mom gets a call. &lt;/p&gt;
&lt;p&gt;Since she’s in the washroom, I pick up the call. The name looks familiar. There’s an old woman’s voice on the other end. &lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nv"&gt;PERSON&lt;/span&gt;:&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;Hello&lt;/span&gt;?&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="ss"&gt;(&lt;/span&gt;&lt;span class="k"&gt;Pause&lt;/span&gt;&lt;span class="ss"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;Yes&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;we&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;have&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;delivered&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;baby&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;oil&lt;/span&gt;.
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Baby oil, you might ask? Isn’t that strange? But a cousin who lived with us worked in fashion. There would often be small clothing samples, or pieces of cloth in the delivered packages at home. My Brain, in its infinite glory, assumed this was something related to that. So I go:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;    ME: Okay. Nice day.
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;And I cut the call. &lt;/p&gt;
&lt;p&gt;A little while later, my mother comes out of the washroom. As she is drying her hair, she looks at her phone and asks me.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nv"&gt;MUMMY&lt;/span&gt;:&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;Saachi&lt;/span&gt;,&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;did&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;someone&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;call&lt;/span&gt;?

&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nv"&gt;ME&lt;/span&gt;:&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;Yes&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;Mummy&lt;/span&gt;.

&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nv"&gt;MUMMY&lt;/span&gt;:&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;X&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;Called&lt;/span&gt;,&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;what&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;was&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;it&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt;?

&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nv"&gt;ME&lt;/span&gt;:&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;Oh&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;nothing&lt;/span&gt;,&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;they&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;are&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;delivering&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;baby&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;oil&lt;/span&gt;.

&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nv"&gt;MUMMY&lt;/span&gt;:&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;HUH&lt;/span&gt;?&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;What&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;nonsense&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;are&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;you&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;saying&lt;/span&gt;?
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;At this point, I have also figured out something is wrong. So I quietly slip away, hoping to avoid this mess altogether. I can hear my mother talking to the person on the phone in the room in hushed voices. Whatever it is, it's not going to go well for yours truly.&lt;/p&gt;
&lt;p&gt;She then enters the room, and looks at me dead in the eye:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;    MUMMY: Saachi, They have delivered a baby GIRL.
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;I never even congratulated the woman who called us. Just cut the call on her. So it goes.&lt;/p&gt;</content><category term="Personal"/><category term="story"/></entry><entry><title>Functional Ribbon Revised</title><link href="https://blog.skaup.co/functional-ribbon-revised.html" rel="alternate"/><published>2025-12-04T11:22:00-08:00</published><updated>2025-12-04T11:22:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-12-04:/functional-ribbon-revised.html</id><summary type="html">&lt;p&gt;Once in a while, when I do the dsa problems, I try to do them using functional programming. Haskell is really good for some of these problems.They become trivial with it. But input output does have to be handled, and it becomes a pain to do so when you …&lt;/p&gt;</summary><content type="html">&lt;p&gt;Once in a while, when I do the dsa problems, I try to do them using functional programming. Haskell is really good for some of these problems.They become trivial with it. But input output does have to be handled, and it becomes a pain to do so when you can’t even get the input test case on your standard competitive programming platform. So I tried some Scala once, and I found its ecosystem, at least for such problems, to be good enough. &lt;/p&gt;
&lt;p&gt;The problem goes: You are given a matrix, and you must write out its “spiral” form i.e take elements as if you were wrapping a ribbon to the centre of the matrix and lie them out flat.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/functional-ribbon/problem-matrix.png" alt="Description" style="max-width: 40%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;So this matrix’s ribbon would be [1,2,3,6,9,8,7,4,5]&lt;/p&gt;
&lt;p&gt;This seemed to me to be a clear functional programming problem. I suppose an imperative approach would help here, but when I visualised the problem, with a little snake coiling through the matrix, taking it apart piece by piece, I &lt;em&gt;knew&lt;/em&gt; it had to be done functionally.&lt;/p&gt;
&lt;p&gt;I suppose there is another way to think about it. Combinatorics. Take the size of the matrix, make pairs of (0..number of columns), then (1.. number of rows), then reverse it again, by doing (col-1.. 0) and (row - 1 to 1). This is fine, but it is basically the same ribbon approach expressed in a different way.&lt;/p&gt;
&lt;p&gt;The POINT is to think about the underlying structure of the problem. Let's say there is a snake coiling through the box. The snake picking up the first row it can see, turning to the right and then repeating the same process again. How do you express this programmatically? You can get the first row's elements easily, but what about the turning part?&lt;/p&gt;
&lt;p&gt;Well, you can flip the remaining matrix (i.e, what is left after the first row is remoed) along it's diagnol. Fancy eighth grade maths term for this is transpose. &lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/functional-ribbon/basic-transpose.png" alt="Description" style="max-width: 60%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Transpose of a matrix - first step&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;This allows us to have access to the columns of the original matrix as rows now, so we can perform the same operation of just picking up the first row in front of us. But, one more hitch. Picking up the first row when you flip the matrix is not enough. The row we want is actually the last one, so we once again flip it along the middle. Even fancier term for this operation is reverse.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/functional-ribbon/operation-sequence.png" alt="Description" style="max-width: 60%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Example of operation sequence - performed once&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;The sequence is to take the first row, then transpose the remaining matrix, reverse it, and do a repeat rerun until you run out of elements. You can be extra clever about it, and not reverse the matrix each time. Instead, you can take the last element only, and then rinse repeat. But this means that the way in which you do the first operation is different from the way in which you do the remaining operations. And while that is fine, it's not really what I would ideally like to do. Again, snake approach - just consume the first row in front of you. &lt;/p&gt;
&lt;p&gt;Initially, I also thought this could be done with a scan or a fold operation. Basically, take the elements and accumulate them somehow. But that didn’t work either, because those approaches iterate through the initial list and consider each element one by one. Here, I had to consider MULTIPLE parts from different elements in the list. And I had to come back to some elements. Unless I transposed the list and passed it to the next element. In which case I was not doing a scan, because I would have to repeat that same process again.&lt;/p&gt;
&lt;p&gt;Here is the python code for it:  &lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;def ribbon_recursive(matrix):  
    if len(matrix) == 0:  
        return []  
    top_row = matrix[0]  
    if len(matrix[1:]) == 0:   
        return top_row  
    return top_row + ribbon_recursive(transpose(matrix[1: ])[:: -1]) 
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This is the basic thing. Find the top, then for the remaining part, transpose and reverse it and continue.&lt;/p&gt;
&lt;p&gt;This is the haskell solution:  &lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;ribbon matrix = map (head) $ iterate (reverse . transpose . tail) matrix 
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;One line. Classic.&lt;/p&gt;
&lt;p&gt;It’s not perfect. It gives a pesky tail exception I am too bored to calculate. I fixed the problem in Scala by taking the elements only while I actually could.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;Iterator.iterate(matrix)(_.tail.transpose.reverse).takeWhile(_.size &amp;gt; 0).map(_.head).flatten
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;And then after talking to some smarter functional programming people, had to confront my laziness(HA) and then fixed it to this -&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;ribbon matrix = map (head) $ takeWhile ((&amp;gt; 0) . length) $ iterate (reverse . transpose . tail) matrix       
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Alternatively, instead of using "tail method" - we can simply, take the last row of the transposed matrix each time, and then do our transformations with the top part of the matrix remaining. I thought this could be done without using the reverse operation (hence it would be more "effiecient"). But alas, efficiency Gods have betrayed me, since I still needed that operation to maintain the order of the elements picked up. And the first row would have to be dealt with separately. here is a recursive definition of the same -&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/functional-ribbon/alternate-sequence.png" alt="Description" style="max-width: 60%; height: auto;"&gt;&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;ribbon&amp;#39; [] = []
ribbon&amp;#39; ((x:xs):[]) = reverse (x:xs)
ribbon&amp;#39; matrix = last matrix ++ (ribbon&amp;#39; $ transpose $ reverse $ init matrix)

ribbon matrix = head matrix ++ (ribbon&amp;#39; $ transpose $ tail matrix)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Again, this approch sidesteps the "reverse" in the first call, but then, the operation must be done to ensure the "turn right" part. Personally, this looks less neat.&lt;/p&gt;
&lt;p&gt;Now, I want to be clear, you are not going to win any efficiency points with this. But my goal is not efficiency as an abstract but to approach it from a functional angle. And it is easier, once you start thinking functionally to apply your solution to all functional programming languages. I learnt how to think like this in Haskell first. It’s what I revert to when I have to “think” of the problem, and then it becomes a matter of finding equivalent functions in other languages. Now, there are some language specific things that cannot be translated. They are the features that fundamentally change from one language to another. But still, I find for these trivial problems, some portability is quite easy to achieve.&lt;/p&gt;</content><category term="Technical"/><category term="functional-languages"/><category term="haskell"/><category term="scala"/></entry><entry><title>The Graduate</title><link href="https://blog.skaup.co/the-graduate.html" rel="alternate"/><published>2025-11-23T10:53:00-08:00</published><updated>2025-11-23T10:53:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-11-23:/the-graduate.html</id><summary type="html">&lt;p&gt;An old one again. Had written this a few years ago, when I graduated from the WE program. It was a women engineer's program by (at the time) NSE TalentSprint and Google. It was shutdown last year, since DEI initiatives went out of style. But the program had a very …&lt;/p&gt;</summary><content type="html">&lt;p&gt;An old one again. Had written this a few years ago, when I graduated from the WE program. It was a women engineer's program by (at the time) NSE TalentSprint and Google. It was shutdown last year, since DEI initiatives went out of style. But the program had a very positive impact on my life, and the lives of many others.
And the fact that it even &lt;em&gt;was&lt;/em&gt; at some point a real force is a good thing. So, onto the good thing. &lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;A few months ago, I graduated from the WE Program. To celebrate this, we were all called to Hyderabad for the ceremony. Though excited at the prospect of travelling, I was still bone deep sad that the program was ending. I really don’t know where the two years went, between early morning sessions, group discussions and last minute presentations. I was particularly not prepared for this, since I was the only one from the cohort attending optional classes meant for the next cohort. I was doing the most I could to stretch the experience for as long as I could, even artificially. It was because I was deathly afraid of who I would be without the program. &lt;/p&gt;
&lt;p&gt;Asokan once jokingly called his early morning sessions a drug. But it’s true. They were actually that good. Towards the end, the structure became extremely free flowing. Our machine learning classes had pretty much run out of material, when Asokan had the idea to ask us to present on the topic. After that, it was pretty much a changed class. Since we weren’t restricted in what we could present on, the topics covered an extremely broad range. Matrix multiplication, quantum mechanics, first person RPG shooters, you name it. And it was a thrill. &lt;/p&gt;
&lt;p&gt;In the back of my head, to know I would now lose out on such experiences made me so sad, I did not even process it. Kept focusing on how excited I was to travel to Hyderabad and have some great Osmania biscuits again. I was basically deluding myself into thinking it wasn’t really over, since we had a handful of Machine Learning sessions remaining. But life has its ways. I reached Hyderabad, sleep deprived and excited. &lt;/p&gt;
&lt;p&gt;Our graduation ceremony started with a Spoorthy hosting. She is our cohort’s program manager and has been guiding us for two years now. She was as charming and captivating as the first day I had spoken to her. All of our mentors made a few speeches, including the program managers from Google. Then the real kick came. Spoorthy mentioned that some of us would be called to speak on stage, at random. 
There are, I think, many parts in all of us, often conflicting. There is a part of me that loves the spotlight. I love making grand speeches that I imagine will dazzle everyone. The keyword here is imagine. In reality, I often fumble, misstep and am incapable of expressing my thoughts clearly. Still, I hoped to speak, but did not really expect it would happen. Besides, it was more than enough to listen to the others. Sincerely wanting to hear their stories, I decided to settle in for the afternoon, happy to just cheer them along.&lt;/p&gt;
&lt;p&gt;My friend Kanchan was called to speak. She got up from her seat and almost leaped to the front. She’s a great dancer too, which really shows in her body movement. She came, and spoke clearly, eloquently and passionately. A bright smile on her face the whole time, even as she described the troubles she faced. It was so good that I was fully convinced that it was pre-planned. As the thought settled in me, I thought oh, I haven’t been asked to speak. The disappointment seeped in for a few seconds, but then I realised that I was more than happy to listen. &lt;/p&gt;
&lt;p&gt;Big mistake. This was, as a matter of fact, completely wrong. After a few more of my friends spoke, Spoorthy mentioned the previous article I wrote and called me to the stage. I was so stunned and unprepared, I got up from my seat and started fiddling with my purse and jacket. Somehow I made it to the stage and just knew this wasn’t going to end well. Already the tears had been bubbling up, and now I was completely vulnerable. But I wanted to say something, so I decided to just push through. It took me a few moments to collect myself. A lot of awkward silence and loud mic breaths. Some words came out about how much the program meant to me, how much I was not prepared to leave. Spoke about what it meant to me that I had such a good support network for these two years. That before the program, I thought life followed one track only. But I have now learnt that you do not have to be limited to the one road visible to you. Different, better things are possible. There is always more road ahead. &lt;/p&gt;
&lt;p&gt;The tears took over at this point, and I ran to the washroom. Composed myself to the best of my abilities and stepped out again. When I came back,  my friends were all smiling at me. They patted me on the back and graciously (thankfully) offered me some tissue papers. That was needed, really needed, throughout the afternoon. The main pipe had been broken, and there was no patching it soon. Thankfully many more people spoke much more coherently after.&lt;/p&gt;
&lt;p&gt;It was wonderful to hear from everyone of my cohort mates, almost all of whom have incredible stories to tell. One person wanted to be an Astronaut, and has now done research work in the domain, even gone to Paris for a conference. One person can speak 12 languages. Almost all of them have internships in huge companies such as Google, Microsoft and other giants. Everyone had a different reason for loving being in the program, from the sense of community to the constant mentorship. That’s how you know it’s good. A few people came after to say that they also started crying when I did. Instantly, I thought they were lying to console me. Either way, it was a nice thing to say.&lt;/p&gt;
&lt;p&gt;To reframe my speech a bit, I would like to say that it is actually now that I think that my path in life is predetermined. I was a very wide eyed kid that thought she could chart out a unique path for herself. That was before college. Now, I see everyone following a path, and I get in line much quicker than I would expect. The program did thankfully give me some space to try different things. The mentors compelled me to compose myself. Their presence demanded it from me. It’s as if disjoint parts of myself came together to form a single line. And it is precisely that good influence that I will miss. But the fact that I felt the influence at all, perhaps that is good enough.&lt;/p&gt;</content><category term="Personal"/><category term="programming"/><category term="community"/><category term="mentors"/></entry><entry><title>LLM (mis)Adventures</title><link href="https://blog.skaup.co/llm-misadventures.html" rel="alternate"/><published>2025-11-09T10:04:00-08:00</published><updated>2025-11-09T10:04:00-08:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-11-09:/llm-misadventures.html</id><summary type="html">&lt;p&gt;Ever have a requirement so simple, it does not even occur to you that it would stump you. And then days later, when you are stuck in the trenches running the same test for what feels like the fiftieth time, because it simply does not work goddamn it, you pray …&lt;/p&gt;</summary><content type="html">&lt;p&gt;Ever have a requirement so simple, it does not even occur to you that it would stump you. And then days later, when you are stuck in the trenches running the same test for what feels like the fiftieth time, because it simply does not work goddamn it, you pray for any god to rescue you from this nightmare.&lt;/p&gt;
&lt;p&gt;No? Just me. Cool. Well, regardless, I was running up against one of those. The requirement was simple. I had to spin up a server and get some packages running. Nothing I haven't done before. Didn't even think twice of it. &lt;/p&gt;
&lt;p&gt;But when I actually did end up running the server, the package manager commands were ridiculously slow. This, despite the fact that I had a server with pretty much the same configurations working in the same environment. Started and restarted the server a million times. Killed it and recreated it a few times. No dice. &lt;/p&gt;
&lt;p&gt;So I decided to go to my best friend. The friend of the underconfident, yound and hopeless. GPT. A lot of this article is to maintain a good internet footprint also, so when the AI Gods take over,they must know I have been very nice. Regardless, I turned to it, because this is the kind of stuff it's good at. With low contextual information, since there no legacy code that it has to keep in mind. Just a dumb little server. Should be easy, right?&lt;/p&gt;
&lt;p&gt;After giving it all the info, it asked me to run some commands. Test this endpoint, run this curl command, check this directory. I did all of it, and saw that in my non working server, only one thing seemed different from the working one. There was a certificate missing error in the non-working one that wasn't present in the working on.&lt;/p&gt;
&lt;p&gt;I told this to my best friend and got the standard response. "That's right! That does seem to be the issue. Good catch! If your certificate is missing, your package manager commands will be slow, since it can't communicate with the endpoint."&lt;/p&gt;
&lt;p&gt;Nice. Flattery will get you everywhere my friend. I believed this. It seemed perfect. Sometimes, while mindlessly using LLMs, I almost feel like it is controlling me, rather than the other way around. That usually ends with pretty terrible, unusuable code. But this felt like I was actually working with the tool at hand. Not as much in control as I'd like to be. But hey, it's getting the job done right? Right?&lt;/p&gt;
&lt;p&gt;So I started looking around for the root of this missing certificate. I didn't even stop to ask why a machine with almost the same config would have a missing certificate. Chalked it up to the version of the packages on the non-working machine being newer, since the working one had been spun up much earlier.&lt;/p&gt;
&lt;p&gt;But alas, I couldn't really put a pin on it. Where could this mysterious certificate be, in the thousands of directories of this machine. How would I even begin looking. I asked the beloved GPT again, and it gave a few commands I could try to figure this out. But oh, nothing of substance came of this investigation. And I was at the stage with GPT where the questions and answers were circular. I didn't have any new information to give other than "not working". It couldn't help me out either.&lt;/p&gt;
&lt;p&gt;I was on the verge of giving up. But then in the same environment, I found a recent server working perfectly well. Someone else must have made it. I looked around for the differences, comparing the config files.&lt;/p&gt;
&lt;p&gt;Dear Reader, it was a RAM issue.&lt;/p&gt;
&lt;p&gt;Hadn't assigned enough RAM to the dumb machine. Certificate-shmertificate. My 3000 word long chat with GPT was for NOTHING. Hours of hazy, general bewilderment and frustration, for nothing. I felt so incredibly stupid. But now I must step back a bit. And have a bird's eye view of the whole situation, w.r.t our usage of LLMs.&lt;/p&gt;
&lt;p&gt;A few very smart people havce argued that LLMs are actually better for senior engineers. It is ONLY if you KNOW what you are looking for that they can help. Used for general investigation of the aimless junior engineer, they will lead to such wastes. This has some very troubling implications. The main one being that even when we think we are in control, we really aren't. I thought I was using the LLM "better than others". I didn't ask it to free-fall read my mind or simply passively take the sugesstions without testing things out. &lt;/p&gt;
&lt;p&gt;But still, while using LLMs, it was like the state of my brain became a bit... groggy. I was more willing to go along with a solution I didn't fully grasp, because it felt like the best (and not coincidentally, laziest) way out. It became easy to ignore small nagging doubts like WHY is this happening. WHY is the supposed certificate missing with just one upgrade, especially since this issue is mentioned nowhere else on the net. It was easier to go along with it's story, since mine felt even more incomplete. Atleast it seemed confident in it's answer. When have I been able to say that for my hunches?&lt;/p&gt;
&lt;p&gt;To get along, you must go along ofcourse. The new age is here. Steroids are everywhere. We must learn, as many before us have had to learn, that things are about to get weird. For "knowledge" work, this time around. I'd love to end this with a nice little "be vigilent" message, but I know that is impossible. The tool incentivises you, especially people at my level of experience to tune out. And we have been trained our whole lives to tune out. It's not easy. Ofcourse, a little presence of mind goes a long way. But the way to cultivate that long term? Reach out if you figure that one. I will pay you.&lt;/p&gt;</content><category term="Technical"/><category term="llms"/><category term="linux"/><category term="software"/></entry><entry><title>Jayanti I</title><link href="https://blog.skaup.co/jayanti-i.html" rel="alternate"/><published>2025-11-02T00:00:00-07:00</published><updated>2025-11-02T00:00:00-07:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-11-02:/jayanti-i.html</id><summary type="html">&lt;p&gt;My grandmother from my fathers side (doda), Jayanti Irvathur died almost two years ago. She had sustained a slip in the bathroom a few years before. She could walk thereafter, but always with some help. And soon, her feet started giving out on her. For three years, she had been …&lt;/p&gt;</summary><content type="html">&lt;p&gt;My grandmother from my fathers side (doda), Jayanti Irvathur died almost two years ago. She had sustained a slip in the bathroom a few years before. She could walk thereafter, but always with some help. And soon, her feet started giving out on her. For three years, she had been bedridden. Slowly, with the exactness of a knife, time took her from us.But she was a person before all that. A vibrant, strict, caring person. When I was younger, my parents sometimes left me at her place during summer vacations. She was always concerned about my cousins. They had to always eat on time, come up when they were called and playing for too long usually resulted in a frown. My sister, who has won softball championships, at the time sported a no less magnificent boy cut. Doda hated this. And she didn’t keep to herself either, always the Indian grandmother. There were constant comments. But my sister obviously didn’t care. Neither did anyone else. Though when she finally did grow out her hair, the comments did not really stop. They were still about how terrible that old cut was. &lt;/p&gt;
&lt;p&gt;She made great maggi. It felt more like a wholesome chicken soup than stale maggi. I loved having it when I was sick. She used to make cakes too. These beautiful bun cakes are made in special aluminium pans. She was one the first older women I knew who made such dishes. And since she was a south indian, she made great goddamn fish. There was always dahi prepped in her home, because her sweet child (my father) loved it. &lt;/p&gt;
&lt;p&gt;When she was younger, she studied in a convent school. She completed her education till the twelfth grade, a rarity for her time. And she was in her school’s volleyball team.  I found the report card one day, Jayanti Irvathur, volleyball team. She raised her children to value education. And although she might have preferred if they all just stayed inside and studied, she was proud of other things too. She kept the first trophy her kids got. My dad, at a rabbit race when he was two years old. Either that, or that trophy is fake. At a later date, all of them were always involved in some or the sports activity too. &lt;/p&gt;
&lt;p&gt;She lived a disciplined life, her and my grandfather. They took regular walks, either to the market or to the nearby garden. They provided all that they could for their kids and then their grandchildren. But they had fun too. She had a bright, beautiful smile. I saw it in some pictures recently. Her long face made her seem a bit rigid, but the wrinkled smile brought you right back.&lt;/p&gt;
&lt;p&gt;When we went to celebrate her birthdays, since my dad has other brothers, there were always two, or more cakes. Surprisingly, my grandparents never had health problems before their very late years. No diabetes, no blood pressure, no cholesterol. Nothing. Zilch. Nada. The consequence of a disciplined life. But they loved those cakes regardless.  When she fell, I think my grandmother changed a lot. She had lived a relatively stable life till then. She would always complain of her terrible luck. How she never wished old age problems on others. How the doctor told her to exercise, and she did, but nothing ever came of it. And there were periods when she seemed better. But really, the years took a toll on her after my grandfather died. She never came to terms with it. After a while, she stopped mentioning him too.&lt;/p&gt;
&lt;p&gt;But she was a person. It’s difficult to remember. I have to try and remind myself she was a person. Bright smile, sari loving, volleyball team member. She wrote in English as well as Kannada. She took my lessons too sometimes, when I was younger. She made great daal rice. She kept a poster of a kid in her cupboard, and then let me take it home years later. She kept a pouch with the crispest notes you have ever seen, wrapped in a handkerchief next to her bed. She often whispered to bring “the pouch” to me and discreetly gave me the fifty rupee note, trying to hide it from the servants. She liked going through old photos. There’s still a picture up of a drawing my sister made on her bed. Her kids are the only things she remembered, even in the last few weeks. How her middle one was the studious, strict one and yet the youngest was her favourite. And she was a real, flawed person here on this planet, before the sickness made her more or less unrecognisable. &lt;/p&gt;
&lt;p&gt;The movie inside out is about emotions and memories. In it, different memories are coloured differently, based on the emotions the character experienced. Happy memories are yellow orbs, blue ones are sad etc. For most of the movie, the memories are only one colour, since the main character is a kid.But in the end there is a scene where the main character realises that memories she thought were only happy had some sadness to them. The memory orb, typically a simple colour, comes out, a kaleidoscopic mix of yellow and blue. It’s about how when we grow up, we must realise that our memories are not one thing only. They are multicoloured. It’s a kids movie, but sometimes common lessons must be drilled into adults. It’s easy to feel our memories of a person are only one thing. Let your memories be multicoloured.&lt;/p&gt;</content><category term="Personal"/><category term="personal"/></entry><entry><title>The Duckies</title><link href="https://blog.skaup.co/the-duckies.html" rel="alternate"/><published>2025-10-26T11:58:00-07:00</published><updated>2025-10-26T11:58:00-07:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-10-26:/the-duckies.html</id><summary type="html">&lt;p&gt;Earlier this year, I called up my friend Nidhi. She's two years younger, and has always been gracious and kind. I ask her what she's up to, and she tells me she's going to make a presentation tomorrow. Usually, she would tell me what it's about. But she keeps it …&lt;/p&gt;</summary><content type="html">&lt;p&gt;Earlier this year, I called up my friend Nidhi. She's two years younger, and has always been gracious and kind. I ask her what she's up to, and she tells me she's going to make a presentation tomorrow. Usually, she would tell me what it's about. But she keeps it a secret. A real professional.&lt;/p&gt;
&lt;p&gt;We were both a part of the same Women Engineers program. I graduated two years ago, and she would just be graduating. She mentioned a project that they had made - something they were about to present. Very casual, nothing much, y'know. Having seen her make many presentations in the program myself - I wasn't too nervous. She could handle this.&lt;/p&gt;
&lt;p&gt;The next day, I asked her, hey, how did it go? She texted me back with a github link. We made this - see. I open up the &lt;a href="https://wearcade.vercel.app"&gt;link&lt;/a&gt; on my phone, and suddenly, I am hit with a funky 80s Atari themed screen. Is this what you have been hiding out on me?&lt;/p&gt;
&lt;p&gt;What is this? I decide to open up my laptop and see this thing as it should be seen. And as I go through it, It seems they have made MULTIPLE games. Each for one of our mentors in the program. They have leaderboards, mini-games, and side quests. The design quality was impeccable. All the games matched the retro 80's theme, unified across all the games. Gorgeous backgrounds, attention paid to small details like the expressions of avatars. &lt;/p&gt;
&lt;p&gt;Personal details included in the questions asked. In-jokes through the years. Sayings and aphorisms we learnt in the program, little anecdotes. Sound effects for the various catch-phrases our mentors have. And these were so fun. When I couldn't move up a level, I begged Nidhi to give me one of the answers for a quiz. So it goes.&lt;/p&gt;
&lt;p&gt;It is a testament to the mentors that this quality of work comes about. 103 people worked on this. They are 20 year old's with 50 other things on their plates. Internships to worry about, grades to manage, parents to deal with. And yet, despite all the troubles, they managed to do this? Groups of 5 are difficult enough to manage, how on earth did they pull this through?&lt;/p&gt;
&lt;p&gt;It's not exactly a product, but rather a tribute. The programming tasks were easy enough to manage. You can find templates and groups of 4-5 people can whip up a game in a week or two. But they created a hierarchy. Kunisha - our mentor, the 'Quackmaster'. Two people Nidhi and Anwesha,as the 'Supreme Duckies'. Beneath them, the Head Duckies. And beneath them, the Deputy Duckies, who worked as team leads for individual or multiple games. There were specialist Duckies to manage the individual parts - Technology Stack, Design Decisions and Story Elements. &lt;/p&gt;
&lt;p&gt;They would do progress reports each week. They provided medals of appreciation to people who contributed regularly. People talked mostly to their leads. And the individual groups were given enough freedom to make whatever they wanted, keeping in mind the theme and tech stack limits. They mostly made games they enjoyed. And that's it. &lt;/p&gt;
&lt;p&gt;This is extremely good organisational ability. Serious grown-ups in serious organisations have trouble co-ordinating between teams in this manner. To organise a group of even 10 people is impressive. Usually, for projects like these - with none of the usual incentives of either better grades, or impressive stipends, people tend to back out and projects fizzle out. But these people felt so strongly that they had to pay a tribute to their teachers, that they made it work. &lt;/p&gt;
&lt;p&gt;I have known some of them since they were wee second year college kids. And it made me so proud to see this. I cannot imagine what it must have been to be a mentor and see this. In their position, I would have slept the most peaceful sleep that night. Knowing, conclusively, that they had done good in the world. It's simple really. Give people the challenge and combine it with their inner conviction. If you want to build a ship, don’t drum up the men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea. &lt;/p&gt;</content><category term="Personal"/><category term="programming"/><category term="community"/><category term="mentors"/></entry><entry><title>A Streetcar Named Presenting</title><link href="https://blog.skaup.co/a-streetcar-named-presenting.html" rel="alternate"/><published>2025-10-19T10:25:00-07:00</published><updated>2025-10-19T10:25:00-07:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-10-19:/a-streetcar-named-presenting.html</id><summary type="html">&lt;p&gt;First, Let us get the &lt;em&gt;cough-cough&lt;/em&gt; stuff out of the way. I wrote this article two years ago. It was written during my first presentation made in an international country. The US of A that too, the big guns, literally. So I was nervous ofcourse. It was the first time …&lt;/p&gt;</summary><content type="html">&lt;p&gt;First, Let us get the &lt;em&gt;cough-cough&lt;/em&gt; stuff out of the way. I wrote this article two years ago. It was written during my first presentation made in an international country. The US of A that too, the big guns, literally. So I was nervous ofcourse. It was the first time I would be entirely on my own in a foreign country. It inspired some fun and angsty thoughts, which I wrote down. At the time, I was told this was too personal. I agree, this isn't exactly linkedin post material. But it is still the honest experience of a homesick person, trying to grasp something entirely new around her. So, this is what that came to.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Recently, the Summer of Haskell program accepted my project proposal. Haskell is a purely functional programming language. Functions are the primary way to write programs in it. The project was to create mandala visualisations for Tidal Cycles, a music software. Haskell programming tracks mathematics closely, bringing some nice compact laws with it. It is well suited to transforming music, graphics, and many other interconnected things. &lt;/p&gt;
&lt;p&gt;This year the International Conference on Functional Programming took place in Seattle. For contributing to the program, my invite arrived a month before the set date. The deadline was impossible and unrealistic. Visas in India are hard to get, usually taking months. But if you have nothing to lose, you can try every option. The visa arrived on the last day before I had to leave. When they found out, my parents were thrilled. Full of joy stirred with laughter that couldn't believe itself.&lt;/p&gt;
&lt;p&gt;There was just a tiny problem. Presentation is storytelling. This is why you act like an alcoholic writer through most of it, wondering what the hell was wrong with you. The first few attempts were ridiculous, full of details that no one cared about. It would have gone on to one hundred slides if not for the help of my friends and mentors. But I kept thinking, who takes weeks for 15-odd slides?&lt;/p&gt;
&lt;p&gt;Technical presentations are usually full of bold claims, bored attendants and nonsensical lying. On similar lines, I also gave flustered speeches with confusing demos. After a point, I had to try something different. There was a lot of procrastination and head-banging against the wall. Finally, I thought “All you have to try and do is make the next best presentation you can make.” So that is what I did. &lt;/p&gt;
&lt;p&gt;I packed and repacked the bags and headed out. As I flew above the Pacific Northwest of the US, what struck me the most were the trees. The country needs the woods for its freshwater. Witch potion ingredients for names. Douglas-Fir, Western hemlock and red cedar. Stout little triangles from the sky bent like punished kids. The only reference I had was the tween romance series Twilight. In it, sleek vampires play baseball and drink animal blood only, because they are vegetarian. One could easily imagine slightly modernised, but ritualistic sacrifice in those woods. Perhaps I was next in line. &lt;/p&gt;
&lt;p&gt;Up until reaching the hotel, the only people I spoke to were my parents. Hello, how are you? Yes, I am settled, boarded, and had food. Yes, Yes, Yes. But as I spoke with the clerks and the guards, a terrible thing sank in. These people don’t understand what I am saying. Too much of my Indian accent slipped in, without ever asking for permission. &lt;/p&gt;
&lt;p&gt;This wasn’t in the plan. I attended a proper school, scored well in English papers and earned fourth place in speaking competitions. I was confident I’d be able to handle myself anywhere. This happened to other people. Classmates who never took the effort, teachers from various small towns, who I made fun of. But, as punishment, things always come full circle. I had to make a presentation for God’s sake. And here I was, repeating every phrase to the waiter to get food. Another delusional person, thinking the accent won’t matter in a strange land. &lt;/p&gt;
&lt;p&gt;Going to the convention floor didn’t help either. I knew no one there. Still, I attended a few talks. And although there were many new things to learn, I could not ask anyone a question.  It was like your first day in a new workplace. You are nervous and eager to impress. But the bosses chain you to a chair and discuss company problems. You work in the same domain, so you pick up bits and pieces. But this is way above your pay grade. All you can do is look at them in confusion and awe.  &lt;/p&gt;
&lt;p&gt;In hindsight, it makes sense. People were talking to their professional colleagues at a research conference. The newbie, hoping to ask questions but too hungry and jet-lagged to try, won’t get too ahead. Or that's an excuse. After a while, it got too much, so I left once the talks were over. &lt;/p&gt;
&lt;p&gt;That evening, I stepped out for a walk. There were many e-bike stands along the way. Ninety per cent of e-bikes were whooshing past pedestrians. And the signs along the sidewalk had one instruction. DO NOT RIDE ON THE SIDEWALK. &lt;/p&gt;
&lt;p&gt;Seattle hosts an inlet of the Pacific Ocean that drains into its lakes. From parts of the city, you can see the flat white top of Mount Rainier. Next to the Ferris wheel on the waterfront, backed by the land across the inlet, Pike Place Market stands. A hustling and bustling place. Tourists wait in line in front of the first Starbucks. Posters promise Edgar Allen Poe shows, played by a man that looks more moustache than man. And then an actual fish market. &lt;/p&gt;
&lt;p&gt;Hoping to hear a familiar sound, I walked towards the water. People took pictures of the Ferris wheel in a quiet spot. I joined them. In the distance, the ships rusted and drifted away. Large machines shuffled next to piers and towering cranes. Though invisible from that point, Mount Rainier loomed behind. I wondered if people from a hundred years ago felt the same calm. Things will go on, long after you are gone.&lt;/p&gt;
&lt;p&gt;Right before the evening could set, I reached the hotel. This place had everything. There was a dimly lit bar full of candles. Friend groups of vacationing retirees. Tech employees out for a dinner night and well-dressed bar-tenders. Taking all this in, alone, I went back to the room. Lights were out by 8 PM. &lt;/p&gt;
&lt;p&gt;The next day, people walked into the conference room, waiting for the coffee to settle in. Important researchers and people from companies like DeepMind and Jane Street presented. This time around, I was glad to be there. Even being in their presence was good enough. My sister and her partner, who came to support me from California, met me soon after. It was very kind of them. They said they’d join me later, and I left to attend some more talks.  &lt;/p&gt;
&lt;p&gt;Lunch had a confusing start. As I revolved around some tables, a kind man helped me out. “That’s what these things are for, talking to people you don’t know.” That made the afternoon easier. I spoke to people who worked in big companies and small ones. Then I left to prepare some more.&lt;/p&gt;
&lt;p&gt;Soon, my presentation started. I gave some context to explain the project. Then something unreal happened. There were some nodding, genuine faces. They seemed interested, with no hidden agenda. People in the crowd looked me in the eye. There was a shared space between us. During the demo, my audio failed. It was easy to brush off though. “It makes music with functions”, I said to cover up. That received an understanding laugh. &lt;/p&gt;
&lt;p&gt;There are a few rare life moments that you hold on to. These people were here because they loved programming with functional languages. The work is difficult because the applications are difficult to explain to others. But they wanted to see and make something beautiful. I offered the small piece I could. It worked. I completed this by conveying my love for the language. While stepping down, I heard real claps. &lt;/p&gt;
&lt;p&gt;Later on, while exploring Seattle with my cousins, this would make me less lonely. Revisit Pike Market Place. Try handmade cheese sticks and have some bad pasta. Talk to overly kind strangers. From a distance, on the hilltop near downtown, look at the volcano imposed over the city’s skyline. Take pictures from the Space Needle against the glass window, afraid for my life. And then after talking and laughing and trying new food, the trip would be over.   &lt;/p&gt;
&lt;p&gt;But for now, reality was still waiting. People came and told me they liked my presentation. Showed me similar work and discussed the challenges. They spoke to me about Mumbai! Yet, it was still difficult to approach some people. There was nothing to say because I had done my job. Packed my stuff and headed out for a better Seattle adventure. And I left, a little less afraid then before.&lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://www.youtube.com/live/Rvs88m2mdig?si=5rgjP8HA5Is_5c4B&amp;amp;t=25694"&gt;The Presentation&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://tidalcycles.org/blog/blog_topic_mandalas/"&gt;Article Explaining Summer Of Haskell Work&lt;/a&gt; &lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;</content><category term="Personal"/><category term="presentations"/><category term="functional-programming"/><category term="travel"/></entry><entry><title>Neural Networks and Lisp - Part 2</title><link href="https://blog.skaup.co/neural-networks-and-lisp-part-2.html" rel="alternate"/><published>2025-10-11T11:31:00-07:00</published><updated>2025-10-11T11:31:00-07:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-10-11:/neural-networks-and-lisp-part-2.html</id><summary type="html">&lt;p&gt;P.S This article assumes some familiarity with functional programming, knowing lisp syntax helps. But I have tried to explain the best I can regardless. I hope if you’re from a general programming background, you will understand this.&lt;/p&gt;
&lt;p&gt;Now the functional part. First, a major part of this code …&lt;/p&gt;</summary><content type="html">&lt;p&gt;P.S This article assumes some familiarity with functional programming, knowing lisp syntax helps. But I have tried to explain the best I can regardless. I hope if you’re from a general programming background, you will understand this.&lt;/p&gt;
&lt;p&gt;Now the functional part. First, a major part of this code is directly a translation of the work done in the napkin math article. The premise is simple - If we want to build a “function approximator”, let’s try and do it for a very basic case - The average function.&lt;/p&gt;
&lt;p&gt;As an example - you have four squares, each having a colour between white and black. We can represent this as numbers as follows. If the square’s colour is white, it is 0, if it is black, it is 1, and any other shade of gray in between will be a real number in between those two. To estimate the colour of the overall grid, you need the average of the four numbers.&lt;/p&gt;
&lt;p&gt;Again, the napkin math article explains it way better, I highly encourage you to check it out. All I have done is a lispy rewrite of it, so I will focus on that. However, I did run into one little hiccup. Will get to that.&lt;/p&gt;
&lt;p&gt;First, in order to approximate the average function - we can try one thing. Suppose we get the square numbers, we can multiply each with any set of four numbers and then add those products up. For example, if the “weights” we decide in this case are 1 1 1 1, then the four numbers remain unchanged. But, if somehow, from any random starting point, our little function approximator could get close to 0.25, 0.25, 0.25, 0.25, then it would be able to act like the average function.&lt;/p&gt;
&lt;p&gt;So, initialization of global variables first. Typically, I’d define global variables with ALL CAPS. I like how scary this makes global variables. But lisp uses this &lt;code&gt;*style*&lt;/code&gt; for them, so I will stick with it. When in Rome. &lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;(defparameter &lt;span class="gs"&gt;*input-layer*&lt;/span&gt; &amp;#39;(0.2 0.5 0.4 0.7))
(defparameter &lt;span class="gs"&gt;*hidden-weights*&lt;/span&gt; &amp;#39;(0.98 -0.08 0.27 0.01))


&lt;span class="c"&gt;; To Ignore, will become relevant later&lt;/span&gt;
(defparameter &lt;span class="gs"&gt;*threshold*&lt;/span&gt; 0.00005)
(defparameter &lt;span class="gs"&gt;*dx*&lt;/span&gt; 1e-4)
(defparameter &lt;span class="gs"&gt;*learning-rate*&lt;/span&gt; 0.1)
(defparameter &lt;span class="gs"&gt;*tolerance*&lt;/span&gt; 1e-6)
(defparameter &lt;span class="gs"&gt;*max-steps*&lt;/span&gt; 1000)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Now, we get to the main part. Closures. So we can define our neuron as follows&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;make-neuron&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;weights&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;&amp;amp;optional&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;bias&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="s"&gt;&amp;quot;Creates a neural network node as a closure with weights and optional bias&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;let&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nv"&gt;w&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;copy-list&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;weights&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;        &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;b&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;bias&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;lambda&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;inputs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;      &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;b&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;reduce&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;#&amp;#39;&lt;/span&gt;&lt;span class="nb"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;mapcar&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;#&amp;#39;&lt;/span&gt;&lt;span class="nb"&gt;*&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;inputs&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;w&lt;/span&gt;&lt;span class="p"&gt;)))))&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The copy-list part is to do a deep copy of the weights passed to the generator function. And what’s that over there in the last line? That takes the input numbers given and does exactly what is mentioned above -&amp;gt; A sum of the product of each with the weights. Again, closures have some object-like properties but are functions. So what do we get with this?&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;(defparameter &lt;span class="gs"&gt;*average-weights*&lt;/span&gt; &amp;#39;(0.25 0.25 0.25 0.25)) ; For testing only
(defparameter &lt;span class="gs"&gt;*average-neuron*&lt;/span&gt; (make-neuron &lt;span class="gs"&gt;*average-weights*&lt;/span&gt;))

&lt;span class="c"&gt;; To call the same&lt;/span&gt;
(funcall &lt;span class="gs"&gt;*average-neuron*&lt;/span&gt; *input-layer*)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;That gets you the average. Pretty cool right? No need to do any finagling with the methods of objects - for now. But this is for testing only, we want our hidden weights to converge to that value.&lt;/p&gt;
&lt;p&gt;To do that, i.e train the model, I need to generate a bunch of x and y pairs. So I will do that. &lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;(let ((rectangle (list (float (random 1.0))
                           (float (random 1.0))
                           (float (random 1.0))
                           (float (random 1.0)))))
      (setq &lt;span class="gs"&gt;*rectangles*&lt;/span&gt; (append &lt;span class="gs"&gt;*rectangles*&lt;/span&gt; (list rectangle)))
      (setq &lt;span class="gs"&gt;*rectangle-averages*&lt;/span&gt; (append &lt;span class="gs"&gt;*rectangle-averages*&lt;/span&gt; (list (/ (apply #&amp;#39;+ rectangle)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;In this case, each x is a tuple of four numbers between 0 and 1, as mentioned before. And y is of course the corresponding average.&lt;/p&gt;
&lt;p&gt;Now, when I generate a y in my model, I need to check how close it is to the actual average of the four given numbers. And I will hand-wave over why this is the case, because the napkin math article does a better job of it. For our purposes, this is our holy grail loss function. &lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;mean-squared-error&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;actual&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;expected&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="s"&gt;&amp;quot;Calculates the Mean Squared Error between two lists of numbers&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;let*&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nv"&gt;squared-errors&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;mapcar&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;lambda&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;a&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;b&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;expt&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;a&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;b&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;actual&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;expected&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;         &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;sum-of-errors&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;apply&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;#&amp;#39;&lt;/span&gt;&lt;span class="nb"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;squared-errors&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;/&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;sum-of-errors&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;length&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;actual&lt;/span&gt;&lt;span class="p"&gt;))))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Now, if I had to make a VERY basic model, using my flawed hidden-weights for now, I would simply multiply the values of each rectangle with the hidden weights, and compare the results using the loss function&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;model&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;rectangle&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="s"&gt;&amp;quot;A model function that computes the dot product of a rectangle&amp;#39;s inputs an&lt;/span&gt;
&lt;span class="s"&gt;d a hidden layer.&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;reduce&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;#&amp;#39;&lt;/span&gt;&lt;span class="nb"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;mapcar&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nf"&gt;#&amp;#39;&lt;/span&gt;&lt;span class="nb"&gt;*&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;rectangle&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;

&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;train&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;rectangles&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="s"&gt;&amp;quot;Applies the model function to each rectangle in a list, returning a list of outputs.&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;mapcar&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;lambda&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;rectangle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;model&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;rectangle&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;rectangles&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Then we do the “training”&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;setq&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;simple&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;train&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;outputs&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;train&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="nx"&gt;rectangles&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="nx"&gt;hidden&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;weights&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;setq&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;simple&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;train&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;mean&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;squared&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;simple&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;train&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;outputs&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="nx"&gt;rectangle&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;averages&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;So that does a pretty bad job, naturally. Since the weights aren’t actually a 4 number tuple of 0.25. It gives some output between 0 and 1, but it’s not the average.&lt;/p&gt;
&lt;p&gt;I was in a DSA class once, and the instructor asked us for a solution. So I raised my zoom class hand up and answered. While I was met with some encouragement - I also got a cool response with the following
“Saachi is a bit of a violent person. Her approach is to hit the problem with a hammer.” So, to get to the hammer approach. One way to get closer to (0.25, 0.25, 0.25, 0.25) is to simply generate a bunch of random ones, compare each to our target. And once the loss goes below a threshold, call it a day. So, I tried that as well.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;randomised_train&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;rectangles&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;rectangle-averages&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;&amp;amp;optional&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;threshold&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="vg"&gt;*threshold*&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="s"&gt;&amp;quot;Applies the model function to each rectangle in a list, returning a list of outputs.&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;let&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nv"&gt;outputs&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;train&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;rectangles&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;
&lt;span class="w"&gt;       &lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;      &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;progn&lt;/span&gt;
&lt;span class="w"&gt;        &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;while&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;&amp;gt;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;mean-squared-error&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;outputs&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;rectangle-averages&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;threshold&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;         &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;setq&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;random_layer&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;         &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;setq&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;outputs&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;train&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;rectangles&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;        &lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;        &lt;/span&gt;&lt;span class="nv"&gt;hidden-weights&lt;/span&gt;
&lt;span class="w"&gt;     &lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This works pretty well too, since the function in this case is fairly simple. 
But still, we can’t generalise this to any kind of function we want to approximate. This won’t work for recognising your local doctors handwriting, let’s say that. So, we go to our friend from last time, gradient descent.&lt;/p&gt;
&lt;p&gt;Now this is where I reasonably diverge from the napkin-math article. Because the article uses Pytorch for automatic differentiation for finding the slope of the loss curve. While that is nice, I wanted to make it even easier for me to implement this. So I chose to do finite differentiation. The thing you reluctantly learnt in your eighth grade class. This part was quite tough for me to implement, but then I came across a wonderful book by Paul Orland called &lt;a href="https://www.manning.com/books/math-for-programmers"&gt;Math For Programmers&lt;/a&gt;. Thank you to O’reilly / Manning. That book does a python version of this. Then I had to translate it to lisp.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;secant-slope&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;f&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;xmin&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;xmax&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="s"&gt;&amp;quot;Calculate slope using secant line approximation&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;/&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;funcall&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;f&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;xmax&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;funcall&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;f&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;xmin&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;
&lt;span class="w"&gt;   &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;xmax&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;xmin&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This is &lt;code&gt;y2-y1/x2-x1&lt;/code&gt;. Again, eight grade, slope of a line. For any given function. Just beautiful. Blew my mind when I saw it in the book. Just clicked immediately. If you ever doubt there if there is &lt;em&gt;the&lt;/em&gt; higher order, just remember at the very least, these higher orders exist.&lt;/p&gt;
&lt;p&gt;Once this is done, some plumbing is needed to actually use it for our tuples. This is done withing the &lt;code&gt;approx-gradient&lt;/code&gt; function. But then, we can do the following&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;update-parameters&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;v&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;grad&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;&amp;amp;optional&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;learning-rate&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="vg"&gt;*learning-rate*&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s"&gt;&amp;quot;Update parameters using gradient: v - learning_rate * grad&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;mapcar&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;lambda&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;vi&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;dvi&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;-&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;vi&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;*&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;learning-rate&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;dvi&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;v&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;grad&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;defun&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;gradient-descent&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;f&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;vstart&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;&amp;amp;key&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;tolerance&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="vg"&gt;*tolerance*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;max-steps&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="vg"&gt;*max-steps*&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s"&gt;&amp;quot;Perform gradient descent to minimize function f&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;labels&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nv"&gt;gd-step&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;v&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;steps&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;            &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;let&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nv"&gt;grad&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;approx-gradient&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;f&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;v&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;
&lt;span class="w"&gt;              &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;or&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;&amp;lt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;vector-length&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;grad&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;tolerance&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="w"&gt;                      &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;&amp;gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;steps&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;max-steps&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="w"&gt;                                     &lt;/span&gt;&lt;span class="nv"&gt;v&lt;/span&gt;
&lt;span class="w"&gt;                    &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;gd-step&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;update-parameters&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;v&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;grad&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;1+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;steps&lt;/span&gt;&lt;span class="p"&gt;))))))&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="c1"&gt;; Recursive call&lt;/span&gt;
&lt;span class="w"&gt;   &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;gd-step&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;vstart&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="c1"&gt;; calls it with initial starting point of 0 steps&lt;/span&gt;
&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Again, that’s it. That’s gradient descent. You might be curious about that vector length function. It’s just a way to determine if you’re closer to your slope being flat (i.e at the minimum loss point). So, if you’ve either crossed the max-steps or reached a flat slope, congrats, you’re in the game.&lt;/p&gt;
&lt;p&gt;Nice, some more plumbing is done in a &lt;code&gt;gradient-train&lt;/code&gt; function, pretty similar to the other ones, and then we can simply do - &lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;(gradient-train &lt;span class="gs"&gt;*rectangles*&lt;/span&gt; *hidden-weights* &lt;span class="gs"&gt;*rectangle-averages*&lt;/span&gt;)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;When this worked (0.25000513 0.2499971 0.24999955 0.24999803) (the most recent output) - I was on a plane. I did a little whoop of happiness. Made my damn day. You can find the entire code &lt;a href="https://gitlab.com/trix3/neuralnetworkslisp"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;So, right. That’s it for technical stuff. But there is part 2 of this story. I recently presented this to a very nice group of people at a &lt;a href="https://hasgeek.com/fpindia/bangalore-fp-october-2025-meetup/"&gt;functional programming meetup&lt;/a&gt;. But tragically, I don’t think I did a very good job. I was very nervous, since I hadn’t made a presentation outside of work in a while. I had been practicing, but still forgot to pause before speaking, asked a bunch of open ended questions to a very small audience (a mistake I have made before) and did not even do a quick check of how many people were actually familiar with the machine learning terms I would be using. I gave too abstract of a demo to a technical audience. I will attach the slides, and a demo I made later. &lt;/p&gt;
&lt;p&gt;But still, I am glad I failed. The people there, despite my mistakes, were thoughtful enough to ask for clarifications, and make requests for what they would have liked. That has allowed me to write this article. I am genuinely grateful for their kindness. &lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;&lt;a href="https://gitlab.com/trix3/neuralnetworkslisp"&gt;Codebase&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://gitlab.com/trix3/presentation-lisp-neural-networks"&gt;Presentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/orlandpm/Math-for-Programmers/tree/master/Chapter%2015"&gt;Math For Programmers&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;</content><category term="Technical"/><category term="neural-networks"/><category term="lisp"/><category term="functional-programming"/></entry><entry><title>Neural Networks and Lisp - Part 1</title><link href="https://blog.skaup.co/neural-networks-and-lisp-part-1.html" rel="alternate"/><published>2025-10-07T10:21:00-07:00</published><updated>2025-10-07T10:21:00-07:00</updated><author><name>skaup</name></author><id>tag:blog.skaup.co,2025-10-07:/neural-networks-and-lisp-part-1.html</id><summary type="html">&lt;p&gt;The first few “artificial intelligence” models that we knew of came in the late 50s to early 60s. They were essentially rule based engines. Symbolic computation was involved that allowed for mimicking intelligence. And lisp was the hot language of choice for this work. These systems were non deterministic to …&lt;/p&gt;</summary><content type="html">&lt;p&gt;The first few “artificial intelligence” models that we knew of came in the late 50s to early 60s. They were essentially rule based engines. Symbolic computation was involved that allowed for mimicking intelligence. And lisp was the hot language of choice for this work. These systems were non deterministic to a degree, but in all, we were stuffing rules such as “Brad likes apples” into machines and hoping for the best. It was a sham.&lt;/p&gt;
&lt;p&gt;Then neural network came along, which made the concept of stuffing rule based engines redundant. And while that made many of the “symbolic computation” crowd evaporate slowly, I thought it would be a fun exercise to try and make neural networks with lisp. Something about the dead coming back to haunt you. &lt;/p&gt;
&lt;p&gt;But really, this is about functional programming. I must say, to the extent that I have worked on it, I found lisp powerful. But even I haven’t grasped it’s power fully. This is a bit half baked. But I am not here to kid around about functional programming. Almost all of the “Neural network from scratch” - or really anything from scratch tutorials I have seen end at the last level with a file with some class declarations. And my only question remains the same one I have had forever. Is this really needed?&lt;/p&gt;
&lt;p&gt;The goal was simple - make a neural network from scratch with lisp. See if there were any advantages (or disadvantages) of making such a network? What does it do differently than a standard neural network? Can we learn something about the properties of such networks by designing them in a different manner? I don’t know how many of these questions I have answered, but this is my attempt.&lt;/p&gt;
&lt;p&gt;Let’s go to a simpler question. What is a function? Lets say you have a set of numbers on 1 side, (1 2 3) and another on the other side (2 3 4). What could you say about their relationship? Right, each number on the right is one larger than its corresponding one on the left. That is a function. Y = x +1. At its very basic level, you can have two sets of numbers (real, irrational, all that jazz) and they would be adequate to “describe” a function. &lt;/p&gt;
&lt;p&gt;What is a neural network? Anyone? On asking this question, I could get various answers - “A machine simulation of your brain”  was one. That is partially true I guess. There are biological roots to this field that are always present, one of the more beautiful things about learning this. But really, a neural network is a function approximator.&lt;/p&gt;
&lt;p&gt;So while we have y = x + 1 that can give us the y for any x, what about complicated functions that we cannot represent cleanly? Well, in that case, assuming we have a machine that can be trained as such, we give it a bunch of x’s and a bunch of corresponding y’s. And once that machine is able to give us pretty accurate results for any given x and y pair, we can then in the future, by crossing our fingers, use it to produce the y result for any new x. That’s what happens in handwriting recognition to large language models. It is a function approximator. Fancy word for “ehhh … this produces results that are very close to what a well defined mathematical function would”&lt;/p&gt;
&lt;p&gt;The point is - a “neural network” is one way to do this. The base block of a neural network is a perceptron or neuron. Fancy word for something that is supposed to be similar to the neuron in that brain of yours. But it gets inputs. It applies some function to it, and gives an output.That function - lets call it the one we called it in 5th grade&lt;/p&gt;
&lt;p&gt;$$y = mx + c$$&lt;/p&gt;
&lt;p&gt;Now, usually we know m and c, which is why it is easy for us to find out y. But we don’ t today, so how do we do it?&lt;/p&gt;
&lt;p&gt;Well, we can kind of reverse it. If we have two pairs of x and y points, we can create a system of linear equations and solve for it. Then we get x and y. But what if the function is more complicated? Multidimensional. God only knows how dimensional? This approach doesn’t really cover those cases. So, we have to approximate these values. How do we do that?&lt;/p&gt;
&lt;p&gt;You all remember our good old friend linear regression? Anybody fresh out of hell that is a college statistics class? In linear regression, we have a bunch of points. And when we want to find the “best fit line” - we try to take the line which is the closest to all the possible points. So we take all the points, and find the line for which the distance of the points from the line is minimum. That is the best fit line. And that function - the distance one. It’s called the loss function. The more that distance, the greater the chances that you aren’t on the best fit line.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/neural-networks-and-lisp-part-1/linear-regression.png" alt="Description" style="max-width: 80%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;Remember again, the point I made, about a bunch of points, and finding something that describes them? Well, that’s what a best fit line is. So all of this is to say that a neural network is simply an overcomplicated linear regression model. Yes, well kind of. There are more things involved, but basically that is what it is. &lt;/p&gt;
&lt;p&gt;But how do we find that line, or that equation? That is the question. Well, one way is to try a bunch of m and c values till you get the least distance below a threshold and call it a day. This is a nice approach, and it will save you your sleep at night. But then you will wake up with squiggly lines taunting you, calling you not good enough. So what can you do? Well, one key way to look at it is that loss function. Suppose, for every one of the possible lines we can think of, we calculate the distance of each point from that line. Since our line is identified by $y = mx + c$, but we want to find m and c, so we take those two on the two horizontal axes. And then, we put the output of the loss function for each of those pairs on the vertical axis. Well, we have ourselves a little hill here don’t we.&lt;/p&gt;
&lt;p&gt;Let's go down it, dare I say? But really? What do we want now? We want the point at which this loss function has a minimum value. And when we visualise it, simply rolling a ball down this plane from a bunch of points will give us that. And that is what gradient descent is. It is a way to find the minimum of this hill - so that we can identify m and c that are appropriate. You find the derivative at a point in the slope, see if it’s going up or down, and keep going down. Do that with multiple starts, until you get to a point you’re pretty comfortable is the actual minimum and not just a random valley you got stuck in.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.skaup.co/images/neural-networks-and-lisp-part-1/gradient-descent-2d.png" alt="Description" style="max-width: 80%; height: auto;"&gt;&lt;/p&gt;
&lt;p&gt;There is a bunch of other stuff involved. There are neurons, with these weights and biases. We have activation functions of neurons - essentially another function application after you find m and c, to further improve the output. You have hidden layers, essentially multiple neurons interacting with each other - which codify the m and c (our friends weights and biases, let’s call them that). You have input layers (x’s in this case). And you have the output layer, your ideal y.&lt;/p&gt;
&lt;p&gt;So you train the neural network - by which you mean, for a very large number of x’s and y’s - you find the best m and x values. And then, after you have verified this, you unleash this into the world and watch it bleed. Now the functional part. What is the advantage of doing this functionally - well there is a lot of differentiation involved in gradient descent, and higher order functions are your buddies in this. But my main curiosity was peaked when I read this in &lt;a href="https://www.paulgraham.com/onlisp.html"&gt;On Lisp&lt;/a&gt;. &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;“A combination of a function and a set of variable bindings (at the time it was created) is called a closure.”&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Closures have three useful properties: they are active, they have local state, and we can make multiple instances of them. Where could we use multiple copies of active objects with local state? In applications involving networks, among others.”&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Aha - something. Right networks, methinks I remember looking at some interesting networks.&lt;/p&gt;
&lt;p&gt;In the next part.&lt;/p&gt;
&lt;hr&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://sirupsen.com/napkin/neural-net"&gt;Neural Networks on a Napkin - Simon Eskildsen&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</content><category term="Technical"/><category term="neural-networks"/><category term="lisp"/><category term="functional-programming"/></entry></feed>