HTTP 200 OK
Allow: GET, POST, HEAD, OPTIONS
Content-Type: application/json
Vary: Accept
[
{
"id": 18,
"title": "2024 Wrapped and on to 2025",
"slug": "2024-wrapped-and-on-to-2025",
"author": 1,
"updated_on": "2024-12-30T15:35:40.843621Z",
"content": "2024 was a year of transition. My team at work was split in two, with half going to tech sales, and the other half (including me) going into global consulting. This was a bit of a sudden change, and resulted in most of the beginning part of 2024 being dedicated to understanding our new roles. It was unclear what our mandate was at first, what the charter of our team was, our how best we could make an impact. \r\n\r\n<p> After some soul searching, we decided to split the team up into dedicated pillars or focus areas. After a bit of lobbying, I was selected to spearhead our emerging AI strategy, centered on RHEL and Openshift AI, due to my previous experience with AI and data engineering at Healthpilot. This was a pretty significant pivot to what I was doing earlier. I frequently felt lost as I was drinking directly from the fire hydrant of new information and a rapidly evolving field. I needed to keep reminding myself that this was new for basically everyone else in the field as well. Despite how confidant or self assured people sound in meetings, they were likely doing something completely different a few years prior. \r\n\r\n<p>My confidence eventually began to grow as I was able to get a few wins under my belt. I was one of the first people to pass the new AI based exam. I led a successful exploratory engagement with a customer centered around AI (this customer was still likely a ways away from making significant investments targeted in generative AI, but the overall engagement was still broadly a success). I became one of the leading contributors in <a href=\"https://github.com/redhat-composer-ai\">an internal project that serves as a platform for building AI Retrieval Augmented Generation (RAG) based tools for customers.</a> I'm going to be delivering two talks and a demo for an internal conference in February. \r\n\r\n<p> The important thing to remember is this is stuff is new and evolving for everyone else too. Everyone, including senior management, is in uncharted territory. They're basically making it up as they go. They may have referential experience to draw from in the past, like the cloud or mobile phone revolution, but will AI follow a similar track? No one can say for certain. I think the lesson of the \"Move Fast and Break Things\" mantra isn't to LITERALLY break things. Its that its far more effective to try things and learn from what doesn't work than it is to meticulous plan a perfect strategy. Be quick on your feet, and don't get married to ideas. Quickly ditch what isn't working, and be prepared to bet that farm on what looks promising.",
"created_on": "2024-12-30T15:33:26.016251Z",
"status": 1
},
{
"id": 17,
"title": "Making Django Unchained Kubernetes Production Ready",
"slug": "making-django-unchained-kubernetes-production-ready",
"author": 1,
"updated_on": "2023-04-08T17:39:37.932660Z",
"content": "<p>Hi all, as I stated in my last post, I've been working on getting Django Unchained to run on Kubernetes. I was successful in this goal, but it wasn't fully production ready. For instance, it had to use manage.py runserver command instead of running as a Web Server Gateway Interface (WSGI) via <a href=\"https://docs.gunicorn.org/en/stable/\"> Gunicorn.</a> This was, in addition to being horribly inefficient, was a security risk.</p> \r\n\r\n<p>I need to have django run via Gunicorn. In order to do this i needed to offload the static assets somewhere else. Since i was using the Azure Kubernetes Service, the most logical place for this would be azure blob storage!</p>\r\n\r\n<p>So how do I get the static assets into azure blob storage in an automated way? Well first i need to provision the blob storage. This is done via <a href=\"https://github.com/aokugel/aks-terraform/blob/main/main.tf#L58-L76\">my terraform repository.</a></p>\r\n\r\n<p>Next, I needed to <a href=\"https://github.com/aokugel/Django-Blog-Website/blob/blob-storage/blog/custom_azure.py\">add functions in django </a>to be able to upload the static files to azure when the manage.py collectstatic command is run. This is done using a python module called <a href=\"https://django-storages.readthedocs.io/en/latest/\">django-storages.</a></p>\r\n\r\n<p>I needed a way to pass the key of the storage account that gets created by terraform to the python classes. This is done by creating an <a href=\"https://github.com/aokugel/aks-terraform/blob/main/outputs.tf#L46-L49\">output, </a> which is then retrieved via shell script and then passed to kubectl to create a <a href=\"https://github.com/aokugel/aks-terraform/blob/main/operation_deathstar.sh#L14\">secret.</a> Now i don't have to enable debug mode to run django!</p>\r\n\r\n<p>In addition to this, I added functionality to configure a <a href=\"https://github.com/aokugel/django-unchained-kubernetes/blob/main/ingress.yaml#L11-L14\">self signed certificate. </a>With an accompanying secret that is created <a href=\"https://github.com/aokugel/aks-terraform/blob/main/operation_deathstar.sh#L49-L53\">here.</a> This approach does require importing the tls.crt into your browser, but when that's completed, It now has end to end encryption!</p>",
"created_on": "2023-04-07T17:57:32.264753Z",
"status": 1
},
{
"id": 16,
"title": "Containerizing Django Unchained on Kubernetes",
"slug": "containerizing-django-unchained-on-kubernetes",
"author": 1,
"updated_on": "2022-12-12T16:02:59.560720Z",
"content": "Time passes, things change, and evaluation is required. Recently I've been working on containerizing this website. This will result in a more cloud native application that is resilient and scalable. \r\n\r\n<p>First, I need to provision a cluster. I have decided to go with aks, as azure not chagrin for the control plane makes this the most cost effective option. <a href=\"https://github.com/aokugel/aks-terraform\">I provision the cluster using the terraform code here.</a>\r\n\r\n<p>Next comes the kubernetes manifests. \r\n <a href=\"https://github.com/aokugel/django-unchained-kubernetes \">These are represented by this repository.</a> Two deployments for the postgres db and the django application. Two respective services, a configmap, and ingress to route traffic to the services. <a href=\"https://learn.microsoft.com/en-us/azure/aks/azure-disk-csi\">I decided to use the managed-csi storageclass.</a>\r\n\r\n<p><a href=\"https://github.com/aokugel/aks-terraform/blob/main/operation_deathstar.sh\">Tying this all together is a bash script that kicks off everything. </a> I may move this to a github actions pipeline at some point, I’m not sure yet. \r\n\r\n<p>The end result is a completely containerized django unchained website running on aks in azure? I still need to figure out how to get https working on the ingress controller. That will be the next todo.",
"created_on": "2022-12-12T15:56:43.088084Z",
"status": 1
},
{
"id": 15,
"title": "Allow Me to Reintroduce Myself",
"slug": "allow-me-to-reintroduce-myself",
"author": 1,
"updated_on": "2022-10-13T15:26:09.631076Z",
"content": "So disaster struck in the django unchained world. Originally the “About” section of this website was just hardcoded html. This was not in the spirit of the sleek dynamic content that this website soughts to provide. As such, I decided to create a model for the contributors of this website. Originally this would be an orphan table, but that wouldn’t be in the spirit of a RDBMS. So I decided that the Author model would effectively be a subclass of the User model that comes built into django. I would then have a one to one relationship between the users and the authors. \r\n\r\n<p>One to one db relationships have a mandatory foreign key. Because I was creating the author table and it didn’t exist yet, postgres didn’t like this. I ended up compounding a bad decision by messing with postgres directly instead of via migrations…which borked the entire db. It was completely hosed.\r\n\r\n<p>To salvage this situation, I used my api endpoint to take a dump of all posts and comments, and saved them to json files. I then completely nuked everything in the public schema in the postgres database. I deleted my migration file, recreated the migrations from scratch, and migrated them. This fixed the issue. I then used the requests module in python to recreate all of my posts and comments. A side effect of this was that I couldn't manually create a creation date for my posts, which is why everything was created on october 11th. Still, this is a small price to pay for a working database.\r\n\r\n<p>Moral of the story, do not manually mess with postgres. Always use migrations. Always.",
"created_on": "2022-10-13T15:26:09.631097Z",
"status": 1
},
{
"id": 14,
"title": "Using Python to make API calls",
"slug": "using-python-make-api-calls",
"author": 1,
"updated_on": "2022-10-11T10:52:08.360782Z",
"content": "Hello Django nation. I recently had to audit a list of over 600 records. I was given a unique id, a module within our CRM platform, and I needed to make an api call to retrieve information. I decided the best way to this was by using python, with the help of the re and requests modules.\r\n\r\n\r\n<p>The first order of business was to deal with authentication. I registered my own application within the developer console within our CRM application. I was then able to use a refresh token to generate an oauth bearer token. I then created a python dictionary with key being ‘Authorization’ and the value containing my oauth token.\r\n\r\n\r\n<p>I then created a for loop which went through each line in my file, and using regex, grabbed both the record and module. I then passed this to a formatted string containing the url of the api endpoint I was calling. I think took the response, formatted it as a dictionary using the requests module’s .json() method. From there I was able to grab the necessary fields and write them to a txt file.\r\n\r\n\r\n<p>This is where python truly shines. Whipping up a simple script like this doesn’t take much time at all.",
"created_on": "2022-10-11T10:52:08.360803Z",
"status": 1
},
{
"id": 13,
"title": "Event Driven Architecture via AWS EventBridge",
"slug": "event-driven-architecture-aws-eventbridge",
"author": 1,
"updated_on": "2022-10-11T10:52:08.130067Z",
"content": "Hi all. So, part of my responsibility at my new position is to integrate our application with third party applications, specifically our CRM solutions. We’re doing this by using an event driven architecture powered by AWS EventBridge. Say a new user is created within our application, and we need that change to be reflected on our CRM platform. The application fires a UserCreate event, which event bridge subscribes to. Now, Ideally, we’d be able to have the CRM application subscribe to that event directly. Unfortunately, our CRM platform doesn't support this functionality. So we use an AWS step function to fire a lambda function to perform an API post request to the CRM. \r\n<p>My responsibility over the last few days has been to get those API Post requests to work correctly, which I believe I have. Hopefully my pull request gets approved either today or tomorrow, and then we’ll have these events fully integrated with our CRM!",
"created_on": "2022-10-11T10:52:08.130088Z",
"status": 1
},
{
"id": 12,
"title": "First Production Experience with Terraform!",
"slug": "first-production-experience-terraform",
"author": 1,
"updated_on": "2022-10-11T10:52:07.896980Z",
"content": "Hi all. So I got my first production experience with terraform. I needed to provision a VPC for our DR environment in another AWS region, and I did this purely with a terraform configuration file (I did create the three elastic IP address by hand, everything else was done with terraform). So VPC, NAT’s subnets, and tags: everything was done via terraform. \r\n\r\n<p>It’s extremely rewarding to be able to work with the latest and greatest technology. I’ve also been taking a crash course on Docker and Kubernetes, as our infrastructure is all on AWS EKS. Pretty night and day difference from supporting applications running on Citrix at the beginning on my career. I’m excited for that the future holds!",
"created_on": "2022-10-11T10:52:07.897002Z",
"status": 1
},
{
"id": 11,
"title": "Using Ansible to Automate Application Updates.",
"slug": "using-ansible-automate-application-updates",
"author": 1,
"updated_on": "2022-10-11T10:52:07.665643Z",
"content": "Hey y’all. I did a thing! So we had a rather time consuming process for one of our solutions. Basically every month we had to deploy a directory to every one of our clients sql servers. We would download the file from my company’s portal, upload it to the target sql server, extract it, and move it to the application install directory on the sql server. This process would take 3-5 hours, once a month. Now that is all streamlined using ansible.\r\n<p>Now all I have to do is sftp the file once to the orchestration server my organization is using for a ansible host. After that I run my ansible playbook which:</p>\r\n<ol>\r\n<li>Copies the file to the target server using the win_copy ansible module</li>\r\n<li>Extracts the file. The file is actually a cabinet file. I believe this compression method was chosen sometime in the late 90’s and has yet to be changed. I use win_shell to run the expand.exe utility in system32</li>\r\n<li>Drops the file in the correct install directory.</li>\r\n<li>Deletes the file from the staging location.</li>\r\n</ol>\r\n<p>Perhaps not the most complex contraption in history, but it feels awesome to be able to contribute to the organization using my ansible skills. Excited for what the future holds!</p>",
"created_on": "2022-10-11T10:52:07.665666Z",
"status": 1
},
{
"id": 10,
"title": "Promoted!",
"slug": "promoted",
"author": 1,
"updated_on": "2022-10-11T10:52:07.430569Z",
"content": "Hello again Django Nation. On Tuesday I had the pleasure of getting promoted! My new role will entail spearheading my organization’s efforts to migrate our solutions from on premise to public cloud. I’m extremely excited to be able to get my hands on enterprise level aws, and I look forward to being able to contribute in a major way to my new team!\r\n\r\n<p>My Certification efforts will likely take a pause, as the migration effort this summer is going to be crazy. The powers that be are looking to migrate an average of one client per day. In the fall I’m probably going to look into getting the AWS Certified SysOps Associate, as there’s a fair bit of overlap between that and the Solutions Architect Associate. Also taking an AWS certification after a major project will give me experience from which to reference from. So instead of being a lot of theoretical abstraction, I’ll be able to apply it directly to what I’ve done. Should be exciting!",
"created_on": "2022-10-11T10:52:07.430590Z",
"status": 1
},
{
"id": 9,
"title": "Passed my AWS – Solutions Architect Associate Exam!",
"slug": "passed-my-aws-solutions-architect-associate-exam",
"author": 1,
"updated_on": "2022-10-11T10:52:07.193007Z",
"content": "Hello my loyal readers! It is with great pleasure that I announce passing my Solutions Architect Associate Exam! It wasn’t too bad. I started studying in mid-January, and passed in mid-march, so it took about 2.5 months (I took a trip to Miami in the middle). Not bad.\r\n\r\n<p>I think the exam served two functions: It provided me a good framework for expanding my knowledge of Amazon Web Services and acted as proof of that understanding. My knowledge of concepts like the Application Load Balancer vs Network Load Balancer, and when to use RDS versus DynamoDB has been greatly augmented. I also have firsthand knowledge of how an API Gateway call works, and how to use the AWS CLI to quickly add or remove S3 Objects. I think it was a solid comprehensive exam that took my comprehension of AWS to the next level.\r\n\r\n<p>What’s next? That’s the million-dollar question! I could continue to get AWS certifications. I could pivot to containers, and work on learning docker and Kubernetes. I could continue to bolster my knowledge of Configuration/infrastructure as code and work on ansible or terraform. This will also depend on the needs of the company that I work for. I’m hearing talk of one of our applications being migrated from on prem to AWS, and that would be great experience if I could be a part of it.\r\n\r\n<p>Ok so more on the aforementioned Miami trip. We rented an Airbnb that was in the same building as the Arya hotel in coconut grove. Super nice, room was on the 20th floor with a view of the ocean. We met a bunch of people at the elevated pool who basically were just Covid digital nomads. They all just took an extended stay after the pandemic hit. One lady from Chicago took her two kids and DROVE from Illinois to Miami to have a 26-night stay. The kids would do schoolwork on their laptops, and she would just day drink at the pool (she had been furloughed and was on the super Covid unemployment). Maybe not a candidate for mother of the year, but one has to admire the spontaneity.",
"created_on": "2022-10-11T10:52:07.193029Z",
"status": 1
},
{
"id": 8,
"title": "Passed My Redhat Certification!",
"slug": "passed-my-redhat-certification",
"author": 1,
"updated_on": "2022-10-11T10:52:06.957709Z",
"content": "Sup y'all, it's been a minute. So I was originally planning on taking the Red Hat Certified System Administrator exam back in April. That would have been after roughly four months of studying (I wasn’t that experienced with Linux at the time). After the Deus Ex Machina of coronavirus entered the equation, I figured this was a sign from a higher power instructing me to change my plans. Kansas City ended up going into lockdown for two months, and IT certification locations were not viewed as essential businesses.\r\n<p>After taking a quick detour and creating this website (as well as some other stuff, checkout my github account in the top right of this page), I was having a one-on-one with my boss. He said something to the effect of “you studied for this exam for months and you didn’t even try to take it? What are you, A wuss? Pourquoi est-ce que j'emploie ce lâche? ” Touche, Jordan, touche. So I ended up re-learning everything I forgot and scheduled to take it last Wednesday. \r\nSo, it turns out, hyping this thing up in my head for the better part of a year did not do wonders for my blood pressure. I was on DEFCON level 1 the entire day purely due to anticipation. The run on went on during my lunch break was definitely interesting (shoutout to yellowcard for making the perfect PB song). \r\n<p>Of course the testing center I went to had the drinking fountains roped off with caution tape. I had to drink water from the bathroom faucet (yum). Anyway I ended up passing by a reasonably comfortable margin. I’m thinking about going to either the AWS Solution Architect Associate, or the Red Hat Certified Engineer next. I’ll mull that one over the holidays.",
"created_on": "2022-10-11T10:52:06.957729Z",
"status": 1
},
{
"id": 7,
"title": "What’s next for Django Unchained.",
"slug": "whats-next-for-django-unchained",
"author": 1,
"updated_on": "2022-10-11T10:52:06.717306Z",
"content": "What’s next for Django Unchained.\r\nSo I’m debating what features to add to this website. I’ve been playing around with Plotly Dash over the past week or so. It’s a super cool data visualization module for python. It can quickly build dynamic graphs and charts. Apparently, there’s a specific package for python that integrates dash with Django. The idea would be I would have another link at the top navigation bar that would take you, to some kind of interactive dashboard. I could rig up some COVID dashboard that consumes APIs from the CDC or something. That would be a cool way to demonstrate my ability to use other people’s APIs. We’ll see what I Can do.\r\n<p>On the personal front, we spent Labor Day weekend at lake of the Ozarks. This was an unmitigated disaster. We narrowly avoided crashing our pontoon boat (which was at least 20 years old, went maybe 15 miles per hour, and had a roughly 10 pound anchor which didn’t actually anchor anything (more on said anchor later)) into a Pearl White Yacht that looks straight out of The Wolf of Wall Street. In the heat of the confusion, we drove our boat some distance without raising the anchor. In the heat of additional brought on by the first wave of confusion, we then thought that the anchor had been ripped off in our frantic attempt to not crash into said $300k yacht. Utterly defeated that we had lost our anchor, we then drove the boat several miles WITH THE ANCHOR DRAGGING BEHIND US, STILL COMLETELY ATTACHED. We ended up taking on a lot of water from the waves (which could have also been exacerbated the anchor situation). When we finally got to the dock, only then did we realize the anchor was still attached. This cause some…unrest within the team, as knowing the anchor was attached would have obviously altered our decision-making process. Smoothing over this tension required tactful diplomacy on my part. \r\n<p>The best part of the trip was the dude at the hotel bar named Mason. As nigh as I can tell, he was intentionally trying to be fired, because he routinely would charge us less then a third of what we actually ordered. At one point, he offered us a deal. If we left within 15 minutes, he would give us all our drinks for free (this was presumably so he could leave at 11:30 to go party with whomever he was associating with on that evening). Dude was completely insane. I’m assuming his parents were making him get a job, and he was trying to get fired so he would have to have a job anymore.",
"created_on": "2022-10-11T10:52:06.717326Z",
"status": 1
},
{
"id": 6,
"title": "Integrating Ansible in the Django Unchained Code Pipeline",
"slug": "integrating-ansible-django-unchained-code-pipeline",
"author": 1,
"updated_on": "2022-10-11T10:52:06.483280Z",
"content": "So up until now, anytime I would want to deploy changes to my production environment, I would have to:\r\n<ol>\r\n <li>Push the changes to my remote github repository</li>\r\n <li>Pull the changes from django unchained directory on my web server</li>\r\n <li>Run manage.py makemigrations</li>\r\n <li>Run manage.py migrate</li>\r\n <li>Run manage.py collectstatic</li>\r\n <li>Cycle Gunicorn</li>\r\n <li>Cycle Nginx</li>\r\n</ol> \r\n\r\nNow I have streamlined this a bit using my new handy dandy <a href=\"https://github.com/aokugel/ansible_playbooks/blob/master/push_code.yml\">Ansible Playbook.</a>\r\n\r\nThis allows me to immediately to run this playbook as soon as I push my changes to my django unchained github repository. Now I no longer have ssh to my web server at all! Pretty neat!",
"created_on": "2022-10-11T10:52:06.483299Z",
"status": 1
},
{
"id": 5,
"title": "Introducing: Django Unchained Rest API",
"slug": "introducing-django-unchained-rest-api",
"author": 1,
"updated_on": "2022-10-11T10:52:06.253788Z",
"content": "So big plays are being made in Django Unchained world. Based almost entirely on the recommendation of a random reddit user who spoke virtually no English, I have added REST API functionality into the world of Django Unchained. So now you can make get requests for all of the posts on this website. I’m not entirely sure why you, dear reader, would ever need to do this, but the option is there if you ever find yourself lying awake at night wishing that anthonykugel.com had REST API’s. <a href=\"https://www.reddit.com/r/ITCareerQuestions/comments/fp28c6/say_i_were_to_start_learning_linux_or_python_on/flj1jau/\">Here's a link to the reddit post I was talking about for posterity.</a>\r\n\r\n<p>On the music front, the new Glass Animals album is a hot mess. And that’s putting it mildly. “Tangerine” sounds like a track from the beach level in Mario Kart 64 if you added a base drop. I’m pretty sure Space Ghost Coast to Coast samples the title screen music from GTA IV, and then immediately after that Dave Bayley mentions something about kids playing too much GTA. Tokyo Drifting has the album’s only feature, and it’s…Denzel Curry. Subsequently, Denzel Curry’s verse is by far the best part of the entire album. Also, the same track has horns that sounded like they were directly ripped from an Alison Wonderland track. The entire work is a cluster. It sounds more like a stream of conscience from a schizophrenic than a cohesive album. But there are some pretty bright spots. IDK\r\n\r\n<p>We just finished Season 3 of Ozark. And, man, that show makes me appreciate how good breaking bad was, because it is very much NOT breaking bad. Basically none of the characters are redeemable. It’s filled with cheap plot twists. Cringey. Maybe season four will be better.",
"created_on": "2022-10-11T10:52:06.253810Z",
"status": 1
},
{
"id": 4,
"title": "Ok Computer 23 Years Later",
"slug": "ok-computer-23-years-later",
"author": 1,
"updated_on": "2022-10-11T10:52:06.021577Z",
"content": "So the original nascent idea behind this blog was I would have this recursive thing going on where I would be writing articles about making the website as I was actually making it. Well if I’m writing about all of the inputs that were needed to create this output, I would be remiss if I didn’t include Radiohead. I don’t think I could have put in the work required to make this thing if I wasn’t lulled into a sort of hypnotic trance for hours at a time. \r\n\r\n<p>I think I was first introduced to OK Computer when I was a sophomore in high school. It didn’t immediately take at first. The disconcerting dissonance of Climbing Up the Walls made me uncomfortable, Karma Police lulled me to sleep. I didn’t receive the same immediate gratification as I did with The Bends (which was, to me, an angrier, angstier version of What’s the Story Morning Glory). I needed to be a smarter audience. The older I got, and the more music I listened too, the more and more I appreciated OK Computer.\r\n\r\n<p>And holy **** was this album ahead of its time! This came out in 1997! This would be ahead of it’s time if it came out five years ago! The way the guitar riff transitions into almost angelic drum fills on Airbag, the ethereal, haunting vocal’s on Karma Police, it’s just mind boggling they were able to piece this together the same year that I Believe I Can Fly and Wannabe came out. How did they have access to the same technology as R Kelly and the Spice Girls had?\r\n\r\n<p>If Radiohead every comes back to Kansas city I’m shelling out 500+ for pit tickets.",
"created_on": "2022-10-11T10:52:06.021598Z",
"status": 1
},
{
"id": 3,
"title": "The Story Behind This Site",
"slug": "story-behind-this-site",
"author": 1,
"updated_on": "2022-10-11T10:52:05.777462Z",
"content": "So I come from a pretty generalized IT background. I picked the MIS major after realizing economics was a glorified finance degree if you didn’t go to grad school at the university of Chicago. With my dream of becoming chairman of the federal reserve crushed (sorry RP) I decided to go into MIS because I had built my computer in high school and felt I was reasonably technically competent.\r\n\r\n<p>MIS was not nearly as technically challenging as something like Computer Science or Software Engineering. This obviously had certain benefits in University life, as it freed me up to do extra curriculars (shout out to long island ice tea Thursday at Café Baudelaire!) It was a bit of a Faustian Wager when it came to the working world, however. I had lingering doubts in my mind whether I should have applied myself more in school and picked a more challenging major.\r\n\r\n\r\n<p>Enter Coronavirus. If there was ever an ideal time to learn something related to computer science, it was when Kansas city was in compete lockdown, all bars and restaurants were closed, and everyone was terrified to even set foot outside of their homes. During this turbulent period, I made a point of learning python. I started with Automate The Boring Stuff with Python. This sort of gave me a glimpse of what was possible. I then learned some data structures and algorithms through Udacity courses, played did a bunch of Leetcode problems (fun and challenging, though not necessarily indicative of what an average developers day in the life will look like. I doubt most devs are implementing a heap sort from scratch or making a sudoku solving backtracking algorithm on a regular basis), and eventually stumbled across the Django web framework. Several of periods of trail and error later, I eventually assembled this site.",
"created_on": "2022-10-11T10:52:05.777488Z",
"status": 1
},
{
"id": 2,
"title": "My Django Blog",
"slug": "my-django-blog",
"author": 1,
"updated_on": "2022-10-11T10:52:05.547088Z",
"content": "Hello and welcome to my django blog! My django blog is written in django, running on nginx and postgresql, and hosted on aws ec2 and rds! Please comment what you think!\r\n\r\nOverall I think this has been an extremely beneficial learning experience. It almost feels like I'm making a hamburger, expect I have to grow/collect/gather all of the ingredients from scratch. Like, first I have to get buns. What are buns made of? Well, grain presumably (I'm not sure I even want to know what McDonalds buns are made of). So I need to grow wheat to mash it into flower to then mix with water and yeast (do I have to grow yeast too?) and then bake so I can make the buns.\r\n\r\nAnyway, this project wasn't quite that intricate but it was a similar idea. Complete vertical integration from the django backend to the nginx web server application hosting the site. I feel like I've learned a ton!",
"created_on": "2022-10-11T10:52:05.547109Z",
"status": 1
}
]