--- title: #5 - 'The Power of [Separation] Compels You!' published: true date: 2025-09-20 21:51:29 UTC tags: WebDevelopment,refactoring,JavaScript,apiintegration canonical_url: https://campfire.dlseitz.dev/5-the-power-of-separation-compels-you header: image: /assets/5-separation.webp attribution: 'Image generated with Sora. | © 2025 Derek L. Seitz' contentImagePath: /assets/#5/ --- Hey there, and welcome back to **_Campfire Logs: The Art of Trial & Error_**. In my last log, [_#4 - Refactoring a False Sense of Simplicity_](https://hashnode.com/post/cmfg52ttw000002jufwdvdxkj), I introduced you to the [front end](https://www.computerscience.org/bootcamps/resources/frontend-vs-backend/#:~:text=Front%2Dend%20development%20focuses%20on%20the%20user%2Dfacing%20side%20of%20a%20website.%20Front%2Dend%20developers%20ensure%20that%20visitors%20can%20easily%20interact%20with%20and%20navigate%20sites%20by%20using%20programming%20languages%2C%20design%20skills%2C%20and%20other%20tools.%20They%20produce%20the%20drop%2Ddown%20menus%2C%20layouts%2C%20and%20designs%20for%20websites.) demo I recently [refactored](https://daedtech.com/refactoring-development-technique-not-project/#:~:text=Code%20refactoring%20is%20the%20process%20of%20restructuring%20existing%20computer%20code%20%E2%80%93%20changing%20the%20factoring%20%E2%80%93%20without%20changing%20its%20external%20behavior.) to be more modular and accessibility-friendly. Today, we are going to talk more about that same refactor, but we are going to focus on some of the enhancements I made to the existing features and the new features added for improved interactivity. We are also going to discuss how separating the data, presentation, and logic layers to the demo improved maintainability of the codebase by [decoupling](https://blog.covibe.us/the-pitfalls-of-excessive-decoupling-in-software-development-striking-the-right-balance/#:~:text=Decoupling%2C%20in%20software%20development%2C%20refers%20to%20the%20practice%20of%20breaking%20down%20a%20software%20system%20into%20smaller%2C%20independent%20components%20or%20modules.) its components. For anyone short on time, or that just want to get to the point, theres a TL;DR section with links to the live demo and its repo at the end. For everyone else, grab some coffee or marshmallows (or hot dogs) and a stickand lets get to it! ## Credit Where Credit is Due Ive mentioned in some of my previous logs how important developing with integrity is to me. In short, what that means for me is developing applications and websites in an honest, transparent, and accessible manner. This includes ensuring proper credit and attributions are made when creative works of others are included in what I build. I also said in the last log that the demo I refactored was originally a course project. What I didn't get into was that the refactor involved sourcing all new images to avoid copyright violations with the materials provided in the class. In other words, because the refactor project wasnt part of the course project, instead being an enhanced demonstration in my portfolio, things were edging a little too far away from being considered [fair use](https://fairuse.stanford.edu/overview/fair-use/what-is-fair-use/) in the eyes of [copyright law](https://www.copyright.gov/what-is-copyright/). So, I nipped it in the bud to avoid potential headaches (and wallet-aches) down the road. This meant I had a whole different problem to worry about, though: figuring out how to give credit where credit was due. ### The Credits & Attributions Page Because the presentation of the demo needed to emulate an eCommerce front end, the images I used throughout couldnt be cluttered with attribution linksit would have killed the whole vibe. Instead, the solution I chose was to create a dedicated page, linking it in the copyright information at the bottom of the footer. The page consists of credits, each linking the image to its creator, the platform that hosts it, and the license under which I am allowed to use it. I knew, however, that I needed a better way of organizing that data than simply statically coding it into the [Nunjucks](https://mozilla.github.io/nunjucks/) template for the page, especially if I wanted to connect it elsewhere in the demo down the road. ![A GIF demonstrating the interactivity of the Credits & Attributions page by hovering over each credit.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758394586739/d872d333-149d-4ac4-b9b0-1644e451fbcd.gif) Because I use [11ty](https://11ty.dev) as my [static site generator](https://www.cloudflare.com/learning/performance/static-site-generator/) (SSG) with Nunjucks for templating, using a data file that holds an array of all the credit objects seemed like the way to go. At build time, when the SSG creates the files to be rendered by the browser, a for-loop in the Nunjucks template for the page could simply iterate over the array, injecting the data into the HTML build artifact, allowing 11ty to then do all the work. ![A code snippet of the data file that holds an array of credit objects, including properties like fileName, creator, host, and license for each image.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758390674105/1bc47fcf-9126-4c16-a055-d594465081c5.png) Now, I know what youre thinking. Each credit is still statically coded into the HTML. Well, yes and no. Yes, the HTML served to the browser appears as a static list of credits, but preventing that was never the point here. The point was to create a modular, more easily maintained system that allows me to make changes in one place that will propagate throughout the project wherever that data is rendered. By separating the [data layer](https://hitgovernor.medium.com/what-is-a-data-layer-28ace099d4af) (credits and attributions) from the [presentation layer](https://www.techtarget.com/searchnetworking/definition/presentation-layer) (HTML and CSS) and the [logic layer](https://www.sciencedirect.com/topics/computer-science/logic-layer) (Nunjucks conditionals and JavaScript), Im letting a tool Im already using anyway handle more of what its designed to do. ## The Ripple Effect The Credits & Attributions page was only the start. Other pages also had various sets of data that I needed to separate from everything else. At this point, the homepage had a Featured Items slideshow that cycled through various images, their descriptions, and their price, all hardcoded into the HTML. The Gallery page also used those same images and information in its carousel of categorized products, but the data here was hardcoded into the JS script that controlled the display of the carousel. Not very efficient. So, I decided to do the same thing as before, creating a data file to hold all the properties of each product in an array of standardized product objects. This allowed me to use Nunjucks templating again for the Featured Items slideshow for quick loading of the homepage while using JavaScript to dynamically populate and sort the product cards in the Gallery pages carousel for enhanced interactivity (e.g., infinite scroll in the carousel, expanded descriptions on focus, etc.). ![A code snippet showing a JavaScript object for a product, with properties for the item, image, description, and price.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758163741870/5a57d832-d9a2-453c-90f5-9fcc9f53f4fd.png) ![A GIF of the homepage's featured items slideshow, showing rendered products with an image, name, and price.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758393181370/157ff4aa-e781-415b-a983-3bc1087421a8.gif) ![A GIF of the Gallery page's product carousel, showing an interactive product card with an expanded description and infinite scroll.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758393340665/a1fb6d98-19e9-4a93-9f02-3eed6a9df6de.gif) You also might be saying, Why not include the credit and attribution data with the product data and just use one data file? Thats a great question. I could have for the purpose of this demo, but if there were a backend to this project and a [relational database](https://www.sciencedirect.com/topics/computer-science/logic-layer) like [PostgreSQL](https://postgresql.org) attached to it, I would still have both sets of data in separate tables in the database. By using a [foreign key](https://hightouch.com/sql-dictionary/sql-foreign-key) between related records in the two separate tables, I could avoid [God Objects](https://dilankam.medium.com/the-god-object-anti-pattern-in-software-architecture-b2b7782d6997), or objects that become incredibly hard to manage because they have too many responsibilities or hold too much information, causing problems down the road. The same thing applies to the data structures I created for this demo. ## Connecting with the [Fictional] Community Because of the dynamic interactivity I developed for the pages Ive discussed so far, I was left scratching my head looking at the Community Events page of the demo. It was a stark contrast to the rest of the site now. Frankly put it was ugly and boring. Also, the mock-events I had created for the original demo were statically coded into the HTML like the other pages had been. I simply couldnt leave it like this. ![A screenshot of the old Community Events page, showing a static, bland-looking Google Calendar iframe.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758379147598/64433d15-a546-4dcf-aefc-5e9df54284cd.png) Now, this is where if you try to say [scope creep](https://asana.com/resources/what-is-scope-creep) had definitely found some footing, I might be inclined to agree with you, at least to a small degree. But looking at the tools available to me while trying to come up with a way to add some pizazz to the otherwise bland Google Calendar iframe and static events on the page, lightbulbs in my head just started flashing. Think Paris Hilton at the 2005 Teen Choice Awards (yeah, I said it). Since I use [Zoho](https://www.zoho.com/) as the email provider for my custom domain, I figured, How fun would it be to use [Zoho Calendar](https://www.zoho.com/calendar/) and the [Zoho Calendar API](https://www.zoho.com/calendar/help/api/introduction.html) for this page? This would provide a single source of truth for the events displayed on the page. All I had to do was figure out how the [API (application programming interface)](https://www.ibm.com/think/topics/api) workedthat is, what was needed in the [HTTP request](https://developer.mozilla.org/en-US/docs/Web/HTTP/Reference/Methods) to the [API endpoint](https://blog.postman.com/what-is-an-api-endpoint/) and what data would be returned in the [response payload](https://adchitects.co/blog/payloads-from-an-api-guide). ### The Grunt Work of Community Engagement Let me go ahead and say, this whole thing seemed a lot easier in my head than it was in reality. The process wasnt hard, but it was less intuitive than I expecteda perception that likely stemmed from my specific use case and my limited experience with third-party APIs. This was primarily due to two things: Zohos documentation not being quite as clear as I thought it could have been, and the need for separate scripts for retrieving the event data at build time and rendering the events carousel dynamically at runtime. No big deal, though; Ive tackled hairier situations. The biggest concerns for the API script were checking off the following: 1. Determine the start and end dates of the current month at build time (more on that in a bit) 2. Check the stored [OAuth 2.0 API token](https://www.zoho.com/calendar/help/api/oauth2-user-guide.html) needed to retrieve data from the Zoho Calendar API, using the [refresh token](https://auth0.com/blog/refresh-tokens-what-are-they-and-when-to-use-them/) to request a new one if it already expired 3. Fetch the events information for the dummy calendar in Zoho using the calendar ID, calculated dates, and OAuth 2.0 API token 4. Normalize the response payload, extracting the data I needed to render the event cards 5. Make a second API call to retrieve the event descriptions for the returned events 6. Store the normalized data in an export module that could be converted to JSON when 11ty creates the build artifacts For me, the most frustrating part of all of this was normalizing the dates and times returned by the API response. It didnt really occur to me at first that all-day and time-specific events would return datetime properties in different formats. Honestly, it was something I didnt even catch until after I wrote the [client-side JavaScript](https://www.cloudflare.com/learning/serverless/glossary/client-side-vs-server-side/) to generate the event cards. It took me longer than Id like to admit getting to the bottom of why only the all-day event cards wouldnt populate with a date. Fortunately, though, the API response for each event included an isAllDay Boolean value which made writing conditional statements for how to parse each events datetime values very straightforward. ![code snippet showing a JavaScript function that uses conditional logic to normalize and reformat datetime properties from an API response.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758377745897/09958d1f-9f2a-4b99-8249-79ea3a78d145.png) Really the rest of the events page was smooth sailing. I had already written the logic for the products carousel on the Gallery page, so it was easy to write an adapted version for the events data. Also, since I output the normalized event data to an [export module](https://www.freecodecamp.org/news/module-exports-how-to-export-in-node-js-and-javascript/), I used Nunjucks and 11ty to convert the data into a [JSON](https://www.json.org/) file during the build process. This allowed the events carousel script to make a simple, local API call, again keeping the data separate from my logic. ![A GIF of the new, dynamic Community Events page with an interactive carousel showing various event cards.](https://cdn.hashnode.com/res/hashnode/image/upload/v1758392960642/f0e25da7-3315-4d98-8a21-a08770b8250c.gif) The last trick I had up my sleeve is what I thought to be most clever (but maybe it wasnt you be the judge). I mentioned that the first thing the script that makes the API call to Zoho needed to do was determine the current month, specifically the first and last dates of the month, to specify which events should be returned. Since this script is run by 11ty at build time (not client-side in the browser), by using a simple [cron job](https://cron-job.org/en/) on my web server to rebuild the demo at 12:01 am on the first of every month, and since Ive set up recurring seasonal events throughout the year in the Zoho dummy calendar, the displayed events in the demo will fit the month in which the demo is viewed without me needing to manually update anything at all. How fun is that? ## What I Learned from All of This Sure, refactoring a difficult-to-maintain codebase into something more manageable and organized turned into a few new features and a lot of work I didnt anticipate at first. To me, though, it was well worth the effort I spent on it. I was incredibly proud of the original demo when I submitted it as my course project a year ago. After all, it was my first website that I quite literally drove myself insane over trying to get right. Even if its still not perfect, Im incredibly proud of what I managed to accomplish in refactoring it. Theres something inspirational in being able to look back to see just how far youve grown in a years time. You realize that little by little, each and every bump in the road along the way adds up to considerable improvement in skill if you just stick with it. You really start to see the forest from the trees, as they say. ## TL;DR I refactored a [monolithic](https://vfunction.com/blog/what-is-monolithic-application/) front-end demo into a modular, maintainable system using 11ty, Nunjucks, and JS. I separated data (credits, products, events) from presentation and logic, built a dedicated Credits & Attributions page, and made the product and event pages dynamic and interactive. The volume of work was the result of a ripple effect from changes that were made, but each change aligned with the refactors goals of modularity, maintainability, ethical attribution, and improved demonstration of my growth as a developer. Overall, the project was challenging, rewarding, and a clear reflection of growth over the past year. [Click here](https://bloomvalleydemo.dlseitz.dev/) to check out the live demo. [Click here](https://gitea.dlseitz.dev/dereklseitz/BloomValleyNursery) to check out the source code for this project. ## Before You Go As always, thank you so much for taking the time to read through some of my struggles and wins in full-stack development. I encourage you all to leave a comment telling me about your own experiencesmaybe youve had similar trouble with third-party APIs, or maybe you have some tips on how I could have approached things differently. I look forward to reading what you have to say! In the next log (#6), Im going to share with you the progress Ive been making on my blogging platform project. Ive gotten started on building the dashboard using [React.js](https://react.dev/) and the [KendoReact component library](https://www.telerik.com/kendo-react-ui), so check back soon for #6 to drop!