
How Artificial Intelligence helped 6x productivity on a key task
From 2020 – 2024, I worked for an organisation which runs South Africa’s largest online high school. In this article, I detail how a side-project that I built helped reduce the time taken to update an online high school lesson from 30 minutes per lesson to 5 minutes per lesson.
Read on to learn more about how I designed and built a full-stack software application that incorporated software engineering, data analysis, cloud computing and product management skills to deliver business value.
Tools used:
- AWS – S3, Elastic Beanstalk, Secrets Manager
- Snowflake – data warehouse, including Snowflake’s Python connector
- Django – backend
- React.js – frontend
- ChatGPT’s API platform
- Gitlab & Gitlab CI/CD pipeline manager
The background
An online high school needs to regularly update its lesson content, utilising insights from learner performance data and user feedback, as well as responding to rapidly evolving industry best practice and regulatory requirements. However, high quality content creation and improvement is highly manual, time-intensive and therefore expensive.
The average high school lesson needs to comprise 40 – 45 minutes worth of work for a learner – a substantial amount of content, with text being the primary element. While acting as lead Analyst on the re-architecting of the above online high school’s learning model, I identified high lesson word count as a major area for improvement.
However, three major challenges were immediately apparent:
- The scale – more than 9 000 lessons needed updates (of varying degrees). The required Capex was simply not available. A streamlined, Opex-aligned approach would be required.
- Each lesson is substantial – each lesson is intended to comprise 40 – 45 minutes worth of work, primarily text. Reducing word count while retaining all required concepts and keywords is time-intensive, requiring both domain expertise and strong writing ability.
- The Learning Management System (LMS) is cumbersome for Content Developers, fragmenting lesson text with multiple clicks and loads between each fragment. No lesson could be viewed in its entirety in a single window and required more than 7 clicks to view.
Requirements
- Rapid access to the full lesson text on a single page: A Content Developer needs to be able to view the entire lesson’s text in a single window, with as few clicks as possible to access each lesson.
- Easy access to key lesson metrics like word count (both for the entire lesson and selectable sections of the lesson)
- Effective integration of a Large Language Model into the text updating process. Users should be able to utilise the services of an appropriate LLM in the same window that is being used to view the lesson text.
- Reliable protection of Company IP. If Content Developers copy lesson text into non-Company controlled LLM services like ChatGPT, there is substantial risk of Company IP being inadvertently, systematically incorporated into the training data of third-party service providers. Any solution must maintain control of how Company IP is shared with an LLM.
- Appropriate security controls, including password-protected login to the application.
- Cost effective utilisation of services, avoiding expensive per-user license fees.
- Cloud hosted, with centralised view of usage and costs for the IT Admin team.
The Content Updating Tool
Having recognised the need for large scale improvement of lesson word counts and the inherent limitations of the existing lesson update process, I obtained permission to initiate a side-project to build a solution to attempt to address this.
MVP version 1 – Sandboxed front-end only
The first prototype was designed using Miro a basic MVP with the following structure:
- Built using React.js, deployed on a static AWS S3 bucket on a dedicated R&D account within the Company’s AWS environment.
- ChatGPT’s API platform as the LLM service.
- Application’s code pushed to an R&D project within the Company’s Gitlab repo, with Gitlab’s CI/CD pipeline manager and AWS Secrets Manager handling deployment.
With a few sandboxed lesson text examples hard-coded into the front-end of MVP version 1, I focused on obtaining user feedback. Based on this feedback, I went back to Miro and designed version 2.
MVP version 2 – functional, full-stack application
Using Miro, I re-designed the user interface of the Content Updating Tool and planned out the technical architecture of the first full version of the application. While I had a fairly good idea of how I wanted to lay out the front-end, I needed to decide how to architect the rest of the prototype application.
Backend: In order to enable password-protected user login, as well as to better secure the application, a robust backend would be needed.
After researching a number of options, I chose to use Django for the backend, primarily because
- The other (much larger) AI-focused project of the Company had also recently chosen to use Django for its backend;
- I like Django’s “batteries-included” approach — I didn’t want to reinvent the wheel; and
- Python is my preferred scripting language, which Django utilises.
Data warehouse: In order to serve the entire lesson’s text in a single page, this data would need to be assembled and fetched from the Company’s Snowflake data warehouse. After examining how this data was stored (which is primarily determined by the data structure of the LMS itself), I constructed an appropriate set of SQL queries which I built into the applicable backend workflows.
Other technical details:
- Similarly to v.1, the full-stack prototype was deployed to AWS.
- The front-end was also built using React, deployed to an S3 bucket.
- The Django backend was deployed to an Elastic Beanstalk instance.
- All application code was remotely hosted on the Company’s Gitlab repo.
- I used Gitlab’s CI/CD pipeline manager and AWS Secrets Manager to handle automated deployment and API authentication respectively.
Testing the finished product
Once complete, a small number of business users were invited to test the application. After receiving positive feedback from these users on the business value that this tool would add, the CTO approved larger scale trials.
Using the application
After navigating to the relevant URL, the user is prompted to log in using their assigned username and password.

The user is then able to select the applicable course, module and lesson in 3 clicks, with near-instant loading times

After this, the user is able to view the entire lesson’s text in a single page. The user can also view the word count of the entire lesson, or highlighted sections, to help quantify required reductions.
Once a section has been identified for improvement, the user simply highlights the section and clicks one of the pre-programmed buttons, which securely sends the highlighted lesson text along with a pre-created prompt to a Large Language Model (GPT4-o). In the example below, the user is requested a suggested word count reduction of the highlighted text.

The response received from the LLM is displayed as a collapsable/expandable suggestion in the comment section. If the user is satisfied with the suggestion, they can copy to clipboard. If they would like to re-prompt the LLM, they can simply re-click the appropriate button. Once done with the suggestion, the user can dismiss a suggestion.

Results
After a large scale trial in late 2024, Content Developers reported that the prototype application reduced the time required to update a lesson from 30 minutes to 5 minutes – a 600% improvement!
Following this successful trial, the code base for this application was transferred to the Company’s Integrations team to be fully incorporated into the Company’s Tech Development and Maintenance Roadmap going forward.
Conclusion
The Content Updating Tool was a valuable experience which required me to learn quickly, think laterally and work efficiently to add value to the greater team and, indirectly, add value to the primary product users – thousands of South African learners.
The project deepened my knowledge of AWS and several AWS services (like Elastic Beanstalk, static S3 buckets, and Secrets Manager), Gitlab’s CI/CD functionality, and Snowflake’s Python connector. I am grateful for the opportunity to have been able to build a solution which adds value.
This experience reiterated to me that my primary passion lies in understanding systems and solving problems using appropriate technical solutions, particularly in the realm of Data (I most enjoyed working with Snowflake).