We have been busy making code updates behind the scenes over the past few years. Most business-critical Time3.0 libraries (some of them operating within our publishing product family like TimeBase and TimeGate) have now been upgraded to the new open-source .NET core that replaced the old .NET Framework. (.NET core will be known as just .NET as the old one is phased out).
Though programming languages get updated to newer versions all the time, this was a major upgrade. With the new .NET libraries, there are a lot of efficiency benefits we can pass down to our customers who use our applications and technology. Updating the core code makes our software faster and more secure, allowing us to build custom modules with even more speed and efficiency.
Though we constantly update the software we create, these kinds of major overhauls don’t happen all the time. Let’s dive into the project and its benefits.
Meet the new .NET core
The new open-source .NET core was actually first launched back in 2014. Since then, all new Timehouse projects have used the new .NET core, but we still had some legacy libraries on the old system. (It is not unusual for coding languages to be phased out over an extended period: many products and processes will rely on the old code for a considerable amount of time). Bringing all our legacy libraries up to date helped us be more aligned.
What are some of the changes the new .NET brought along?
- The new .NET is completely platform agnostic, and it is not tied to Microsoft anymore. This is a major development and improvement
- Old, redundant functions were removed so that the code base itself became more efficient. This made it more usable
- The addition of JSON, which is a testament to its importance in the digital age
- Scalability and performance were improved
- This change was part of a wider industry shift towards a more modern, microservices model.
Why .NET is better for our customers
With the new .NET, the platform is more streamlined and unnecessary functions have been removed. This allows for application development with less technical debt, creating products with smaller footprints.
There is now a low tie-in between front and back end, supporting a microservices architecture (more on that later). Before, you might be using same code for front and backend functions, but nowadays the backend and frontend can be in whatever, so .NET in the backend can be dynamically combined with a variety of modern frontend frameworks. As .NET is now platform agnostic, you can use whatever language you want.
For businesses, this is great, as you can have different business functions and services separated, e.g. having individualised forms in an ecommerce setting, using different tech to achieve different functions.
Redoing architecture was essential to supporting business processes
In some of the libraries we upgraded, there were 1000s of old functions and features, and figuring out what we needed to bring, what could be left behind, and what needed to be planned again, was a big undertaking.
In a project like this, you need to be close to the business and its operations to know what to keep and what to cut. Our understanding of how our clients use our technology helped us plan out the library upgrades, ensuring we approached the switch efficiently. We were very aware that as well as a technical upgrade, we were working on core business processes. This kind of project is an excellent opportunity to streamline and strengthen workflows.
This was really the core element of the upgrade – it was not just a case of copying code, we also had to redo parts of it to ensure it matched current business objectives.
Not everything stands the test of time
Even without major code base changes like this, pretty much everything in tech ends up needing to be redone one day.
Good software management cannot overcome the fact that over a considerable period of time, things in business tend to change quite a lot. People change, departments get shut down, processes are always evolving.
Imagine how many features, requests, and ways of doing things become obsolete after eight years of maintaining a software.
Part of this upgrade was a forensic analysis of all the features that had been added over the years and understanding which ones were obsolete, and which ones would need to be imported over.
Over the last few years, updating these old libraries has also taught us how to code better in the future.
Microservices and the DevOps model: why you benefit?
The new .NET allows for container led development and a microservices architecture.
What that basically means is that rather than having one huge code base for everything, there are separate smaller independent parts (microservices) that make up the whole. This makes software easier to upkeep and manage. It also stops issues arising from mixing software logic with UX considerations – this tends to make things messy.
This is also why the DevOps method of software delivery we use at Timehouse goes so well with microservices, as it allows for programming sprints and agility.
The importance of upgrading your tech
Overall, this is a good lesson in how important it is to upgrade your tech over the years.
Here’s why:
- Some tech will become obsolete
- Even if the tech itself is fine, some ways of doing things may have become obsolete
- You will have less technical debt
- Your tech will have a smaller footprint
- You will find it easier to find people and suppliers who can work on your systems.