Trends > Sustainable Technology
There comes a point in our lives where our children become our conscience for the future, and it seems the new generation are more purpose driven than any generation before them. I could argue that this is through necessity rather than a product of a deliberate campaign to breed a different type of society. My daughter has been telling me for a couple of weeks now, that if her great-grandchildren do not have a planet to live on, then I must acknowledge my role in cutting short the lives of my future descendants. Now it is sometimes hard to tell whether she constantly reminds me of this because she really wants me to buy a Tesla or whether it is a genuine acknowledgement of the significant damage we have inflicted on our planet and the urgency with which we should be considering remediation actions. Either way, she is right that we should be very conscious of the damage we do on an everyday basis. We can’t change the damage done in the past, but we can reduce the future damage.
We should not overlook the impact that we can have when changing habits within our personal lives. We all know what these are; recycling, turning off electrical devices when not in use, using energy efficient appliances etc. However our impact can and should extend into our professional lives, and I think it is important for us to understand our responsibility as members of the technology industry. There are many “tools” out there which can contribute towards sustainability goals for our companies but they are often not well understood. Even if we can’t see the immediate opportunities to contribute to decarbonisation, we should at least understand what these tools are. This gives us the power to know spot opportunities as they arise.
According to the UN, the technology industry accounts for 2 – 3% of global emissions. This does not seem like a lot but as the growth of online and digital technology continues, so this number could grow rapidly. There are two main sources of carbon emissions within the technology industry. The greatest contribution is from the manufacturing of the hardware devices followed by the electricity which is consumed in the process of delivering various technology services. To fully understand this, we need to dig a bit deeper and understand the demand side of this equation, what are the products or services which have the largest negative impact on emissions.
I was surprised when I first discovered that topping the leaderboard was not data centres and server infrastructure, as I had expected but rather end-user devices. There are a couple of reasons for this, the first one is the pure volume of end user devices that exist, ranging from desktops, laptops, mobile phones and tablets. Many of us have three or more devices. Now interestingly enough over 70% of the emissions related to end user devices come from the manufacturing, logistics and subsequent disposal process. So by the time the device has been purchased, you have already had a large impact. So the best strategy in this case is to optimise the device strategy, extend the useful life of devices and to use devices which have a lower emissions footprint. This is an easy and cost-effective way to have an impact on your emissions trajectory.
As I had originally thought, the next largest contributor is in fact in-house data centres. There are two main sources of emissions; firstly the power consumption for the servers and the cooling and secondly, the manufacturing impact of the servers which are used. This is really confirming what we already know; running an in-house, traditional (not hypervisor) data centre, is not the optimal strategy for the future.
Both AWS and Microsoft have released case studies evidencing the significant emissions savings that can be achieved through the shift from an on-premises data centre to a Cloud Service Provider. These savings range from 52% all the way to 92% across these two Service Providers. Now there are a couple of main drivers of this:-
Server Utilisation, in traditional data centres, with no hypervisor solutions implemented, server utilisation is typically low on average. This is because the infrastructure is built to be able to handle peaks which may be daily or monthly peaks. Having the ability to dynamically provision compute and storage means that you can remove this idle time through balancing of demand across a much larger estate. The second dimension of server utilisation is multi-tenancy. Since you can host multiple applications, companies, or users on the same server, it means that one large server could be used to provide the same compute as 10 smaller servers. This again reduces the required hardware. Both factors result in a reduction in the hardware required to service the same demand.
Efficiency, cloud service providers are in the business of managing data centres to provide their clients with various services. For companies that are not in the business of running data centres, but instead regard this as a necessary enabler, they probably are not going to be able to achieve the same level of optimisations and efficiencies, not purely because they don’t have the same scale but also because it sits on a cost line and not a profit line.
Renewable Energy, Both AWS & Azure have a target to be 100% powered by renewable energy by 2025 which means removes the carbon emissions associated with powering the servers and the cooling system.
The final contributor which I want to talk about is software. Software creates the demand for compute and hardware and thus it makes sense that introducing practices which reduce the demand will also have a positive impact on emissions:-
Firstly lets have a look at Architecture and how the choices we make can impact our overall energy efficiency. There are many patterns which can make code more efficient. As an example, introducing serverless apps into the architecture means that this code can scale seamlessly and more importantly will only be running when the functionality is required. AWS has Lambda and Azure has functions to address these requirements. Now going serverless is not a pattern which can be used everywhere, but being able to identify the right places to leverage this will make a difference.
There was an interesting study, which was done by a group of Portuguese academics in 2017, measuring the energy efficiency across a variety of programming language. The same algorithm was coded in 20+ languages and measurements taken to determine the joules of energy consumed by both the CPU and memory usage when running the code. As a starting point, it is not surprising that the compiled languages such as C++, C and Rust outperformed both the Virtual Machine languages such as Java and C# and the least optimal being the interpreted languages such as Perl, Python and PHP.
Movement of data, the movements of data is hugely emissions expensive and the further it must travel the more expensive it is. Our architecture should be designed in a way that minimises the movement of data and if data does need to be transferred between systems that the data sets are optimised before they are transported and there is a compression strategy in place to reduce the bytes on the pipe. As an extrapolation of the above point, if the aim is to reduce the movement of data to reduce the overall footprint of the technology stack, then we should certainly be identifying any opportunities possible to take advantage of Edge Computing. Edge computing will keep the source of the data and the processing of that data close together (at the edge) and thereby removing the energy impact of transferring the data.
Secondly, lets talk about Algorithms. Not a lot needs to be written about this, except to say that algorithms, particularly those that are run at scale, should be optimised as far as possible. A small change run a million times a day can have a significant impact.
Finally let's talk about Database Efficiency. Data storage, processing and destruction drives up both CPU, memory, and network consumption. Sticking to some simple practices here will make your stack more efficient. We all know that database maintenance is important for performance; running a query that has not been indexed many times over is highly wasteful. Similarly storing data that is not needed has a compounding impact, not only do you consume hardware through the storage and backup of this data, but you are also using compute and memory every time you process a data set that contains the non-relevant data. A repeatable data destruction policy and process is very important.
Whether sustainability is a topic which resonates with you or not, it is a definite part of our future. Governments, regulatory bodies, and consumers are demanding changes of participants in the metaphorical global marketplace. We can wait for someone to insist we change, or we can get ahead of the curve and make the changes now. From a personal perspective, I see my personal investment as a gift to future generations. People often comment that a single person or family cannot make a change, but I know that there are millions of people weighing up the same dilemma. If just a percentage of these people vest themselves in sustainability, it will make a big difference.
To conclude, I just wanted to share one final interesting piece of research from the University of Massachusetts, Amherst which states that if we took an extra full day off a week, we would reduce our carbon footprint by 30%. #justsaying 🙂
Penny Futter
Becker, Gerrit, et al. “The Green It Revolution: A Blueprint for Cios to Combat Climate Change.” McKinsey & Company, McKinsey & Company, 15 Sept. 2022, www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-green-it-revolution-a-blueprint-for-cios-to-combat-climate-change.
>Evolve.
>Sustainably.
>Together.
© Copyright 2023 Vaxowave | All rights Reserved
INFORMATION PRIVACY POLICY