Author ORCID Identifier

0000-0002-3300-1152

Document Type

Conference Paper

Disciplines

Computer Sciences, Information Science

Publication Details

https://link.springer.com/chapter/10.1007/978-3-031-73110-5_22#:~:text=Pioneering%20initiatives%20in%202023%2C%20are,from%20sustainable%20and%20environmentally%20friendly.

https://doi.org/10.1007/978-3-031-73110-5_22

Abstract

In the world of artificial intelligence (AI), large language models (LLMs) are leading the way, transforming how people understand and use language. These models have significantly impacted various domains, from natural language processing (NLP) to content generation, sparking a wave of innovation and exploration. However, this rapid progress brings to light the environmental implications of LLMs, particularly the significant energy consumption and carbon emissions during their training and operational phases. This requires a shift towards more energy-efficient practices in training and deploying LLMs, balancing AI innovation with environmental responsibility. This paper emphasizes the need for improving the energy efficiency of LLMs to align their benefits with environmental sustainability. The discussion covers the significant power consumption associated with training LLMs. We present a generic energy-efficient training framework of LLMs that employs federated learning (FL) and integrates renewable energy (RE), aiming to mitigate environmental impact of LLMs. Our objective is to encourage the implementation of sustainable AI practices that preserve the capabilities of LLMs while reducing their environmental impact, thus guiding the AI community towards the responsible advancement of technology.

DOI

https://doi.org/10.1007/978-3-031-73110-5_22

Funder

SFI

Creative Commons License

Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.


Share

COinS