Performance analysis of low dimensional word embeddings to support green computing

Authors

DOI:

https://doi.org/10.32968/psaie.2022.2.3.

Keywords:

green computing, word2vec, transition based dependency parsing

Abstract

It has become increasingly important to pay attention how much energy we use to operate various Artificial Intelligence (AI) and Machine Learning (ML) systems. In order to implement environmentally responsible solutions we need to reconsider our used storage resources and computational power. Training a natural language model is a time and energy demanding process. In recent years the language models are becoming extremely large and the trend is growing. The building process of these models are consuming an extremely large amount of computational power hence these demands huge amounts of energy. In our research we trained and evaluated low dimensional word2vec embedding models and analyzed their performance on building transition based dependency parsers to show that low dimensional models are still competitive and in many use cases may be sufficient.

Downloads

Published

2022-07-12