Optimizing Data Engineering with MLOps for Scalable AI Workflows

Authors

  • Thejaswi Adimulam Independent Researcher, USA
  • Souratn Jain Independent Researcher, USA
  • Manoj Bhoyar Independent Researcher, USA
  • Guru Prasad Selvarajan Independent Researcher, USA

Keywords:

MLOps, Data Engineering, AI Scalability, Machine Learning Operations, Continuous Integration, Automated Deployment, AI Workflow Optimization, Data Pipeline Efficiency

Abstract

Organizational AI integration and growth in the current dynamic bent of business calls for the best data engineering alongside acceptable MLOps. This work presents a novel MLOps framework for enhancing big data handling for AI applications through improving data engineering. The solutions to these areas of concern include The use of automation in various processes and the implementation of procedures that facilitate continuous integration of data and deployment of the models. In this work, a pilot implementation using one of the largest technology companies saw the data pipeline improve by 40% and a 25% improvement in the time to deploy models. Moreover, the framework created an environment that enabled data engineering and machine learning interactions to create a form of AI that was fun to develop. These outcomes indicate that the MLOps approach can more effectively facilitate the scale-up of manifold AI procedures. The work also shows how important it is to maintain the further development of AI techniques and organizational efficiency with integrated data engineering and MLOps systems.

Published

01-09-2024

How to Cite

Thejaswi Adimulam, Souratn Jain, Manoj Bhoyar, & Guru Prasad Selvarajan. (2024). Optimizing Data Engineering with MLOps for Scalable AI Workflows. Well Testing Journal, 33, 29–50. Retrieved from https://welltestingjournal.com/index.php/WT/article/view/33.SI.29

Issue

Section

Original Research Articles

Most read articles by the same author(s)

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.