The Future of GenAI Development: Why 80% of Applications Will Build on Existing Infrastructure by 2028
By 2028, 80% of GenAI apps will run on existing platforms, cutting complexity and delivery time 50%.


The enterprise Generative AI landscape is undergoing a fundamental transformation.
According to Gartner's latest research, organizations worldwide are set to abandon the costly practice of building GenAI applications from scratch.
Instead, by 2028, 80% of generative AI business applications will be developed on existing data management platforms, reducing both complexity and delivery time by 50%.
This paradigm shift represents more than just a technical preference—it signals a strategic evolution in how enterprises approach AI implementation. The implications extend far beyond cost savings, touching on everything from AI Governance to competitive advantage in an increasingly AI-driven marketplace.
The Rise of AI-Generated Content: Expert Insights on the 90% AI-Powered Web by 2025. Read more here!

The Current Challenge
Today's GenAI development landscape resembles a complex puzzle with scattered pieces. Building GenAI business applications involves integrating large language models (LLMs) with an organization's internal data and adopting rapidly evolving technologies like vector search, metadata management, prompt design and embedding.
The problem isn't technological capability—it's coordination. Without a unified management approach, adopting these scattered technologies leads to longer delivery times and potential sunk costs for organizations, explains Prasad Pore, Senior Director Analyst at Gartner.
This fragmentation creates several critical pain points:
- Integration Complexity: Organizations struggle to connect Large Language Models with proprietary data systems
- Technology Sprawl: Multiple vendors and platforms create management overhead
- Skill Gaps: Teams need expertise across diverse, rapidly evolving technologies
- Governance Challenges: Ensuring security and compliance across fragmented systems
The Architectural Foundation of Tomorrow's GenAI
Central to this transformation is RAG AI (Retrieval-Augmented Generation), which Gartner identifies as a cornerstone for deploying GenAI applications, providing implementation flexibility, enhanced explainability and composability with LLMs.
RAG AI addresses a fundamental limitation of current Large Language Models. Most LLMs are trained on publicly available data and are not highly effective on their own at solving specific business challenges. However, when these LLMs are combined with business-owned datasets using the RAG architectural pattern, their accuracy is significantly enhanced.
The technical advantages of RAG AI extend beyond accuracy improvements:
Enhanced Context Awareness: By integrating real-time business data, RAG AI systems can provide responses grounded in current organizational knowledge rather than static training data.
Improved Traceability: Data catalogs can help capture semantic information, enriching knowledge bases and ensuring the right context and traceability for data used in RAG solutions.
Reduced Hallucinations: By anchoring responses to verified business data, RAG AI significantly reduces the risk of Large Language Models generating inaccurate or fabricated information.
Top RAG Tools to Boost Your LLM Workflows in 2025. Read here!
Three Strategic Imperatives for Enterprise Success
Gartner's research reveals three critical recommendations for organizations preparing for this shift:
1. Platform Evolution Strategy
Organizations must evaluate whether current data management platforms can be transformed into a RAG-as-a-service platform, replacing stand-alone document/data stores as the knowledge source for business GenAI applications.
This isn't merely a technical upgrade—it represents a fundamental reimagining of data architecture. Companies need to assess their existing infrastructure's capability to support vector databases, semantic search, and real-time data integration required for effective RAG AI implementation.
2. Technology Integration Prioritization
The second imperative focuses on evaluating and integrating RAG technologies such as vector search, graph and chunking, from existing data management solutions or their ecosystem partners when building GenAI applications.
This approach offers resilience advantages. These options are more resilient to technological disruptions and compatible with organizational data, providing a stable foundation as Generative AI technologies continue evolving rapidly.
3. Advanced Metadata Utilization
The third recommendation emphasizes leveraging not only technical metadata, but also operational metadata generated at runtime in data management platforms. This comprehensive metadata strategy helps protect GenAI applications from malicious use, privacy issues and intellectual property leaks.

The Critical Success Factor
As organizations consolidate their GenAI development on existing platforms, AI Governance becomes paramount. The integration of business-critical data with Large Language Models introduces new risk vectors that traditional IT governance frameworks weren't designed to address.
Effective AI Governance in this context requires:
Data Lineage Tracking: Understanding how business data flows through RAG AI systems to ensure accuracy and compliance.
Access Control: Implementing granular permissions to ensure GenAI applications only access appropriate data sources.
Audit Capabilities: Maintaining comprehensive logs of model decisions and data usage for regulatory compliance and performance optimization.
Quality Assurance: Establishing continuous monitoring to detect model drift, bias, or performance degradation over time.
Deloitte Report : AI Governance Improvement Opportunities in the APAC Region. More here!
The Competitive Advantage of Early Adoption
Organizations that successfully navigate this transition will gain significant competitive advantages.
The technology research firm predicts that by 2028, 80% of generative AI business applications will be developed on organizations' existing data management infrastructure. This approach is expected to reduce both the complexity and time required to deliver these applications by 50%.
This 50% reduction in development time translates to faster time-to-market for AI-powered features, reduced development costs, and the ability to iterate more rapidly on GenAI solutions. In markets where AI capabilities increasingly differentiate leaders from laggards, this speed advantage could prove decisive.
Implementation Roadmap: From Strategy to Execution
Successfully implementing this platform-centric approach requires a structured methodology:
Assessment Phase: Evaluate current data management capabilities and identify gaps preventing effective RAG AI implementation.
Architecture Design: Plan the integration of vector databases, semantic search, and metadata management systems within existing infrastructure.
Pilot Development: Begin with low-risk use cases to validate the platform approach and build organizational confidence.
Scaling Strategy: Develop processes for rapidly deploying new GenAI applications across the validated platform architecture.
Governance Integration: Embed AI Governance processes throughout the development lifecycle to ensure sustainable, compliant growth.
The Path Forward
The shift toward building GenAI applications on existing data platforms represents more than a technological trend—it's a strategic imperative for organizations serious about scaling AI capabilities.
By embracing RAG AI architectures, implementing comprehensive AI Governance, and leveraging existing infrastructure investments, enterprises can position themselves to capture the full value of Generative AI while avoiding the pitfalls of fragmented, costly development approaches.
The question isn't whether this transformation will occur, but how quickly organizations can adapt their strategies to leverage this fundamental shift in GenAI development. Those who act decisively on Gartner's insights will find themselves well-positioned to lead in the AI-driven economy of 2028 and beyond.
Ready to Build Your Next GenAI Application on Your Existing Infrastructure?
As Gartner predicts the future of GenAI development shifting toward existing data platforms, forward-thinking organizations need proven partners to navigate this transformation. Makebot specializes in building enterprise-grade LLM and RAG AI solutions that integrate seamlessly with your current infrastructure.
Don't wait until 2028 to gain competitive advantage. Start building your RAG AI-powered applications today with Makebot's all-in-one solution for multi-LLM integration, fine-tuning, and enterprise data integration.
Ready to lead the GenAI transformation?
📧 Contact Makebot: b2b@makebot.ai
🌐 Learn More: makebot.ai