The use of Machine Learning (ML) and Large Language Models (LLMs) in Artificial Intelligence (AI) for enabling, simplifying, and accelerating application development and modernization is gaining popularity, but raises many concerns.
Application modernization takes on many forms and is facilitated in many ways with the latest trend being code assistants and code generation via artificial intelligence. While most everyone has heard the term “artificial intelligence”, either in the movies or various media outlets, there are several well-warranted debates around the meaning and usefulness of application to our everyday lives.
For businesses, the most important may be questions surrounding the validity, integrity, and fidelity of the output when applied to the task of business application modernization. AI encompasses a broad range of technologies and methodologies aimed at creating systems that can perform tasks that typically require human intelligence. These tasks include problem-solving, decision-making, language understanding, and contextual perception. Below is a quick breakdown of AI and ML subsets—and how they can be applied to application modernization.
Generative AI
Generative AI is a subset of artificial intelligence that focuses on creating or generating new content—such as images, text, music, or even entire scenarios—that are not explicitly programmed into the system. Generative AI often utilizes techniques like neural networks, specifically Generative Adversarial Networks (GANs) or Variational Autoencoders (VAEs), to generate content that mimics patterns found in training data.
There are many facets to AI. It is closely related to Machine Learning. However, they are separate and distinct concepts. While many AI systems incorporate ML techniques, it is indeed possible to have AI without ML.
Rule-Based Systems
Before the rise of ML, AI systems relied heavily on rule-based approaches. In these systems, human experts would encode a set of rules and logical deductions that the AI system would follow to make decisions or perform tasks. These rules are typically deterministic and do not involve learning from data. Expert systems, rule-based expert systems, and knowledge-based systems are examples of AI systems that do not necessarily involve ML.
Symbolic AI
Symbolic AI, also known as classic AI, focuses on the manipulation of symbols and logic to perform intelligent tasks. It involves techniques such as logic programming or algorithms, expert systems, and automated reasoning. Symbolic AI is based on formal logic and rules, rather than statistical patterns learned from data.
Search and Optimization Algorithms
AI systems can also be built using search and optimization algorithms. These algorithms explore large solution spaces to find optimal solutions to problems. Techniques such as depth-first search, breadth-first search, A* search, and genetic algorithms fall into this category.
Knowledge Representation and Reasoning
AI systems can use formal methods for knowledge representation and reasoning. They can represent knowledge in various forms—such as ontologies and semantic networks—and use logical reasoning mechanisms to infer new knowledge from existing knowledge bases.While ML has become increasingly prominent in modern AI systems due to its ability to learn from data and make predictions without explicit programming, it’s important to recognize that AI encompasses a broader range of techniques and methodologies beyond ML. Depending on the requirements and constraints of a particular problem, AI systems may employ a combination of rule-based systems, symbolic AI, search algorithms, knowledge bases, and other techniques without relying on ML and Large Language Models (LLMs).
How We Can Help
For over three decades, having parsed over two billion lines of complex enterprise application code for over 250 clients, AveriSource has built—and continues to enhance—its own knowledge base or model, rules, and algorithms for understanding legacy mainframe and midrange applications and data systems, as well as accelerating analysis, planning, and modernization efforts.
The AveriSource Platform™️ products (Scan, Inventory, Discover, Analyze, and Transform) each rely on and build upon the information captured from the prior products to accomplish their designed goals. For instance, when generating data microservices and testing screens for the rearchitected data framework, Transform knows the variables used in business rules to ensure the relevant fields have required entry or sizing constraints, based on the knowledge gained from the Analyze product business rule identification functionality.
When generating business logic microservices, Transform maintains variable translations entered in the Discover Data Dictionary for ease of understanding and maintaining generated code.
The biggest concerns with the current stage of Generative AI code conversion products are not having the full knowledge of the model that the code generator was trained on, where the data originated, how complete or thorough the use cases are, and whether the generated code reliably represents the functionality of the legacy input. This uncertainty forces added focus on reviewing the generated code and additional testing.
To overcome these concerns, AveriSource is currently part of a team of vendors working with the Open Mainframe Project (OMP) to build a Machine Learning model based solely on COBOL (initially), to ensure predictable and reliable code generation for various application transformation requirements.
AveriSource listens to our partners, clients, and the market—and continues addressing the concerns raised—reducing manual pre-project preparation activities, improving code quality and speed of code ingestion, as well as reporting speed and configurability, accelerating application re-architecture and code generation, and greatly improving user interaction, training, and support. We are constantly and proactively working to improve processes, anticipating usage bottlenecks, expanding legacy language coverage and their constructs, and simplifying implementation efforts using current and impactful technologies.
As we, the industry, improve AI with more reliable models and expansion of proven use cases, we, AveriSource, will continue to lead the charge in applying it to application understanding, development, transformation, maintenance, and future design. Ever expanding the aperture and blurring the lines between “programming” languages, we, as a collective IT community, may soon see business analysis and technical experts performing “any-to-any” language conversion, as well as “natural language programming,” with automated deployments driven by AI-created executable objects. The future of AI-powered modernization is bright and as an industry leader, we look forward to playing a pivotal role in its success.
Here’s to the future.