Detailed Project History
Below are listed some of the projects and works by César Zea since 1993 until now (32 years developing, designing software solutions, and leading teams).
Some works and projects have not been included for confidentiality reasons.
From 2021 to now
Celestial Systems
(4 years)
Principal AI & Data Architect
& Software developer
Tasked with building a high-performance Data & AI division from the ground up to align with the company's strategic goal of becoming an industry leader in artificial intelligence. This initiative involved designing the entire organizational blueprint, securing executive buy-in, and leading the formation of multiple high-functioning teams from scratch.
- Strategic Roadmap & Vision: Formulated the multi-year strategic roadmap for the Data & AI division, presenting a clear vision to leadership that connected our technical capabilities directly to business growth and market differentiation.
- Organizational Blueprint Design: Architected the complete organizational structure for the division, defining the roles, responsibilities, and composition of new, specialized teams including Dataiku, MS Fabric, Python AI, Front-End AI, and Snowflake.
- Resource Optimization & Hiring Strategy: Authored the formal AI Resource Optimization and Hiring Plan identifying critical resource misalignments. This plan continously secured executive approval for the hiring of new professionals and restructured the division to free up the senior AI specialists from operational duties, enabling them to focus on high-value innovation.
- Talent Acquisition & Team Building: Drove the talent acquisition process from end to end. I provided direct support to HR by defining job descriptions and technical requirements, and personally led the interview and selection process to build our expert teams from the ground up.
- Culture & Collaboration Framework: Defined and implemented the division's work culture, establishing best practices for communication, a robust framework for shared documentation, and a collaborative environment centered on continuous learning and excellence.
As the technical and innovation leader for the newly formed Data & AI division, my primary responsibility was to empower our teams with the knowledge, tools, and vision necessary to achieve technical excellence. This involved creating a culture of innovation, mentoring talent, and leading hands-on R&D efforts.
- Knowledge Sharing Framework: Established and personally led recurring "Knowledge Sharing Sessions" to disseminate expertise on advanced AI/ML topics, cloud platforms, and engineering best practices, ensuring rapid and continuous upskilling across the entire division.
- Proof of Concept (POC) Leadership: Spearheaded the ideation, architecture, and development of both internal and client-facing POCs. These initiatives served to validate new technologies, mitigate project risk, and demonstrate cutting-edge capabilities in Generative AI, real-time analytics, and other emerging fields.
- Team Training & Onboarding: Designed and executed the complete technical onboarding and training programs for all new hires. This program was instrumental in accelerating their ramp-up time and ensuring their seamless integration into high-performing, productive teams.
- Mentorship & Architectural Guidance: Provided continuous technical mentorship to Team Leads and senior engineers, guiding critical architectural decisions, resolving complex technical challenges, and promoting elite best practices in software engineering and data science.
- Professional Development & Certifications: Championed the professional growth of the team by defining a clear path for development and encouraging the pursuit of key industry certifications in Azure, Dataiku, Snowflake, and other core technologies, solidifying our division's expertise.
Acting as the bridge between the technical division and the commercial success of the company, I held ultimate responsibility for the entire project lifecycle. This role encompassed everything from initial client engagement and solution design to final delivery and ensuring a positive return on investment for both the client and Celestial Systems.
- Full Delivery & P/L Ownership: Assumed end-to-end ownership for the entire project lifecycle and the division's profitability. I was accountable for transforming the newly-formed teams into successful production units that delivered high-quality, ROI-positive outcomes for our clients.
- SOW & Commercial Offer Authority: Served as the final technical and strategic authority for all commercial proposals. I personally architected the solutions, defined the scope and deliverables, and wrote, reviewed, and approved all Statements of Work (SOWs) and offers generated by the Data & AI division.
- Pre-Sales Leadership & Client Engagement: Provided expert technical leadership to the sales team by leading high-stakes client workshops, architecting solutions for complex business problems, and effectively communicating our value proposition to both technical and executive-level audiences.
- Project & Task Management Framework: Defined and implemented the project and task management framework for the division, ensuring rigorous tracking, clear communication, and successful execution of all engineering action plans and client commitments.
- Building a Profitable Division: My leadership was directly focused on not just building technically proficient teams, but on cultivating them into commercially successful, profitable business units that became a cornerstone of the company's growth strategy.
As the lead developer on this enterprise AI project, I architected and implemented a sophisticated Banking Document Intelligence system that transforms how financial institutions interact with their vast document repositories. The AIChat application leverages cutting-edge RAG (Retrieval-Augmented Generation) technology to provide intelligent, context-aware responses based on bank-specific document collections, enabling financial professionals to extract insights from complex financial data through natural language conversations.
- Full-Stack Architecture: Designed and implemented a comprehensive full-stack architecture combining a modern Ext JS 7.8 frontend with a Material Design theme for both desktop and mobile, ensuring a responsive user experience.
- Advanced AI Backend: Architected a robust Python Flask backend, integrating Azure OpenAI GPT-4 models through the LangChain framework for sophisticated natural language processing and conversation management with persistent memory.
- Core RAG System: Developed a sophisticated Retrieval-Augmented Generation (RAG) system that dynamically retrieves relevant documents from bank-specific Azure Search indexes and generates contextually accurate, streaming responses.
- Multi-Tenant & Secure Design: Implemented a multi-tenant architecture supporting multiple banking institutions, with dynamic index selection based on bank IDs, secure authentication, and isolated data processing for each institution.
- Document Processing Pipeline: Built a comprehensive document processing pipeline with Azure Blob Storage for secure file management, SQL Server for metadata storage, and automated indexing of financial documents into Azure Search.
- Advanced Prompt Engineering: Created an advanced prompt engineering system with intelligent character limit management, document truncation algorithms, and conversation history integration to optimize LLM performance while maintaining context.
- Enterprise-Grade Security: Developed enterprise-grade security features including CORS configuration, request validation, comprehensive logging, and Azure Identity integration for secure access to cloud resources.
- Real-Time Streaming: Implemented real-time streaming responses using Flask generators, providing immediate user feedback and an enhanced user experience for complex financial queries.
- Intuitive User Interface: Designed an intuitive UI with bank selection grids, file management systems, user authentication, and a comprehensive dashboard displaying document processing status and system analytics.
- High Availability & Reliability: Established robust error handling, logging frameworks, and monitoring systems to ensure high availability and reliability for mission-critical financial operations.
As a strategic leader within the Data & AI division, I conducted a comprehensive analysis of our team structure and resource allocation, culminating in a formal proposal for a strategic hiring and optimization plan. The initiative was driven by a critical misalignment between our stated goal of AI leadership and the on-the-ground reality of our top AI specialists being consumed by operational, non-AI tasks.
- Resource & Skill Gap Analysis: Conducted a thorough audit of the Data & AI teams (including Dataiku, MS Fabric, and Snowflake), identifying a critical skill gap and the underutilization of 7 senior AI specialists who were dedicated to non-strategic project work.
- Strategic Restructuring Proposal: Architected a resource optimization plan to free up our most valuable AI experts. The core of the proposal was to backfill their current roles with 7 new Python engineers, thereby liberating our senior talent to focus on core AI development, mentorship, and advanced R&D.
- Future-State Organizational Design: Designed the target organizational structure for the entire AI division, defining the composition of new, specialized teams including a 'Python AI' team (Data & LLM Engineers), a 'Front-End AI' team (React/JS), and reinforcing the existing Snowflake team.
- Comprehensive Hiring Plan: Developed and formalized a global hiring request for 18 professionals, detailing the specific roles, required skillsets (from Python developers to specialized LLM Engineers and Team Leads), and strategic justification for each position.
- Risk Assessment & Mitigation: Identified and articulated the strategic risks of inaction, including the inability to staff upcoming high-priority projects like 'Farm Lending' and the danger of 'dying of success'—having opportunities without the capacity to execute. The hiring plan was positioned as a direct risk mitigation strategy.
- Compliance & Scalability Framework: Identified the growing burden of compliance on technical leadership and incorporated the hiring of a dedicated Compliance Specialist to de-risk operations, offload managers, and ensure sustainable, orderly growth for the entire division.
As the lead architect for the Insights Generator project, I developed an advanced AI-powered platform that transforms raw, unstructured data into actionable business insights. The solution addresses a critical challenge faced by modern businesses: extracting meaningful patterns and knowledge from diverse data sources without requiring specialized technical expertise. By leveraging the capabilities of OpenAI's GPT models, the platform enables business decision-makers to gain a comprehensive understanding of customer sentiment, product feedback, and team communications.
- Dual-Module Architecture: Designed and implemented a sophisticated two-pronged solution with specialized modules for analyzing product reviews and group chats, enabling organizations to process and derive insights from multiple data formats and sources.
- Scalable Processing Pipeline: Architected a scalable processing pipeline that handles large volumes of text through innovative batching techniques, allowing the system to process document sets that exceed the token limits of individual AI model prompts.
- Advanced Prompt Engineering: Developed advanced prompt engineering strategies including staging and grounding techniques to minimize hallucinations and ensure the accuracy of AI-generated insights, maintaining the integrity of the analysis.
- Semantic Clustering System: Created a semantic clustering system that identifies dominant discussion topics and groups similar content, enabling the extraction of nuanced insights while maintaining references to the original text for verification.
- Comprehensive Insights Framework: Implemented a comprehensive insights extraction framework that generates high-level summaries, identifies key topics, provides statistical analysis, and delivers actionable recommendations based on the processed data.
- Natural Language Interface: Built a natural language query interface that allows non-technical users to interact with the system using everyday language, democratizing access to advanced data analysis capabilities across the organization.
- Cloud Integration: Integrated the solution with Azure cloud services for scalable deployment, enabling organizations to process large datasets efficiently while maintaining security and compliance requirements.
A comprehensive AI-powered solution designed to revolutionize how banking institutions engage with their customers through personalized product recommendations. This Next Best Offer (NBO) platform leverages advanced machine learning algorithms and predictive analytics to identify the most suitable banking products for each customer, calculate the expected revenue impact, and generate personalized communications to maximize conversion rates.
- Machine Learning Pipeline: Developed a sophisticated machine learning pipeline that analyzes customer data across multiple dimensions—including demographics, account history, transaction patterns, and product holdings—to predict the probability of subscription to various banking products.
- Advanced Customer Segmentation: Implemented a multi-faceted customer segmentation approach combining traditional RFM (Recency, Frequency, Monetary) analysis with demographic and behavioral patterns to create highly targeted customer profiles.
- Predictive Revenue Modeling: Engineered a predictive modeling system that calculates both direct and indirect expected revenue impact for each product recommendation, enabling financial institutions to prioritize marketing efforts based on potential ROI.
- Interactive Campaign Dashboard: Created an interactive marketing campaign dashboard that identifies top prospects for cross-selling specific products, allowing marketing teams to optimize campaign targeting and resource allocation.
- AI-Generated Communications: Integrated Large Language Model (LLM) capabilities to generate personalized, contextually relevant sales communications tailored to each customer's profile, including customized email content with product-specific sales arguments.
- Flexible Data Ingestion: Built a flexible data ingestion system supporting multiple input methods (direct file upload, database connections, or integration with existing customer segmentation systems) to accommodate various banking data infrastructures.
- Intuitive User Interface: Designed an intuitive web-based interface for business users to configure prediction parameters, select target products, and visualize results through interactive dashboards tailored for different stakeholders.
- Extensible "Advisor" Plugin: Incorporated an optional "Advisor" plugin that enhances the platform with additional customer insights and visualization capabilities, providing deeper analytical context for decision-makers.
VoiceRAG is an innovative solution that demonstrates a powerful pattern for implementing voice-based assistants with Retrieval-Augmented Generation (RAG) capabilities. This application showcases how modern AI technologies can be combined to create natural, context-aware voice interactions enhanced with domain-specific knowledge.
- Real-Time Voice Interface: Implemented a real-time voice interface using the GPT-4o Realtime API for Audio, enabling natural and fluid conversations without the traditional turn-based limitations of most voice assistants.
- Sophisticated Backend Architecture: Designed and built a sophisticated Real-Time Middle Tier (RTMT) that bridges the frontend voice interface with Azure OpenAI services, handling bidirectional WebSocket communication and managing the flow of audio data and text responses.
- Retrieval-Augmented Generation (RAG): Integrated RAG capabilities using Azure AI Search, allowing the assistant to retrieve and reference relevant information from a knowledge base when responding to user queries, significantly enhancing the accuracy and usefulness of its answers.
- Custom Financial Tools: Developed custom tools for financial use cases, including the ability to retrieve product balances, list available products, and execute financial transactions like transfers between accounts.
- Responsive Frontend: Created a responsive React frontend with TypeScript that handles audio recording and playback, displays retrieved documents, and provides visual feedback on the conversation state, all styled with TailwindCSS for a modern user experience.
- Secure & Personalized Architecture: Implemented a flexible architecture that supports authentication flows, allowing different users to access personalized information and perform authorized actions based on their identity.
- Robust Error Handling: Engineered the system to handle various edge cases, including error states, conversation initialization, and graceful degradation when services are unavailable, ensuring a robust user experience.
- Extensible & Scalable Design: Designed the application with extensibility in mind, allowing for the easy addition of new tools and capabilities as requirements evolve, demonstrated by the planned features for financial transactions and enhanced authentication.
This project serves as both a functional prototype for voice-based financial services and a reference implementation for the VoiceRAG pattern, showcasing how Azure AI services can be leveraged to create sophisticated, context-aware voice assistants for various domains.
Led a complex, end-to-end project to architect and deliver an advanced solution for Shiftboard's new analytics platform on Microsoft Fabric. The engagement evolved from initial architectural design and a native-Fabric implementation to incorporating a sophisticated, source-side Change Tracking (CT) system, ultimately delivering a robust, hybrid, and fully automated data pipeline that unified data from up to 10 separate client databases.
- Architectural Strategy & Design: Conducted the initial analysis of multiple ETL architectures, comparing a Fabric-native model against a source-based SQL Server Change Tracking (CT) model. Authored the foundational technical proposal and SOW that defined the project's scope, deliverables, and acceptance criteria.
- Scope Change Management & Implementation: Successfully managed and implemented a significant mid-project scope change to integrate SQL Server Change Tracking (CT). This involved designing and deploying custom T-SQL stored procedures and log tables on the source SQL Servers to capture granular change data (Inserts, Updates, Deletes) with high precision.
- Hybrid ETL Pipeline Development: Engineered a highly dynamic, metadata-driven ETL framework in Fabric. The PySpark notebooks were designed to support a hybrid model, seamlessly switching between timestamp-based incremental updates for standard tables and consuming explicit change logs from the mirrored CT tables for high-frequency data.
- Dynamic Metadata Control System: Built the core of the solution around a set of control tables in Fabric. This framework allows for the dynamic configuration of new databases, tables, composite primary keys, and sync types, making the entire pipeline flexible and scalable without requiring code changes.
- Data Modeling & Integrity: Implemented a consolidated star schema in the Lakehouse, transforming and merging data from multiple sources. A robust surrogate key generation system was engineered to resolve primary key collisions, ensuring relational integrity.
- Historical Dimension Handling: Built the transformation logic to accurately manage historical dimension data ('DimGroups'), enabling correct point-in-time analysis by joining transactional facts with the version of the group definitions that was active at the time of the event.
- Successful Delivery & Validation: Delivered the complete, end-to-end POC, including an incrementally refreshed semantic model for Power BI. The solution was successfully validated against all acceptance criteria, confirming data accuracy and the reliability of the 15-minute automated refresh cycle.
Architected a solution to re-engineer Equilend's message queue processing system, addressing a critical scalability bottleneck. The existing architecture inefficiently required every server to listen to every queue, causing resource strain to multiply as the system grew. The new design introduces a dynamic, intelligent, and resilient load-balancing system.
- Designed a dual-monitoring system to provide real-time visibility into the health and status of the entire MQ ecosystem.
- Created a Queue Monitoring System utilizing native IBM MQ capabilities (QSTATUS, MONQ) to track queue depth, listener count (OPENCOUNT), and latency, publishing status events to a central supervision queue.
- Designed a Server Monitoring System to allow each server in the cluster to be aware of the status of its peers, enabling intelligent failover and load distribution.
- Developed a Decision-Making Algorithm that consumes monitoring data and uses a configurable rules engine to dynamically scale listener threads up or down on each server.
- The new architecture ensures high availability by automatically reassigning listeners from a downed server to healthy nodes in the cluster.
- The system can automatically allocate additional listeners to queues that are overloaded or experiencing high latency, ensuring performance targets are met.
- The solution transforms a rigid, unscalable system into a dynamic, resilient, and resource-efficient platform prepared for future growth.
Architected and developed a new, modern, and lightweight web console for IDERA's flagship SQL Diagnostic Manager product. The primary objective was to create a high-performance, fully responsive monitoring tool for DBAs and developers, capable of scaling to extremely large enterprise environments and providing a seamless experience on both desktop and mobile devices.
- Full-Stack Development: Engineered the application with a Java/Tomcat-based web front end that consumes data from a newly developed .NET REST API layer, ensuring a clean separation between the UI and backend services.
- Responsive UI/UX: Designed and implemented a fully responsive user interface using Bootstrap and a modern JavaScript framework (React/Angular), ensuring an optimal user experience across all devices, from ultra-wide monitors to tablets and mobile phones.
- Dashboard & Visualization: Created a feature-rich server dashboard providing an at-a-glance overview of the entire SQL Server environment. This included a customizable charting module where users can visualize key performance metrics, rearrange charts via drag-and-drop, and expand them for detailed analysis.
- Data Views: Built multiple data views for server monitoring, including a "Card View" for high-level status and a detailed "List View" with advanced data grid features like sorting, filtering, and column management.
- Drill-Down Functionality: Implemented detailed views for server sessions and alerts, allowing DBAs to drill down into specific performance issues, view blocking information, and manage alerts (including snooze functionality) directly from the web interface.
- Theming & Customization: Engineered the application to support both Light and Dark themes, allowing users to toggle between modes for improved visual comfort and accessibility.
- Security & Integration: Integrated the web console with the core SQL Diagnostic Manager security model, enforcing existing instance-level user permissions and supporting both Windows and SQL Server authentication methods.
As the Enterprise Architect on a strategic consulting engagement, I co-authored a comprehensive pre-read document for a key workshop with Equilend. The objective was to define a transformative vision for a new "Data Marketplace" designed to unlock the immense value of their vast securities lending data assets and establish new, high-margin revenue streams.
- Conducted an in-depth analysis of Equilend's current data landscape, identifying key challenges such as delayed (once-a-day) data updates, limited accessibility, and significant untapped monetization opportunities across their NGT and DataLend platforms.
- Architected the strategic vision for the Equilend Data Marketplace, a platform designed to provide clients with real-time insights, predictive intelligence, and direct access to valuable datasets.
- Defined multiple data monetization strategies for the platform, including tiered subscription models, direct data licensing agreements, and premium value-added services like advanced analytics and portfolio optimization tools.
- Outlined the technical roadmap for Phase 1, focusing on transforming raw transactional data into ML-ready, time-series datasets and designing high-value features to serve as the foundation for machine learning.
- Crafted the long-term vision for a collaborative "AI Marketplace," proposing advanced capabilities such as a repository for community-driven predictive models (e.g., anomaly and fraud detection), a secure sandbox for ML model testing, and the integration of Large Language Models (LLMs) with RAG for sophisticated business analytics.
- Authored the detailed section on predictive model explainability, establishing a framework for incorporating features like LIME/SHAP, feature importance, and ROC/AUC analysis to ensure transparency, build user trust, and manage risk.
Conducted a specialized research project to analyze the compatibility of the PreEmptive DashO obfuscator with the new features and bytecode changes introduced in Java 18. The primary objective was to identify necessary modifications to the DashO engine and provide a clear path forward for the official Java 18 support release.
- Developed a comprehensive sample Java application to test and validate DashO's protection capabilities against new Java 18 features.
- The test application implemented key Java Enhancement Proposals (JEPs), including JEP 400 (UTF-8 by Default), JEP 413 (Code Snippets), JEP 416 (Core Reflection with Method Handles), and JEP 420 (Pattern Matching for switch).
- Performed a thorough gap analysis by running the existing DashO obfuscator against the Java 18 application, meticulously documenting errors and identifying language constructs that were not correctly processed or protected.
- Conducted a deep-dive analysis into the DashO source code to pinpoint the exact locations requiring modification to handle the new language and bytecode structures.
- Authored and delivered a detailed technical report outlining all findings, providing specific, actionable recommendations and proposing solutions for the DashO engineering team to implement full Java 18 support.
- Ensured the sample application was built using modern build tools (Ant/Gradle) and adhered to the project's established coding and style conventions.
Enhanced the Perspectium DataSync platform by implementing key user-facing features and bug fixes to improve usability and provide greater insight into data replication processes. Key responsibilities and achievements included:
- Improved the user onboarding experience by dynamically hiding unnecessary installation steps in the ServiceNow Core UI, reducing user confusion.
- Enhanced application usability by redesigning the properties page, organizing settings into logical subcategories for easier navigation and management.
- Strengthened data integrity by ensuring the DataSync Agent generates fully compliant XML records, guaranteeing successful replication into ServiceNow instances.
- Provided critical system visibility by adding a storage utilization indicator to the Snapshot application, allowing users to monitor consumed and available space.
- Delivered greater operational insight by adding a "Last Restored" timestamp to each snapshot, enabling users to easily track data recovery activities.
- Automated user workflows by implementing an optional email notification system to alert users upon the completion of lengthy data restore processes.
The Azure Cost Control System (ACCS) is a comprehensive cloud cost management platform designed to help organizations monitor, manage, and optimize their Azure cloud spending. This solution provides real-time visibility into resource consumption, automated budget monitoring, and proactive notifications when costs exceed predefined thresholds, enabling organizations to maintain financial control over their cloud infrastructure.
- Full-Stack Development: Developed a full-stack application with a Python Flask backend that integrates directly with Azure Cost Management APIs to retrieve accurate cost and forecast data for all Azure resources across multiple subscriptions.
- Secure Credential Management: Implemented secure credential management using Azure Key Vault, ensuring that sensitive information such as database credentials and API keys are never hardcoded in the application, adhering to security best practices.
- Intuitive Frontend Dashboard: Created a responsive React frontend with an intuitive dashboard that visualizes budget status, current costs, and forecasted spending through interactive bar charts, enabling stakeholders to quickly identify cost trends and potential budget overruns.
- Automated Notification System: Designed a robust notification system that automatically alerts designated individuals when resources or budgets exceed defined thresholds, with detailed information about the affected resources and recommended actions.
- Comprehensive Auditing & Logging: Built a comprehensive logging system that tracks all system activities and budget status changes, providing a complete audit trail for compliance and troubleshooting purposes.
- Data Modeling: Engineered a PostgreSQL database schema to efficiently store and retrieve historical cost data, budget configurations, and notification records, enabling trend analysis and performance reporting.
- RESTful API Development: Developed a RESTful API that exposes endpoints for retrieving budget data, system logs, and notification history, facilitating integration with other enterprise systems and custom reporting tools.
- API Rate-Limiting Handling: Implemented intelligent rate-limiting handling for Azure API requests, ensuring the application remains responsive and reliable even during periods of high activity or when approaching API quotas.
Conducted a comprehensive investigation and documentation of the current state of DevOps across three distinct BitTitan products: MigrationWiz, Perspectium, and Voleer. The primary objective was to provide a complete picture of existing processes to define the future DevOps strategy for the company. This consultancy project involved interviewing key staff and delivering a final detailed report.
- Created detailed architecture diagrams for production, non-production, and development environments for each product line.
- Documented the entire software promotion lifecycle, from development to production, including rollback strategies, pre-production checklists, and customer notification processes.
- Mapped the complete development workflow, detailing source code branching strategies and access control to key assets.
- Performed an audit of all tools and products used in the DevOps workflow, identifying and highlighting any components that were outdated, had known issues, or used insecure communication channels.
- Documented all operational runbooks for normal tasks and procedures for handling typical failures, including roles and responsibilities.
- Analyzed and documented the production monitoring processes, including a list of monitored assets, the tools used, and the support escalation paths.
- Defined and documented the processes for onboarding new customers and team members, as well as decommissioning procedures.
Provided expert consultancy as a Solutions Architect to assist in developing a strategic roadmap for upgrading Hitachi Energy's existing ExtJS applications. The goal of this engagement was to increase stability, maintainability, performance, and the overall long-term quality of the code.
- Conducted an in-depth code review and discovery process of the client's applications, inspecting server interactions and discussing key functionality with their engineering teams.
- Analyzed and documented findings on application architecture, design approaches, API calls, custom components, theming, and performance.
- Provided actionable recommendations and best practices to improve code quality and performance, addressing issues like DOM complexity, data fetching, and module splitting.
- Delivered code samples and proofs-of-concept to demonstrate proposed fixes, especially for custom charting and data visualizations.
- Authored a final, detailed report that provided a strategic roadmap for upgrading the applications to the latest version of ExtJS.
- Created and presented a detailed estimate for the full application upgrade, based on the findings of the architecture review.
Developed a new Graphical User Interface (GUI) for the Kiuwan Local Analyzer (KLA) to empower security configurators. This tool replaces the manual, error-prone process of editing complex XML metadata files. The new UI allows users to easily and safely define custom security rules by specifying data-flow sources, sinks, and neutralizers for Kiuwan's static analysis engine. Key achievements included:
- Designed and built a comprehensive UI to manage security metadata, integrated directly into the Kiuwan Local Analyzer's advanced configuration panel.
- Developed a file management and search interface allowing users to find, filter, and view both default (read-only) and custom metadata files across different scopes (system-wide, application-specific, etc.).
- Created a powerful, non-modal editor that allows multiple files to be opened and edited simultaneously, featuring a tree-view for structure and an intuitive interface for modifying attributes.
- Implemented a robust validation system, driven by an XSD schema, to guide users with tooltips, examples, and checks for mandatory fields, ensuring the generated XML is always correct.
- Engineered a conflict detection system that visually alerts the user when a custom rule overrides a default or another custom definition, helping to prevent unintended behavior in security analyses.
- Ensured the entire UI was developed with Java and Swing, adhering to strict coding standards and integrating with Kiuwan's existing build system based on Ant and Maven.
Led a major, multi-release engineering effort to modernize the core technology stack of Aqua Data Studio and significantly expand its enterprise connectivity features. This involved deep architectural changes, extensive cross-platform support, and the implementation of highly requested user features.
- Executed the platform-wide migration from OpenJDK 8 to 11, addressing complex dependencies across the JVM, IntelliJ editor framework, custom components, and dozens of JDBC drivers.
- Engineered and implemented comprehensive IPv6 support across the entire application, modifying connection logic for over 30 database types, SSH tunneling, AquaScript (SCP, FTP, HTTP), Version Control (Git, SVN), Fluidshell, and proxy functionalities.
- Developed and launched a full-application Dark Theme by re-architecting Java Swing components, providing a modern user experience across all dialogs, editors, and tools including the ER Modeler and Query Builder.
- Updated the graphics rendering engine by migrating to Mesa v21.2, which involved building C/C++ code for various architectures to support newer graphics cards in the Visual Analytics tool.
- Expanded database and OS support by adding full compatibility for Oracle 21c, PostgreSQL 14, Windows 11, and macOS Monterey.
- Managed the update of more than 10 core JDBC drivers to their latest versions, including for Oracle, PostgreSQL, DB2, Snowflake, and Microsoft SQL Server.
- Resolved a backlog of critical, customer-reported defects, focusing on improving DDL extraction, schema comparison, and index handling for SingleStore and DB2 databases.
Engaged in a multi-year, multi-release effort to enhance and modernize the Sencha Ext JS framework and its comprehensive suite of development tools. Responsibilities included developing new features, resolving customer-reported defects, and ensuring compatibility with the latest platforms and IDEs across Ext JS versions 7.5, 7.6, 7.7, and the 7.8 tooling update.
- New Feature Development (Ext JS 7.6): Architected and implemented key framework features, including strict Content-Security-Policy (CSP) support, a modern grid list filter, stateful column management, and drag-and-drop capabilities from grids into the HTML Editor.
- Tooling Modernization (Sencha Tooling 7.8): Spearheaded the modernization of Sencha's tooling, upgrading Sencha Cmd to support Java LTS versions up to Java 21 and enhancing the Closure compiler with greater ECMA support.
- Sencha Architect as a VS Code Extension (POC): Led a Proof of Concept (POC) project to re-imagine Sencha Architect as a lightweight, feature-rich VS Code extension. Designed a low-code editor with real-time, two-way code synchronization, drag-and-drop functionality, and a seamless in-editor experience to reduce the learning curve and keep developers within a single environment.
- IDE Plugin and Tooling Compatibility: Consistently maintained and updated all IDE plugins (VS Code, JetBrains, Eclipse) and standalone tools (Architect, Themer) to ensure full compatibility with new Ext JS SDK releases, the latest host IDE versions, and modern hardware like Apple M1/M2 processors on macOS Monterey and Ventura.
- Component Enhancements & Bug Fixes (Ext JS 7.5 & 7.7): Delivered numerous quality-of-life improvements and bug fixes across the Ext JS component library, focusing on data-intensive components like Cartesian charts, grids (filterbar, buffered rendering, scrolling, accessibility), and tree panels.
- Mobile and Cross-Browser Support: Improved mobile support by resolving touch event issues in the Calendar component and addressing UI behavior on iOS and Android devices, while ensuring stability across all supported desktop browsers.
- Dependency and Ecosystem Updates: Managed updates of critical third-party dependencies within the framework, including upgrading the integrated Froala Editor and Font Awesome packages to their latest versions.
From 1.999 to 2021
Jaune Sistemas
(22 years)
Founder, CTO & Chief Architect
& Software developer
Period From 2.009 to 2.021
- Development of various extensions and modifications to their BI system and data warehouse to attend various internal business transformation projects.
- Implementation of the SQL Server code control system in Git.
- SalesForce pick list data synchronization using SalesForce APIs from Java.
- Process and performance optimization.
- BI system documentation.
I developed an AWS-DMS services management and supervision tool that automates all these tasks and resolves them in an automated way, with AWS-DMS integration with the automation of:
- Data tables to migrate and their requirements.
- AWS-DMS Automation with the automation of:
- Tasks creation.
- Start – stop tasks.
- Initial full data transfer in the fastest way that exists between SQL Server databases with a signigicant improvement.
- Synchronization validation.
- Control data changes.
- Creation of full speed SQL Server to SQL Server data transfer in a parallel controlled tasks.
- Able to use multiple disks to speed up the storage of data caches.
- The speed limit is only the network and the disks speeds. All any other has being full optimized.
- When all tables an AWS task completes its initial full transfer, the AWS task will start automatically with the correct synchronization start time.
- All automatic Minimum human interaction No chance of error.
- 2 TB SQL Server Enterprise Database, table partitions, etc.
- More than 3.000 remote PLCs near realtime telecontrolled.
- More than 126.000 source ligtings telecontrolled.
- More than 400.000.000 of events and measures each year.
- Near real time supervision and telecontrol.
- Data Warehouse + Processes, Reporting and Analytics BI.
- ETL process supervision.
- Time analysis.
- Error control and reporting.
- Database isolation when problems exist and
- the data is inconsistent.
- Linked with Pentaho Home control status.
The objective of the project carried out by me was the safe migration, consolidation and performance improvements, at production time with non stop, from SQL Server 2000 to the last SQL Server version, ensuring empirically with an engineering perspective, tested and documented, the following points:
- 100% verification of the migrated T-SQL code, with sufficient unit tests to verify the 100% code coverage, record of tests performed and verification of results.
- 100% verification of views and migrated data.
- Increase in performance according to the same stress and load existing in production.
- Perform hot migration in production systems.
Project results:
- 100% compliance with the project requirements.
- Zero incidents in the realization of hot migration of the systems in production.
- Project completed in the budget and term agreed with no incident and congratulations
- from the client.
I have done similar projects for other companies, such as for TelePizza SA and its production databases.
The biggest load of web stress in Spain with several million sales per hourshortly before each football game.
Multilayer infrastructure of valancers, front and 12 synchronized SQL Server base servers implemented with much of the business layer in hundreds T-SQL stored procedures.
The objective of the project carried out by me was the safe migration, consolidation and performance improvements, at production time with non stop, from SQL Server 2000 to the last SQL Server version, ensuring empirically with an engineering perspective, tested and documented, the following points:
- 100% verification of the migrated T-SQL code, with sufficient unit tests to verify the 100% code coverage, record of tests performed and verification of results.
- 100% verification of views and migrated data.
- Increase in performance according to the same stress and load existing in production.
- Perform hot migration in production systems.
Project results:
- 100% compliance with the project requirements.
- Zero incidents in the realization of hot migration of the systems in production.
- Server Consolidation:
- Initial situation, 12 servers.
- Final situation: only 2 servers.
- Project completed in the budget and term agreed with no incident and congratulations
- from the client.
I have done similar projects for other companies, such as for ALK Abello pharmaceutical company and its production databases.
Platform for the management and sale of Iberdrola shares to its more than 100,000 employees, in which sales occurred in two short annual periods in which 90% of sales occurred in the first two hours.
- Management and issuance of sales and digitally signed certificates with legal support.
- Flexible compensation management platform for managers with simulation of state tax calculation based on their compensation models.
- Order management and satisfaction monitoring of the requested requirements.
From 1.999 to 2021
Jaune Sistemas
(22 years)
Founder, CTO & Chief Architect
& Software developer
Period From 1.999 to 2.009
- Excmo. Ayuntamiento de Barcelona.
- Excmo. Ayuntamiento de Madrid.
- Grupo ACS-Dragados.
- Georg Fischer.
- Charmilles Technologies.
- Esymar Laboratorios.
- Española de Desarrollo Financiero.
- ASI Consultores.
- Bussiness Transformation Consulting Group.
- Redes, Tecnologías y Sistemas.
- AB Group.
- Grupo Afinsa
- Auctentia Subastas.
- Grupo Antíquitas 2003.
- Paisajes Españoles.
- Altra Lda.
- Española de Desarrollo Financiero.
- Charmilles Technologies.
- Editorial América Ibérica.
- Finarte Casa D’Aste.
- Polmer
- GMAI – Auctentia Central de Compras.
- Geofísica Consultores.
- Ferrovial
- Universidad Complutense.
- ADIF
- Albalá Ingenieros
- Datatronics Mobility
- Sociedad Ibérica de Construcciones Eléctricas, SICE
- Puerto de Valencia
- Puerto de Sagunto
- MercaMadrid
- Software de usuario, definición, implementación y gestión de la base de datos, para el Sistema de gestión y vigilancia en tiempo real mediante la lectura de matrículas automatizado en todos sus accesos de la Urbanización La Moraleja. (Una media de treinta millones de registros cada seis meses).
- Varios sistemas de gestión de expedientes administrativos y emisión de licencias.
- Definición de la base de datos con acumulados y medias automáticas de veinte magnitudes tomadas cada diez minutos de forma continua desde cuarenta estaciones de medición remota. Sistema de explotación gráfica y estadística.
- Proyecto llave en mano de implementación de la base de datos con varios millones de registros, generación automática de tablas acumuladas y sistema de definición y explotación de gráficos.
- Definición de la base de datos y traspaso de datos de un sistema de peajes y accesos en sistemas Linux a plataformas Microsoft para Merca-Madrid (gestión de decenas de millones de registros)
- Explotación de datos de un sistema de control de accesos. Actualmente el sistema está en funcionamiento con varios millones de registros en base de datos.
- Informes de datos meteorológicos obtenidos de bases de datos SQL Server de gran capacidad.
- Control de paneles de mensaje variable.
- Enlace de la Intranet con un sistema basado en PalmOS y SMS.
- Sistema de gestión de datos administrativos.
- Explotación gráfica en intranet de datos obtenidos cada diez minutos de más de cien dispositivos (SQL Server en almacenamiento de gran capacidad).
- Sistema de mantenimiento y explotación de datos para el control de obras en carretera.
- Gestión accesos y explotación de datos de control de accesos para personal y vehículos.
- Dirección de Sistemas.
- Consultoría para la solución corporativa de protección contra virus y adquisición e instalación de la misma.
- Establecimiento de comunicaciones Madrid – Barcelona.
- Instalación de gestión de control remoto para accesos desde Barcelona.
- Sistema B2B en Web enlazado con el ERP corporativo, datos obtenidos de transmisiones EDI y bases de datos propietarias para el cálculo de precios, descuentos y tiempos de entrega en bases de datos de artículos con más de 30.000 referencias.
- Sistema de explotación de datos extraídos del CRM corporativo para el análisis de actividades y ofertas.
- Instalación de un servidor Linux con impresora de PDFs y un desarrollo de un aplicativo para la composición de PDFs obtenidos a partir de partes de la documentación corporativa de productos.
- Reconfiguración de un servidor Linux para el acceso simultaneo desde varias redes físicas diferentes.
- Gestión e instalación de los accesos remotos externos.
- Sistema de captación de datos EDI para la creación de la Intranet de logística, gestión de albaranes, transacciones de almacén, etc.
- Consultoría y supervisión para la reestructuración de red.
- Inclusión de la información de Órdenes Directas de Suiza en el sistema B2B.
- Adaptación del sistema de obtención de datos EDI al nuevo formato numérico del ERP corporativo.
- Adaptación del CRM corporativo a las necesidades de Georg Fischer.
- o Importación de datos al CRM corporativo y automatización de la Importación de datos del ERP al CRM corporativo.
- Consultoría y solución problema envío masivo de faxes.
- Consultoría de procesos, definición de los protocolos de transmisión, implementación y puesta en marcha de un sistema de enlace entre el seguimiento de la empresa de logística subcontratada y la Intranet corporativa de logística para el seguimiento de stock y pedidos.
- Diversos informes de incidencias y actividades para la explotación de datos del CRM corporativo.
- Consultoría para la revisión del estado de la solución de protección frente a virus.
- Automatización del envío de recordatorios de pagos a clientes en el sistema de Cash Flow corporativo. Envío directo por Zetafax.
- Estudio de la optimización del uso del CRM corporativo.
- Supervisión de los de los trabajos de diferentes proveedores relacionados con los sistemas informáticos durante tres años consecutivos a la actualidad.
- Adaptaciones al B2B para la red corporativa GF Common Network.
- Creación del sistema de Cash Flow corporativo.
- Enlace de Stock Optimizer con SCALA en SQL Server 2000.
- Informes de Aprovisionamiento e integración con el Cash Flow.
- Envío automático de estados de pedidos por fax.
- Estudio de viabilidad de la utilización de Palm o Cassiopeia por los comerciales.
- Sistema de Gestión de Tesorería.
- Consultoría, definición e implementación del Sistema Georg Fischer Task Manager testing.
- Sistema de control de almacenes.
- Sistema de reporting a la corporación BUSAR.
- Sistema de publicación de información automática del ERP en el sistema B2B en Web.
- Creación de los procedimientos de calidad ISO 9000 referidos a sistemas. Preparación para las auditorias.
- Implementación de un servidor de Reserva como sistema de respuesta rápida a posibles desastres.
- Informes de estadísticas de facturación del ERP.
- Consultoría implementación y revisión de los sistemas de seguridad informática.
- Envío automático de informes de ventas al móvil por SMS.
- Revisiones anuales de los procedimientos de Informática para el sistema de calidad.
- Licencias, consultoría e implantación del Sistema Unificado de Tratamiento de Información – JAUNECRM – para la integración de toda la información disponible en la empresa, todos los procesos, work flow, documentos, etc. Implantación en Madrid, Barcelona y usuarios remotos.
- Modificaciones al sistema huérfano para la actualización remota de datos.
- Modificaciones, conectividad e instalación del sistema huérfano en las oficinas y la planta de fabricación de Altra Portugal.
- Consultoría de mejoras y modificaciones sobre sistema huérfano. ()
- Cálculos de costes e integración con el sistema huérfano.()
- Control del proceso productivo con códigos de barras.
- Seguimiento de la promoción de Transitions.
- Paso al euro.
- Modificaciones al sistema huérfano para adaptarlo a la venta de lentes de fabricación.
- Automatización de los pedidos de stock al proveedor
- Servidor de correo sobre Linux.
- Canal logístico y de atención para clientes preferentes.
- Mejora en el sistema de introducción de órdenes de venta para reducir el tiempo de gestión y aumentar así el rendimiento de los usuarios.
- Adaptación del sistema huérfano para el uso óptimo y control de bolsas de envío.
- Automatización de los Envíos de pedidos.
- Instalación de FreeTDS para el acceso a los datos de SQL Server desde Linux.
- Ampliación del sistema informático en oficinas remotas.
- Modificaciones al sistema huérfano para la aplicación de descuentos de fabricación a lentes de stock.
- Modificaciones al sistema huérfano para la recepción y generación de albaranes de lentes de fabricación a medida.
- Modificaciones al sistema huérfano para la exportación de facturas a su ERP.
- Modificaciones al sistema huérfano para la gestión de tablas temporales y edición de Albaranes.
- Optimización de las comunicaciones.
- Modificaciones al sistema huérfano para la aplicación de descuentos.
- Modificaciones al sistema huérfano para la gestión de ópticas.
- Análisis estadísticos de ventas.
- Modificaciones al sistema huérfano para la asignación de descuentos por familias y clientes.
- Sistema de catalogación del stock de sellos mediante imagen digital enlazada al sistema de gestión (1.000.000 sellos)
- Análisis, diseño, implementación y puesta en marcha del portal de subastas http://www.afinsa-auctions.com.
- Estudio para el mantenimiento de la Web de subastas de Auctentia.
- Integración del sistema Maximiles en www.QuintaEsencia.com.
- Control de la procedencia de los compradores de la Web.
- Gestión de los sistemas de Copias de Seguridad en los servidores.
- Recuperación de desastre por infección del virus Codigo Rojo.
- Creación de procesos en background para el procesamiento y depuración de imágenes obtenidas en estudio digital.
- Sistema de maquetación de imágenes y creación de documentos en Mac para la creación de catálogos.
- Recuperación de desastre de servidor.
- Análisis, diseño, implementación, integración puesta en marcha de las secciones 'Compramos' y 'Subastas en Sala' en http://www.afinsa-auctions.com.
- Análisis, diseño, implementación, integración puesta en marcha de las secciones 'Mancolista' en http://www.afinsa-auctions.com
- Modificaciones a los sistemas de gestión de imágenes para la impresión y uso de códigos de Barras.
- Sistema Unificado de Tratamiento de Información – JAUNECRM –. Gestión Comercial y CRM
- Configuración e instalación del firewall corporativo sobre Linux.
- Sistema de seguridad y encriptación. (SSH2 y sistemas de claves públicas y privadas).
- Integración del sistema de pago vía PayBox en http://www.quintaesencia.com (extinguida)
- Configuración, instalación y mantenimiento del firewall corporativo sobre Linux.
- Diversas asistencias técnicas.
- Mantenimiento de www.antiqvaria.com desde septiembre del 2002.
- Dirección de sistemas.
- Dirección de sistemas y selección de proveedores en Barcelona
- Venta parcial del portal creado y desarrollado por Jaune Sistemas http://www.cotosdepesca.com para la formación de sociedad
- Consultoría sobre el estado, mejoras y plan de actuación de la web de esoterismo www.akasico.com
- Consultoría sobre el estado, mejoras y plan de actuación de la web de naturaleza y medio ambiente www.natuweb.com
- Mantenimiento de http://www.cotosdepesca.com.
- Venta definitiva de http://www.cotosdepesca.com.
- Dirección de Sistemas
- Finarte - Paso al Euro de la web www.finarte.es
- Gelt
- Asistencias Técnicas
- Sistema de automatización de maquetación de catálogos.
- Sistema de Procesado automático de imágenes.
- Web promocional de simposio.
- Sistema de representación de datos de conservación de carreteras.
- Estudio del código fuente de los sistemas de Geofactory, transformación cartográfico, modificación y publicación Flash para, sin ayuda alguna de sus desarrolladores originales, ayudar a la creación del nuevo departamento de producción. ( + Flash ActionScript)
- Documentación técnica de la plataforma.
- Varios proyectos de uso interno y para terceros clientes.
- Implementación de las BB.DD de gestión.
- Mantenimiento de la Web corporativa.
- Diseño de la Web de franquicias Quintaesencia.
- Integración del sistema de Código de Barras en la web.
- Implementación y control del sistema de fidelización Maximiles.
- Instalación y adaptación de JauneCRM.
- Web de venta corporativo.
- Dirección de sistemas.
- Instalación y adaptación de JauneCRM.
- Web de venta corporativo.
- Dirección de sistemas.
- JauneCRM: licencias y personalización como entorno de trabajo corporativo del grupo.
- JauneCRM: licencias y personalización para el análisis de la actividad comercial para los productos de La Ley.
Portal de venta y subasta de sellos y material filatélico.
From 1.998 to 1.999
Poliedro Ingenieros
From 1994 to 1998
INTECSA Internacional
DRAGADOS Group (ACS)
Project Manager, Architect
& Software developer
INSTITUTO MUNICIPAL DE INFORMÁTICA DEL AYUNTAMIENTO DE BARCELONA IMI, FCC, S.A. y CESPA, S.A.
- Sistema de Control de la Limpieza Viaria. Solución que integra, terminales remotos, etc. para el control, tanto de los subcontratistas como de los procesos de limpieza.
- Sistema Información de Servicios Especiales.
- Sistema de Control de la Recogida de Residuos Sólidos Urbanos. Control de los servicios de recogida y asignación de pesos tomados en báscula.
- Sistema de Consultas de Servicios de Limpieza para el Instituto Municipal de Informática del Ayuntamiento de Barcelona, IMI. Sistema de consulta de los servicios para la implantación en todos los terminales de atención al ciudadano del Ayuntamiento de Barcelona.
- Digitalización 1.500 itinerarios de limpieza para FCC, S.A. y CESPA, S.A. e introducción de datos para GERSURB.
- Digitalización 900 itinerarios de recogida para FCC, S.A..
- JEFE DE OFERTA, entre las que cabe destacar:
- Enlace GERSURB – Sistema de Incidencias remotas por GPS CESPA-GEOCOM para CESPA y el Exmo. Ayto. de Barcelona.
- Enlace GERSURB – GPS CESPA-GEOCOM para CESPA y el Exmo. Ayto. de Barcelona.
- Enlace base de datos de Jardinería – MicroStation para CESPA y el Exmo. Ayto. de Barcelona.
- Ampliación del Sistema de Gestión de Servicios Urbanos.
- Control Automatizado de Pesos.
- Control del Mobiliario Urbano.
- Enlace de la Información del Sistema de Gestión de Servicios Urbanos con la www de Internet.
- Gestión Integral Portátil de Centrales DIPE para TELEFÓNICA SISTEMAS.
- Concurso Internacional de ideas para la mejora de las nuevas contratas de Limpieza y Recogida de la Ciudad de Barcelona..
- Planificación, remodelación y ampliación de la organización de los grupos de análisis y desarrollo.
- Responsable de los grupos de análisis y desarrollo del programa JESSICA de gestión de la Calidad Total.
- Asesoría al Jefe de Proyecto del desarrollo del GIS de puertos TINGLADO para PUERTOS DEL ESTADO.
- Asesoría y apoyo a las unidades operativas internas de producción.
- Asesoría al Jefe de Proyecto del desarrollo del Sistema de Gestión de Costas DUNA.
- Dirección de la creación de utilidades y sistemas de uso interno como módulos de ploteo de INTECSA, etc.
- Diseño e implantación de la página WEB de INTECSA: http://www.intecsa.es.
From 1993 to 1994
INTECSA Internacional
DRAGADOS Group (ACS)
Software Analyst & Software Developer
From 1992 to 1993
Software Ibérica 92
Analyst & Software Developer
From 1991 to 1992
Productos Tecnológicos Protecno, SA
Software Developer
- Gestión Multialmacén con varios niveles de ruptura y multiselección de la creación de informes por el propio usuario en un entorno de división geográfica con ajustes automáticos de márgenes, selección de prioridades y creación de grupos.
- Sistema de informes para seguimiento de ventas frente a un plan presupuestario con los consiguientes cuadros de análisis.
- Sistema de Seguimiento de representantes.
- Sistema de Informes Económicos Presupuestarios con niveles de ruptura hasta vendedor y multiselección de creación de informes.