Key Responsibilities 
- Design and Develop Full-Stack Features: 
 - Build and enhance core modules across frontend (React/TypeScript) and backend ) aligned with the product roadmap.
 - Integrate LLM & RAG Workflows: 
 - Implement prompt pipelines, embedding models, and hybrid retrieval (Qdrant / Oracle 23AI vector search) for contextual AI responses.
 - Backend Services & APIs: 
 - Develop scalable REST/GraphQL APIs, implement proxy layers for database and LLM access, and ensure robust request orchestration.
 - Database & Data Services:
 Work with
 MongoDB, Oracle
 , and
 Qdrant
 using various database integration patterns.
 - Frontend Engineering: 
 - Develop modular, high-performance UI using
 React + Vite + Tailwind CSS
 , integrated with role-based access.
 - Implement dynamic dashboards, charts, mind maps, and prompt management UIs. 
 - AI Agent Integration: 
 - Collaborate on MCP (Model Context Protocol) plugin integration for connecting AI agents to enterprise systems like CRM, Billing, and Product Catalog.
 - DevOps & Deployment: 
 - Support containerization (Docker/Kubernetes), CI/CD (Azure DevOps), environment management, and on-prem/cloud deployments.
 - Performance Optimization & Security: 
 - Optimize database queries, LLM calls, and real-time communication layers while ensuring secure handling of enterprise data.
  
Required Technical Skills 
Frontend Stack: 
- React 18+, Vite, TypeScript, Tailwind CSS 
 - State management: Redux / Zustand 
 - Charting and visualization: Recharts, D3, Mermaid 
 - Responsive and dark-theme UI implementation 
  
Backend Stack: 
-  18+, Express / Fastify 
 - REST & GraphQL API design 
 - Proxy architecture and middleware design 
 - Authentication (JWT/OAuth2, Azure AD) 
 - File handling, PDF/Doc generation, and AI pipeline integration 
  
Databases: 
- MongoDB / Mongoose 
 - Qdrant / Oracle 23AI (Vector Search, Hybrid + Rerank retrieval) 
  
AI & Integration Layer: 
- OpenAI / Azure OpenAI APIs 
 - Gemini and LLaMA model adapters 
 - Embedding and RAG architecture 
 - Model Context Protocol (MCP) 
 - Knowledge indexing and retrieval optimization 
  
DevOps & Tools: 
- Docker, Kubernetes, Nginx ingress routing 
 - CI/CD using Azure DevOps / GitHub Actions 
 - Git branching, versioning, and backup strategies 
 - Environment management (.env, secrets, configs) 
  
Preferred Experience 
- 3–5 years of hands-on full-stack development experience 
 - Experience in
 AI-powered enterprise systems
 or
 LLM-based applications  - Familiarity with
 telecom BSS/OSS domains
 (CRM, Billing, Product Catalog, Order Management) is an added advantage  - Prior exposure to
 MCP
 ,
 RAG
 , or
 Vector DB
 systems will be preferred.
 - Comfortable working in a
 modular micro-service architecture  - Strong debugging, documentation, and collaboration skills 
  
Soft Skills 
- Excellent problem-solving and analytical mindset 
 - Strong communication and documentation habits 
 - Ability to work independently in a fast-paced, evolving environment 
 - Collaborative mindset to work with architects, AI engineers, and UI designers