Role overview
- Build real-world agentic AI systems, not experimental demos
- Work on enterprise-scale GenAI deployments with measurable business impact
- Collaborate with top-tier engineers and AI practitioners
- Influence architecture and technical direction in a fast-growing, well-funded AI environment
- Competitive compensation aligned with senior GenAI expertise
For more details reach at **resumes@navitassols.com.
About Navitas Partners, LLC:**It is a certified WBENC and one of the fastest-growing Technical / IT staffing firms in the US providing services to numerous clients. We offer the most competitive pay for every position. We understand this is a partnership. You will not be blindsided and your salary will be discussed upfront.
What you'll work on
- Design, develop, and deploy agentic AI systems leveraging large language models and modern GenAI frameworks
- Integrate GenAI capabilities into full-stack applications and internal enterprise workflows
- Collaborate on prompt engineering, model fine-tuning, and systematic evaluation of generative outputs
- Build reusable services and components for multi-agent orchestration and intelligent task automation
- Optimize AI inference pipelines for scalability, latency, reliability, and cost efficiency
- Contribute to architectural discussions and help shape the technical roadmap for the AI pod
- Ensure best practices for production-grade GenAI deployment, monitoring, and lifecycle management
*Core Skills & Experience
Must-Have Qualifications**
- 8+ years of software engineering experience, including 2–3+ years working with AI/ML or Generative AI systems in production
- Strong hands-on experience with Python (required) for AI/ML model integration and orchestration
- Practical experience with LLM frameworks such as LangChain and LlamaIndex (required)
- Hands-on exposure to agentic AI frameworks including LangGraph and/or Google ADK (required)
- Solid understanding of Git, CI/CD pipelines, DevOps, and production deployment practices
- Experience working with Google Cloud Platform (GCP), including tools such as **Vertex AI, Cloud Run, and GKE
Good-to-Have Skills**
- Experience building AI-powered APIs, embeddings, and vector search integrations
- Exposure to fine-tuning open-source LLMs (e.g., LLaMA, Mistral) or working with OpenAI APIs
- Experience with multimodal AI systems (text, image, or voice)
- Familiarity with low-code / no-code platforms (e.g., AppSheet) for workflow automation