Role overview
We’re Hiring a Data Scientist!
Are you passionate about AI and excited to work on cutting-edge applications in the field of NLP and LLMs (Large Language Models)? Do you have experience with AI Agents and the RAG (Retrieval-Augmented Generation) approach? If yes, we’d love to have you on our team!
What You’ll Do:
- Develop and Deploy Models: Build and optimize machine learning models, with a strong emphasis on NLP and LLM applications.
- Implement RAG-based Solutions: Design and implement Retrieval-Augmented Generation workflows to solve real-world challenges.
- Experiment with AI Agents: Innovate and optimize AI Agents for diverse use cases.
- Stay Updated: Continuously explore the latest trends and advancements in AI and ML to bring innovative ideas to the team.
What We’re Looking For:
Must-Have Skills:
- Programming Expertise: Proficiency in Python and experience with libraries such as LangChain, Hugging Face, or similar NLP/LLM frameworks.
- AI and NLP Knowledge: Hands-on experience with LLMs, transformers, or advanced NLP techniques.
- Problem-Solving Ability: Strong analytical skills and a creative mindset to tackle complex challenges.
- Curiosity and Drive: A passion for AI innovation and continuous learning.
- Effective Communication: Ability to clearly present insights, findings, and results to both technical and non-technical audiences.
Preferred Qualifications:
- Experience with RAG Workflows: Familiarity with integrating external data sources into LLMs.
- Cloud and Deployment Expertise: Knowledge of cloud platforms (AWS, GCP, or Azure) and experience with building deployment pipelines.
- Open-Source Contributions or Research: Demonstrated involvement in ML/NLP open-source projects or published research.
Join us in shaping the future of AI! If you’re ready to make an impact and work on some of the most exciting advancements in AI technology, apply now!
What we're looking for
- Experience with RAG Workflows: Familiarity with integrating external data sources into LLMs.
- Cloud and Deployment Expertise: Knowledge of cloud platforms (AWS, GCP, or Azure) and experience with building deployment pipelines.
- Open-Source Contributions or Research: Demonstrated involvement in ML/NLP open-source projects or published research.
Join us in shaping the future of AI! If you’re ready to make an impact and work on some of the most exciting advancements in AI technology, apply now!