Nairobi, Nairobi Area, Kenya Corporate Staffing Services Full time
About Us
We are the leading telecommunication company in East Africa. Our purpose is to transform lives by connecting people to people, people to opportunities and people to information. We keep over 42 million customers connected and play a critical role in the society, supporting over one million jobs both directly and indirectly.
We are listed on the Nairobi Securities Exchange (NSE) and have annual revenues of close to KES 298 Billion ($2.5 billion) as at March 2022.
We were founded in 1997 as a fully owned subsidiary of Telkom Kenya before a 40 percent acquisition by Vodafone Group PLC in May 2000.
Responsibilities:
Collaborate with data scientists and software engineers to design and implement machine learning workflows.
Take offline models data scientists build and turn them into a real machine learning production system.
Develop and deploy scalable custom tools and services that can handle machine learning training and inference.
Apply software engineering best practices to machine learning such as CI/CD, versioning and Containerization.
Develop machine learning algorithms and libraries for problem solving and AI operations.
Research and provide input on design approach, performance and base functionality improvements for various software applications.
Stay up to date with the latest developments in machine learning and cloud computing technologies.
Qualifications
BS or MS in computer science or equivalent practical experience.
At least 2-3 years of coding experience in a non-university setting.
Proven experience in Object Oriented development.
Experience in deploying and managing Machine Learning models at scale.
Experience with MLOps platforms such as Kubeflow, MLFlow, Sagemaker etc.
Familiarity with DevOps practices and tools such as Kubernetes, Docker, Jenkins, Git.
Proficient understanding of distributed computing principles.
Experience with NoSQL databases, such as HBase, Cassandra, MongoDB.
Demonstrated proficiency with data structures, algorithms, distributed computing, and ETL systems.
Good knowledge of and experience with big data frameworks such as Apache Hive, Spark.
Strong understanding of machine learning concepts and frameworks, including TensorFlow, PyTorch, Scikit-learn, Kedro etc.