|Location||State of São Paulo, Brazil|
|Date Posted||November 1, 2019|
Brazil - Work Permit
About the Team
Data Engineering has a significant part in all of our strategic efforts and decisions at Wildlife. Our mission is to provide our company with complete, secure, reliable, high quality and highly available data. To accomplish this mission we are looking for engineers to help us to develop cutting-edge Data science infrastructure. We love working with large datasets, low latency data systems, and complex business logic.
About the Role
At Machine Learning Engineering we provide generalized machine learning support for the data science teams, building and productionalizing models in partnership with them. Our current tech stack for data processing is Hadoop (MR), Hive, Presto and Spark, the challenge here is dealing with +1PB of historical data to input our models.
Also, we have services for advertising, matchmaking for battles between players and live-ops (e.g. tournament events) that will consume ML models in order to leverage user experience and engagement. For this role what we want is to design and develop the supporting infrastructure of machine learning for data science teams.
This role will not be doing applied Machine Learning (like developing new models), rather they will be supporting other teams that do.
More about you
- Strong engineering backgrounds and understanding of ML area very well (both theoretically and practically)
- Enjoy working with complex business logic and deal with large datasets;
- Smart and creative, both, you have the ability and persistence to solve problems, big and small. Curious by nature, you're constantly looking for ways to improve upon things;
- Demonstrate critical thinking and problem-solving capabilities both independently and collaboratively;
- You're flexible, fearless, and excited to help build something;
- You're hands-on, in the right ways; willing and able to do whatever is needed, no matter the task.
What you’ll do
- Building expertise around ML tools and approaches like SageMaker, MLeap, MLFlow;
- Implement workflows for smooth and powerful model development experience for data scientist;
- Work as an ML hub for engineerings teams, providing them with guidance, tools, and components, when they need to integrate machine learning models into their tools, systems, and workflows;
- Working closely with Data science to assess the computational load and scalability of their workflows (and/or ML algorithms) (eg. when data science brings some new modeling techniques, you need to advise on how to write an efficient implementation, like ensure parallelization approaches are well supported by our infrastructure and understood by data science as well).
What you'll need
- MSc or Ph.D. in Computer Science or a related field with expertise in Machine Learning;
- At least 5 years of experience as a Machine Learning Engineer;
- At least 2 years of experience deploying models into production at large scale;
- Strong experience with Python and Scala;
- Experience with ML related technologies (eg. SageMaker, MLflow, MLeap or H2O);
- Highly skilled with SQL and building workflows through ETL/ELT
- Relevant experience with Hadoop ecosystem (Hive, HDFS, Yarn);
- Experience with orchestration frameworks like airflow or Luigi;
- Desirable experience contributing to MLlib, not needed
We welcome people from all backgrounds who seek the opportunity to help build the best gaming company, where everyone thrives.
Open to assisting the right candidate with the following Visa(s) / Work Permit(s)
1) Brazil - Work Permit