Location: Hyderabad, Telangana, India
Pogo has been the leader in online casual games since 1998. Featuring a growing library of 60+ titles spanning popular genres like Solitaire, Mahjong, Match 3, and more, Pogo exists to be the best destination for online casual games. We aim to produce high-quality HTML5-powered games with metagames and social mechanics, all while working across desktop, tablet and mobile. Our fans and subscribers come to Pogo for fresh content, daily challenges, great events, new games, and a live service that delivers!
We are looking for a Data Engineer based in our EA Hyderabad office. You will report to the Analytics Manager, working remotely as part of the product team. This is a key position that will work with game team analysts and Product Managers to design and build data platforms and enrichment pipelines that will enable and empower data democratization for the game studio. You will help transform how we use data at Pogo to drive success to our live services capabilities.
Main Responsibilities (What you will do)
- Data Pipeline Development: Design, build, and maintain scalable and efficient data pipelines to support various data analytics initiatives
- Data Integration: Ensure seamless data integration from multiple sources, including databases, APIs, and cloud-based data lake
- Data Quality: Implement data quality checks, validation processes, and monitoring to ensure data accuracy and reliability
- Data Architecture: Contribute to the design and optimization of data warehouse and data lake architecture for better performance and scalability.
- ETL Automation: Develop, schedule, and manage ETL workflows, automating data extraction, transformation and loading process
- Performance Optimization: Identify and resolve performance bottlenecks in data pipelines and database queries to improve overall system efficiency.
- Team Collaboration: Collaborate with analysts, and other stakeholders to understand their data requirements and provide support in building data-driven solutions.
- Documentation: Create and maintain comprehensive documentation for data pipelines, processes, and data dictionaries.
Qualifications (What you know how to do)
- 2+ years relevant industry experience in a DE related role.
- High proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions.
- Experience in deploying pipelines using Airflow
- Experience in data modeling, ETL processes, big data, and data warehousing.
- Experience with business intelligence tools such as Tableau, or similar BI tools.
- Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data.
- Work cross-functionally and both technical and business partners.
- Experience with developing in Python or a similar language in modern ETL frameworks, such as Airflow