Short Description
Western Union is seeking for Senior Associate Big Data who will be responsible to Design and build the infrastructure for data extraction from a variety of sources, preparation, and analytics. Provide solution implementation including system set-ups, development, system and integration testing, and user training.Job Description
- Principal Software EngHave has proven experience in working with a team of engineers and service providers in a geographically distributed structure.
- Use varied Big data architectural styles, architectural patterns, technologies and frameworks to design and build actionable data analytics systems at the Western Union.
- Exposure to AWS Cloud-based Big data systems for data ingestion, processing, analytics, and machine learning is expected.
- Troubleshoot issues and coordinate resolution with operations, functional, and technical teams that may be geographically distributed.
- Define and adopt the best global practices that help in building a scalable, resilient, highly available big data platform.
- Support server buildout on multiple sites for failover/disaster recovery in a highly available ecosystem.
- Deliver common framework architecture and high-level design.
- Develop and implement new software, maintain and improve existing software.
- Conduct detailed design, code, and test reviews.
- Ensure that software functionality is implemented with a focus on high performance.
- Work with business partners to understand their needs and proactively lead them to solutions.
- Must be able to multi-task, handling numerous competing priorities.
- Provide coaching to Junior Associates.
- B.E. / B.Tech. / MCA in Computer Science, Engineering or related field is required.
- 8 - 10 years of experience in software engineering and architecture experience in a fast-paced.
- Corporate environment.
- Comprehensive experience in Java/Scala.
- Provide solutions for real-time/batch data injection into Hadoop Data Lake through Streaming systems.
- Batch data load through HDFS/Sqoop scripts into Hadoop through Hive or Impala.
- Experience with Spark datasets, Spark SQL and providing data to the Machine Learning Libraries for executing various machine learning algorithms.
- Expertise in writing queries and with the tuning of queries or jobs. Experience on both relational and NoSQL databases.
- Experience in AWS services like Kinesis, Redshift, EMR, RDS.
- Perform analysis of vast data stores and uncover insights. Experience in implementing BI tools like Zoom data, Arcadia, Power etc.
- Experience in fast analytics tools like Kudu is desired.
- Strong skills in Stream processing systems like a storm, Spark-streaming, Kafka streaming. Exposure to Kafka is expected.
- Experience on Cloudera Hadoop eco-system is desired.
- Strong Linux/Unix skills required.
- Understanding various development methodologies, including Traditional Waterfall and Iterative development methods (Unified Process and Agile)neer.
Senior Associate Big Data