We’re Fiserv, a global leader in Fintech and payments enabling innovative financial services experiences that are in step with the way people live and work today. Our aspiration is to move money and information in a way that moves the world. Our Purpose is to deliver superior value for our clients through leading technology, targeted innovation and excellence in everything we do.
Our core values are to earn client trust every day, create with purpose, inspire and achieve excellence, do the right thing and deliver on the promise of one Fiserv. It’s these values that create a foundation for us to be able make the right decisions and deliver on our commitments to our client’s and each other.
Do you enjoy working with scrum teams and thrive to be the best? We are looking for enthusiastic people to join our International Payment Platform team.
We work agile to develop multiple payment products such as Merchant / Terminal onboarding, Internet Payment gateways, Payment terminals software, Digital portals, Business analytics, and Mobile device as a Reporting / Payment solution.
We are currently looking at expanding our technology centre and are seeking people who want to work on core data driven products for deployment on a global scale. We want to talk with people who want to:
• Create better experiences for users
• Prioritize ease of use and functionality
• Build and deploy new services
• Be passionate, impactful and Be first
• Join a team on a mission
WHAT WILL YOU DO
• Create Spark/Scala jobs for data transformation and aggregation
• Develop unit tests for Spark transformations and aggregations
• Develop production grade real time or batch data integrations between systems
• Data processing from ETL using Spark streaming and processing into Kafka
• Design and build data pipelines of medium to high complexity
• Execute practices such as continuous integration and test-driven development to enable the rapid delivery of working code
• Deploy production grade data pipelines, data infrastructure and data artifacts as code
• Develop estimates for data driven solutions
• Establish standards of good practice such as coding standards and data governance
• Peer review code developed by others