What do we do?
GFT Technologies is driving the digital transformation of the world’s leading financial institutions. Other sectors, such as industry and insurance, also leverage GFT’s strong consulting and implementation skills across all aspects of pioneering technologies, such as cloud engineering, artificial intelligence, the Internet of Things for Industry 4.0, and blockchain.
With its in-depth technological expertise, strong partnerships and scalable IT solutions, GFT increases productivity in software development. This provides clients with faster access to new IT applications and innovative business models, while also reducing risk.
We’ve been a pioneer of near-shore delivery since 2001 and now offer an international team spanning 16 countries with a global workforce of over 9,000 people around the world. GFT is recognised by industry analysts, such as Everest Group, as a leader amongst the global mid-sized Service Integrators and ranked in the Top 20 leading global Service Integrators in many of the exponential technologies such as Open Banking, Blockchain, Digital Banking, and Apps Services.
Role Summary
As a Senior Data Engineer at GFT, you will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions.
Key Activities
- Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders
- Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments
- Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives
- Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring
- Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency
- Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows
- Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions
Required Skills
- Core Expertise: Proficiency in Python for data processing and scripting (pandas, pyspark), workflow automation (Apache Airflow), and experience with AWS services (Glue, S3, EC2, Lambda)
- Containerization & Orchestration: Experience working with Kubernetes and Docker for managing containerized environments in the cloud
- Data Engineering Tools: Hands-on experience with columnar and big data databases (Athena, Redshift, Vertica, Hive/Hadoop), along with version control systems like Git
- Cloud Services: Strong familiarity with AWS services for cloud-based data processing and management
- CI/CD Pipeline: Experience with CI/CD tools such as Jenkins, CircleCI, or AWS CodePipeline for continuous integration and deployment
- Data Engineering Focus (75%): Expertise in building and managing robust data architectures and pipelines for large-scale data operations
- Data Science Support (25%): Ability to support data science workflows, including collaboration on data preparation, feature engineering, and enabling experimentation environments
Nice-to-have requirements
- Advanced Data Science Tools: Experience with AWS Sagemaker or Databricks for enabling machine learning environments
- Big Data & Analytics: Familiarity with both RDBMS (MySQL, PostgreSQL) and NoSQL (DynamoDB, Redis) databases
- BI Tools: Experience with enterprise BI tools like Tableau, Looker, or PowerBI
- Messaging & Event Streaming: Familiarity with distributed messaging systems like Kafka or RabbitMQ for event streaming
- Monitoring & Logging: Experience with monitoring and log management tools such as the ELK stack or Datadog
- Data Privacy and Security: Knowledge of best practices for ensuring data privacy and security, particularly in large data infrastructures
What can we offer you?
- Competitive salary
- 13th-month salary guarantee
- Performance bonus
- Professional English course for employees
- Premium health insurance
- Extensive annual leave
Due to the high volume of applications we receive, we are unable to respond to every candidate individually. If you have not received a response from GFT regarding your application within 10 workdays, please consider that we have decided to proceed with other candidates.
About Us
We show commitment to our investors and stand for solid, long-term growth performance. Founded in Germany in 1987 and in American territory since 2008, GFT expanded globally to over 10,000 experts. And to more than 15 markets to ensure proximity to clients. With new opportunities from Asia to Brazil, the international growth story continues. We are committed to grow tech talents worldwide. Because our team’s strong consulting and development skills across legacy and pioneering technologies, like GreenCoding, underpin success. We maintain a family atmosphere in an inclusive work environment.
Why Choose GFT?
- Competitive Compensation
- Benefits package including comprehensive medical, dental, vision and others
- Company Culture based on our Core Values
- Professional Development Training with Individual Development Plans to map out your career growth
- Opportunity to work in a global environment with diverse teams built with colleagues from around the world
- Opportunity to work with technology industry leaders in the financial services industry
- Opportunity to work for big name clients in capital markets, banking and other industries