The Data Engineering team is looking for people who are passionate about working in agile delivery environments and resolving the engineering challenges of building robust and scalable data systems aligned to enterprise data strategy.

As Lead, Data Engineer you will be responsible for developing, constructing, and testing large-scale data analytics systems based on AWS cloud that will help address the disparate analytics needs of a growing organization.

Responsibilities:
- Conceptualize, design, and implement analytics products that enhance Investments analytics capabilities.
- Design solutions aligned with long-term architecture and technology strategy using Amazon Web Services (AWS) for Cloud development.
- Participate in the development life cycle from start to completion - requirements analysis, development, testing, and deployment.
- Work in a fast-paced environment collaborating with developers, data engineers, architects, researchers, and data scientists.
- Ensure architecture will support the requirements of the Investments business.
- Develop tools that prepare, transform, combine, and manage structured and unstructured data for use by Investments business users.
- Define and shape Investments' future technology and research process

Must have
- University degree in Engineering or Computer Science preferred.
- Overall IT experience 8+ years
- Hands-on experience building data exploratory interfaces using Notebooks (JupyterHub/Lab, Livy, SPARK, Hudi, Hive)
- Hands-on expertise with building data pipelines and applications, leveraging Kubernetes, Airflow,
- Knowledge & experience with driving IaC concepts within an organization, leveraging Terraform, Ansible, GitHub actions.
- Deep proficiency in Python with experience using Spark, Pandas or PySpark.
- AWS EMR, S3, Glue.
- Experience of ETL pipelines, managing multiple datasets and providing necessary support.

Nice to have:
- Experience with front-end framework ie. Angular, React is a plus
- Interests in the financial industry.
- Familiar with cloud technology best practices to enable the distribution and analysis of big data on the cloud (formatting/partitioning/etc.).
- Ability to work in an entrepreneurial environment and be a self-starter.
- Demonstrated ability to easily deal with both abstract and concrete concepts and be able to reconcile them for the appropriate audience and context preferred.
- Quickly understand organizational dynamics and management priorities, and to be able to work effectively in a fast paced, results driven company.
- Demonstrate strong facilitation, negotiation, interpersonal, communication and collaboration skills.
- Experience with Cloud based data and analytics platforms, warehouses (Redshift/Spectrum, Databricks, Snowflake), BI Tools, OLAP systems (Clickhouse, Druid), including a mix of relational, non-relational, streaming and event-based architectures.
(AWS SagemakerSparkmagic, Enterprise Gateway, matplotlib, Plotly/Dash, etc.).

Technical Skills
Is a Remote Job?
Hybrid (Remote with required office time)
Employment Type
Full time

Luxoft is the design, data and development arm of DXC Technology, providing bespoke, end-to-end technology solutions for mission critical systems, products and services.

We help create data-fueled...

Apply Now