Job Description
As a Data Engineer you will create and maintain the data pipeline and associated analytics tools, insights and business performance metrics; you'll be responsible for building the infrastructure for optimal extraction, transformation and loading of data with a focus on security.
You'll be working with the latest technology on a modern cloud based technology stack encompassing Scala, Python, Apache Spark, BigQuery, DBT, Apache Airflow, GCP and AWS, improving access to data and data insights as well optimising huge datasets.
The company can currently offer a remote interview and onboarding process as well as the ability to work from home throughout the term of the contract (although occasional visits to the London office could be required).
Requirements:
*Indepth knowledge of GCP including hands-on experience of securing GCP projects and applications and implementing security best practice including networking, VPC security controls, security logging / auditing and IAM
*Strong Scala development skills, ideally also Python
*Strong Apache Spark experience
*Experience with Data warehouse technologies including BigQuery, DBT and Apache Airflow
*Used to dealing with large datasets / Data Lake
*Excellent communication / collaboration skills
Apply now or call to find out more about this Data Engineer (Google Cloud Platform Scala Spark Big Data) contract opportunity.
Job Role: Data Engineer (GCP); Location: Remote WfH; Rate: £600 to £650 p/day (Outside IR35); Term: 6 months; Start: Immediate / ASAP
