Data Engineer

Data Engineer Consultant | Hybrid | Netherlands | €77K–€88K + €3K Bonus

Job Title: Data Engineer Consultant

Location: Netherlands (Hybrid - 2 days office, 3 days home)
Industry: Data Engineering
Compensation: €77,472 - €87,840 per year (€3,200 - €4,000 monthly)
Monthly Bonus: €3,000
Working Hours: Minimum 36 hours per week
Vacation Days: 25
Mobility Budget: €450 monthly
Visa Sponsorship: Not Available
Languages Required: Fluent Dutch and English
Relocation Assistance: Not Available
Holidays: 25

Job Description

As a Data Engineer Consultant, your primary responsibility is to prepare data for analytical or operational use. You will build data pipelines to bring together information from different source systems. You will integrate, consolidate, and clean the data before structuring it for use in analytical applications.

While working on challenging assignments with our clients, we also focus on your professional growth. We believe in helping you discover and unlock your potential through coaching, training, and sharing knowledge. This enables you to continue developing as a professional and helps us serve our clients even better.

Ideal Candidate

The ideal candidate should possess deep knowledge of data engineering and data modeling, both conceptually and dimensionally. You should have experience with various cloud architectures, such as Microsoft Azure or AWS, and be familiar with working in Scrum, Agile, and DevOps methodologies. You should be proficient in technologies such as Databricks, Spark Structured Streaming, and PySpark, and be capable of translating user requirements into appropriate solutions. Additionally, you should be skilled in analyzing source data and designing effective data models.

Key Responsibilities

  • Data Engineering: Build and maintain data pipelines, integrate data from various source systems, and structure it for analytical purposes.

  • Data Modeling: Apply conceptual and dimensional data modeling techniques to ensure data can be leveraged effectively.

  • Technology Application: Use Databricks, Spark, and PySpark to build robust data solutions.

  • Collaboration: Work within Scrum and Agile teams to develop data solutions that meet business needs.

Skills & Qualifications

Must-Have Skills

  • Data Engineering

  • Data Modeling

  • Scrum, Agile, DevOps methodologies

  • Python

  • MySQL

  • Microsoft Azure

  • Bachelor’s degree (HBO or equivalent)

  • Fluency in Dutch

Preferable Skills

  • Databricks

  • Microsoft Power BI

  • Azure Data Factory

  • Data Vault

  • Data Governance

  • Bachelor’s degree in Data Science (BSc) or Computer Science (BSc)

  • Data Engineering on Microsoft Azure (DP-203) certification

  • Proficiency in English

Soft Skills

  • Strong communication skills

  • Adaptability

  • Teamwork and collaboration

  • Problem-solving abilities

  • Self-driven and motivated

Experience

  • More than 5 years of experience working in complex data environments at top 500 companies.

Compensation & Benefits

  • Annual Salary: €77,472 - €87,840

  • Monthly Salary: €3,200 - €4,000

  • Monthly Bonus: €3,000

  • Mobility Budget: €450

  • Extra Benefits: Pension package, phone, expenses reimbursement, lease budget, and laptop.

Working Conditions

  • Hybrid Work: 2 days in the office, 3 days remote

  • Vacation: 25 days off per year

  • Visa Sponsorship: Not available

  • Relocation Assistance: Not available

  • Working Hours: Minimum of 36 hours per week

 

Senior Data Engineer - London - Full Time Perm Hybrid - Base Salary - GBP £70,000 to £90,000

Senior Data Engineer

London

FullTime Perm Hybrid

Base Salary - GBP £70,000 to £90,000

 

 

BOUNTY DESCRIPTION

Skimlinks, a Connexity and Taboola company, drives e-commerce success for 50% of the Internet’s largest online retailers. We deliver $2B in annual sales by connecting retailers to shoppers on the most desirable retail content channels. As a pioneer in online advertising and campaign technology, Connexity is constantly iterating on products, solving problems for retailers, and building interest in new solutions.

We have recently been acquired by Taboola to make the first Open-Web Source for Publishers connecting editorial content to product recommendations, where readers can easily buy products related to stories they are reading.

Skimlinks, a Taboola company, is a global e-commerce monetization platform, with offices in LA, London, Germany, and NYC. We work with over 60,000 premium publishers and 50,000 retailers around the world helping content producers get paid commissions for the products and brands they write about.

 

About the role

We are looking for a Senior Data Engineer to join our team in London. We are creating a fundamentally new approach to digital marketing, combining big data with large-scale machine learning. Our data sets are on a truly massive scale - we collect data on over a billion users per month and analyse the content of hundreds of millions of documents a day.

As a member of our Data Platform team your responsibilities will include:

  • Design, build, test and maintain high-volume Python data pipelines.

  • Analyse complex datasets in SQL.

  • Communicate effectively with Product Managers and Commercial teams to translate complex business requirements into scalable solutions.

  • Perform software development best practices.

  • Work independently in an agile environment.

  • Share your knowledge across the business and mentor colleagues in areas of deep technical expertise.

 

Requirements:

Here at Skimlinks we value dedication, enthusiasm, and a love of innovation. We are disrupting the online monetization industry, and welcome candidates who want to be a part of this ambitious journey. But it is not just hard work, we definitely appreciate a bit of quirkiness and fun along the way.

·        An advanced degree (Bachelor/Masters) in computer science or a related field.

  • Solid programming skills in both Python and SQL.

  • Proven work experience in Google Cloud Platform or other clouds, developing batch (Apache Airflow) and streaming (Dataflow) scalable data pipelines.

  • Passion for processing large datasets at scale (BigQuery, Apache Druid, Elasticsearch)

  • Familiarity with Terraform, DBT & Looker is a plus.

  • Initiatives around performance optimisation and cost reduction.

  • A commercial mindset, you are passionate about creating outstanding products.

Voted “Best Places to Work,” our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates sites and business services in the US, UK, and EU. We offer top benefits including Annual Leave Entitlement, paid holidays, competitive comp, team events and more!

  • Healthcare insurance & cash plans

  • Pension

  • Parental Leave Policies

  • Learning & Development Program (educational tool)

  • Flexible work schedules

  • Wellness Resources

  • Equity

We are committed to providing a culture at Connexity that supports the diversity, equity and inclusion of our most valuable asset, our people. We encourage individuality, and are driven to represent a workplace that celebrates our differences, and provides opportunities equally across gender, race, religion, sexual orientation, and all other demographics. Our actions across Education, Recruitment, Retention, and Volunteering reflect our core company values and remind us that we’re all in this together to drive positive change in our industry.

SKILLS AND CERTIFICATIONS [note: bold skills and certification are required]

·        Airflow

·        Python

·        SQL

·        GCP

·        BigQuery

·        Data pipelines

To Apply Please Complete the Form Below

Data Engineer - USA, Connecticut - $90,000 to $120,000

Data Engineer

USA, Connecticut

$90,000 to $120,000

 

Position Summary:

Reporting to the Associate Director, Information Technology, the Data Manager/Data Analytics is responsible for overseeing the development and use of company data systems and guaranteeing that all information to and from the company runs timely and securely. They will also effectively identify, analyze and translate business needs into technology and process solutions. This position can be based out of Westlake Village, CA/Marlborough, MA/Danbury, CT.

 

Principal Responsibilities:

  • Works with other team members and business stakeholders to drive development of business analytics requirements.

  • Leverages knowledge of business processes and data domain.

  • Brings deep expertise in data visualization tools and techniques in translating business analytics needs into data visualization and semantic data access requirements.

  • Works with various business units to facilitate technical design of complex data sourcing, transformation and aggregation logic, ensuring business analytics requirements are met.

  • Leverages enterprise standard tools and platforms to visualize analytics insights, typically working with and/or leading a small team.

  • Regularly monitor and evaluate information and data systems that could affect analytical results.

  • Translate business needs to technical specifications

  • Design, build and deploy BI solutions (e.g., reporting tools)

  • Manage integration tools and data warehouse.  

  • Manage and conduct data validation and troubleshooting

  • Create visualizations and reports according to business requirements

  • Monitoring and enhancing databases and related systems to optimize performance.

  • Proactively addressing scalability and performance issues.

  • Ensuring data quality and integrity while supporting large data sets.

  • Debugging and resolving database reliability, integrity, and efficiency.

  • As a member of the IT organization at MannKind Corp. the incumbent is also expected to be customer focused, a problem solver, a communicator, professional, willing to learn, organized and a team player.

  • Duties and responsibilities are not limited to the work listed above and may include other assignments as necessary.

 

Education and Experience Qualifications:

  • BS/BA Degree with minimum 3-5 years related experience in data management or analysis.

  • 3+ years of experience with Relational Database Management Systems (RDBMS)

  • 3+ years of Business Intelligence / Analytics related work experience in challenging environments

  • Strong understanding of modern data modelling techniques

  • Strong understanding of Cloud services providers (AWS, Google, Microsoft) and how to architect solutions around them

  • Ability to decipher and organize large amounts of data.

  • An analytical mindset with superb communication and problem-solving skills.

  • Ability to translate complex problems clearly and in nontechnical terms.

  • In-depth SQL programming knowledge - partitioning, indexing, performance tuning knowledge, stored procedure, views

  • Hands-on experience in developing dashboards and data visualizations using BI tools (e.g.  PowerBI, Tableau)