header-image

50556MKP - Senior Data Engineer (Bayer sl)


  • Ubicació: Barcelona (Espanya)
  • Tipus de Contracte: Indefinit
  • Jornada: Jornada completa
  • Sector: Farmacèutica i biofarmacèutica
  • Vacants: 1
  • Disciplina: I+D
  • Modalitat de treball: Híbrida

Barcelona Activa

BARCELONA ACTIVA gestiona ofertes d'empreses que necessiten cobrir vacants de personal. Les posicions que es publiquen no són per a Barcelona Activa. NOMÉS ES TINDRAN EN COMPTE LES CANDIDATURES QUE ARRIBIN PER AQUEST CANAL. El Servei de Cerca de Treballadors/es s'adreça a aquelles empreses del territori que necessiten incorporar nous professionals als seus equips i ofereix suport en la identificació de les vacants, el reclutament de personal i la preselecció de candidatures que s'ajustin als perfils sol·licitats. 

 

Descripció de l'oferta

This offer is part of the Biomedicine and Health Talent Marketplace organized by Barcelona Activa in collaboration with CATALONIA.HEALTH during the Connection Day. BAYER SL    is looking for the profile  Senior Data Engineer

Only people who meet the profile requested by the company will be notified.

The pre-selected people will receive an email from Barcelona Activa with the confirmation of attendance and organizational details.

Once pre-selected, your availability for the Job Marketplace will be requested: Tuesday April 29th, from 15h to 18h, *location to be determined.

 

FUNCTIONS AND TASKS:

. BAYER visionaries. facilitates a varied and meaningful career in a community of bright and diverse minds to make a real difference.

Seeking a talented and highly motivated individual with a strong technical engineering background and relevant experience to join Bayer’sVegetable R&D Engineering and Automation Hub as a Senior Data Engineer. The Engineering and Automation Hub carries out a mission to createdata-driven end-to-end digital workflows that improve operational efficiency and support predictive breeding for the R&D organization.
The ideal candidate will have strong expertise in working with Big Data technologies like Google BigQuery and experience implementing ETLprocesses to manage data pipelines efficiently. You will play a crucial role in developing and maintaining key projects in the Vegetable R&Dportfolio. This involves collaborating with a diverse group of global cross-functional scientists, engineers, developers, plant breeders, and IT teams.

YOUR TASKS AND RESPONSIBILITIES
Developing, and troubleshooting SQL queries on Google BigQuery, as well as designing
scalable ETL pipelines using technologies like Metaflowand Python.
Oversee database management by optimizing SQL queries, implementing effective indexing strategies, and conducting regular performancetuning to enhance overall efficiency and responsiveness.
Establish materialized views for performance optimization, and create data quality checks and validation procedures to ensure data integrity
Monitor, troubleshoot, and document data pipeline issues, implementing error handling and recovery mechanisms as needed.
Develop comprehensive data archiving and retention policies that balance storage efficiency, compliance requirements, and cost-effectiveness,utilizing tiered storage solutions and automated lifecycle management techniques.
Write clean, maintainable code following team standards, Bayer, and industry best practices. This includes writing comprehensive unit tests,participating in code reviews, and engaging in agile scrum development practices.

 

YOUR APPLICATION

Bayer is committed to treating all applicants fairly and avoiding discrimination. Bayer is the company for you if you are interested

 

  • Type of contract: Permanent
    Number of hours per week: 40h
    Schedule: 9h to 18h. Hibrid
    Gross annual remuneration: 

Within the Offices and Offices Agreement. According to interview.

 

Requisits

WHO YOU ARE (Education/Experience)
Bachelor's degree in computer science, software engineering, or a related discipline plus minimum of 5 years of experience
Proven experience building large-scale data pipelines for production applications.
History of working in Agile Scrum teams
SKILLS (Technical & Soft)
Strong expertise working with Google BigQuery, with advanced skills in SQL optimization, database performance tuning, data modeling, andusing materialized views for enhanced efficiency.
Proven experience building ETL pipelines using AWS data services (e.g., RDS, Lambda, Step Functions) and orchestration frameworks likeMetaflow or Airflow.
Familiarity with data streaming patterns and technologies like Apache Kafka.
Proficient in writing unit tests using frameworks (e.g Pytest).
Experience with version control systems (e.g., GitHub) and working with CI/CD pipelines (GitHub Actions).
Excellent verbal and written communication skills, with the ability to work independently and collaboratively.
Strong attention to detail and commitment to code quality.


  • Ubicació: Barcelona (Espanya)
  • Tipus de Contracte: Indefinit
  • Jornada: Jornada completa
  • Sector: Farmacèutica i biofarmacèutica
  • Vacants: 1
  • Disciplina: I+D
  • Modalitat de treball: Híbrida