Senior data platform engineer -Unit: BTO
FlexSpot®Lighting the way ahead together.
We are committed 24/7, 365 days a year to ensuring a safe and reliable electricity supply, now and in the future. This is the world of TenneT. As a leading European grid operator, we design, build, maintain, and operate the high-voltage network in the Netherlands and large parts of Germany, playing a key role in the energy transition. With 23,500 kilometers of high-voltage connections on land and at sea and nearly 5,000 employees, we ensure the daily power supply for more than 42 million end users.
TenneT has numerous innovative projects in progress now and in the coming years, meaning we are constantly looking for dedicated professionals. TenneT is expanding rapidly to achieve its ambitions, taking a leading role in driving the energy transition. Finding new talent is crucial in this effort.
We are looking for a motivated:
Senior Data Platform Engineer
Unit:
TenneT’s Business Technology Organization (BTO) is responsible for developing, implementing, and managing IT technologies across the organization. In a dynamic IT environment, BTO plays a crucial role in quickly adapting to changing customer needs and fostering a data-driven culture. The department focuses on standardizing and digitizing processes, promoting innovation, and improving collaboration between different units. Through automation and data analysis, BTO contributes to efficiently and effectively achieving TenneT’s strategic objectives.
Your role:
As a Senior Data Platform Engineer at TenneT, you will have the opportunity to strengthen and inspire the Data Engineering Team! This team plays a crucial role in managing and optimizing data solutions within the organization. Your responsibility lies in the operational implementation of data platforms and pipelines, ensuring efficient, optimized, and reliable data solutions.
You help identify and resolve operational and data quality issues. Additionally, you are responsible for designing, implementing, monitoring, and optimizing data platforms that meet business needs. Furthermore, you ensure the installation and configuration of additional services that support the decision-making process of business and data engineers. You have a hands-on mentality with a strong focus on platform engineering.
Key Responsibilities:
• Hands on design, develop, and implement data solutions using Azure services.
• Azure Kubernetes Service (AKS) knowledge.
• Build and maintain CI/CD pipelines for efficient and automated deployment and testing of data engineering processes.
• Utilize scripting and development skills in languages such as Python, Java, and other relevant languages to enhance data processing and manipulation.
• Demonstrate expertise in Azure services, ensuring efficient data storage, data ingestion, data transformation, and data analytics.
• Develop and integrate APIs to enable smooth data communication and interaction with external systems.
• Implement automated workflows and integrations.
• Work with IAC to provision and manage cloud infrastructure on Azure.
• Design, build, test, deploy and maintain applications with proper performance, fault handling, logging and monitoring
• Ensure the quality of deliverables by designing and implementing unit- and integration tests
• Troubleshooting, analysing, and solving issues from tests as well as those reported by our customers
• Suggesting improvements to our technical solutions and way of working, and implementing them in alignment with your team
• Participate in team knowledge sharing, design reviews and technical reviews
Your Profile:
• You have strong knowledge and experience with a minimum of 5 implementation projects with Azure services, including AKS, Azure Data Lake Storage, Azure Data Factory, and Azure Databricks.
• You are proficient in continuous integration and continuous deployment (CI/CD) methodologies and tools.
• You have solid scripting and development skills, with expertise in Python, Java, and SQL.
• You have experience managing cloud infrastructure.
• You are familiar with automated workflows and integrations.
• You understand API development and integration.
• You have excellent problem-solving abilities and take a proactive approach to troubleshooting and optimizing data engineering processes.
• You have strong communication skills and work well in a team environment.
• You have knowledge of big data technologies and frameworks like Hadoop, Spark, and Kafka.
Personal Attributes:
• Proactive: You identify opportunities and seize them immediately.
• Team player: You seek collaboration and inspire colleagues.
• Innovative: You think outside the box and experiment with new technologies.
• Solution-oriented: You independently solve complex problems.
• Eagerness: You have strong desire and enthusiasm to understand the platform and optimize it. It also means you are highly motivated, excited and keen to take action or learn new things.