Company Overview
My client builds large scale Big Data platforms for industrial and telemetry use cases. They focus on handling very high data volumes and turning complex, messy data into reliable and scalable solutions. The team is small, highly skilled, and fully remote, with a strong hands on engineering culture.
Role Overview
My client is looking for a Data Engineer to work on telemetry data platforms using PySpark and Databricks. This role is ideal for someone who enjoys working with large datasets, solving complex data problems, and building robust pipelines in a modern cloud environment.
Key Responsibilities
• Analyze and process large scale telemetry data using PySpark and Databricks
• Build and optimize data pipelines designed to handle very high data volumes
• Work closely with a small team of experienced data engineers
• Take ownership of tasks and work independently in a remote first setup
• Share knowledge and collaborate in a pragmatic, engineering driven culture
Required Experience
• Strong hands on experience with PySpark and Databricks
• Good understanding of telemetry or event based data
• Ability to translate complex data challenges into practical solutions
• Motivation to work autonomously in a dynamic environment
Why This Role
• Technically challenging Big Data projects you rarely encounter elsewhere
• Fully remote role with complete flexibility
• High level of responsibility and ownership
• Strong team culture despite being remote
• Opportunity to make a visible impact in a small expert team
Role Details
• Full time, permanent position
• Fully remote home office
• Immediate or short term start preferred
• Company training, events, and equipment provided
• German and English required for daily communication
• Candidates are encouraged to share relevant Spark or Big Data project experience
