Data Engineer

Remote work

Safetyheads

We are a Software House with experience in cybersecurity.

We specialize in creating mobile and web applications as well as delivering IoT solutions. We work mainly with the industrial sector, IT companies, and e-commerce.

We also have our own unique product 5Days. It is the low-code platform designed as the fastest and most cost-effective tool for building enterprise-class IT solutions.

We are looking for an experienced  Data Engineer, who will join our friendly team and will work on a very interesting project in the heating industry.

The Data Team works on building and deploying novel analytical algorithms based on the data from our IoT products. The person in this position will work on the full stack: from data pipelines and processing to creating user facing applications.

 

Your responsibilities:

  • Build and run Azure based Data Platform.
  • Handle multiple Terabytes of different domains data with focus on IoT data using Stream and Batch processing.
  • Craft and maintain world-class, high traffic analytical applications and services with a team of experienced data engineers and scientists.
  • Develop data models and process raw data at scale to make it available for analysis.
  • Build scalable data pipelines using state-of-the-art technologies in the Microsoft Azure Cloud.
  • Implement, run and deploy algorithms in cooperation with data scientists using the Hadoop ecosystem (e.g. Spark, Hive, HDFS) and the Microsoft Azure Stack).

 

What we expect:

  • Certification University degree in Computer Science or related fields of study (Mathematics, Physics).
  • At least 5 years of experience in architecting and building Data Lake and Enterprise Analytics Solutions.
  • Ability to optimize 'Big Data’ data pipelines, design Data Lake architecture and data sets.
  • Experience with Design and Architecture of relational SQL and NoSQL databases.
  • Advanced hands-on SQL, Python, pySpark and HDFS knowledge and experience working with relational databases for data querying and retrieval.
  • Experience with Design and Architecture of Azure big data frameworks: Azure Databricks, Azure Data Factory, Azure ML, SQL Data Warehouse, HDInsights, Stream Analytics, Azure Data Explorer – or their open source counterparts: Kafka, Spark, Blob Storages.
  • Knowledge of DevOps Processes (including CI/CD) and infrastructure as a code fundamentals.
  • Knowledge of Data Quality issues and methods of dealing with them.
  • Fluent in English is a must have, German language will be considered as a plus.
  • Nice to have: experience in designing, building and integrating REST APIs.

Why it’s worth joining us?

Health care

Individual training budget

Fully remote work

Flexible working hours

Unique working atmosphere

Family-friendly community

Knowledge sharing with developers with extensive experience

In-house training programs (Angular, IOS, Android, CyberSecurity Leadership)

Recruitment process

1
2
3

HR call

After we receive your CV you will have a short conversation with HR, during which we will initially tell you about SafetyHeads, the project and the role that interests you, and we will gladly hear about your expectations.

Online meeting

We invite you to a 1.5-hour meeting with HR and a technical person, during which we will get to know each other better, talk about your experience and the possibilities we have for you.

Meeting with the customer

The client is also very happy to meet you. There will be one or two interviews with him, if necessary.

Feedback

We will get back to You with the results of the recruitment regardless of the outcome. Hopefully it will always be good news.

Join our team

We're hiring

Apply now

We use cookies on our website, hope you don’t mind.

Read moreAgree