Auf einen Blick
- Aufgaben: Join our Data Engineering team to automate and enhance our Data Platform.
- Arbeitgeber: Dynatrace is a leading software company focused on perfecting digital experiences.
- Mitarbeitervorteile: Enjoy flexible working models, attractive compensation, and a unique career development program.
- Warum dieser Job: Make a tangible impact on product innovation while collaborating with a diverse, international team.
- Gewünschte Qualifikationen: 3+ years in DevOps or similar roles, with strong cloud and automation skills required.
- Andere Informationen: Relocation support available for candidates moving to new countries.
Das voraussichtliche Gehalt liegt zwischen 56000 - 84000 € pro Jahr.
Company Description
Dynatrace exists to make software work perfectly. Our platform combines broad and deep observability and continuous runtime application security with advanced AIOps to provide answers and intelligent automation from data. This enables innovators to modernize and automate cloud operations, deliver software faster and more securely, and ensure flawless digital experiences.
Job Description
Data = information = knowledge = power. Do you want to hold the keys to that power? Are you motivated by solving challenging problems, where creativity is as crucial as your ability to write code, deliver solutions, and bring valuable data sets together to answer business questions?
If this sounds like an environment where you will thrive, come and join our Data Engineering team. Interested? Cause we are!
About the role
In Dynatrace we are all about automation, self healing and noOps approach. We preach automation wherever possible and we live by what we preach.
In the Data Engineering team for which we are hiring we are providing data to drive the world class Application Intelligence platform that is Dynatrace.
As a DevOps Engineer in the Data Engineering you will help us automate away our Data Platform both by providing the necessary tooling and by designing processes.
You will be building tools to automate installation at scale, accelerating time-to-value and enhancing the reliability of the Data Platform. That includes scripts but we may also need to integrate with existing mechanisms via APIs or provide means to reconfigure an already deployed product. Have an impact on how we shape our ETL pipeline and make sure the deployments of new builds of pipeline are automatic, predictable and transparent. All this working towards eliminating data downtimes and adding bricks to building trust in the data that your fellow Dynatracers will use on all levels of seniority to build the product that our customers love.
This is an exciting opportunity to make a direct, tangible impact on our product and work on our crucial Digital Business Platform.
As a member of the Data Engineering team, you will be at the center of Dynatrace product innovation.
In a company as Agile in organization as Dynatrace, there is always an option and encouragement to explore new areas when you find them interesting, moving to new positions and building a career with Dynatrace.
We guarantee plenty of challenges and scope to grow.
Qualifications
Main responsibilities
- Creating deployment integrations for cloud platforms, primarily AWS and Azure.
- Deployment automation in Jenkins and Terraform.
- Designing and automating processes for ETL data pipeline(s).
- Proactively ensuring continuous and smooth data related processes execution.
- Collaboration in international cross-lab teams (mostly in the same time zone, across Europe) on the delivery of current objectives.
Priority skills & experience
- 3+ years professional experience with process automation, preferably as a DevOps, SRE or sysadmin.
- 3+ years working with Cloud solutions, preferably AWS, on configuration, deployment management and automation.
- Experience with deployment automation, and CI/CD pipelines preferably using Jenkins.
- Good English communication skills.
Desired skills & experience
- Experience with Cloud databases, preferably Snowflake.
- Experience with DB services administration (PostgreSQL, AWS RDS, Aurora, Snowflake) and practical knowledge of SQL.
- Practical knowledge of IaC tools: CloudFormation, Terraform, and similar tools.
- Mindset focused on monitoring and observability.
Nice-to-haves
- Experience in CI/CD support for MS Power BI development.
- Experience with working on data pipelines (ETL / ELT) automation as well as supporting Data Engineering and Data Science team(s).
- Experience with multiple cloud platforms (AWS, GCP, Azure).
- Good command of scripting language(s): Python, Shell script, PowerShell.
- Practical knowledge of IaC tools: Ansible, Chef, Puppet, PowerShell DSC, SaltStack, CloudFormation, Terraform, and similar tools.
- Java literacy, experience with other programming languages.
- Familiarity with Docker and Kubernetes.
Additional Information
What\’s in it for you?
- A one-product software company creating real value for the largest enterprises and millions of end customers globally, striving for a world where software works perfectly .
- Working with the latest technologies and at the forefront of innovation in tech on scale; but also, in other areas like marketing, design, or research.
- Working models that offer you the flexibility you need.
- A team that thinks outside the box, welcomes unconventional ideas, and pushes boundaries .
- An environment that fosters innovation, enables creative collaboration, and allows you to grow .
- A globally unique and tailor-made career development program recognizing your potential, promoting your strengths, and supporting you in achieving your career goals.
- A truly international mindset that is being shaped by the diverse personalities, expertise, and backgrounds of our global team.
- A relocation team that is eager to help you start your journey to a new country , always there to support and by your side.
- Attractive compensation packages and stock purchase options with numerous benefits and advantages .
Dynatracers come from different countries and cultures all over the world, speaking various languages. English is the one that connects us (55+ nationalities). If you need to relocate for a position you are applying for, we offer you a relocation allowance and support with your visa, work permit, accommodation, language courses, as well as a dedicated buddy program.
Compensation and rewards
- We offer attractive compensation packages and stock purchase options with numerous benefits and advantages .
- Due to legal reasons, we are obliged to disclose the minimum salary for this position, which is € 56,000 gross per year based on full-time employment. We offer a higher salary in line with qualifications and experience.
#J-18808-Ljbffr
Senior DevOps Engineer for Data Engineering team (m/f/x) Arbeitgeber: Dynatrace
Kontaktperson:
Dynatrace HR Team
StudySmarter Bewerbungstipps 🤫
So bekommst du den Job: Senior DevOps Engineer for Data Engineering team (m/f/x)
✨Tip Number 1
Familiarize yourself with the specific tools and technologies mentioned in the job description, such as Jenkins, Terraform, and AWS. Having hands-on experience or projects that showcase your skills with these tools can set you apart from other candidates.
✨Tip Number 2
Highlight any experience you have with automation and CI/CD pipelines. Be prepared to discuss specific challenges you've faced in automating processes and how you overcame them during the interview.
✨Tip Number 3
Since collaboration is key in this role, think of examples where you've successfully worked in cross-functional teams. Being able to communicate your teamwork experiences will demonstrate your fit for the international and collaborative culture at Dynatrace.
✨Tip Number 4
Stay updated on the latest trends in data engineering and cloud solutions. Showing your passion for continuous learning and innovation can resonate well with the company's values and mission.
Diese Fähigkeiten machen dich zur top Bewerber*in für die Stelle: Senior DevOps Engineer for Data Engineering team (m/f/x)
Tipps für deine Bewerbung 🫡
Tailor Your CV: Make sure to customize your CV to highlight your experience with process automation, cloud solutions, and CI/CD pipelines. Use keywords from the job description to ensure your application stands out.
Craft a Compelling Cover Letter: In your cover letter, express your passion for automation and data engineering. Mention specific projects or experiences that demonstrate your skills in AWS, Jenkins, and Terraform, and how they align with Dynatrace's goals.
Showcase Relevant Skills: Clearly list your technical skills related to the job, such as scripting languages (Python, Shell), IaC tools (Terraform, CloudFormation), and experience with ETL processes. This will help the hiring team see your fit for the role.
Highlight Collaboration Experience: Since the role involves working in international cross-lab teams, emphasize any previous experience you have in collaborative environments. Discuss how you effectively communicated and worked with diverse teams to achieve common goals.
Wie du dich auf ein Vorstellungsgespräch bei Dynatrace vorbereitest
✨Showcase Your Automation Skills
Since the role emphasizes automation, be prepared to discuss your experience with tools like Jenkins and Terraform. Share specific examples of how you've automated processes in previous roles, particularly in cloud environments.
✨Demonstrate Cloud Expertise
Highlight your experience with cloud platforms, especially AWS and Azure. Be ready to explain how you've managed deployments and configurations in these environments, as well as any challenges you've faced and overcome.
✨Discuss ETL Pipeline Experience
The position involves shaping ETL pipelines, so come prepared to talk about your experience with data pipelines. Discuss any relevant projects where you designed or automated ETL processes, focusing on the impact of your work.
✨Emphasize Collaboration Skills
Collaboration in cross-lab teams is key for this role. Share examples of how you've worked effectively in diverse teams, particularly in international settings. Highlight your communication skills and how they contributed to successful project outcomes.