4 Data Scientist Jobs in Salzburg
RESPONSIBILITIES
- 5+ years of experience in Data Science. Previous experience working on Data Science topics in Operations / Supply Chain is a plus.
- Programming: proficient in at least one of R, Python, SQL with the ability and willingness to learn the other two.
- Data Literacy: ability to prepare datasets ensuring top quality such that essence of said data and the implications for the problem at hand can be grasped quickly.
- Technical Literacy: applied understanding of modern computing allows the candidate to do things which go beyond the strict definition of Data Science (git, API calls, web crawling, …).
- Statistical Reasoning: applied understanding of Stats and Probability and know how to use these tools to reduce uncertainty in a business context (regression, visualisation).
- Project Management: able to navigate evolving requirements, prioritise effectively, and scope work in a way that balances impact, effort, and stakeholder needs.
- Presentation Skills: present coherent data stories at the appropriate level of abstraction given the audience.
- Stakeholder Management: skilled at building mutually beneficial connections with functional stakeholders.
- Pragmatic Critical Thinking: intuitively consider relevant costs/benefits in all decisions and act accordingly.
- Outcome Driven: highly motivated to add value and to demonstrate that impact to the organisation.
- Scientific Reasoning/ Scoping: ability to define and formulate new questions, in addition to answering given ones.
- Grit: proven capability to see things through to the end even if initial feedback is discouraging.
|
RESPONSIBILITIES
- 4+ years of experience in Data Science. Previous experience working on Data Science topics in Operations / Supply Chain is a plus.
- Masters or Doctorate in relevant field, for instance: Mathematics, Operations Research, Science (including data, computer, behavioural, social, e-con, ...).
- Programming: know at least one of R, Python, SQL with the ability and willingness to learn the other two
- Data Literacy: ability to prepare datasets ensuring top quality such that essence of said data and the implications for the problem at hand can be grasped quickly
- Statistical Reasoning: theoretical and applied understanding of Stats, Probability and ML Algorithms and know how to use these tools to reduce uncertainty in a business context (regression, visualization)
- Technical Literacy: applied understanding of modern computing allows the candidate to do things which go beyond the strict definition of Data Science (git, API calls, web crawling, …)
- Presentation Skills: present coherent data stories at the appropriate level of abstraction given the audience
- Stakeholder Management: skilled at building mutually beneficial connections with functional stakeholders
- Pragmatic Critical Thinking: intuitively consider relevant costs/benefits in all decisions and act accordingly
- Outcome Driven: highly motivated to add value and to demonstrate that impact to the organization
- Scientific Reasoning/ Scoping: ability to define and formulate new questions, in addition to answering given ones
- Grit: proven capability to see things through to the end even if initial feedback is discouraging
|
Vollzeit
Salzburg, Wien
12.12.2025
Salzburg, Wien
Wir möchten die Welt bewegen
- Planung und Koordination von Business Intelligence- und Data Engineering-Projekten inklusive Stakeholder-Management
- Erhebung, Analyse und Übersetzung fachlicher Anforderungen in technische Lösungen (Requirements Engineering)
- Verwaltung und Priorisierung des Backlogs in Azure DevOps
- Entwicklung und Betrieb von effizienten Datenmodellen und Power BI Reports
- Kontinuierliche Optimierung von ETL-Prozessen zur Verbesserung der Datenqualität
- Proaktive Identifikation und Umsetzung von Lösungen zur Optimierung interner Prozesse und Entscheidungsfindung
|
RESPONSIBILITIES
- Minimum of 2-3 years of professional experience as a data professional (e.g. data engineer, data scientist) or in a business/IT consulting position
- Strong and proven knowledge in SQL and database cloud solutions, ideally Snowflake, combined with proficiency in standard data architecture design and modelling.
- Expertise in using Python and Git/Github is expected; familiarity with PySpark and Snowpark will be a strong plus
- Experience in creating interactive web apps (e.g., using Streamlit or RShiny) and knowledge of business intelligence tools like Tableau and Power BI are beneficial
- Strong communication and presentation skills.
- Team-player and collaborative (“copy-left” vs “copyright”).
- Proactive, self-motivated, and able to work on different projects in parallel.
|