OnlyDataJobs.com

Equinix, Inc.
  • Dallas, TX
Intern: Data Science Analyst
About Equinix
At Equinix, we make the internet work faster, better, and more reliably. We hire talented people who thrive on solving hard problems and give them opportunities to hone new skills, try new approaches, and grow in new directions. Our culture is at the heart of our success and its our authentic, humble, gritty people who create The Magic of Equinix. We share a real passion for winning and put the customer at the center of everything we do.
We are looking for bright and enthusiastic college students who love to learn and want to make an impact on the world. Join the Equinix team and shape the future of cloud computing and enterprise connectivity at one of the Fastest Growing Technology Companies in America (Forbes).
Opportunities
The Equinix Internship Program offers wide-ranging opportunities in Information Technology, Engineering, Human Resources, Finance and more. Spend your time gaining practical work experience and learning from some of the sharpest minds in the industry. Work in a culture that thrives on innovation and delivering results, while building solid relationships with industry leaders, and fellow students from around the country.
Projects
  • Advanced Predictive Analytics Platform
    • Build predictive AI model to proactively identify potential Siebel application performance degradation/availability based on data collected from different layers of Siebel Enterprise (Network VM DB).
    • Interns will have a hands-on experience on Java and Python.
    • Interns will be involved in the Data Modelling phase which includes Data Translation and Data co-relation phases.
Responsibilities/Tasks
  • Assist in the identifying the crucial Siebel CRM data points and assign weightage to the feature sets
  • Identify the mapping attributes and then co-relate the different data sets based on the identifying mapping attributes
  • Work on the POC for the predictive model based on the identified algorithm and experiment with the test data and feature sets
  • Validate the outputs generated by the POC Model and come up with the accuracy rating of the predictive model
Qualifications
  • Rising senior undergraduate student
  • Intuitive understanding of machine learning algorithms (supervised and unsupervised modeling techniques)
  • Experience with machine learning tools and libraries
  • Intuition about algorithm, system performance and throughput
  • Hands on experience on Linux, mining of structured, semi-structured, and unstructured data
  • Java or Python scripting background
  • Architecture and system/pipeline layout experience
  • Deep learning
  • Attention to detail, data accuracy, and quality of output
  • Strong interpersonal, written, and verbal communication skills
  • Ability to effectively function in a fast-paced environment
  • Good to have:
    • Experience with Apache Hadoop, Spark, SOLR/Lucene, Cassandra and related technologies
    • Working knowledge of SQL
    • Multimodal learning applications
Kayzen
  • Berlin, Germany

Role : Senior Data Analyst

Analytics & Insights Team


You will be one of the founding members of our analytics & insights team. This gives you the opportunity to work with an extremely high degree of freedom and shaping that teams focus and roadmap. You will have ample support from our data engineering team to access the ~10 TB of data we store every single day across our various data centres. Your job is to build analytics and product features on top of that data and derive valuable insights for our business teams and customers.


Responsibilities:



  • Analyse several Terabytes of structured data generated per day and identify patterns to derive business recommendations

  • Formulate questions which help us improve business KPIs and lead your data analysis to find profound answers to those questions

  • Lead data driven business investigations aiming to improve the performance of our systems through better understanding of the massive amounts of data we have, e.g. to reduce fraud, improve conversion rates, allow for more precise targeting of users and more

  • Work closely with our client success team to understand our clients needs and develop ideas on how to solve them with smart data analytics

  • Provide inputs and recommendations for our data science models and work closely with data scientists to enhance precision of our prediction models

  • Work closely with engineering and product and contribute in all phases of the Product Development Lifecycle


Requirements:



  • Expertise in data modelling, access, processing and data visualization

  • Strong analytical skills and experience with Business Intelligence tools and languages (SQL, Python, R, Tableau or other data visualization tools)

  • Comfortable with Big data technologies (Hadoop, MapReduce, Hive, Pig etc)

  • A willingness to dive deep, experiment rapidly and get things done

  • High degree of pragmatism and business outcome driven data analysis

  • Attention to detail, data accuracy, and quality of output

  • Being passionate about creating great products and experiences for our customers

  • High levels of creativity and quick problem-solving capabilities, a self-starter with goal-driven focus

  • Previous Ad-tech industry experience preferred


What do we offer?



  • An opportunity to work with a highly motivated team of adtech veterans who are aiming for nothing less but changing our industry with better software

  • This role is an ideal entry point if you are considering to build a career as a data scientist in the future as you’ll operate at the intersection of data analysis and data science

  • You get valuable insights into mobile marketing, entrepreneurship and have a high impact on shaping the expansion and success of Kayzen across the globe

  • You get an opportunity to be responsible for and manage your own projects

  • You work directly with the founders where you can have a massive impact on the organization

  • You experience an excellent learning culture

  • You are part of a fun, international team of outstanding talents

  • You enjoy a competitive remuneration package and much more


Interested? Write to us at talent@kayzen.io

Apple
  • London, UK

Summary


Have you ever imagined what you could do here? At Apple, new ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. The Apple Media Products Data Engineering team is looking for great engineers. We are looking for the very best people to build and positively improve features and services driving the iTunes Store, App Store and Apple Music. Our team is responsible for many of the key systems powering the personalisation features of the AMP ecosystem. This includes features such as ratings & reviews, purchase history, sync services and many more. Here you have a phenomenal opportunity to help build and evolve global-scale, leading-edge dynamic data systems, with positions currently available as we grow our amazing London team.


Key Qualifications



  • Significant experience in crafting, implementing and supporting highly scalable systems and services in Java

  • Bachelors or equivalent in Computer Science or related discipline

  • Experience in two or more Big Data areas is helpful — see Additional Requirements below for examples


Description


Our work covers the full stack, from: internet-facing web services, internal services using various flavours of RPC; design and implementation of data pipelines/life-cycles (Kafka); Hadoop infrastructure, strategy and implementation; distributed key-value storage (Voldemort, Cassandra, Redis, etc); and putting all this together to operate live customer-facing features with millisecond-latencies across multiple data centres with petabyte datasets and > 2 billion users. We promote innovation and new technology to further improve our creative output. If you’re an all round and performance-savvy Java server engineer with an interest in, and experience of, large scale Data technologies and systems at an unprecedented scale we’d love to hear from you.


Education & Experience


Bachelors/Masters in Computer Science or related discipline


Additional Requirements



  • Experience building and/or using distributed systems, distributed caching, distributed key-value or column stores (e.g. Cassandra, Voldemort, Redis)

  • A deep understanding of eventual consistency concepts

  • Experience with and understanding of Hadoop-ecosystem technologies such as MapReduce, Spark, YARN/MR2, etc

  • Experience in building and running best in class large scale data pipelines, using Kafka, with data ingest to/from multiple sources feeding batch compute components via HDFS and near-realtime components via online key-value storage

  • Experience and interest in data modelling and data architecture as optimised for large data patterns (warehousing concepts; efficient storage and query on HDFS; support for relevant realtime query patterns in key-value stores; columnar schema design; etc.)

  • Deep understanding of real time advanced analytics fundamentals and associated stream processing tools and techniques is a plus

  • Passion for customer-satisfaction ethic and focus on customer privacy

  • Experience with Scala would be an advantage


This position offers competitive salary and benefits.

Hays
  • Toronto, ON, Canada


Major Bank looking for a Big Data Engineer to work out of their Downtown Toronto office for 6months + ext

Big Data Engineer

Client: HSBC
Role: Data Engineer
Duration: 6 months, plus likely extension
Rate: Open *depending on experience
Location: Toronto, ON

Our client, a globally recognized bank is looking to hire a Data Engineer for a minimum 6 months based in Toronto to join their team..

Your new company
A leading bank, with multiple offices across Canada and throughout the world are looking for a Big Data Engineer for a 6 month contract in their Toronto office. They have an excellent reputation within their sector and are known as a market leader.

Your new role
You will be working as a Big Data Engineer part of the core big data technology and design team. Person would be entrusted to develop solutions/design ideas, identify design ideas to enable the software to meet the acceptance and success criteria. You will be working with architects/BA to build data components on the Big Data environment.

What you'll need to succeed
* 8+ years professional software development experience and at least 4+ years within Big data environment
* 4+ years of programming experience in Java, Scala, and Spark.
* Proficient in SQL and relational database design.
* Agile and DevOps experience - at least 2+ years
* Project planning.
* Must have excellent communication skills + have strong team-working skills
* Experienced in Java, Scala and/or Python, Unix/Linux environment on-premises and in the cloud
* Experienced in construction of robust batch and real-time data processing solutions on hadoop
* Java development and design using Java 1.7/1.8.
* Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, HBase, Presto, Python, ETL frameworks, MapReduce, SQL, RESTful services).
* Sound knowledge on working Unix/Linux Platform
* Hands-on experience building data pipelines using Hadoop components Sqoop, Hive, Pig, Spark, Spark SQL.
* Must have experience with developing Hive QL, UDF's for analysing semi structured/structured datasets
* Experience with time-series/analytics db's such as Elastic search or no SQL database.
* Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA
* Exposure to Agile Project methodology but also with exposure to other methodologies (such as Kanban)
* Understanding of data modelling techniques using relational and non-relational techniques
* Coordination between global teams
* Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects
* Nice to have: ELK experience. Knowledge of cloud computing technology such as Google Cloud Platform(GCP)
What you'll get in return


The client is offering a 6 month engagement, with a high likelihood of extension and a very competitive rate for the contract.

What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
PrimeRevenue
  • Atlanta, GA

ARE YOU READY TO WORK AT PRIMEREVENUE?

Do you want to work for a high growth, FinTech company helping other companies innovate, grow and create jobs? Do you enjoy working within an entrepreneurial environment that is mission-driven, results-driven and community oriented? Were looking for a Director of Data Architecture to continue the impressive development of our data enterprise, exposing our customers to the wealth of insights and predictive analytics. The Director will be responsible for design, development, and execution of data product initiatives. This individual will be part of multi-disciplinary team including data architects, BI developers, technical architects, data scientist, engineering, and operational teams for data products.  

WHAT YOU GET TO DO:

    • Design, create, deploy and manage our organization's data architecture
    • Develop and own our Data Product Roadmap
    • Map the systems and interfaces used to manage data, set standards for data management, analyze current state and conceive desired future state, and conceive projects needed to close the gap between current state and future goals
    • Provide a standard common business vocabulary, express strategic data requirements, outline high level integrated designs to meet these requirements, and align with enterprise strategy and related business architecture
    • Set data architecture principles, create models of data that enable the implementation of the intended business architecture
    • Create diagrams showing key data entities, and create an inventory of the data needed to implement the architecture vision
    • Drive all phases of data modelling, from conceptualization to database optimization, including SQL development and any database administration
    • Design ETL architecture



WHAT ARE WE LOOKING FOR?

  • Bachelor's degree in Computer Science or related discipline
  • Minimum 10 years working in data products organization
  • Knowledge of relational and dimensional data modelling
  • Knowledge of RDBMS solution (DB2, Oracle)
  • Excellent SQL skills
  • Experienced in Agile methodologies
  • Deep understanding of Data Management principles
  • Strong oral and written communication skills
  • Strong leadership skills
  • BI tools (Tableau, MicroStrategy)
  • Experience architecting enterprise data lakes in AWS
  • Hands-on experience with Hadoop frameworks/tools such as Kinesis, Glue, Redshift, Spark, Hive, Pig etc
  • Experience with distributions in Amazon EMR
  • Previous work with NoSQL databases such as PostgreSQL, MongoDB


WHO ARE YOU?

SMART, HUNGRY, & HUMBLE PERSONALITY IS A MUST!


WORKING AT PRIMEREVENUE BENEFITS:

    • Professional growth within our company
    • Monthly fun TEAM events
    • Generous benefits package
    • Community Service-Oriented Culture
SafetyCulture
  • Surry Hills, Australia
  • Salary: A$120k - 140k

The Role



  • Be an integral member on the team responsible for design, implement and maintain distributed big data capable system with high-quality components (Kafka, EMR + Spark, Akka, etc).

  • Embrace the challenge of dealing with big data on a daily basis (Kafka, RDS, Redshift, S3, Athena, Hadoop/HBase), perform data ETL, and build tools for proper data ingestion from multiple data sources.

  • Collaborate closely with data infrastructure engineers and data analysts across different teams, find bottlenecks and solve the problem

  • Design, implement and maintain the heterogeneous data processing platform to automate the execution and management of data-related jobs and pipelines

  • Implement automated data workflow in collaboration with data analysts, continue to improve, maintain and improve system in line with growth

  • Collaborate with Software Engineers on application events, and ensuring right data can be extracted

  • Contribute to resources management for computation and capacity planning

  • Diving deep into code and constantly innovating


Requirements



  • Experience with AWS data technologies (EC2, EMR, S3, Redshift, ECS, Data Pipeline, etc) and infrastructure.

  • Working knowledge in big data frameworks such as Apache Spark, Kafka, Zookeeper, Hadoop, Flink, Storm, etc

  • Rich experience with Linux and database systems

  • Experience with relational and NoSQL database, query optimization, and data modelling

  • Familiar with one or more of the following: Scala/Java, SQL, Python, Shell, Golang, R, etc

  • Experience with container technologies (Docker, k8s), Agile development, DevOps and CI tools.

  • Excellent problem-solving skills

  • Excellent verbal and written communication skills 

Burger King Corporation
  • Miami, FL

Position Overview:

This person will be key in the structuring of our new Guest Insights and Intelligence area within BK North America. Burger marketing analytics has evolved substantially over the past several years, to a point where we have a very detailed understanding of our sales on a product or ticket level. Nevertheless, our understanding of who is buying our products and offers is still very limited. With an increasingly competitive market, our objective is to create a new area that will have as its core focus the understanding of our guests, which ultimately will drive our strategy across several different initiatives, including calendar, pricing, innovation, advertising/communication. With more data available than ever coming from our mobile app, kiosks, POS, credit card, and external data sources, were looking for a data scientist with strong business judgment who will be able to help us structure and develop this area, effectively transforming how we look at marketing analytics at BK and eventually RBI.


Responsibilities & Qualifications:


    • 3-5 years of professional experience, masters degree a plus
    • Expertise in data modelling, visualization, and databases
    • Proficiency with statistical packages (e.g. SAS and R), database languages (e.g. SQL Server, MySQL, Oracle Express), and media measurement tools (e.g. DoubleClick, Omniture, Google Analytics)
    • Datab
      • ase design and implementation Machi
      • ne learning Time
      • series and forecasting Data
      • mining Linea
      • r and logistic regression Decis
      • ion trees Segme
      • ntation analysis Clust
      • ering techniques Marke
      • ting mix models Data
      • visualization techniques


  • Market research and competitive analysis aimed at driving growth
  • Comprehensive data analysis and manipulation to help identify trends
  • Strong interpersonal and communication skills

Restaurant Brands International US Services LLC (RBI) is an equal opportunity employer and gives consideration for employment to qualified applicants without regard to race, color, religion, sex, national origin, disability, or protected veteran status.

ING
  • Amsterdam, Netherlands
For the DataGen Finance squad we are looking for a Senior Data Engineer  


… to build with us the Strategic Data Exchange between Lending core systems and surrounding systems. The DataGen squad is part of WB Tribe Lending. The primary focus of the squad is on the processes concerning data delivery to Finance, to the Wholesale Bank Data Lake and data delivery on Regulations.



Data is becoming more important every day. Your contribution to the Strategic Data Integration work will be critical to realise our ambitioned Lending data platform, with high quality and timely data availability moving from batch to real-time. This way enabling excellent consumption possibilities to meet our ever increasing client- and regulatory demands on data.



We need your help in designing and building this new exchange and building bridges towards other squads in order to realise end-to-end delivery across Lending- and other Tribes. We are a group of individuals who value Agile, self-organization and craftsmanship. We are driven professionals who enjoy shaping the future of this place.




Needed skills & experience


We are looking for someone with an easy-to-work-with, mature and no-nonsense mentality. Someone who is an open and honest communicator, who values working as part of a team, who is willing and able to coach or train other developers and who is aware of developments and trends in the industry and corporate ecosystem.



On the more technical side you must have 9+ years of relevant experience in data engineering and especially must have experience in the following fields:



  • Agile / Scrum.

  • Track record in building larger corporate systems.

  • Oracle Data Integrator 12c.

  • Oracle RDBMS 11g or higher.

  • Oracle Sql 11g or higher.

  • Data modelling.

  • Linux (bash) scripting capabilities.

  • Java backend development.



Next to these must haves, we would like you to have knowledge of the following:



  • Kafka. Preferably the Confluent framework.

  • Kafka Sql (KSQL) and the Kafka Streaming API.

  • CI / CD tooling: Maven, Jenkins, Sonar, Git, Artifactory, Ansible.

  • Database Change Data Capture.

  • Visualisation with Grafana, Elastic, Kibana.

  • Experience in a complex, corporate environment.

  • Experience in Lending, Financial systems.

  • Issue trackers like JIRA, ServiceNow.

  • Collaboration tooling like Confluence.



What we offer to you



  • Work on something that has great significance to the bank.

  • Being part of the squad shaping the future way of development.

  • An enthusiastic team in an informal, dynamic environment.

Expedia, Inc.
  • Chicago, IL

Are you fascinated by data and building robust data pipelines which process massive amounts of data at scale and speed to provide crucial insights to the end customer?  Are you passionate about making sure customers have the information they need to get the most out of every product they use? Are you ready to help people go places?  This is exactly what we, the Lodging Data Tech (LDT) group in Expedia, do. Our mission is “transforming Expedia’s lodging data assets into Data Products that deliver intelligence and real-time insights for our customers”. We work on creating data assets and products to support a variety of applications which are used by 1000+ market managers, analysts, and external hotel partners.


Our work spans across a wide range of data-sets like lodging booking, clickstream, and web scrape data, across a diverse technology stack ranging from Teradata and MS SQL-server to Hadoop, Spark, Qubole and AWS. We are looking for passionate, creative and innately curious data engineers to join a new team in Chicago to build a unified data service which would power the data needs of all partner facing applications in the lodging line of business.


As a Software Dev Engineer II you are involved in all aspects of software development, including participating in technical designs, implementation, functional analysis, and release for mid-to-large sized projects.


What you’ll do with us:
You will develop, design, debug, and modify components of software applications and tools.
Understand business requirements; perform source to target data mapping, design and implement ETL workflows and data pipelines on the Cloud using Big Data frameworks and/or RDBMS/ETL tools
Support and solve data and/or system issues as needed
Prototype creative solutions quickly by developing minimum viable products and work with seniors and peers in crafting and implementing the technical vision of the team
Communicate and work effectively with geographically distributed multi-functional teams
Participate in code reviews to assess overall code quality and flexibility
Resolve problems and roadblocks as they occur with peers and unblock junior members of the group. Follow through on details and drive issues to closure
Define, develop and maintain artifacts like technical design or partner documentation
Drive for continuous improvement in software and development process within an agile development team
Participate in user story creation in collaboration with the team


Who you are:
Bachelors or master’s degree in computer science or a related major and/or equivalent work experience
Experience using code versioning tools for e.g Git or others
Experience in Agile/Scrum software development practices
Effective verbal and written communication skills with the ability to present complex technical information clearly and concisely
3-7+ years’ experience in Software Engineering specifically in databases, Big Data or Data-warehouse projects.
Proficient knowledge in SQL, database development (T-SQL/PL-SQL) and some experience with data modelling.
Experience working on Big Data framework like Hadoop or Spark.
Experience with any one MPP database system like Teradata, Redshift, DB2, Azure Datawarehouse or Greenplum.
Proficient in at least one programming language like Python/Java/Scala/ on a Unix/Linux environment.
Knowledge or experience working with AWS and AWS services like Redshift, EMR, AWS Lambda a plus.
Prior experience working with NoSQL stores (Hbase, ElasticSearch, Cassandra, Mongodb) a plus.
Familiarity with the e-commerce or travel industry.


Why join us:


Expedia Group recognizes our success is dependent on the success of our people. We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business. Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so.
Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach. We relentlessly strive for better, but not at the cost of the customer. We act with humility and optimism, respecting ideas big and small. We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground so we can act fast and stay simple. Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.


Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.


Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

Hilltop Holdings
  • Dallas, TX

In the role of the Sr. Data Integration Architect you will be the lead data engineer responsible for the development and support of an advanced data integration strategy, architecture and framework at the enterprise level. You will establish the management processes to create and govern the integration layer (ETL/ELT/PO/Data Hubs) between mission critical analytical systems and a variety of applications including managed services, on premise and cloud based applications. You should bring particular domain expertise as an architect with experience in designing, prototyping, creating, and modifying Data Integration Interfaces with a variety of tools is required. You will also have a strong knowledge of enterprise analytics, big data platforms, data exchange models, CRM, MDM, and cloud integration. As the Integration Architect you must have a strong hands-on development track record of building integrations utilizing a variety of integration products, tools, protocols, technologies and patterns. You will contribute to the development of roadmaps and migration plans for the integration of business systems to improve performance and reduce business costs.

Design and delivery experience with integration technologies such as Talend, Data Factory/SSIS, Data Integrator, Informatica/Cloud, SnapLogic, Boomi or Mulesoft required. Data base experience with MS SQL, Oracle or Hadoop also required.


 Responsibilities:


  • Evaluate and implement an enterprise Data Integration strategy for and between Hilltop Holdings and its subsidiary LOBs
  • Define and direct the implementation of standard processes and procedures for managing data integration and data exchange models between systems on premise and cloud systems including security, traceability, audit, performance and risk
  • Guide the software selection and implementation of an enterprise data integration platform/tool(s)
  • Work closely with the data governance team in the maintenance of a metadata, data glossary and data catalog process
  • Lead the data integration aspect of mission critical projects for the enterprise, including hands on delivery
  • Assists developers, analysts, and designers in conceptualizing and validating solutions that meet business requirements
  • Serve as the authoritative thought-leader for data integration across the enterprise
  • Define effective hybrid models for integration between legacy and cloud applications (including SalesForce, Oracle Cloud Financials and ServiceNow) for all integration methods (ETL/ELT/PO, data hub)


Qualifications:


  • Bachelors degree in Business or IT related field required
  • 8+ years of data integration with tools based data integration, analytics, data governance and master data
  • 4 years of architecture and data integration design and leadership experience (large complex projects)
  • 2+ years of hands-on experience in implementing data lakes with cloud technologies such as S3, ADW, Hadoop or Spark
  • Exposure to all phases of software development lifecycle and requirements gathering within a variety of analytical and integration technologies and applications
  • Prior experience in defining and building an EDW and modeling/implementing Master Data subject areas are required
  • Data modelling experience; specifically designing logical models/data dictionaries from business requirements
  • Experienced technology leader with extensive Data Quality, Data Management, Data Security, Data Governance skills
  • Must have strong focus on internal and external customer requirements
  • Excellent collaboration, communication, and negotiation skills to effectively serve as a leader and evangelist across the enterprise
  • Demonstrate advanced understanding of complex business processes across the enterprise
  • Must have excellent written, verbal, and interpersonal communication skills
  • Must be comfortable making formal presentations to senior IT management and executive level clients
  • Ability to coach and mentor others to provide leadership to analyze and resolve multiple complex problems at the enterprise level
  • Ability to direct, coach and mentor the technical and managerial development of other engineers
  • Ability to partner with vendors, external contacts and industry standards organizations to deliver technical solutions
  • Must have appropriate skills with the standard workstation software utilized to perform daily duties, e.g., Word, Excel, Visio, etc.
The William Carter Company
  • Atlanta, GA

Job Description

Enable Advance Analytics Capability (40%)

    • Lead transformation of Carters BI team from current focus on operational reporting to future capabilities around advanced analytics and business self-service with ad-hoc analysis
    • Define strategies for BI/Analytics to achieve transformative state for advanced analytics including data architecture, data platform, front-end tools, visualization, and analytics areas
    • Implement defined strategies to enable the business to solve complex issue through the use of data
    • Ensure all areas of the business are in scope for this new Analytics capability

Project and Initiative Delivery (40%)

    • Manage the overall portfolio of BI/Analytics projects and drive new analytics capabilities around Retail, Marketing, Supply Chain, HR, Finance, and other functional areas of the business in order to deliver new insights enabling better decision making
    • Manage all design and development of BI/Analytics solutions
    • Lead BI/Analytics architect; closely aligns with cross-functional business partners and provides senior technical knowledge, design & development activities for the BI/Analytics team
    • Deliver all BI/Analytics solutions on time and on budget

Service Delivery Improvements (10%)

    • Support incident resolution as needed with a focus on root cause analysis
    • Lead teams to implement permanent and repeatable solutions for incidents and problems

Administrative, Legal, SOX Compliance (10%)

    • Develop task plan for design and development activities for assigned and future projects
    • Adhere to all control and compliance regulations

Required Experience

    • Must have a strong understanding of core BI toolsets to include ETL, front-end and data modelling
    • Must have a proven track record of partnering with business in delivering quality business solutions
    • 10+ years of experience architecting and delivering BI solutions strongly preferred
    • 5+ years of experience design, development and testing of common Business Intelligence architectures required (in a retail organization preferred)
    • 6+ years hands on experience with standard ETL and front-end BI toolsets preferred
    • 3+ years analytical programming experience including predictive analytics and forecasting preferred
    • Must be proficient at SQL
    • Experience with Programming languages like R, Python, JavaScript, other required
    • Experience with Data visualization tools like Microstrategy, Tableau, PowerBI, other required
    • Experience working in VLDB (Very Large Database) environments is a plus
    • 1+ years experience working with cloud data store environments such as AWS, Google or Azure required
    • Bachelors and/or Masters Degree in Computer Science, Information Management, Information Technology or Engineering, or equivalent combination of education and experience is required
    • Experience with Netezza is a plus
    • Experience with big data architectures is a plus
    • Broad knowledge in Marketing (marketing analytics a point of emphasis), Retail, Supply Chain, and HR/Finance required
    • Must have proven ability to successfully influence others and drive change in a large, complex, global organization
    • Must have ability to establish creditability and build strong working relationships with key business stakeholders at all levels
    • Must have excellent communication skills (verbal, written, presentation); proven experience articulating a concept and associated business value that can be derived from the concept
    • Must have proven ability to lead leaders and technical professionals
    • Proven BI/Analytics experience with Retail and Wholesale models strongly preferred
    •  Must be able to drive strategic direction around architecture and future state

SocialCops
  • New Delhi, India
  • Salary: ₹180k - 300k

Our Alternative Data Team builds to solve world's most critical problems. From satellite data to government reports, from structured, internal data to unstructured, external data, and from online PDFs to paper surveys, the data sources we use are broad and varied. However, what makes us different is that we don’t just sell data. Instead, we sell insights.  We integrate 200+ global data sources across different sectors - agriculture, demography, infrastructure and consumer affluence, for targeted, granular insights As a Data Science Intern at SocialCops you would get to deal with diverse data, ranging from satellite data to sales data of big companies. You will be responsible for data modelling, cleaning, structuring, and handling data sets under the mentorship of our data scientists and economists. As a Data Science Intern, you will play a key role in converting a variety of messy datasets into clean and structured datasets by creating quality metadata files, running scalable R/Python scripts to model the data and perform data validations. You will also carry data analysis, create data visualizations, and create data models to make sense of data to power critical decisions.



REQUIREMENTS



  • Love R or Python and know how to manipulate data

  • Love when your code throws mind-boggling insights from an almost unusable data

  • Use statistical methods and models to analyze trends in diverse datasets

  • Know when to use bar charts instead of line charts

  • Hate writing the same logic of code twice and love to write scalable, reusable code to process data

  • Create data tools and processes to ease the lives of people working on data processing

  • Not afraid to roll up your sleeves, dig into the code, and implement your ideas


Cookies:



  • You have an updated Kaggle profile and are looking for more challenging problems

  • You have dealt with big data problems


Note: Please do not apply if you haven't worked on a data problem in R or Python Note: If you are looking for advanced machine learning problems and have a good knowledge of ML techniques check out the Machine Learning Intern position

Expedia, Inc.
  • Chicago, IL

Are you fascinated by data and building robust data pipelines which process massive amounts of data at scale and speed to provide crucial insights to the end customer?

Are you passionate about making sure customers have the information they need to get the most out of every product they use?

Are you ready to help people go places?


This is exactly what we, the Lodging Data Tech (LDT) group in Expedia, do. Our mission is “transforming Expedia’s lodging data assets into Data Products that deliver intelligence and real-time insights for our customers”. We work on creating data assets and products to support a variety of applications which are used by 1000+ market managers, analysts, and external hotel partners.

Our work spans across a wide range of data-sets like lodging booking, clickstream, and web scrape data, across a diverse technology stack ranging from Teradata and MS SQL-server to Hadoop, Spark, Qubole and AWS. We are looking for passionate, creative and innately curious data engineers to join a new team in Chicago to build a unified data service which would power the data needs of all partner facing applications in the lodging line of business.

As a Senior Data Engineer you are involved in all aspects of software development, including technical designs, implementation, functional analysis, and release for mid-to-large sized projects.

What you’ll do with us:



  • Lead the end-to-end product life cycle for mid to large size projects: Design, development, testing, deployment, and providing operational excellence and support

  • You find and advocate for industry standards in development methodologies, techniques, and technologies

  • You contribute to advancing the team's design methodology and quality programming practices and mentor junior team members to adapt standard methods.

  • Innovate and implement new ideas to solve complex software problems and prototype creative solutions to enable product MVP's

  • Independently understand scheduling, cost constraints, and impact to other teams; and make resource and architectural trade-offs based on those factors

  • Anticipate and prevent problems and roadblocks, before they occur, and present technical issues and their impact to leadership

  • Lead, coordinate, and collaborate on multiple concurrent and complex cross-organizational initiatives

  • Effectively build and maintain a network of key contacts across company, and use these contacts to achieve results

  • Communicate and work effectively with geographically distributed multi-functional teams

  • Drive for continuous improvement in software and development process within an agile development team



Who you are:



  • Bachelor's or Master's Degree in Computer Science, Information Systems, Engineering, or equivalent experience.

  • Demonstrated proficiency in most areas of the professional function, and in-depth specialization in some

  • Effective verbal and written communication skills with the ability to present complex technical information clearly and concisely

  • Experience in Agile/Scrum software development practices

  • 7 -10 + years of experience in the field of Software Engineering

  • 3-7+ years’ experience in Software Engineering specifically in databases, Big Data or Data-warehouse projects.

  • Proficient knowledge in SQL, database development (T-SQL/PL-SQL) and some experience with data modelling.

  • Experience working on Big Data framework like Hadoop or Spark.

  • Experience with any one MPP database system like Teradata, Redshift, DB2, Azure Datawarehouse or Greenplum.

  • Proficient in at least one programming language like Python/Java/Scala/ on a Unix/Linux environment.

  • Knowledge or experience with AWS and AWS services like Redshift, EMR, AWS Lambda a plus.

  • Prior experience working with NoSQL stores (Hbase, ElasticSearch, Cassandra, Mongodb) a plus.

  • Familiarity with the e-commerce or travel industry.



Why join us:
Expedia Group recognizes our success is dependent on the success of our people. We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business. Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so.

Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach. We relentlessly strive for better, but not at the cost of the customer. We act with humility and optimism, respecting ideas big and small. We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground so we can act fast and stay simple. Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.

Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.



Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

Expedia, Inc.
  • Chicago, IL

Are you fascinated by data and building robust data pipelines which process massive amounts of data at scale and speed to provide crucial insights to the end customer?

Are you passionate about making sure customers have the information they need to get the most out of every product they use?

Are you ready to help people go places?

This is exactly what we, the Lodging Data Tech (LDT) group in Expedia, do. Our mission is “transforming Expedia’s lodging data assets into Data Products that deliver intelligence and real-time insights for our customers”. We work on creating data assets and products to support a variety of applications which are used by 1000+ market managers, analysts, and external hotel partners.

Our work spans across a wide range of data-sets like lodging booking, clickstream, and web scrape data, across a diverse technology stack ranging from Teradata and MS SQL-server to Hadoop, Spark, Qubole and AWS. We are looking for passionate, creative and innately curious data engineers to join a new team in Chicago to build a unified data service which would power the data needs of all partner facing applications in the lodging line of business.

As a Software Dev Engineer II you are involved in all aspects of software development, including participating in technical designs, implementation, functional analysis, and release for mid-to-large sized projects.

What you’ll do:



  • You will develop, design, debug, and modify components of software applications and tools.

  • Understand business requirements; perform source to target data mapping, design and implement ETL workflows and data pipelines on the Cloud using Big Data frameworks and/or RDBMS/ETL tools

  • Support and solve data and/or system issues as needed

  • Prototype creative solutions quickly by developing minimum viable products and work with seniors and peers in crafting and implementing the technical vision of the team

  • Communicate and work effectively with geographically distributed multi-functional teams

  • Participate in code reviews to assess overall code quality and flexibility

  • Resolve problems and roadblocks as they occur with peers and unblock junior members of the group. Follow through on details and drive issues to closure

  • Define, develop and maintain artifacts like technical design or partner documentation

  • Drive for continuous improvement in software and development process within an agile development team

  • Participate in user story creation in collaboration with the team



Who you are:



  • Bachelors or master’s degree in computer science or a related major and/or equivalent work experience

  • Experience using code versioning tools for e.g Git or others

  • Experience in Agile/Scrum software development practices

  • Effective verbal and written communication skills with the ability to present complex technical information clearly and concisely

  • 3-7+ years’ experience in Software Engineering specifically in databases, Big Data or Data-warehouse projects.

  • Proficient knowledge in SQL, database development (T-SQL/PL-SQL) and some experience with data modelling.

  • Experience working on Big Data framework like Hadoop or Spark.

  • Experience with any one MPP database system like Teradata, Redshift, DB2, Azure Datawarehouse or Greenplum.

  • Proficient in at least one programming language like Python/Java/Scala/ on a Unix/Linux environment.

  • Knowledge or experience working with AWS and AWS services like Redshift, EMR, AWS Lambda a plus.

  • Prior experience working with NoSQL stores (Hbase, ElasticSearch, Cassandra, Mongodb) a plus.

  • Familiarity with the e-commerce or travel industry.



Why join us:
Expedia Group recognizes our success is dependent on the success of our people. We are the world's travel platform, made up of the most knowledgeable, passionate, and creative people in our business. Our brands recognize the power of travel to break down barriers and make people's lives better – that responsibility inspires us to be the place where exceptional people want to do their best work, and to provide them the tools to do so.

Whether you're applying to work in engineering or customer support, marketing or lodging supply, at Expedia Group we act as one team, working towards a common goal; to bring the world within reach. We relentlessly strive for better, but not at the cost of the customer. We act with humility and optimism, respecting ideas big and small. We value diversity and voices of all volumes. We are a global organization but keep our feet on the ground so we can act fast and stay simple. Our teams also have the chance to give back on a local level and make a difference through our corporate social responsibility program, Expedia Cares.

Our family of travel brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Egencia®, trivago®, HomeAway®, Orbitz®, Travelocity®, Wotif®, lastminute.com.au®, ebookers®, CheapTickets®, Hotwire®, Classic Vacations®, Expedia® Media Solutions, CarRentals.com™, Expedia Local Expert®, Expedia® CruiseShipCenters®, SilverRail Technologies, Inc., ALICE and Traveldoo®.

LI-BM1



Expedia is committed to creating an inclusive work environment with a diverse workforce.   All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.  This employer participates in E-Verify. The employer will provide the Social Security Administration (SSA) and, if necessary, the Department of Homeland Security (DHS) with information from each new employee's I-9 to confirm work authorization.

MERCEDES AMG PETRONAS
  • Brackley, UK

Job Description


An exciting position is available for a Senior Software Architect in the Engineering Software group within the Performance Department. The group is responsible for designing, developing and maintaining the systems and toolsets used to understand and develop the performance of the car. These include the Driver in the Loop real time simulator, our HPC infrastructure and the data capture and analysis systems for on and off track.




Candidate Profile


We are looking for a talented individual with the motivation and determination to succeed in this challenging and rewarding environment. You will be expected to define, develop and deploy your vision of the next generation of software infrastructure for the Performance Department.

The ideal candidate will have exceptional technical skills and proven hands-on experience in the full Software Development Life Cycle (SDLC) for the most demanding of environments; from requirements capturing through to delivering and maintenance of high-quality software systems. You will be expected to work in a highly collaborative manner, lead high value projects and mentor the junior members of the group. You will be working within a broader multi-disciplinary engineering team, so the ability to explain complex system design to non-technical team members is key.

Key skills:

• Broad and extensive knowledge of modern software development practice and its technologies
• Knowledge of architectural styles and design patterns
• Detailed knowledge of C#, with experience in C/C++, MATLAB/Simulink desirable
• Data modelling and design experience in both relational (MSSQL) and NoSQL (ELK/Mongo) databases
• Experience with modern software development lifecycle (SDLC) processes
• Experience with service oriented architecture (SOA) and distributed systems





Benefits


In return for your work and commitment, we offer a competitive package including bonus, life assurance, private medical cover, car lease scheme, 25 days holidays, subsidised restaurant and on-site gym facilities.



Closing date: 10 February 2019


Apply here: http://careers.mercedesamgf1.com/vacancies/details/?id=33673

Burger King Corporation
  • Miami, FL

Position Overview:

This person will be key in the structuring of our new Guest Insights and Intelligence area within BK North America. Burger marketing analytics has evolved substantially over the past several years, to a point where we have a very detailed understanding of our sales on a product or ticket level. Nevertheless, our understanding of who is buying our products and offers is still very limited. With an increasingly competitive market, our objective is to create a new area that will have as its core focus the understanding of our guests, which ultimately will drive our strategy across several different initiatives, including calendar, pricing, innovation, advertising/communication. With more data available than ever coming from our mobile app, kiosks, POS, credit card, and external data sources, were looking for a data scientist with strong business judgment who will be able to help us structure and develop this area, effectively transforming how we look at marketing analytics at BK and eventually RBI.


Responsibilities & Qualifications:


    • 3-5 years of professional experience, masters degree a plus
    • Expertise in data modelling, visualization, and databases
    • Proficiency with statistical packages (e.g. SAS and R), database languages (e.g. SQL Server, MySQL, Oracle Express), and media measurement tools (e.g. DoubleClick, Omniture, Google Analytics)
    • Datab
      • ase design and implementation Machi
      • ne learning Time
      • series and forecasting Data
      • mining Linea
      • r and logistic regression Decis
      • ion trees Segme
      • ntation analysis Clust
      • ering techniques Marke
      • ting mix models Data
      • visualization techniques


  • Market research and competitive analysis aimed at driving growth
  • Comprehensive data analysis and manipulation to help identify trends
  • Strong interpersonal and communication skills

Restaurant Brands International US Services LLC (RBI) is an equal opportunity employer and gives consideration for employment to qualified applicants without regard to race, color, religion, sex, national origin, disability, or protected veteran status.

Beeswax
  • New York, NY

As our newest team member you will:



  • Design, build and own our core infrastructure for servings ads and managing bids

  • Collaborate with engineers, product managers, designers and data scientists to evolve these services to provide new experiences for our customers as well as meet our ever-growing scale

  • Do fun tasks like reviewing code, writing design docs and closing JIRA tickets


Ideal candidates will have:



  • 5+ years of working on backend systems at strong software companies

  • Significant experience with Java, C++, Python and databases

  • Experience with software architecture, performance, scaling, data modelling, and API (or apiary) design

  • An understanding of what it means to build systems at scale - if you love articles on highscalability.com you’ll fit right in

  • A strong grasp of software architecture (either through a degree or from learning it themselves)

  • An ability to think about and tackle problems analytically

  • Comfort in a Linux ecosystem


Successful engineers at Beeswax have:



  • An ethic of service and a belief in putting the customer first

  • A powerful sense of pragmatism to figure out what needs to be done right versus right now

  • A curiosity about technology and a desire to use it to solve problems in all sorts of domains

  • An openness to feedback and more than just the spelling skills to know that there’s no I in Team

  • An appreciation of repeatability, resilience, observability, and operational simplicity


We believe in using the right tools for the job so our products are built in a variety of languages including C++, Java, Javascript and Python. While engineers tend to specialize in their specific domains everyone on our team needs to have a solid understanding of computing fundamentals and modern approaches to building high scale and high-quality software.

Intercontinental Exchange
  • Atlanta, GA
Job Purpose
The Data Analytics team is seeing a dynamic, self-motivated Sr Developer who has a history of strong stakeholder management skills, who is able to work independently on data analysis, datamining, report development and customer requirement gathering.
Responsibilities
    • Applies data analysis and data modelling techniques, based upon a detailed understanding of the corporate information requirements, to establish, modify or maintain data structures and their associated components
    • Participates in the development and maintenance of corporate data standards.
    • Support stakeholders and business users to define data & analytic requirements.
    • Work with the business to identify additional internal and external data sources to bring into the data environment and mesh with existing data
    • Story board, create and publish standard reports, data visualizations, analysis and presentations.
    • Develop and implement workflows using Alteryx and/or R.
    • Develop and implement various operational and sales Tableau dashboards.
    • Ability to process and analyze large volumes of data, from various outputs using your findings to support the business in understanding information.
Knowledge And Experience
    • Bachelors degree in Statistics/quantitative/Engineering/Math/Science/ Economics/Finance or a related quantitative discipline required.
      Masters in Engineering/Physics/Statistics/Economics/Math/Science preferred.
    • 3+ years of experience supporting the development of analytics solutions leveraging tools like Tableau Desktop and Tableau Online and over 2+ years experience working with SQL, developing complex SQL queries, and leveraging SQL in Tableau.
    • Working knowledge of common BI tools such as OBIEE,BO
    • 1+ yrs experience in Alteryx, R coding
    • Deep understanding of Data Governance
    • Data Modeling
    • Advanced written and oral communication skills with the ability to summarize findings and present in a clear concise manner to peers, manager and others.
    • Ability to actualize requirements
Preferred Skills
    • Good stakeholder management
    • Good team fit / Enthusiasm
    • Data Evangelist - needs to show a love of all things data
    • Data Science techniques (if possible)
Additional Information
    • Job Type Standard
    • Schedule Full-time
HelloFresh
  • Berlin, Germany

About the job


As a Big Data Engineer at HelloFresh, you will be collaborating to build one of the most advanced big data platforms in Europe. You will develop distributed services that processes data in near-time and real-time, with focus on scalability, data quality and integration of machine learning models. 



  • Plan, Implement and maintain distributed, service-oriented and message-oriented data platform components 

  • Design, build, and maintain data ingestion, ETLs and infrastructure for batch and near real time processing.

  • Build and maintain Business Intelligence and reporting solutions.


You can get a taste of what we've been working on by checking out our engineering blog.


Who we’re looking for



  • Degree in Computer Science, Software Engineering, or equivalent

  • Experience in Software Engineering

  • Fluency in Python and OOP

  • Experience with Apache Spark, Hive, Impala, and Kafka

  • Experience with Hadoop, plus distributions like Cloudera, MapR or Hortonworks

  • Experience with RDBMS such as PostgreSQL or MySQL

  • Experience with NoSQL data stores such as MongoDB, Redis or Cassandra

  • Agile team-working experience 

  • Test-Driven Development

  • Preferred: Experienced with data modelling, design patterns, building highly scalable and secured solutions

  • Preferred: Experience with Amazon AWS and DevOps and Automation

  • Preferred: Familiarity with performance metric tools

  • Preferred: Knowledge of visualization tools (Tableau, Qlikview)


What we offer



  • The opportunity to get into one of the most intellectually demanding roles at one of the largest technology companies in Europe

  • Cutting edge technology, allowing you to work with state-of-the-art tools and software solutions

  • Competitive compensation and plenty of room for personal growth

  • Great international exposure and team atmosphere

  • Work in a modern, spacious office in the heart of Berlin with excellent transport links and employee perks

ProSiebenSat.1 Media SE
  • Munich, Germany
  • Salary: €83k - 100k

Job opportunity Business Intelligence Architect (m/f/d) at ProSiebenSat.1 Media SE Jobportal













At Data Solutions we are the go to guys for providing all our colleagues with state of the art data access, business ready solutions and advice in all things regarding data. As a team of highly skilled individuals we collaboratively work on the central data platform of ProSiebenSat.1 to help building the entertainment of the future. In that regard we occupy ourselves with writing quality code, managing projects and stakeholders, automating our development processes, making users happy and last but not least having lots of fun together.

What you can expect in this role



  • In a dynamic Scrum team, you are accountable for the technical as well as the business architecture of BI-solutions, using technologies like Spark, Exasol or Tableau

  • You define requirements on the central data integration platform of ProSiebenSat.1 and consult its development team during implementation

  • In addition, you work closely with domain experts to create new data solutions. To do so, you define and continuously improve required processes and methodologies

  • Automating the development process regarding continuous integration, continuous deployment and automated testing is also within your area of responsibility


Your essential experience and education



  • You have successfully completed your studies in computer science or with an IT focus

  • You have ample experience in developing enterprise systems for processing large amounts of data, for example with Spark and/or SQL

  • You are an expert in data modelling and data management. Hands on experience in Data Vault would be a great plus

  • Your own highest quality standards and your solution oriented mindset are best applied in an agile development environment (at ProSiebenSat.1 this normally means Scrum)

  • Personal accountability and team spirit are no contradiction to you. With your competence and personality, you know how to inspire a team and lead it to outstanding solutions

  • Excellent knowledge of German rounds off your profile


What's in it for you?



  • Take advantage of our wide range of training and development opportunities offered by the ProSiebenSat.1 Academy

  • Benefit from a flexible work schedule with home office and 30 vacation days per year

  • Work on modern fully equipped workplaces and use our wide range of sports (including yoga, football or workouts), post office, laundry and ironing service as well as many cafés and Wifi on campus

  • In addition to the campus's own daycare or a childcare allowance, there is a family service with free consultation

  • Enjoy an open corporate culture without dress code in the dynamic environment of the media industry

  • This position can be filled as full time and part time job



If we got your your attention, then send off your application to us today. It only takes a couple of minutes! Get to know our tech teams at tech.prosiebensat1.com.
 
ProSiebenSat.1 Tech Solutions GmbH is a fully owned subsidiary of ProSiebenSat.1 Media SE. It incorporates all subjects regarding Enterprise IT and is responsible for operations and continued development of P7S1-internal applications as well as of IT infrastructure. ProSiebenSat.1 Tech Solutions optimizes the technical processes throughout the group, permanently refines the system architecture and thus provides the P7S1-employees a technologically state-of-the-art working environment.
 
ProSiebenSat.1 Group is one of Europe's leading media companies. Our stations SAT.1, ProSieben, kabel eins, sixx, SAT.1 Gold, ProSieben MAXX and kabel eins Doku captivate millions of viewers every day. But extraordinary TV entertainment is just one part of our concept for success. Today, our growth is based on four segments: German-speaking TV business, our digital units with a large entertainment and e-commerce portfolio, and our international program production and sales house. We have consistently diversified our company in recent years and created additional sources of revenue. This also reflects our vision: We are establishing ProSiebenSat.1 as a leading broadcasting, digital entertainment, and commerce powerhouse.