OnlyDataJobs.com

Software Security Engineer at Stott and May (Austin, TX)

  • Austin, TX

Techniques

Languages

Tools

The Company 

Dubbed an "open-source unicorn" by Forbes, theyve received on $200m in total funding - with Sequoia Capital, Benchmark, and Index Ventures recently investing a combined $125 million in Series D financing.

They offer a streaming platform based on Apache Kafka that enables companies to easily access data as real-time streams. Its key strength is its ability to make high volume data available as a real-time stream for consumption in systems with very different requirements.


The Role

Theyre on the lookout for an experienced platform engineer to be based out the Austin, TX office to work on high impact security aspects of their Platform (platform built in Java, with a little Scala mixed in as well as Go), as many of the worlds largest banks, insurance, and telecom companies use their product. Specific role title is Distributed Systems Security Engineer.


Effectively needs to a be strong full-stack engineer who has used distributed systems, with skills in security, and knows how data fits together


On Offer

Opportunity to get equity with a company working on a potentially generational project (like how VMware, Docker, Kubernetes, Hadoop have been adopted), with high impact / high visibility work. Alongside said equity, they offer; competitive cash comp; full benefits, unlimited PTO, and various other perks.


Interested?

Apply or email Luis.Cruz@StottAndMay.com

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As part of the Big Data ecosystem, the Comcast dx team defines and executes on data strategy to realize the promise of "data to the people." The Solution Manager plays a critical role in this effort by linking our customers' needs to the data ecosystem, both within the dx team and across the larger Comcast organization.

As a Solution Manager you:

-Lead client engagement, data discovery/analysis and business/process modeling efforts for the dx team.

-Operate across a number of technical domains with a focus in a primary area such as Product or Network Quality.

-Are a naturally curious problem solver with a passion for efficiency and execution excellence.

-Have performed in an analytical or technical role previously and have a strong understanding of the analytical workflow from requirements/data discovery through analysis to operationalization.

-Understand that to be successful in this role requires presence and confidence, and you have the ability to drive a team forward in the face of ambiguity and competing priorities.

-Understand the fundamental role that data plays in the competitive landscape and demonstrate a passion for data excellence.

-Embrace collaboration as a central tenant to being successful and understand the critical need to build trusting bonds both with our key stakeholders and delivery teams.

-Partner effectively with a Solution Engineers, Architects, Product Owners and Tech Leads to define and scope work into delivery roadmaps.

-Ensure traceability and alignment of execution to critical priorities. Drive translation of business requirements into solution intent. This includes requirements identification and clarification, project goals and objective definition, scoping, estimation, risk assessment.

-Anticipate needs, operate with a sense of urgency, and have the ability to adapt to change quickly.

-Fill resource gaps with hands-on work as needed.

-Should have had the ability to write this job description better than us :)

Qualifications:

-Bachelor's Degree (Advanced Degree Preferred) in engineering, mathematics, computer science, statistics, physics, economics, operations research or related field;graduate study extremely helpful.

-Minimum of 7 years Tech Lead / Product Management / Project Management / Consulting experience, preferably in Data Warehousing, Big Data and Analytics.

-Experience with customers: managing consultant utilization, milestone success, setting and managing expectations, controlling outcomes and resolving customer issues.

-Understanding of data warehousing technologies and evolution, including relational databases (SQL, Oracle) and big data technology (Cloud, AWS, Hadoop), business intelligence and analytical tools (Tableau, Python, R), architectural strategies and previous hands on development and engineering (SDLC and Agile).

-Strong communication, presentation and meeting facilitation skills. Ability to positively represent yourself and the team.

-Must be a team-player and be able to work closely with technical teams and business users.

-Capable of building strong relationships with leaders across enterprise for both sources of data and consumers.

-The ability to travel ("25%) and to attend in-person client meetings.

-Expert in MS Office.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Position Summary

The dx Team has responsibility for Data Engineering part of Comcast. One of the major goal is to harmonize the data ingestion and consumption layer across Comcast. Creating enterprise data sources as a single version of truth is a goal of dx Team.

With moderate guidance, the Big Data Software Developer will develop (code/program), test, and debug ETL (Extract/Transform/Load) of data to answer technically challenging business requirements (complex transformations, high data volume). All work needs to be documented part of release management.

Employees at all levels are expect to:

-Understand our Operating Principles; make them the guidelines for how you do your job

-Own the customer experience-think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services

-Know your stuff-be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences

-Win as a team-make big things happen by working together and being open to new ideas

-Be an active part of the Net Promoter System-a way of working that brings more employee and customer feedback into the company-by joining huddles, making call backs and helping us elevate opportunities to do better for our customers

-Drive results and growth

-Respect and promote inclusion and diversity

-Do what's right for each other, our customers, investors and our communities

Core Responsibilities

-Play a key role at a senior-level engineer by implementing a solid, robust, extensible design that supports key business flow.

-Analyzes and determines data integration needs.

-Evaluates and plans software designs, test results and technical manuals using Big Data (Hadoop) ecosystem

-Build and maintain optimized ETL solutions to process/load source systems data into Hadoop using Sqoop or Microservices leveraging Hadoop tools.

-Reviews literature, current practices relevant to the solution of assigned projects in Data Warehousing/ Data Lake and Reporting areas

-Programs new software using Spark, Scala, Kafka, Sqoop, SQL

-Supports existing and new applications and customization of current applications

-Edits and reviews technical requirements documentation

-Displays knowledge of software engineering methodologies, concepts, skills and their application in the area of specified engineering specialty (like Data warehousing)

-Displays knowledge of, and ability to apply, process software design and redesign skills

-Displays in-depth knowledge of, and ability to apply, project management skills

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required

-Consistent exercise of independent judgment and discretion in matters of significance

-Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary

-Other duties and responsibilities as assigned

Education Level

Bachelor's Degree or Equivalent

Years of Experience

2-5 years of experience working as data integration developer, SQL Developer, ETL developer or Java/C# Developer or related experience is required.

Field of Study

Computer Science, Engineering

Compliance

Comcast is an EEO/AA/Drug Free Workplace

Disclaimer

The above information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications

Additional Information

Daily Responsibilities:

Technologies used day to day

Linux, Hadoop, Spark, UC4, SQL, Linux Shell Scripting, BI reporting tools

Business Units -what group(s) does the role support

dx and dx business partner initiatives

Paired Programming/Vs. Individual tasks-what does that look like

40% paired programming / Vs. 60% Individual tasks

Business Purpose

Describe the core impact of this role and the team -project details

Ingest data from various data sources to create a harmonized layer of data

The dx Team has responsibility including Data Engineering for Comcast; one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Description

Softwareengineering leadership and data science skills, combined with the demands of a highly-visible enterprise metadata repository, make this an exciting challenge for the right candidate.

Are you passionate about digital media, entertainment, and software services? Do you likebigchallenges and working within ahighly-motivatedteam environment?

As aSenior Manager in the Metadata Engineeringgroup of the Data Experience (DX) team at Comcast, you will drive the development,deployment, and support of large-scale Metadata platforms using real-time distributed computing architectures.You willalso employ your skills to promote positive changes in our work culture and practices that will improve our productivity, ingenuity, agility, and software development maturity.TheDXdatateam is afast-moving team of world-class experts who are innovating in end-to-end data delivery and analytics. We are a team that thrives onbigchallenges, results, quality, and agility.

Who will you be working with?

DX MetadataEngineering is a diverse collection of professionals who work with a variety of teams ranging from: other software engineering teams whose Metadata repositories integrate with the Centralized Metadata Repository, Portal engineers who develop a UI to support data discovery, software engineers on other DX platforms that ingest, transform, and retrieve data whose metadata is stored in the Centralized Metadata Store, data stewards/data architects who collect and disseminate metadata information,and users who rely on Metadata for data discovery.

What are some interesting problems you'll be working on?

You will manage the design and development of a metadata and business glossary collection and, enrich the system that allows real-time update of the enterprise and satellite metadata repositories using best-of-breed and industry-leading technologies. These repositories contain metadata and lineage for a widely diverse and ever-growing complement of datasets (e.g., Hortonworks, AWS S3, Streaming Data (Kafka/kinesis), streaming data transformation, ML pipeline, Teradata, and RDBMS's). You will lead the design and development of cross-domain cross-platform lineage tooling by using advanced statistical methods and Machine Intelligence algorithms. You will manage development tools to discover data across disparate metadata repositories, develop tools for data governance, and implement processes to rationalize data across the repositories.

Where can you make an impact?

The dx Teamis building the enterprise metadata repository needed to drive the next generation ofdata platforms and data processing capabilities.Building data products, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robustmetadata repository capable of providing insightsinto the data and its lineage.

Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

-Responsible for managing all metadata assets, applications, and supporting processes.

-Closely work with Architects, Product Owners and Solution Engineers to understand product requirements, understand architectural recommendations, and work with solution engineers to develop a viable solution.

-Guide the Metadata Engineering team in identifying product and technical requirements.

-Ensure products and projects are delivered as a roadmap within the agreed budget and time.

-Serve as primary point of contact and liaison between Metadata Engineering and other teams.

-Closely monitor metadata eco-system to ensure that each metadata asset is performing as per SLA and continuously delivering business value.

-Responsible for preparing team budgets, roadmaps, and operational objectives and ensuring operational plans are aligned with business objectives.

-Responsible for selection and recruitment of resources and work to ensure a high-quality stream of candidates in our talent pipeline

-Experience in hiring and managing teams.

-Experienced in managing projects with competing priorities

-Ensure that direct reports keep current with technological developments within the industry.

-Monitor and evaluate competitive applications and products.

-Drive a culture of continuous improvement and innovation

-Pro-actively work to mitigate risks to performance and delivery of our teams

-Promote solutions for integrating metadata and data quality processes into agile methodologies

-Promote blameless post-mortems and ensure that all post-mortem activities are acted upon

Here are some ofthespecific technologies we use:

-Metadata Repositories-Apache Atlas and Informatica MDM

-Spark(AWS EMR, Databricks)

-Kafka, AWS Kinesis

-AWS Glue, AWS Lambda

-Cassandra, RDBMS, Teradata, AWS DynamoDB

-Elasticsearch, Solr, Logstash, Kibana

-Java, Scala, Go, Python, R

-Git,Maven, Gradle, Jenkins

-Puppet, Docker, Terraform, Ansible, AWS CloudFormation

-Linux

-Kubernetes

-Manta

-Hadoop (HDFS, YARN, ZooKeeper, Hive), Presto

-Jira

Skills & Requirements:

-7+ years of people leadership experience in a software development environment

-2-4 years of experience in metadata-related projects

-Bachelors or Masters inComputer Science, Statisticsor related discipline

-Demonstrated experience in data management; Certified Data Management Professional (CDMP)-Nice to have

-Experience in software development of large-scale distributed systems including a proven track record of delivering backend systems that participate in a complex ecosystem.

-Experienceinmetadata-related open source frameworks preferred

-Experience in using and contributing to Open Source software preferred.

-ProficientinUnix/Linux environmentspreferred

-Partner withdata analysis, data quality, and reporting teams in establishing the best standards and principles around metadata management

-Excellentcommunicator, able to analyze and articulate complex issues and technologies understandably and engagingly

-Great design and problem-solving skills

-Adaptable, proactive and program ownership

-Keen attention to detail and high level of commitment

-Thrivesin a fast-paced agile environment. Requirements change quickly, and our team needs to constantly adapt to moving targets

-A team player with excellent networking, collaborating and influencing skills

About ComcastDX(Data Experience):

dx(Data Experience) is a results-driven, data platform research and engineering team responsible for the delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization. We have an overarching objective to gather, organize, and make sense of Comcast data with the intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of the dx team define and leverage industry best practices, work on extremely large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Our mission is to enable many diverse users with the tools and information to gather, organize, make sense of Comcast data, and make it universally accessible to empower, enable, and transform Comcast into an insight-driven organization.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Hamburg, Deutschland

System Administration | Hamburg | Full-Time




You know your way around Linux servers, already dived into the Hadoop Ecosystems and are looking for an opportunity to specialize in these technologies? Our Big Data environment with state-of-the-art technologies will be the perfect place to support your development. You will be part of our System Administration department and work very closely with our data engineers and scientist to design and implement architecture changes for our data infrastructure. Once you are familiar with our infrastructure you will take over the responsibility to operate our Analytics systems, design and implement architecture changes and provide hands on support for our data engineers and data scientists.

Your mission:



  • Work together with our data engineers and data scientists

  • Discuss and implement new technologies

  • Maintain the systems, update, monitor, and debug them

  • Automate and improve the environment



Our technologies:



  • The Hadoop ecosystem (HDFS, Hive, Impala, Spark)

  • Stream processing (Kafka, Flink)

  • Custom data applications with Java, PHP and go

  • Jenkins for job scheduling and build processes

  • Debian and Puppet for configuration management

  • Supported by Nagios, Graphite, Grafana and Serveradmin



Your profile:



  • Some years professional experience administering *nix systems

  • Real world experience with tools from the Hadoop ecosystem

  • Passion for Data Science / Analytics / Big Data and related technologies

  • Proficient in at least one scripting or programming language

  • Open and friendly communication style and very good English skills



Why join us?


  • Be part of a great team in an international environment in a healthy and stable growing company

  • Choose your preferred device (Linux, Mac or even Windows) for your comfortable workplace

  • We will actively support your further development and give you all needed resources to evaluate new technologies, participate in open source communities or improve your soft skills

  • Competitive compensation and an atmosphere to empower creative thinking and strong results

  • Exceptional benefits ranging from flawless relocation support to company gym, smartphone or tablet of your own choice for personal use, roof terrace with BBQ and much more



Excited to start your journey with InnoGames and join our dynamic team as a Linux System Administrator / System Engineer (Hadoop)? We look forward to receiving your application as well as your salary expectations and earliest possible start date through our online application form. Isabella Dettlaff would be happy to answer any questions you may have.

InnoGames, based in Hamburg, is one of the leading developers and publishers of online games with more than 200 million registered players around the world. Currently, more than 400 people from 30 nations are working in the Hamburg-based headquarters. We have been characterized by dynamic growth ever since the company was founded in 2007. In order to further expand our success and to realize new projects, we are constantly looking for young talents, experienced professionals, and creative thinkers.



Isabella


Talent Acquisition Manager




Phone +494078893350

  • Água Branca, Brazil

We are in the process of migrating our [monolithic] application landscape to a microservice architecture.
We also transitioned from fire fighting operations mode to a innovation and devops culture.
In this process a new R&D area emerged which consists of a small team of research and software engineers.


What we are doing



  • Building algorithms that help us with anticipated shipping, purchasing forecast and protects us against system failure.

  • using image recognition to give our users the highest possible convenience and coolest features.

  • using state of the art game engines to build virtual reality into our customer experience.

  • Optimize product search and build data consistency monitoring.

  • Help building a large scale architecture together with entire IT team.


The output will be nothing less than transform the way e-commerce works and to provide sustainable solutions.


What we are looking for


3+ years professional or research experience



  • Comprehensive knowledge in statistics, probability and machine learning.

  • Passion for solving standard and non-standard mathematical problems.

  • Ability to explain data science insights to people with and without quantitative background.

  • Fluency in R, Python, or Julia.

  • Hands on/advanced experience with SQL.

  • Experience in data visualization.

  • Experience using Big Data toolstack (e.g. Hadoop, Hive, Spark, Cassandra, Hbase, or other non-relational DB) is a plus.

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Description

Platformengineering and cloud excellence combined with the demands of a high volume, highly-visible analytics platform make this an exciting challenge for the right candidate.

Are you passionate about digital media, entertainment, and software services? Do you likebigchallenges and working within ahighly-motivatedteam environment?

As aPlatformEngineer indx Data Experience team, you will research develop,support and deploy solutions using real-time distributed computing architectures.Our mission is to enable many diverse users with the tools and information to gather, organize, make sense of Comcast data, and make it universally accessible to empower, enable, and transform Comcast into an insight-driven organization. Thedxbig dataorganization is afast-moving team of world-class experts who are innovating in end-to-end data delivery. We are a team that thrives onbigchallenges, results, quality, and agility.

Who does thePlatformengineer work with?

PlatformEngineering is a diverse collection of professionals who work with a variety of teams ranging from other software engineering teams whose software integrates with analytics services, service delivery engineers who provide support for our product, testers, operational stakeholders with all manner of information needs,and executives who rely ondatafor data-based decision making.

What are some interesting problems you'll be working on?

Develop solutions capable of processingmillions of events per second and multi-billions of events per day, providing both a real time and historical view into the operation of Comcast'swide array ofsystems. Design collection and enrichment system components forquality, timeliness,scale and reliability.Work on high performance real time data stores and a massive historical data stores usingbest-of-breed and industry-leading technology.Build platforms that allow others to design, develop, and apply advanced statistical methods and Machine Intelligence algorithms, fostering self-service capabilities and ease of use across the entire Technology, Product, Xperience (TPX) organization landscape and beyond!

Where can you make an impact?

The dx Teamis building the core components needed to drive the next generation ofdata platforms and data processing capability.Building data products, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robustdataarchitecture capable of providing insightsthat would otherwise be drowned in anoceanof data.

Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

-Lead development for new platforms

-Build capabilities that analyze massive amounts of databothinreal-time and batch processing

-Prototype ideas for new tools, products and services

-Employ rigorous continuous delivery practices managed under an agile software development approach

-Ensure a quality transition to production andsolid production operationof the platforms

-Raise the bar for the Engineering team by advocating leading edge practices such as CI/CD, containerization and test-driven development (TDD)

-Enhance our DevOps practicesto deploy and operate our systems

-Automate and streamline our operations and processes

-Build and maintain tools for deployment,monitoring and operations

-Troubleshoot and resolve issues in our development, test andproduction environments

Here are some ofthespecific technologies we use:

-Spark(AWS EMR), AWS Lambda

-SparkStreaming and Batch

-Avro, Parquet

-Apache Kafka, Kinesis Stream

-MemSQL, Cassandra,HBase,MongoDB, RDBMS

-Caching Frameworks (ElastiCache)

-Elasticsearch, Beats, Logstash, Kibana

-Java, Scala, Go, Python, R, Node.js

-Git,Maven, Gradle, Jenkins

-Rancher, Puppet, Docker, Ansible, Kubernetes

-Linux

-Hadoop (HDFS, YARN, ZooKeeper, Hive)

-Presto

Skills & Requirements:

-7+years programming experience

-Bachelors or Masters inComputer Science, Statisticsor related discipline

-Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem.

-Experienceindata related technologies and open source frameworks preferred

-ProficientinUnix/Linux environments

-Knowledge of network engineering and security

-Test-driven development/testautomation, continuous integration, and deployment automation

-Enjoyworking withdata analysis, data quality and reporting

-Excellentcommunicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly

-Great design and problem-solving skills

-Adaptable, proactive and willing to take ownership

-Keen attention to detail and high level of commitment

-Thrivesin a fast-paced agile environment. Requirements change quickly and our team needs to constantly adapt to moving targets

About Comcastdx (Data Experience):

dx(Data Experience) is a results-driven, data platform research and engineering team responsible for the delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization. We have an overarching objective to gather, organize, and make sense of Comcast data with intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of the dx team define and leverage industry best practices, work on extremely large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Our mission is to enable many diverse users with the tools and information to gather, organize, make sense of Comcast data, and make it universally accessible to empower, enable, and transform Comcast into an insight-driven organization.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Responsibilities

Position Summary

This is a senior position within the SE Data Service organization reporting to the Director of Data Experience . The DX group has responsibility for Data Service for Comcast, one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Creating the enterprise data sources, EDW, ODS, transaction data source as a single version of truth for data analysis for Comcast business.

The position is especially responsible working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

Collaborate and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution. Prepare and present Gap analysis, for technology and business solution up-to VP level.

Manage and support all project related architect, design, development, deployment of data-oriented integration across platform and projects as matrix organization.

The Data Integration and solution architect must have strong understanding and hands- on working knowledge with following software components: Linux, Shell Scripting, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Flume, Sqoop, Spark, Storm, Kafka, Accumulo, HBase, Java and UC4

-Provides direction for diverse and complex initiatives and is accountable for a variety of tasks to architect and deliver data warehousing solutions that exceed customer expectations in content, usability, accuracy, reliability and performance while assuming a leading role within agile teams (both on-shore and off-shore)

-Interprets business strategy and develops organizational objectives to align with this strategy. Typically manages multiple teams of professionals.

-Development of End to End ETL data integration and solution architecture

-Worked across technical and business team in order to harmonization data asset

-Experience with SQL & BTEQ Scripting: Strong data management and data analysis and performance tuning required.

-Experience with Hadoop Hive, Pig, Flume, Sqoop, Storm, Spark, Kafka, Accumulo, and HBase

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Candidate might have worked as one or more role e.g. Data architect, Data modeler, Data integration developer/architect, ETL developer/architect.

-Experiences solution architect to design, develop and implement ODS, EDW, Data Integration Layer etc.

-Experience with Presto query engine

-Method and Procedure creation related to operation and administration related activity.

-Manging and Co-ordniting Dev/ops Process.

-Conduct Operations readiness and environment compatibility review of any changes prior to deployment.

-Experince Managing Relase and process detail

-Knowledge in data warehousing methodologies and best practices required.

-Strong verbal and written communication skills required.

-Effective interpersonal relations skills, ability to effectively collaborate with others and work as part of a team required.

-Skills in navigating a large organization in order to accomplish results required.

-Ability to initiate and follow through on complex projects of both short and long term duration required.

-Excellent organizational and time management skills required.

-Excellent analytical and problem solving skills required.

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required.

-Participate on interdepartmental teams to support organizational goals

-Perform other related duties and tasks as assigned

-Punctual, regular, and consistent attendance

Required Skills/Experience:

Advanced Degree in technical discipline and/or management required. 10+ year of experience with data integration, Data Warehouse and ETL Architecture.

-Experience with Teradata and Hadoop Ecosystems.

-Experience working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

-5 + year of experience working as data integration, data solution architect EDW, ODS, ETL architect or similar role required.

-Five to seven years' experience leading data integration, Development of ETL architecture using Linux, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Spark, Flume, Sqoop, and UC4

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Experience working as solution architect supporting Platform as Service Organization.

-Experience with Presto query engine

-Requires understanding of complete SDLC and Experience with continuous integration, test-driven/behavior-driven development, and agile, scrum development methodologies

-Experience collaborating and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution.

-Experience presenting strategies, road-map and gap analysis, for technology and business solution up-to VP level

-Manging and Co-ordniting Dev/ops Process.

-Experience designing logical and physical architectures

-Experience managing teams of senior technologists

-Experience working in an Agile development methodology in a data warehouse environment

-Ability to work effectively across organizational boundaries

-Excellent oral, written, analytical, problem solving, and presentation skills

-Manage and Co-ordinate 1-10 matrix resources,

-Experience with mange service and on-shore and off-shore development experience is must

Desired Skills/ Experience

-Telecommunications experience Knowledge of Telecommunication/Cable billing, customer care systems e.g. DST, CSG and AMDOC etc.

-Knowledge of NoSQL platforms;

-Hadoop, Teradata, TOGAF, AWS Certified

Comcast is an EEO/AA/ Drug Free Workplace

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Hamburg, Deutschland

System Administration | Hamburg | Full-Time




You are a data analyst/scientist and you want to develop a deep understanding of the infrastructure behind Big Data systems? We are offering a cross-functional position where you can use your existing data processing skills and become the driving force behind the operation and further development of our data infrastructure.

You will be part of our System Administration department and work very closely with our data engineers and scientist to design and implement architecture changes for our Big Data systems. Our experienced system administrators will introduce you to our infrastructure. Once you are familiar with our data ecosystem, Puppet setup, Serveradmin and our operational structure you will take over the responsibility for our Big Data infrastructure step by step.

With our data architecture we are processing more than 1,000,000,000 game interaction events day by day. Our Hadoop cluster works on 1,500 CPUs, 5 TB Memory, 1 PB of disk space.



Your profile:



  • Some years professional experience as a Data Analyst, Scientist or Engineer

  • You understand distributed systems (e.g. CAP, Kappa, Lambda) and the interactions happening in the background (e.g. rebalancing)

  • You know stream processing, Data integration and processing (ETL) and can analyze and optimize SQL queries

  • Professional experience in Java and a Scripting language, ideally Python

  • Open and friendly communication style and very good English skills

  • You know your way around a Linux command line

  • Knowledge in server hardware and ideally a configuration management tool ( like Puppet, Chef or Ansible) would be a huge plus



Your mission:



  • Work together with our Data Engineers and Data Scientists

  • Discuss and implement new technologies

  • Maintain the systems, update, monitor, and debug them

  • Automate and improve the environment



Our technologies:



  • The Hadoop ecosystem (HDFS, Hive, Impala, Spark)

  • Stream processing (Kafka, Flink)

  • Custom data applications with Java, PHP and go

  • Jenkins for job scheduling and build processes

  • Debian and Puppet for configuration management

  • Supported by Nagios, Graphite, Grafana and Serveradmin



Why join us?



  • Be part of a great team in an international environment in a healthy and stable growing company

  • Choose your preferred device (Linux, Mac or even Windows) for your comfortable workplace

  • We will actively support your further development and give you all needed resources to evaluate new technologies, participate in open source communities or improve your soft skills

  • Competitive compensation and an atmosphere to empower creative thinking and strong results

  • Exceptional benefits ranging from flawless relocation support to company gym, smartphone or tablet of your own choice for personal use, roof terrace with BBQ and much more



Excited to start your journey with InnoGames and join our dynamic team as a Data Engineer Base Technologies (for DevOps)? We look forward to receiving your application as well as your salary expectations and earliest possible start date through our online application form. Isabella Dettlaff would be happy to answer any questions you may have.

InnoGames, based in Hamburg, is one of the leading developers and publishers of online games with more than 200 million registered players around the world. Currently, more than 400 people from 30 nations are working in the Hamburg-based headquarters. We have been characterized by dynamic growth ever since the company was founded in 2007. In order to further expand our success and to realize new projects, we are constantly looking for young talents, experienced professionals, and creative thinkers.





Isabella


Talent Acquisition Manager




Phone +494078893350

  • London Borough of Richmond upon Thames, UK
Gumtree, part of eBay Classifieds Group, is the UK’s leading classifieds site with over 14.5M unique visitors every month and over 9.5M app downloads. Founded in London in 2000, on Gumtree you can buy and sell everything from cars to home items and find jobs, local services, community events and even somewhere to live.



We’re looking for a Head of Engineering to join the Product Development team, reporting to the CTO. You will play a key role within engineering leadership, responsble for hiring, mentoring and managing 15-20 people, while maintaining strong development standards.



Based in beautiful Richmond, London, just by the riverside, you will join a team of over 40 engineers constantly innovating and delivering value to millions of people every single day. We work in an Agile environment in cross-functional squads, building features using continuous integration and constant testing with our users.



In this role you will be responsible for: People management of 15-20 back-end and front-end engineers, Scrum Masters and QA engineers.

Lead coaching, mentoring, career development and hiring across the team.

Work closely with product management to meet company objectives and goals.

Partner with leaders from across the business, including other eBay Classifieds Group brands worldwide.

Continually innovate and optimise our technology, people and performance.

Promote clean, testable and maintainable code that is modular and scalable.



We’re looking for:Experience managing large teams of engineers, testers and managers.

Knowledge and experience of software engineering and standard methodologies.

Familiar with organising work to follow Agile methodologies.

Excellent interpersonal skills with both technical and non-technical people.

Real passion for developing your people to ensure they succeed.



Although this is not a hands-on role, you should have strong development experience, ideally in some of the tools and technologies that we currently use:

Java, Spring, Scala, Akka, Elasticsearch, PostgreSQL, MongoDB, Redis, Hadoop.

React (with Flux), JavaScript ES6, SASS, HTML5.

Jenkins, JUnit, ScalaTest, TestNG, Mockito, Cucumber.



BenefitsFlexible working patterns and occasional work-from-home supported.

Full medical, dental and vision healthcare cover.

Pension scheme.

Life and disability insurance.

Childcare vouchers, parental leave policy and Cyclescheme available.

Networking, learning and global travel opportunities across eBay Classifieds Group.

Regular Tech Talks, Hackathons and workshops.

Phenomenal working environment with height-adjustable desks and Aeron chairs.

Free breakfast, fruit, snacks, soft drinks, coffee and tea.

Free on-site massages, yoga, pilates and fitness bootcamps.



Interview processIntroductory phone call with CTO.

Face-to-face interviews including meetings with a senior engineering, engineering manager and CTO.
  • London Borough of Ealing, UK

Cognizant BigData Practice:


Big Data service line is part of the Cognizant Digital Business/Artificial Intelligence and Analytics business unit. It is responsible for delivering digital data platforms, smart data lakes on premise/cloud using different Hadoop distributions/NoSQL technologies, Data Virtualization products availble in the market and have built an proprietary Data Platform to deliver end-end data services, automation and framework driven data processing modules. We are a platinum/gold partners for majority of platform vendors which enables us seamless access to all product upgrades, discussions, events etc;


The members of the team work in a variety of industries and use a diverse set of tools to process IoT, Social data, larger operational systems like SAP, Salesforce, Mainframes etc; types of complex data.  Accessing a range of data stored in disparate systems, integrating data and providing the data in necessary formats to perform data mining to answer specific business questions as well as identifying unknown trends and relationships in data. 


Few Snippets:



  • We are one of the fastest growing and revenue generating service lines with in Cognizant

  • More than 170+ Big Data engagements across the globe with over 2000 + Big Data consultants with expertise on various different industry products & distributions

  • 170+ Usecases repository with Solutions for different Digital & IoT usecases

  • Rated as a Leader by Everest’s PEAK matrix in their “Big Data assessment” for  RCGTH, HC, Banking and Insurance domains

  • “Leader”  Gartner Magic Quadrant for Data & Business Analytics Services, Worldwide 


You as senior technologist will be part of the strategic drive to in leading multi-discipline teams through the full delivery lifecycle of complex data products and pipelines with a clear understanding of large-scale data processing and data science solutions, deliver use cases for one of our customers and develop practical solutions and implement them to give this customer a competitive edge with in the enterprise. You will have breadth of experience in both engineering and architecture across technology disciplines and the unique challenges of each, including software development, automated test and quality assurance, data and integration.


Your role


Delivering a high quality and innovative data driven solution to our client’s advance analytical needs. This will include, working within a team on the clients site: understanding the client’s pain, designing an analytical approach, implementing a solution, ensuring it is of high quality, and leading and mentoring multi discipline technology teams.


Key Responsibilities:



  • Work at a client site as a member of an experienced onsite consulting/delivery teams developing & providing architectures for a highly resilient, fault tolerant, horizontally scalable data platform.

  • To build multiple layers with in the platform around Data Ingestion, Integration, Registration, Metadata Management, Error & Auditing, Data Provisioning.

  • Working with Architecture to refine ideas on the Analytics Architecture and help with standards and guidelines.

  • Building a Big Data Engineering community and helping customers/clients to build centre of excellence for engineering.

  • Working closely with the Lead Software Engineers in your area of responsibility to champion the core tenets of engineering at client

  • Ensuring development teams deliver the highest quality data products through the adoption and continuous improvement of patterns and practices, tools and frameworks and processes. 

  • Applying years of experience and expertise to help teams resolve and overcome technical challenges of any size or complexity and assist where necessary to accelerate problem resolution.


Job Requirements  Essential Skills:



  • Proven, Strong Data Processing skillset with experience in Hadoop tools and techniques. For example (not exhaustive):

  • Spark processing, Streaming and performance tuning

  • Kafka Real time messaging

  • HBase modelling and development

  • HDFS file formats partitioning for eg; Parquet, Avro etc;

  • Impala/Hive

  • Unix Shell Scripting

  • Proficiency in Scala

  • Working proficiency in developmental toolsets like Eclipse, IntelliJ

  • Exposure/competence with Agile Development approach

  • Git & Continuous integration tools such as Jenkins, TeamCity

  • Multi-threaded, OOPS Programming

  • Jenkins/Maven

  • FindBugs, Sonar, JUNIT, Performance, Memory Management

  • Strong experience in delivering Data Solutions using Big-Data technologies and cloud platforms.

  • Strong work experience in Azure Cloud platform

  • Expert in at least one of Java, Scala, Python with knowledge of the others

  • Experience in delivering big data solutions using a leading Hadoop distribution like Hortonworks, Cloudera or MapR

  • Knowledge of RDBMS, ETL and Data warehouse technologies 

  • Testing frameworks like Junit, Scalatest, Mock testing

  • Nice to Have Skills.

  • Knowledge and experience in delivering real time solutions on sinks like Cassandra, MongoDB or equivalent NoSQL databases

  • Container technologies like Docker

  • Knowledge of API and Microservice architectures 

  • Knowledge of advanced analytics and insights techniques (e.g. predictive analytics, machine learning, segmentation)

  • Knowledge of deep learning frameworks like TensorFlow, Keras

  • Knowledge of machine learning libraries like MLLIB, sklearn

  • Strong domain experience in retail/e-commerce,  Communications & Technology verticals. 


Qualifications:  University Degree with a specialization in Computer Science, Mathematics