OnlyDataJobs.com

Data Modeler at Apps IT America (Houston, TX)

  • Houston, TX

Data Modeler

EXPERIENCE & SKILL SET:
Data modeling using ERWin tool (Work Group version)
Enterprise Data Warehouse modeling skill
Business Analysis Skill
Oracle Database Skill


Description 


My Client is seeking an experienced Data Modeler to assist in building and supporting RigSystems & Aftermarket's Data Warehouse. This resource will be responsible for separating different types of data into structures that can be easily processed by various systems. This resource will also focus on a variety of issues, such as enhancing data migration from one system to another and eliminating data redundancy. Duties and responsibilities include:

Understand and translate business needs into data models supporting long-term solutions.
Work with the Application Development team to implement data strategies, build data flows and develop conceptual data models.
Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.
Optimize and update logical and physical data models to support new and existing projects.
Maintain conceptual, logical and physical data models along with corresponding metadata.
Develop best practices for standard naming conventions and coding practices to ensure consistency of data models.
Recommend opportunities for reuse of data models in new environments.
Perform reverse engineering of physical data models from databases and SQL scripts.
Evaluate data models and physical databases for variances and discrepancies.
Validate business data objects for accuracy and completeness.
Analyze data-related system integration challenges and propose appropriate solutions.
Develop data models according to company standards.
Guide System Analysts, Engineers, Programmers and others on project limitations and capabilities, Performance requirements and interfaces.
Review modifications to existing software to improve efficiency and performance.
Examine new application design and recommend corrections if required.

Required Skills

3+ years experience as a Data Modeler/Data Architect
Proficient in the use of data modeling tools; Eriwin proficiency is a must.
Experience in meta data management and data integration engines such as Biztalk or Informatica
Experience in supporting as well as implementing Oracle and SQL data infrastructures
Knowledge of the entire process behind software development including design and deployment (SOA knowledge and
experience is a bonus)
Expert analytical and problem-solving traits
Knowledge of the design, development and maintenance of various data models and their components
Understand BI tools and technologies as well as the optimization of underlying databases

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

As part of the Big Data ecosystem, the Comcast dx team defines and executes on data strategy to realize the promise of "data to the people." The Solution Manager plays a critical role in this effort by linking our customers' needs to the data ecosystem, both within the dx team and across the larger Comcast organization.

As a Solution Manager you:

-Lead client engagement, data discovery/analysis and business/process modeling efforts for the dx team.

-Operate across a number of technical domains with a focus in a primary area such as Product or Network Quality.

-Are a naturally curious problem solver with a passion for efficiency and execution excellence.

-Have performed in an analytical or technical role previously and have a strong understanding of the analytical workflow from requirements/data discovery through analysis to operationalization.

-Understand that to be successful in this role requires presence and confidence, and you have the ability to drive a team forward in the face of ambiguity and competing priorities.

-Understand the fundamental role that data plays in the competitive landscape and demonstrate a passion for data excellence.

-Embrace collaboration as a central tenant to being successful and understand the critical need to build trusting bonds both with our key stakeholders and delivery teams.

-Partner effectively with a Solution Engineers, Architects, Product Owners and Tech Leads to define and scope work into delivery roadmaps.

-Ensure traceability and alignment of execution to critical priorities. Drive translation of business requirements into solution intent. This includes requirements identification and clarification, project goals and objective definition, scoping, estimation, risk assessment.

-Anticipate needs, operate with a sense of urgency, and have the ability to adapt to change quickly.

-Fill resource gaps with hands-on work as needed.

-Should have had the ability to write this job description better than us :)

Qualifications:

-Bachelor's Degree (Advanced Degree Preferred) in engineering, mathematics, computer science, statistics, physics, economics, operations research or related field;graduate study extremely helpful.

-Minimum of 7 years Tech Lead / Product Management / Project Management / Consulting experience, preferably in Data Warehousing, Big Data and Analytics.

-Experience with customers: managing consultant utilization, milestone success, setting and managing expectations, controlling outcomes and resolving customer issues.

-Understanding of data warehousing technologies and evolution, including relational databases (SQL, Oracle) and big data technology (Cloud, AWS, Hadoop), business intelligence and analytical tools (Tableau, Python, R), architectural strategies and previous hands on development and engineering (SDLC and Agile).

-Strong communication, presentation and meeting facilitation skills. Ability to positively represent yourself and the team.

-Must be a team-player and be able to work closely with technical teams and business users.

-Capable of building strong relationships with leaders across enterprise for both sources of data and consumers.

-The ability to travel ("25%) and to attend in-person client meetings.

-Expert in MS Office.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Position Summary

The dx Team has responsibility for Data Engineering part of Comcast. One of the major goal is to harmonize the data ingestion and consumption layer across Comcast. Creating enterprise data sources as a single version of truth is a goal of dx Team.

With moderate guidance, the Big Data Software Developer will develop (code/program), test, and debug ETL (Extract/Transform/Load) of data to answer technically challenging business requirements (complex transformations, high data volume). All work needs to be documented part of release management.

Employees at all levels are expect to:

-Understand our Operating Principles; make them the guidelines for how you do your job

-Own the customer experience-think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services

-Know your stuff-be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences

-Win as a team-make big things happen by working together and being open to new ideas

-Be an active part of the Net Promoter System-a way of working that brings more employee and customer feedback into the company-by joining huddles, making call backs and helping us elevate opportunities to do better for our customers

-Drive results and growth

-Respect and promote inclusion and diversity

-Do what's right for each other, our customers, investors and our communities

Core Responsibilities

-Play a key role at a senior-level engineer by implementing a solid, robust, extensible design that supports key business flow.

-Analyzes and determines data integration needs.

-Evaluates and plans software designs, test results and technical manuals using Big Data (Hadoop) ecosystem

-Build and maintain optimized ETL solutions to process/load source systems data into Hadoop using Sqoop or Microservices leveraging Hadoop tools.

-Reviews literature, current practices relevant to the solution of assigned projects in Data Warehousing/ Data Lake and Reporting areas

-Programs new software using Spark, Scala, Kafka, Sqoop, SQL

-Supports existing and new applications and customization of current applications

-Edits and reviews technical requirements documentation

-Displays knowledge of software engineering methodologies, concepts, skills and their application in the area of specified engineering specialty (like Data warehousing)

-Displays knowledge of, and ability to apply, process software design and redesign skills

-Displays in-depth knowledge of, and ability to apply, project management skills

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required

-Consistent exercise of independent judgment and discretion in matters of significance

-Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary

-Other duties and responsibilities as assigned

Education Level

Bachelor's Degree or Equivalent

Years of Experience

2-5 years of experience working as data integration developer, SQL Developer, ETL developer or Java/C# Developer or related experience is required.

Field of Study

Computer Science, Engineering

Compliance

Comcast is an EEO/AA/Drug Free Workplace

Disclaimer

The above information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications

Additional Information

Daily Responsibilities:

Technologies used day to day

Linux, Hadoop, Spark, UC4, SQL, Linux Shell Scripting, BI reporting tools

Business Units -what group(s) does the role support

dx and dx business partner initiatives

Paired Programming/Vs. Individual tasks-what does that look like

40% paired programming / Vs. 60% Individual tasks

Business Purpose

Describe the core impact of this role and the team -project details

Ingest data from various data sources to create a harmonized layer of data

The dx Team has responsibility including Data Engineering for Comcast; one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Philadelphia, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Description

Softwareengineering leadership and data science skills, combined with the demands of a highly-visible enterprise metadata repository, make this an exciting challenge for the right candidate.

Are you passionate about digital media, entertainment, and software services? Do you likebigchallenges and working within ahighly-motivatedteam environment?

As aSenior Manager in the Metadata Engineeringgroup of the Data Experience (DX) team at Comcast, you will drive the development,deployment, and support of large-scale Metadata platforms using real-time distributed computing architectures.You willalso employ your skills to promote positive changes in our work culture and practices that will improve our productivity, ingenuity, agility, and software development maturity.TheDXdatateam is afast-moving team of world-class experts who are innovating in end-to-end data delivery and analytics. We are a team that thrives onbigchallenges, results, quality, and agility.

Who will you be working with?

DX MetadataEngineering is a diverse collection of professionals who work with a variety of teams ranging from: other software engineering teams whose Metadata repositories integrate with the Centralized Metadata Repository, Portal engineers who develop a UI to support data discovery, software engineers on other DX platforms that ingest, transform, and retrieve data whose metadata is stored in the Centralized Metadata Store, data stewards/data architects who collect and disseminate metadata information,and users who rely on Metadata for data discovery.

What are some interesting problems you'll be working on?

You will manage the design and development of a metadata and business glossary collection and, enrich the system that allows real-time update of the enterprise and satellite metadata repositories using best-of-breed and industry-leading technologies. These repositories contain metadata and lineage for a widely diverse and ever-growing complement of datasets (e.g., Hortonworks, AWS S3, Streaming Data (Kafka/kinesis), streaming data transformation, ML pipeline, Teradata, and RDBMS's). You will lead the design and development of cross-domain cross-platform lineage tooling by using advanced statistical methods and Machine Intelligence algorithms. You will manage development tools to discover data across disparate metadata repositories, develop tools for data governance, and implement processes to rationalize data across the repositories.

Where can you make an impact?

The dx Teamis building the enterprise metadata repository needed to drive the next generation ofdata platforms and data processing capabilities.Building data products, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robustmetadata repository capable of providing insightsinto the data and its lineage.

Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

-Responsible for managing all metadata assets, applications, and supporting processes.

-Closely work with Architects, Product Owners and Solution Engineers to understand product requirements, understand architectural recommendations, and work with solution engineers to develop a viable solution.

-Guide the Metadata Engineering team in identifying product and technical requirements.

-Ensure products and projects are delivered as a roadmap within the agreed budget and time.

-Serve as primary point of contact and liaison between Metadata Engineering and other teams.

-Closely monitor metadata eco-system to ensure that each metadata asset is performing as per SLA and continuously delivering business value.

-Responsible for preparing team budgets, roadmaps, and operational objectives and ensuring operational plans are aligned with business objectives.

-Responsible for selection and recruitment of resources and work to ensure a high-quality stream of candidates in our talent pipeline

-Experience in hiring and managing teams.

-Experienced in managing projects with competing priorities

-Ensure that direct reports keep current with technological developments within the industry.

-Monitor and evaluate competitive applications and products.

-Drive a culture of continuous improvement and innovation

-Pro-actively work to mitigate risks to performance and delivery of our teams

-Promote solutions for integrating metadata and data quality processes into agile methodologies

-Promote blameless post-mortems and ensure that all post-mortem activities are acted upon

Here are some ofthespecific technologies we use:

-Metadata Repositories-Apache Atlas and Informatica MDM

-Spark(AWS EMR, Databricks)

-Kafka, AWS Kinesis

-AWS Glue, AWS Lambda

-Cassandra, RDBMS, Teradata, AWS DynamoDB

-Elasticsearch, Solr, Logstash, Kibana

-Java, Scala, Go, Python, R

-Git,Maven, Gradle, Jenkins

-Puppet, Docker, Terraform, Ansible, AWS CloudFormation

-Linux

-Kubernetes

-Manta

-Hadoop (HDFS, YARN, ZooKeeper, Hive), Presto

-Jira

Skills & Requirements:

-7+ years of people leadership experience in a software development environment

-2-4 years of experience in metadata-related projects

-Bachelors or Masters inComputer Science, Statisticsor related discipline

-Demonstrated experience in data management; Certified Data Management Professional (CDMP)-Nice to have

-Experience in software development of large-scale distributed systems including a proven track record of delivering backend systems that participate in a complex ecosystem.

-Experienceinmetadata-related open source frameworks preferred

-Experience in using and contributing to Open Source software preferred.

-ProficientinUnix/Linux environmentspreferred

-Partner withdata analysis, data quality, and reporting teams in establishing the best standards and principles around metadata management

-Excellentcommunicator, able to analyze and articulate complex issues and technologies understandably and engagingly

-Great design and problem-solving skills

-Adaptable, proactive and program ownership

-Keen attention to detail and high level of commitment

-Thrivesin a fast-paced agile environment. Requirements change quickly, and our team needs to constantly adapt to moving targets

-A team player with excellent networking, collaborating and influencing skills

About ComcastDX(Data Experience):

dx(Data Experience) is a results-driven, data platform research and engineering team responsible for the delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization. We have an overarching objective to gather, organize, and make sense of Comcast data with the intention to reveal business and operational insight, discover actionable intelligence, enable experimentation, empower users, and delight our stakeholders. Members of the dx team define and leverage industry best practices, work on extremely large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines.

Our mission is to enable many diverse users with the tools and information to gather, organize, make sense of Comcast data, and make it universally accessible to empower, enable, and transform Comcast into an insight-driven organization.

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Água Branca, Brazil

We are in the process of migrating our [monolithic] application landscape to a microservice architecture.
We also transitioned from fire fighting operations mode to a innovation and devops culture.
In this process a new R&D area emerged which consists of a small team of research and software engineers.


What we are doing



  • Building algorithms that help us with anticipated shipping, purchasing forecast and protects us against system failure.

  • using image recognition to give our users the highest possible convenience and coolest features.

  • using state of the art game engines to build virtual reality into our customer experience.

  • Optimize product search and build data consistency monitoring.

  • Help building a large scale architecture together with entire IT team.


The output will be nothing less than transform the way e-commerce works and to provide sustainable solutions.


What we are looking for


3+ years professional or research experience



  • Comprehensive knowledge in statistics, probability and machine learning.

  • Passion for solving standard and non-standard mathematical problems.

  • Ability to explain data science insights to people with and without quantitative background.

  • Fluency in R, Python, or Julia.

  • Hands on/advanced experience with SQL.

  • Experience in data visualization.

  • Experience using Big Data toolstack (e.g. Hadoop, Hive, Spark, Cassandra, Hbase, or other non-relational DB) is a plus.

  • West Chester, PA

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Responsibilities

Position Summary

This is a senior position within the SE Data Service organization reporting to the Director of Data Experience . The DX group has responsibility for Data Service for Comcast, one of the major goal is to harmonize the data ingestion and consumption layer across Comcast.

Creating the enterprise data sources, EDW, ODS, transaction data source as a single version of truth for data analysis for Comcast business.

The position is especially responsible working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

Collaborate and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution. Prepare and present Gap analysis, for technology and business solution up-to VP level.

Manage and support all project related architect, design, development, deployment of data-oriented integration across platform and projects as matrix organization.

The Data Integration and solution architect must have strong understanding and hands- on working knowledge with following software components: Linux, Shell Scripting, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Flume, Sqoop, Spark, Storm, Kafka, Accumulo, HBase, Java and UC4

-Provides direction for diverse and complex initiatives and is accountable for a variety of tasks to architect and deliver data warehousing solutions that exceed customer expectations in content, usability, accuracy, reliability and performance while assuming a leading role within agile teams (both on-shore and off-shore)

-Interprets business strategy and develops organizational objectives to align with this strategy. Typically manages multiple teams of professionals.

-Development of End to End ETL data integration and solution architecture

-Worked across technical and business team in order to harmonization data asset

-Experience with SQL & BTEQ Scripting: Strong data management and data analysis and performance tuning required.

-Experience with Hadoop Hive, Pig, Flume, Sqoop, Storm, Spark, Kafka, Accumulo, and HBase

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Candidate might have worked as one or more role e.g. Data architect, Data modeler, Data integration developer/architect, ETL developer/architect.

-Experiences solution architect to design, develop and implement ODS, EDW, Data Integration Layer etc.

-Experience with Presto query engine

-Method and Procedure creation related to operation and administration related activity.

-Manging and Co-ordniting Dev/ops Process.

-Conduct Operations readiness and environment compatibility review of any changes prior to deployment.

-Experince Managing Relase and process detail

-Knowledge in data warehousing methodologies and best practices required.

-Strong verbal and written communication skills required.

-Effective interpersonal relations skills, ability to effectively collaborate with others and work as part of a team required.

-Skills in navigating a large organization in order to accomplish results required.

-Ability to initiate and follow through on complex projects of both short and long term duration required.

-Excellent organizational and time management skills required.

-Excellent analytical and problem solving skills required.

-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required.

-Participate on interdepartmental teams to support organizational goals

-Perform other related duties and tasks as assigned

-Punctual, regular, and consistent attendance

Required Skills/Experience:

Advanced Degree in technical discipline and/or management required. 10+ year of experience with data integration, Data Warehouse and ETL Architecture.

-Experience with Teradata and Hadoop Ecosystems.

-Experience working across multiple solution architect, business partner and program and project to harmonize, data platform, process, integration, data asset within Comcast.

-5 + year of experience working as data integration, data solution architect EDW, ODS, ETL architect or similar role required.

-Five to seven years' experience leading data integration, Development of ETL architecture using Linux, Informatica, Teradata, SQL, BTEQ, Hadoop Hive, Pig, Spark, Flume, Sqoop, and UC4

-Experience with AWS services (EMR, Kinesis, S3, redshift, EC2)

-Experience working as solution architect supporting Platform as Service Organization.

-Experience with Presto query engine

-Requires understanding of complete SDLC and Experience with continuous integration, test-driven/behavior-driven development, and agile, scrum development methodologies

-Experience collaborating and partner with technical and business team to present, define, strategies, requirement, road-map, budget, solution.

-Experience presenting strategies, road-map and gap analysis, for technology and business solution up-to VP level

-Manging and Co-ordniting Dev/ops Process.

-Experience designing logical and physical architectures

-Experience managing teams of senior technologists

-Experience working in an Agile development methodology in a data warehouse environment

-Ability to work effectively across organizational boundaries

-Excellent oral, written, analytical, problem solving, and presentation skills

-Manage and Co-ordinate 1-10 matrix resources,

-Experience with mange service and on-shore and off-shore development experience is must

Desired Skills/ Experience

-Telecommunications experience Knowledge of Telecommunication/Cable billing, customer care systems e.g. DST, CSG and AMDOC etc.

-Knowledge of NoSQL platforms;

-Hadoop, Teradata, TOGAF, AWS Certified

Comcast is an EEO/AA/ Drug Free Workplace

Comcast is an EOE/Veterans/Disabled/LGBT employer

  • Hamburg, Deutschland

System Administration | Hamburg | Full-Time




You are a data analyst/scientist and you want to develop a deep understanding of the infrastructure behind Big Data systems? We are offering a cross-functional position where you can use your existing data processing skills and become the driving force behind the operation and further development of our data infrastructure.

You will be part of our System Administration department and work very closely with our data engineers and scientist to design and implement architecture changes for our Big Data systems. Our experienced system administrators will introduce you to our infrastructure. Once you are familiar with our data ecosystem, Puppet setup, Serveradmin and our operational structure you will take over the responsibility for our Big Data infrastructure step by step.

With our data architecture we are processing more than 1,000,000,000 game interaction events day by day. Our Hadoop cluster works on 1,500 CPUs, 5 TB Memory, 1 PB of disk space.



Your profile:



  • Some years professional experience as a Data Analyst, Scientist or Engineer

  • You understand distributed systems (e.g. CAP, Kappa, Lambda) and the interactions happening in the background (e.g. rebalancing)

  • You know stream processing, Data integration and processing (ETL) and can analyze and optimize SQL queries

  • Professional experience in Java and a Scripting language, ideally Python

  • Open and friendly communication style and very good English skills

  • You know your way around a Linux command line

  • Knowledge in server hardware and ideally a configuration management tool ( like Puppet, Chef or Ansible) would be a huge plus



Your mission:



  • Work together with our Data Engineers and Data Scientists

  • Discuss and implement new technologies

  • Maintain the systems, update, monitor, and debug them

  • Automate and improve the environment



Our technologies:



  • The Hadoop ecosystem (HDFS, Hive, Impala, Spark)

  • Stream processing (Kafka, Flink)

  • Custom data applications with Java, PHP and go

  • Jenkins for job scheduling and build processes

  • Debian and Puppet for configuration management

  • Supported by Nagios, Graphite, Grafana and Serveradmin



Why join us?



  • Be part of a great team in an international environment in a healthy and stable growing company

  • Choose your preferred device (Linux, Mac or even Windows) for your comfortable workplace

  • We will actively support your further development and give you all needed resources to evaluate new technologies, participate in open source communities or improve your soft skills

  • Competitive compensation and an atmosphere to empower creative thinking and strong results

  • Exceptional benefits ranging from flawless relocation support to company gym, smartphone or tablet of your own choice for personal use, roof terrace with BBQ and much more



Excited to start your journey with InnoGames and join our dynamic team as a Data Engineer Base Technologies (for DevOps)? We look forward to receiving your application as well as your salary expectations and earliest possible start date through our online application form. Isabella Dettlaff would be happy to answer any questions you may have.

InnoGames, based in Hamburg, is one of the leading developers and publishers of online games with more than 200 million registered players around the world. Currently, more than 400 people from 30 nations are working in the Hamburg-based headquarters. We have been characterized by dynamic growth ever since the company was founded in 2007. In order to further expand our success and to realize new projects, we are constantly looking for young talents, experienced professionals, and creative thinkers.





Isabella


Talent Acquisition Manager




Phone +494078893350

  • London Borough of Ealing, UK

Cognizant BigData Practice:


Big Data service line is part of the Cognizant Digital Business/Artificial Intelligence and Analytics business unit. It is responsible for delivering digital data platforms, smart data lakes on premise/cloud using different Hadoop distributions/NoSQL technologies, Data Virtualization products availble in the market and have built an proprietary Data Platform to deliver end-end data services, automation and framework driven data processing modules. We are a platinum/gold partners for majority of platform vendors which enables us seamless access to all product upgrades, discussions, events etc;


The members of the team work in a variety of industries and use a diverse set of tools to process IoT, Social data, larger operational systems like SAP, Salesforce, Mainframes etc; types of complex data.  Accessing a range of data stored in disparate systems, integrating data and providing the data in necessary formats to perform data mining to answer specific business questions as well as identifying unknown trends and relationships in data. 


Few Snippets:



  • We are one of the fastest growing and revenue generating service lines with in Cognizant

  • More than 170+ Big Data engagements across the globe with over 2000 + Big Data consultants with expertise on various different industry products & distributions

  • 170+ Usecases repository with Solutions for different Digital & IoT usecases

  • Rated as a Leader by Everest’s PEAK matrix in their “Big Data assessment” for  RCGTH, HC, Banking and Insurance domains

  • “Leader”  Gartner Magic Quadrant for Data & Business Analytics Services, Worldwide 


You as senior technologist will be part of the strategic drive to in leading multi-discipline teams through the full delivery lifecycle of complex data products and pipelines with a clear understanding of large-scale data processing and data science solutions, deliver use cases for one of our customers and develop practical solutions and implement them to give this customer a competitive edge with in the enterprise. You will have breadth of experience in both engineering and architecture across technology disciplines and the unique challenges of each, including software development, automated test and quality assurance, data and integration.


Your role


Delivering a high quality and innovative data driven solution to our client’s advance analytical needs. This will include, working within a team on the clients site: understanding the client’s pain, designing an analytical approach, implementing a solution, ensuring it is of high quality, and leading and mentoring multi discipline technology teams.


Key Responsibilities:



  • Work at a client site as a member of an experienced onsite consulting/delivery teams developing & providing architectures for a highly resilient, fault tolerant, horizontally scalable data platform.

  • To build multiple layers with in the platform around Data Ingestion, Integration, Registration, Metadata Management, Error & Auditing, Data Provisioning.

  • Working with Architecture to refine ideas on the Analytics Architecture and help with standards and guidelines.

  • Building a Big Data Engineering community and helping customers/clients to build centre of excellence for engineering.

  • Working closely with the Lead Software Engineers in your area of responsibility to champion the core tenets of engineering at client

  • Ensuring development teams deliver the highest quality data products through the adoption and continuous improvement of patterns and practices, tools and frameworks and processes. 

  • Applying years of experience and expertise to help teams resolve and overcome technical challenges of any size or complexity and assist where necessary to accelerate problem resolution.


Job Requirements  Essential Skills:



  • Proven, Strong Data Processing skillset with experience in Hadoop tools and techniques. For example (not exhaustive):

  • Spark processing, Streaming and performance tuning

  • Kafka Real time messaging

  • HBase modelling and development

  • HDFS file formats partitioning for eg; Parquet, Avro etc;

  • Impala/Hive

  • Unix Shell Scripting

  • Proficiency in Scala

  • Working proficiency in developmental toolsets like Eclipse, IntelliJ

  • Exposure/competence with Agile Development approach

  • Git & Continuous integration tools such as Jenkins, TeamCity

  • Multi-threaded, OOPS Programming

  • Jenkins/Maven

  • FindBugs, Sonar, JUNIT, Performance, Memory Management

  • Strong experience in delivering Data Solutions using Big-Data technologies and cloud platforms.

  • Strong work experience in Azure Cloud platform

  • Expert in at least one of Java, Scala, Python with knowledge of the others

  • Experience in delivering big data solutions using a leading Hadoop distribution like Hortonworks, Cloudera or MapR

  • Knowledge of RDBMS, ETL and Data warehouse technologies 

  • Testing frameworks like Junit, Scalatest, Mock testing

  • Nice to Have Skills.

  • Knowledge and experience in delivering real time solutions on sinks like Cassandra, MongoDB or equivalent NoSQL databases

  • Container technologies like Docker

  • Knowledge of API and Microservice architectures 

  • Knowledge of advanced analytics and insights techniques (e.g. predictive analytics, machine learning, segmentation)

  • Knowledge of deep learning frameworks like TensorFlow, Keras

  • Knowledge of machine learning libraries like MLLIB, sklearn

  • Strong domain experience in retail/e-commerce,  Communications & Technology verticals. 


Qualifications:  University Degree with a specialization in Computer Science, Mathematics 

  • Slough, UK

Job Summary 


AI&A 


Cognizant’s AIA (AI and Analytics) practice is 30,000+ consultants strong and is one of the largest Analytics Practice in the Industry.


Cognizant’s AI solutions are impacting how people learn, eat, shop, travel, work and how businesses provide and receive services. It is accelerating decision making, improving business processes, enhancing user engagement, reducing costs and driving remarkable growth and profitability. Our human-centered, agile approach enables us to deliver superior customer experiences, more intelligent products and smarter business operations in the shortest amount of time.


Your role


Delivering a high quality and innovative data driven solution to our client’s advance analytical needs. This will include, working within a team on the clients site: understanding the client’s pain, designing an analytical approach, implementing a solution, ensuring it is of high quality, and leading and mentoring technologists. The immediate opportunity is to work on Network Analytics for a large telecoms provider where you will be responsible for analysing network and call data sets to discover the topology and call patterns. Based on this, the intent is to derive is to deep customer insights that can be action for better serivce, customer experience and upsell / cross-sell opportunities. It will also require identification of quality issues and remediation to improve resilience. 


Key Responsibilities:



  • Work at a client site as a member of an experienced onsite solutioning / engineerring team developing & providings solutions for advanced analytics platform 

  • Analyse requirements, perform data collection, cleansing, transformation, and loading to populate facts and dimensions for data warehouse

  • Working with Architecture to refine ideas on the Analytics Architecture and help with standards and guidelines.

  • Build multiple layers with in the platform around Data Ingestion, Integration, Registration, Metadata Management, Error & Auditing, Data Provisioning.

  • Select the appropriate AI / ML algorithms to analyse the data sets to classify, derive insights, predict and create real t- time actionable insights  

  • Ensure development teams deliver the highest quality data products through the adoption and continuous improvement of patterns and practices, tools and frameworks and processes. 

  • Apply experience and expertise to help teams resolve and overcome technical challenges and assist where necessary to accelerate problem resolution.


Job Requirements 



  • Hands on programming role and should be able to submit a SPARK programming assignment as part of pre – screening 

  • Deep expertise in Java, Scala, Python, R, SPARK, MATLAB etc. 

  • Experience in design and coding enterprise class applications and seeing them through the entire SDLC 

  • In depth understanding of large-scale data, such as data warehouses and their best practice and principles of managing them

  • Capacity of designing solutions around SPARK, with advanced skills in high performance and parallelism 

  • Experience of integration projects on large-scale environments

  • Experience of SPARK ML technical design, development and support

  • Excellent JSON, XML skills and understanding of different Graph DB, RDBMS and Columnar DB 

  • Experience in developing applications for the cloud

  • Excellent verbal and written communication skills

  • Proven experience in working in complex Agile environment


Nice to Have Skills.



  • Knowledge of Telecom industry 

  • Experience in Network Analytics


Qualifications: 



  • University Degree with a specialization in Computer Science, Mathematics or Physics 

  • London, UK

Facebook's mission is to give people the power to build community and bring the world closer together. Through our family of apps and services, we're building a different kind of company that connects billions of people around the world, gives them ways to share what matters most to them, and helps bring people closer together. Whether we're creating new products or helping a small business expand its reach, people at Facebook are builders at heart. Our global teams are constantly iterating, solving problems, and working together to empower people around the world to build community and connect in meaningful ways. Together, we can help people build stronger communities - we're just getting started.

We're looking for Data Scientists to work on Community Integrity. Community Integrity protects our 2B users across the globe from violations such as spam, hate speech, terrorist propaganda and harassment. We do so through advanced detection tools that scan billions of photos, videos, and posts to identify signs of violations. Our Data Scientists partner with ML Engineers on developing features for our ML algorithms, testing, and evaluating their performance. Data Scientists also partner with Market Research to develop an ecosystem view of violations on our platform, where we should focus, and how we can address these violations through a range of responses.

You will enjoy working with one of the strongest data sets in the world, cutting edge technology, and the ability to see your insights turned into real products on a regular basis. You will make a real and meaningful impact on the world. The perfect candidate will have a background in a quantitative or technical field, will have experience working with strong data sets, and will have some experience in data-driven decision making. We are looking for a candidate who is comfortable tackling big problems and cutting through the fog to develop a clear set of product investment priorities. This position is located in our London office.

Competitive Salary including the following benefits apply: Medical Benefits Dental Benefits Vision Benefits Pension Benefits Life Assurance Childcare Benefits Gym Benefits Transport benefits Laundry Benefit

Responsibilities

  • Apply your expertise in quantitative analysis, data mining, and the presentation of data to see beyond the numbers and understand how our users interact with both our consumer and business products
  • Partner with Product and Engineering teams to solve problems and identify trends and opportunities
  • Inform, influence, support, and execute our product decisions and product launches
  • The Data Scientist Analytics role has work across the following four areas:
    <br>
    Product Operations
    • Forecasting and setting product team goals
    • Designing and evaluating experiments, including complex network experiments to evaluate the impact of new detection algorithms on the FB ecosystem
    • Monitoring key product metrics, understanding root causes of changes in metrics
    • Building and analysing dashboards and reports
    • Building key data sets to empower operational and exploratory analysis
  • Evaluating and defining metrics
    <br>
    Exploratory Analysis
  • Proposing what to build in the next roadmap
  • Understanding ecosystems, user behaviours, and long-term trends
  • Identifying new levers to help move key metrics
  • Building models of user behaviours for analysis or to power production systems
    <br>
    Product Leadership
    • Influencing product teams through presentation of data-based recommendations
    • Communicating state of business, experiment results, etc. to product teams
    • Spreading best practices to analytics and product teams
      <br>
  • Data Infrastructure
    • Working in Hadoop and Hive primarily, sometimes MySQL, Oracle, and Vertica
    • Automating analyses and authoring pipelines via SQL and Python based ETL framework

Minimum Qualifications

  • 5+ years of experience doing quantitative analysis.
  • BA/BS in Computer Science, Math, Physics, Engineering, Statistics or other technical field. Advanced degrees.
  • Experience in SQL or other programming languages.
  • Ability to communicate the results of analyses.
  • Understanding of statistics (e.g., hypothesis testing, regressions).
  • Experience manipulating data sets through statistical software (ex. R, SAS) or other methods.

Preferred Qualifications

  • Experience with distributed computing (Hive/Hadoop)
  • Development experience in any scripting language (PHP, Python, Perl, etc.)
  • Domain experience in the adversarial space, including risk/fraud management
  • Hamburg, Deutschland

Node.js Developer (f/m)


Hamburg, Germany


We are looking for a Node.js Developer to join our backend engineering team. The ideal candidate is a hands-on technology enthusiast with significant experience in developing scalable data platforms. You should be a good team player with critical thinking, a keen eye for detail and strong problem solving skills.


Essential



  • Demonstrated experience in Node.js and Typescript

  • Experience with SQL Databases

  • Experience with test-driven development and automated testing frameworks

  • Familiar with Scrum/Agile development methodologies

  • Experience with building APIs and services using REST


Great if you have



  • Experience with NoSQL technologies like MongoDB, ElasticSearch and Redis

  • Experience with infrastructure automation technologies like Docker

  • Experience using Streams and Message Queues like Nats and Kafka

  • Experience with monitoring technologies like Grafana, Prometheus and Logstash

  • Experience with C# or Scala is a plus


Our Offer



  • An international and dynamic company culture in the city center of Hamburg, the Northern capital of Germany

  • Responsibility from Day One in a motivated team

  • Flexible working hours, grants for public transport, free drinks & fruits, cruise & shopping discounts, company sports, team & company events on a regular basis