OnlyDataJobs.com

SafetyCulture
  • Surry Hills, Australia
  • Salary: A$120k - 140k

The Role



  • Be an integral member on the team responsible for design, implement and maintain distributed big data capable system with high-quality components (Kafka, EMR + Spark, Akka, etc).

  • Embrace the challenge of dealing with big data on a daily basis (Kafka, RDS, Redshift, S3, Athena, Hadoop/HBase), perform data ETL, and build tools for proper data ingestion from multiple data sources.

  • Collaborate closely with data infrastructure engineers and data analysts across different teams, find bottlenecks and solve the problem

  • Design, implement and maintain the heterogeneous data processing platform to automate the execution and management of data-related jobs and pipelines

  • Implement automated data workflow in collaboration with data analysts, continue to improve, maintain and improve system in line with growth

  • Collaborate with Software Engineers on application events, and ensuring right data can be extracted

  • Contribute to resources management for computation and capacity planning

  • Diving deep into code and constantly innovating


Requirements



  • Experience with AWS data technologies (EC2, EMR, S3, Redshift, ECS, Data Pipeline, etc) and infrastructure.

  • Working knowledge in big data frameworks such as Apache Spark, Kafka, Zookeeper, Hadoop, Flink, Storm, etc

  • Rich experience with Linux and database systems

  • Experience with relational and NoSQL database, query optimization, and data modelling

  • Familiar with one or more of the following: Scala/Java, SQL, Python, Shell, Golang, R, etc

  • Experience with container technologies (Docker, k8s), Agile development, DevOps and CI tools.

  • Excellent problem-solving skills

  • Excellent verbal and written communication skills 

ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Sr. Analytics Analyst will be part of the Production, Drilling, and Projects Analytics Services Team within the Analytics Innovation Center of Excellence that enables data analytics across the ConocoPhillips global enterprise. This role works with business units and global functions to help strategically design, implement, and support data analytics solutions. This is a full-time position that provides tremendous career growth potential within ConocoPhillips.
Responsibilities May Include
  • Complete end to end delivery of data analytics solutions to the end user
  • Interacting closely with both business and developers while gathering requirements, designing, testing, implementing and supporting solutions
  • Gather business and technical specifications to support analytic, report and database development
  • Collect, analyze and translate user requirements into effective solutions
  • Build report and analytic prototypes based on initial business requirements
  • Provide status on the issues and progress of key business projects
  • Providing regular reporting on the performance of data analytics solutions
  • Delivering regular updates and maintenance on data analytics solutions
  • Championing the data analytics solutions and technologies at ConocoPhillips
  • Integrate data for data models used by the customers
  • Deliver Data Visualizations used for data driven decision making
  • Provide strategic technology direction while supporting the needs of the business
Basic/Required
  • Legally authorized to work in the United States
  • 5+ years of related IT experience
  • 5+ year of Structure Querying Language experience (ANSI SQL, T-SQL, PL/SQL)
  • 3+ years hands-on experience delivering solutions with an Analytics Tools i.e. (Spotfire, SSRS, Power BI, Tableau, Business Objects)
Preferred
  • Bachelor's Degree in Information Technology or Computer Science
  • 5+ years of Oil and Gas Industry experience
  • 5+ years hands-on experience delivering solutions with Informatica PowerCenter
  • 5+ years architecting data warehouses and/or data lakes
  • 5+ years with Extract Transform and Load (ETL) tools and best practices
  • 3+ years hands-on experience delivering solutions with Teradata
  • 1+ years developing analytics models with R or Python
  • 1+ years developing visualizations using R or Python
  • Experience with Oracle (11g, 12c) and SQL Server (2008 R2, 2010, 2016) and Teradata 15.x
  • Experience with Hadoop technologies (Hortonworks, Cloudera, SQOOP, Flume, etc.)
  • Experience with AWS technologies (S3, SageMaker, Athena, EMR, Redshift, Glue, etc.)
  • Thorough understanding of BI/DW concepts, proficient in SQL, and data modeling
  • Familiarity with ETL tools (Informatica, etc.) and ETL processes
  • Solutions oriented individual; learn quickly, understand complex problems, and apply useful solutions
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to work as a team player
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Strong analytical, troubleshooting, and problem-solving skills experience in analyzing and understanding business/technology system architectures, databases, and client applications to recognize, isolate, and resolve problems
  • Demonstrates the desire and ability to learn and utilize new technologies in data analytics solutions
  • Strong communication and presentation skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 20, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 13, 2019, 4:56:49 PM
Comcast
  • Englewood, CO

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Job Summary:

Software engineering skills combined with the demands of a high volume, highly-visible analytics platform make this an exciting challenge for the right candidate.

Are you passionate about digital media, entertainment, and software services? Do you like big challenges and working within a highly motivated team environment?

As a software engineer in the Data Experience (DX) team, you will research, develop, support, and deploy solutions in real-time distributing computing architectures. The DX big data team is a fast-moving team of world-class experts who are innovating in providing user-driven, self-service tools for making sense and making decisions with high volumes of data. We are a team that thrives on big challenges, results, quality, and agility.

Who does the data engineer work with?

Big Data software engineering is a diverse collection of professionals who work with a variety of teams ranging from other software engineering teams whose software integrates with analytics services, service delivery engineers who provide support for our product, testers, operational stakeholders with all manner of information needs, and executives who rely on big data for data backed decisioning.

What are some interesting problems you'll be working on?

Develop systems capable of processing millions of events per second and multi-billions of events per day, providing both a real-time and historical view into the operation of our wide-array of systems. Design collection and enrichment system components for quality, timeliness, scale and reliability. Work on high-performance real-time data stores and a massive historical data store using best-of-breed and industry-leading technology.

Where can you make an impact?

Comcast DX is building the core components needed to drive the next generation of data platforms and data processing capability. Running this infrastructure, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robust big data architecture capable of providing insights that would otherwise be drowned in an ocean of data.

Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

  • Develop solutions to big data problems utilizing common tools found in the ecosystem.
  • Develop solutions to real-time and offline event collecting from various systems.
  • Develop, maintain, and perform analysis within a real-time architecture supporting large amounts of data from various sources.
  • Analyze massive amounts of data and help drive prototype ideas for new tools and products.
  • Design, build and support APIs and services that are exposed to other internal teams
  • Employ rigorous continuous delivery practices managed under an agile software development approach
  • Ensure a quality transition to production and solid production operation of the software

Skills & Requirements:

  • 5+ years programming experience
  • Bachelors or Masters in Computer Science, Statistics or related discipline
  • Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem.
  • Experience working on big data platforms in the cloud or on traditional Hadoop platforms
  • AWS Core
  • Kinesis
  • IAM
  • S3/Glacier
  • Glue
  • DynamoDB
  • SQS
  • Step Functions
  • Lambda
  • API Gateway
  • Cognito
  • EMR
  • RDS/Auora
  • CloudFormation
  • CloudWatch
  • Languages
  • Python
  • Scala/Java
  • Spark
  • Batch, Streaming, ML
  • Performance tuning at scale
  • Hadoop
  • Hive
  • HiveQL
  • YARN
  • Pig
  • Scoop
  • Ranger
  • Real-time Streaming
  • Kafka
  • Kinesis
  • Data File Formats:
  • Avro, Parquet, JSON, ORC, CSV, XML
  • NoSQL / SQL
  • Microservice development
  • RESTful API development
  • CI/CD pipelines
  • Jenkins / GoCD
  • AWS
    • CodeCommit
    • CodeBuild
    • CodeDeploy
    • CodePipeline
  • Containers
  • Docker / Kubernetes
  • AWS
    • Lambda
    • Fargate
    • EKS
  • Analytics
  • Presto / Athena
  • QuickSight
  • Tableau
  • Test-driven development/test automation, continuous integration, and deployment automation
  • Enjoy working with data data analysis, data quality, reporting, and visualization
  • Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly.
  • Great design and problem solving skills, with a strong bias for architecting at scale.
  • Adaptable, proactive and willing to take ownership.
  • Keen attention to detail and high level of commitment.
  • Good understanding in any: advanced mathematics, statistics, and probability.
  • Experience working in agile/iterative development and delivery environments. Comfort in working in such an environment. Requirements change quickly and our team needs to constantly adapt to moving targets.

About Comcast DX (Data Experience):

Data Experience(DX) is a results-driven, data platform research and engineering team responsible for the delivery of multi-tenant data infrastructure and platforms necessary to support our data-driven culture and organization. The mission of DX is to gather, organize, make sense of Comcast data, and make it universally accessible to empower, enable, and transform Comcast into an insight-driven organization. Members of the DX team define and leverage industry best practices, work on extremely large-scale data problems, design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research, engineer, and apply data science and machine intelligence disciplines

Comcast is an EOE/Veterans/Disabled/LGBT employer

Applied Resource Group
  • Atlanta, GA

Applied Resource Group is seeking a talented and experienced Data Engineer for our client, an emerging leader in the transit solutions space. As an experienced Data Engineer on the Data Services team, you will lead the design, development and maintenance of comprehensible data pipelines and distributed systems for data extraction, analysis, transformation, modelling and visualization. They're looking for independent thinkers that are passionate about technology and building solutions that continually improve the customer experience. Excellent communication skills and the ability to work collaboratively with teams is critical.
 

Job Duties/Responsibilities:

    • Building a unified data services platform from scratch, leveraging the most suitable Big Data tools following technical requirements and needs
    • Exploring and working with cutting edge data processing technologies
    • Work with distributed, scalable cloud-based technologies
    • Collaborating with a talented team of Software Engineers working on product development
    • Designing and delivering BI solutions to meet a wide range of reporting needs across the organization
    • Providing and maintaining up to date documentation to enable a clear outline of solutions
    • Managing task lists and communicating updates to stakeholders and team members following Agile Scrum methodology
    • Working as a key member of the core team to support the timely and efficient delivery of critical data solutions

 
Experience Needed:
 

    • Experience with AWS technologies are desired, especially those used for Data Analytics, including some of these: EMR, Glue, Data Pipelines, Lambda, Redshift, Athena, Kinesis, Elasticache, Aurora
    • Minimum of 5 years working in developing and building data solutions
    • Experience as an ETL/Data warehouse developer with knowledge in design, development and delivery of end-to-end data integration processes
    • Deep understanding of data storage technologies for structured and unstructured data
    • Background in programming and knowledge of programming languages such as Java, Scala, Node.js, Python.
    • Familiarity with cloud services (AWS, Azure, Google Cloud)
    • Experience using Linux as a primary development environment
    • Knowledge of Big data systems - Hadoop, pig, hive, shark/spark etc. a big plus.
    • Knowledge of BI platforms such as Tableau, Jaspersoft etc.
    • Strong communication and analytical skills
    • Capable of working independently under the direction of the Head of Data Services
    • Excellent communication, analytical and problem-solving skills
    • Ability to initially take direction and then work on own initiative
    • Experience working in AGILE

 
Nice-to-have experience and skills:

    • Masters in Computer-Science, Computer Engineering or equivalent  
    • Building data pipelines to perform real-time data processing using Spark Streaming and Kafka, or similar technologies.
Perficient, Inc.
  • Dallas, TX

At Perficient, youll deliver mission-critical technology and business solutions to Fortune 500 companies and some of the most recognized brands on the planet. And youll do it with cutting-edge technologies, thanks to our close partnerships with the worlds biggest vendors. Our network of offices across North America, as well as locations in India and China, will give you the opportunity to spread your wings, too.

Were proud to be publicly recognized as a Top Workplace year after year. This is due, in no small part, to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned, driven, and fulfilled.

About Our Data Governance Practice:


We provide exceptional data integration services in the ETL, Data Catalog, Data Quality, Data Warehouse, Master Data Management (MDM), Metadata Management & Governance space.

Perficient currently has a career opportunity for a Python Developer who resides in the vicinity of Jersey City, NJ or Dallas,TX.

Job Overview:

As a Python developer, you will participate in all aspects of the software development lifecycle which includes estimating, technical design, implementation, documentation, testing, deployment, and support of application developed for our clients. As a member working in a team environment, you will take direction from solution architects and Leads on development activities.


Required skills:

  • 6+ years of experience in architecting, building and maintaining software platforms and large-scale data infrastructures in a commercial or open source environment
  • Excellent knowledge of Python
  • Good knowledge of and hands on experience working with quant/data Python libraries (pandas/numpy etc)
  • Good knowledge of and hands on experience designing APIs in Python (using Django/Flask etc)

Nice to have skills (in the order of priority):

  • Comfortable and Hands on experience with AWS cloud (S3, EC2, EMR, Lambda, Athena, QuickSight etc.) and EMR tools (Hive, Zeppelin etc)
  • Experience building and optimizing big data data pipelines, architectures and data sets.
  • Hands on experience in Hadoop MapReduce or other big data technologies and pipelines (Hadoop, Spark/pyspark, MapReduce, etc.)
  • Bash Scripting
  • Understanding of Machine Learning and Data Science processes and techniques
  • Experience in Java / Scala


Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities, and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues with great benefits are just part of what makes Perficient a great place to work.

ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Sr. Analytics Analyst will be part of the Production, Drilling, and Projects Analytics Services Team within the Analytics Innovation Center of Excellence that enables data analytics across the ConocoPhillips global enterprise. This role works with business units and global functions to help strategically design, implement, and support data analytics solutions. This is a full-time position that provides tremendous career growth potential within ConocoPhillips.
Responsibilities May Include
  • Complete end to end delivery of data analytics solutions to the end user
  • Interacting closely with both business and developers while gathering requirements, designing, testing, implementing and supporting solutions
  • Gather business and technical specifications to support analytic, report and database development
  • Collect, analyze and translate user requirements into effective solutions
  • Build report and analytic prototypes based on initial business requirements
  • Provide status on the issues and progress of key business projects
  • Providing regular reporting on the performance of data analytics solutions
  • Delivering regular updates and maintenance on data analytics solutions
  • Championing the data analytics solutions and technologies at ConocoPhillips
  • Integrate data for data models used by the customers
  • Deliver Data Visualizations used for data driven decision making
  • Provide strategic technology direction while supporting the needs of the business
Basic/Required
  • Legally authorized to work in the United States
  • 5+ year of Structure Querying Language experience (ANSI SQL, T-SQL, PL/SQL)
  • 5+ years hands-on experience delivering solutions with an Analytics Tools i.e. (Spotfire, SSRS, Power BI, Tableau, Business Objects)
  • 5+ years of Oil and Gas Industry experience
  • 7+ years of related IT experience
Preferred
  • Bachelor's Degree in Information Technology or Computer Science
  • 5+ years hands-on experience delivering solutions with Informatica PowerCenter
  • 5+ years architecting data warehouses and/or data lakes
  • 5+ years with Extract Transform and Load (ETL) tools and best practices
  • 3+ years hands-on experience delivering solutions with Teradata
  • 1+ years developing analytics models with R or Python
  • 1+ years developing visualizations using R or Python
  • Experience with Oracle (11g, 12c) and SQL Server (2008 R2, 2010, 2016) and Teradata 15.x
  • Experience with Hadoop technologies (Hortonworks, Cloudera, SQOOP, Flume, etc.)
  • Experience with AWS technologies (S3, SageMaker, Athena, EMR, Redshift, Glue, etc.)
  • Thorough understanding of BI/DW concepts, proficient in SQL, and data modeling
  • Familiarity with ETL tools (Informatica, etc.) and ETL processes
  • Solutions oriented individual; learn quickly, understand complex problems, and apply useful solutions
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to work as a team player
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Strong analytical, troubleshooting, and problem-solving skills experience in analyzing and understanding business/technology system architectures, databases, and client applications to recognize, isolate, and resolve problems
  • Demonstrates the desire and ability to learn and utilize new technologies in data analytics solutions
  • Strong communication and presentation skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 18, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 5, 2019, 3:45:16 PM
The Wellcome Trust Sanger Institute
  • Cambridge, UK




Salary range: £36,000-£44,000 per annum depending on experience plus excellent benefits. Fixed Term Contract for 3 Years.

Open Targets has recently launched Open Targets Genetics (https://genetics.opentargets.org), a portal that aggregates large scale GWAS data with functional genomics data to identify potential drug targets at disease-associated loci.

A Statistical Geneticist role funded by Open Targets, is available at the Wellcome Sanger Institute in a new team under the leadership of Dr. Maya Ghoussaini. This is an exciting opportunity for you to participate in the enhancement of the existing Open Targets Genetics Portal through de development of new functionality and features.

You will actively engage in the integration of new eQTL datasets and tissue-specific chromatin interaction datasets.

You will have the opportunity to work across a range of analysis such as:
  • Aggregate large scase GWAS data from multiple consortia and across a wide range of disease and traits.
  • Perform association analysis on UK Biobank data with a particular focus on therapeutic areas important for Open Targets
  • Work together with other members of the Open Targets team on statistical genetics analysis for large scale sequence analysis
  • Work with existing members of the team to integrate genetic and cell-specific genomic data to identify and validate causal links between targets and diseases and improve the Genetics Portal.


We welcome candidates with a background in statistical genetics or relevant discipline with advanced level of programming skills suitable for statistical genetic analyses of complex diseases. Experience in functional genomics data analysis is highly desirable. You will have the opportunity to interact with active computational and experimental research teams using cutting edge genomic techniques.

Essential Skills

  • PhD in Statistical Genetics, Computational Biology or a closely related discipline.
  • Advanced level programming skills suitable for statistical genetic analyses, such as R, Python, MATLAB.
  • Firm grounding in statistical methods of complex disease genetics such as genome wide association studies, fine-mapping, high-throughput expression data, whole exome/genome sequencing, PheWAS, Mendelian Randomisation.
  • Previous experience in working with large-scale genetic datasets.
  • Ability to work to tight timelines.
  • Demonstrable good project management and organisational skills.
  • Fluent in written and spoken English.
  • Ability to communicate ideas and results effectively.
  • Ability to work independently and organise own workload.


Ideal Skills

  • Experience in functional genomics data analysis (RNAseq, ChIPseq, etc);
  • Experience with generating reproducible bioinformatics pipelines;
  • A strong track record in preparing publications and other written materials;
  • Interest in target validation and translational research.


Other information



Open Targets is a pioneering public-private initiative between GlaxoSmithKline (GSK), Biogen, Takeda, Celgene, Sanofi, EMBL-EBI (European Bioinformatics Institute) and the WSI (Wellcome Sanger Institute), located on the Wellcome Genome Campus in Hinxton, near Cambridge, UK.

Open Targets aims to generate evidence on the biological validity of therapeutic targets and provide an initial assessment of the likely effectiveness of pharmacological intervention on these targets, using genome-scale experiments and analysis. Open Targets aims to provide an R&D framework that applies to all aspects of human disease, to improve the success rate for discovering new medicines and share its data openly in the interests of accelerating drug discovery.

Genome Research Limited is an Equal Opportunity employer. As part of our dedication to gender equality and promoting women's careers in science, we hold an Athena SWAN Bronze Award. We will consider all qualified applicants without discrimination on grounds of disability, sexual orientation, pregnancy or maternity leave status, race or national or ethnic origin, age, religion or belief, gender identity or re-assignment, marital or civil partnership status, protected veteran status (if applicable) or any other characteristic protected by law.

Please include a covering letter and CV with your application

Closing date: 28th February, however applications will be reviewed on an ongoing basis and therefore the post may be filled before the deadline.
ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Sr. Analytics Analyst will be part of the Production, Drilling, and Projects Analytics Services Team within the Analytics Innovation Center of Excellence that enables data analytics across the ConocoPhillips global enterprise. This role works with business units and global functions to help strategically design, implement, and support data analytics solutions. This is a full-time position that provides tremendous career growth potential within ConocoPhillips.
Responsibilities May Include
  • Complete end to end delivery of data analytics solutions to the end user
  • Interacting closely with both business and developers while gathering requirements, designing, testing, implementing and supporting solutions
  • Gather business and technical specifications to support analytic, report and database development
  • Collect, analyze and translate user requirements into effective solutions
  • Build report and analytic prototypes based on initial business requirements
  • Provide status on the issues and progress of key business projects
  • Providing regular reporting on the performance of data analytics solutions
  • Delivering regular updates and maintenance on data analytics solutions
  • Championing the data analytics solutions and technologies at ConocoPhillips
  • Integrate data for data models used by the customers
  • Deliver Data Visualizations used for data driven decision making
  • Provide strategic technology direction while supporting the needs of the business
Basic/Required
  • Legally authorized to work in the United States
  • 5+ year of Structure Querying Language experience (ANSI SQL, T-SQL, PL/SQL)
  • 5+ years hands-on experience delivering solutions with an Analytics Tools i.e. (Spotfire, SSRS, Power BI, Tableau, Business Objects)
  • 5+ years of Oil and Gas Industry experience
  • 7+ years of related IT experience
Preferred
  • Bachelor's Degree in Information Technology or Computer Science
  • 5+ years hands-on experience delivering solutions with Informatica PowerCenter
  • 5+ years architecting data warehouses and/or data lakes
  • 5+ years with Extract Transform and Load (ETL) tools and best practices
  • 3+ years hands-on experience delivering solutions with Teradata
  • 1+ years developing analytics models with R or Python
  • 1+ years developing visualizations using R or Python
  • Experience with Oracle (11g, 12c) and SQL Server (2008 R2, 2010, 2016) and Teradata 15.x
  • Experience with Hadoop technologies (Hortonworks, Cloudera, SQOOP, Flume, etc.)
  • Experience with AWS technologies (S3, SageMaker, Athena, EMR, Redshift, Glue, etc.)
  • Thorough understanding of BI/DW concepts, proficient in SQL, and data modeling
  • Familiarity with ETL tools (Informatica, etc.) and ETL processes
  • Solutions oriented individual; learn quickly, understand complex problems, and apply useful solutions
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to work as a team player
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Strong analytical, troubleshooting, and problem-solving skills experience in analyzing and understanding business/technology system architectures, databases, and client applications to recognize, isolate, and resolve problems
  • Demonstrates the desire and ability to learn and utilize new technologies in data analytics solutions
  • Strong communication and presentation skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of February 18, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Feb 5, 2019, 3:45:16 PM
Mobimeo GmbH
  • Berlin, Deutschland

Our Mobimeo Engineering Team:


We are working together in agile autonomous cross-functional teams that take full responsibility for their own part of the Mobimeo ecosystem.


We minimize reinventing the wheel and maximize using open-source (Play!, Akka, Spring, Elastic, Kibana, Zipkin), commercial (StatusCake, Runscope, Datadog, PagerDuty) and AWS offerings as appropriate (API Gateway, S3, Athena, Lambda, EMR, RDS, SES, EC2, CloudWatch)


We believe in continuous delivery through high automation and DevOps culture (Gitlab, Helm, Kubernetes,Terraform, Docker).


At the core of our products are sophisticated ML algorithms which give us unique insights to end-user mobility behavior. Our team is extremely data driven, making extensive use of A/B testing and ML to launch and ramp new experiments.


We use a variety of languages, but most of our reactive services are written in Scala or Kotlin.


We encourage the culture of learning, trust, constructive feedback and career growth.


YOUR RESPONSIBILITIES



  • Build digital mobility applications from the ground up

  • Hire and lead an engineering team that will be responsible for developing one of our products

  • Managing priorities/conflicts, develop and execute mid & long-term technical vision

  • Work alongside product managers, UI/UX designers and data scientists

  • Develop high performance, scalable systems

  • Work across different technologies and platforms


YOUR PROFILE



  • Effective people manager that likes to remain close to code

  • Extensive experience building applications in Java, Scala or Kotlin

  • Deep understanding of software engineering best practices

  • Interest in deploying machine learning based systems

  • DevOps mindset - we automate everything


WHY MOBIMEO



  • Early stage - Build a product from the ground up

  • Help create seamless travel experiences for tens of millions of users

  • Stable - Backed by a leading German transportation company

  • Develop a sophisticated product leveraging bleeding edge technologies

  • Join a diverse and highly experienced engineering team

  • Yearly personnel development budget

  • Weekly team lunches as well as free drinks and snacks

  • A highly collaborative, engineering driven environment

  • Nice office space in Berlin Kreuzberg

Eliassen Group
  • Headquarters: West McLean, VA
Headquarters: West McLean, VA URL: https://www.eliassen.com/ [https://www.eliassen.com/] Eliassen Group is partnering with a client who is the first company to develop and offer mass customization and personalization of credit card products, and they have been innovating relentlessly ever since. Today, they are a nationally recognized brand, a top-ten bank, and a scientific laboratory on a journey to become a leading high-tech company and digital innovator touching millions of customer accounts. Our client is currently seeking a Senior Big Data / Machine Learning Engineer to join their dedicated team. We will accept corp to corp or w2 contractors. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance. Responsibilities of the Senior Big Data / Machine Learning Engineer: * Work cross-functionally in an Agile team to develop innovative software solutions * Design and build cloud based applications - AWS, Azure, GCP * Leverage DevOps practices and tools - CI/CD, TDD, Jenkins, Maven, Git, Docker * Perform comprehensive unit testing and engage in peer reviews to ensure high caliber code Requirements of the Senior Big Data / Machine Learning Engineer: * Bachelor's degree in Computer Science or equivalent technical discipline * Around 5 years of programming (Java/Python, C++, Scala, or Golang) and data streaming or data warehousing experience (Spark, Flink, Kafka; Presto, AWS Athena) * Solid experience developing data pipelines with Apache Spark; Linux-based OS (RHEL); Shell/Python/Perl scripting * Experience in cloud environments, AWS preferred * Experience with columnar data stores and MPP To apply: https://www.eliassen.com/ [https://www.eliassen.com/]
FlixBus
  • München, Germany

Your tasks - Paint the world green



  • You are our truffle hunter, who quickly finds a way through complex datasets and recognizes opportunities to grow and to optimize

  • You filter growth relevant information from our BI/tracking systems and convert complexity into simplicity

  • You analyze our data systems and algorithms for customer relationship management and customer behavior modeling; you solve our complex business challenges with creative insights

  • You use statistical models to answer real-life questions like; “What are the effects of price changes and how to adjust marketing spend (e.g. bidding)?” “What is the optimal marketing budget allocation to maximize revenue and profits?”

  • You work closely with Yield Management, Business Intelligence and Backend Development teams on solutions for data-driven decision making

  • You use machine learning algorithms to forecast current/future business and marketing performance


Your Profile - Ready to hop on board



  • Profound experience in a relevant role, minimum 2 years

  • Bachelor's/Master's degree in International Business, Information Systems, Media Studies, Statistics, Data Science or similar

  • Strong analytical skills and mindset as well as strong communication and presentation skills

  • Hands-on experience with various Machine Learning algorithms and Data Visualization tools (knowledge in Microsoft PowerBI, QlikView or similar is a plus +)

  • Fluency in Python is a must (knowledge in R would be a plus +)

  • Data Analytics libraries (Pandas, Numpy, Matplotlib...)

  • Experience with API usage

  • Experience with Version Control (Git)

  • Experience with SQL & Relational DBs (NoSQL DBs, Big data tools (Spark, Hadoop, etc.) would be a plus +)

  • Knowledge over AWS Stack (Redshift, S3, Athena, Glue...) is a plus

  • •Knowledge of yield management/revenue management as well as know-how in Google Analytics, Webtrekk, Google Tag Manager are a plus +

  • Experience with Agile development framework is a plus +

  • Fluent in English; every other European language is a plus +

eHire
  • Atlanta, GA
Job Description
Data Engineer
Atlanta, GA
Seeking a Data Engineer II who is ready to take the next step in their career. If you are ready to learn, collaborate, solution, and operate a born-in-the-cloud, serverless architecture to deliver the data foundation for a multi-billion revenue company, then we want you!
As a Data Engineer II supporting our clients Platforms, and working within our Scaled Agile Framework, you will be responsible for the delivery of strategic, cloud based, analytics data solutions. A successful candidate is a solid python developer who has a passion for learning new skills and a desire to leverage cloud solutions for enterprise scale data processing and analytics.
Technology Stack: AWS. Services leveraged are S3, Glue (Python), DMS, EMR (Python), Lambda (Python), SMS, Step Functions, RDS (Oracle), Athena, Redshift, Redshift Spectrum. Supporting components of our stack are Mulesoft (integrations, APIs) and Splunk (monitoring, alerting) and Terraform (IaC).
Responsibilities
    • Working primarily within AWS, deliver analytics solutions, including collecting data from providers, building transformations and integrations, persisting within repositories, and distributing to consuming systems.
    • Transition Minimally Viable Product (MVP) solutions into operationally hardened systems, including introducing re-useable objects and patterns to drive automation, maintainability and supportability.
    • Activity contribute to platform maturity: in partnership with technical leads, develop and improve coding standards, processing frameworks, quality automation, and CI/CD.
    • As a member of the Client Analytics Scrum Team, a) provide input into story sizing, backlog refinement, and release planning, b) contribute to solution designs, data analysis, coding, testing, and c) perform ongoing support, and maintenance of deployed products.
    • Liaison with counterpart technical and business teams (e.g., Cloud Ops, DBAs, SFDC, Enterprise Data Services, Data Governance, Financial Reporting, and Business Unit technology and analytics teams) to address interdependencies with multi-team initiatives.

Qualifications
    • A minimum of 4 years of experience delivering software solutions
    • A minimum of 2 years of experience delivering analytics, reporting or business intelligence solutions
    • Proficient in Python
    • Proficient in SQL
    • Strong, hands-on technical skills and self-directed problem solving
    • Preferred: Experience developing in big data technologies (cloud based analytics, Hadoop, NoSQL)
    • Preferred: Experience maturing production systems (introducing QA automation, fault tolerance, self-recovery)
    • Preferred: Experience developing within AWS, especially EMR, Glue, Lambda, SNS, SQS
    • Preferred: Experience developing in Spark (Spark Streaming, Dataframes, Datasets)
    • Desired: Experience in working as a member of Agile teams (Scrum)

Education
Bachelors degree in Information Systems, Computer Science, or Engineering.
EHire
Be Great | Give Back | Be True | Follow Purpose | Have Fun
ehire.com/jobs
Our Company is committed to the principles of equal employment. We are committed to complying with all federal, state, and local laws providing equal employment opportunities, and all other employment laws and regulations. It is our intent to maintain a work environment which is free of harassment, discrimination, or retaliation because of sex, gender, race, religion, color, national origin, physical or mental disability, genetic information, marital status, age, sexual orientation, gender identity, military service, veteran status, or any other status protected by federal, state, or local laws. The Company is dedicated to the fulfillment of this policy in regard to all aspects of employment, including but not limited to recruiting, hiring, placement, transfer, training, promotion, rates of pay, and other compensation, termination, and all other terms, conditions, and privileges of employment.
Delivery Hero SE
  • Berlin, Germany

Are you passionate about food, data and intelligent applications? 


Delivery Hero is building the next generation global online food-delivery platform, with data at the center of delivering amazing food experiences. 


We’re a truly global team, working across 45 countries to ensure our customers are able to find, order and receive their favourite food in the fastest way possible. Since we started our journey in 2011, Delivery Hero has become the world’s largest food-delivery network, and we’re focused on a culture of growth, in both size and opportunities. 


If you’re an enthusiastic, creative problem solver, hungry for a new adventure, an exciting job and an international workplace is waiting for you in the heart of Berlin!


Your mission:



  • Oversee the mapping of data sources, data movement, interfaces, and analytics, to ensure data quality, data and feature agility and compliance.

  • Develop and maintain logical and physical data models and assist with the definition of process models.

  • Identify the key facts and dimensions necessary to support the business and requirements, performing the activities necessary to support the standardization of entities and attributes.

  • Develop entity and attribute descriptions and definitions for the models and facilitate the resolution of inconsistencies and conflicts in data models. 


Required qualifications:



  • MS in Information Technology, Computer Science, Software Engineering, Mathematics or related.

  • Extensive expertise (10+ years) in leading, designing, developing, testing, maintaining, implementing, monitoring, supporting, and documenting data architecture and data modeling (normalized, dimensional, logical, and physical) solutions for Big Data systems, Enterprise Data Warehouses or Enterprise Data Marts.


Your heroic skills:



  • Advanced expertise (8+ years) of data design and modeling on relational systems (Oracle, MsSQL, MySQL, ...).

  • Practical knowledge (4+ years) of data design and modeling on NoSQL environments (Cassandra, MongoDB, Redis, HBase and related).

  • Practical knowledge (4+ years) of BigData technologies in a cloud environment (Hadoop, Spark, BigQuery, RedShift, Athena and related).

  • Practical knowledge (4+ years) of formal object, schema, API and ERD modeling languages (e.g. UML, XSD, JsonSchema and similar) and tools (ER Studio, ERwin).

  • Exposure (2+ years) to master data management, data compliance, change management and data quality management in a BigData environment.

  • Experience (2+ years) curating and managing entities and attributes information with highly domain-specific semantics.

  • Familiarity (2+ years) with agile practices and processes (Scrum).


Aptitudes for success:



  • Strive in complex, fast moving environments.

  • Strong oral, written and interpersonal communication skills.

  • Able to express complex business and technical cases in a clear and didactic way.

  • Proficient at adapting interactions to the participants background and proficiency levels.

  • Comfortable facilitating negotiations and shared understanding between stakeholders.


We offer you:



  • English is our working language, but our colleagues at Delivery Hero come from every corner of the globe providing an incredibly diverse, international working atmosphere with cross-cultural teams.

  • A modern, recently refurbished office in the heart of Berlin.

  • We offer flexible working hours to fit around your personal or family life in case you have to drop off your kids at kita or perhaps you just want to come in to work a little later.

  • Great career opportunities following our development career plan.

  • Being part of a global family under the Delivery Hero umbrella, we can offer you the safety of a large company including a pension scheme and stability.

  • Moving can be stressful so to help you settle in we provide a relocation package including visa help, temporary accommodation and a budget to applicants from abroad.

  • Our building has several kitchens where you’ll find fresh fruit, cereals, juice/ drinks, tea, coffee, etc.

  • On nearly every floor you’ll find a kicker or table tennis (or a lounge, and even a nap room) for when you need to take a break.

  • Being a food-ordering company, in between playing and working hard, we provide a generous number of monthly vouchers to use for ordering food on our platforms when you get the munchies.

  • Sprichst du kein Deutsch? No worries, we provide German classes for those expats who want to expedite their integration.

  • We know that experience at the office with interesting and diverse problems, colleagues and technologies isn’t the only way to learn, so our employees get an education budget to attend conferences or trainings locally or around Europe to satisfy their pursuit of knowledge.

  • Every Friday we have a team integration, when we just stop working earlier, grab a beer and relax with colleagues.

  • Sometimes we just enjoy our times together with team events and company events.


At Delivery Hero, we value diversity as a key element of our success. We are an equal opportunity employer, and welcome all regardless of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.


Have we caught your attention? Then please send us your application including cover letter, CV, salary expectations and earliest starting date. We’re looking forward to your application!

Drip
  • Minneapolis, MN

Join Drip as we positively impact tens of thousands of passionate users and marketing professionals around the globe.


Drip is the worlds first ECRM: an ecommerce CRM that enables retailers to build personal and profitable relationships with their customers at scale. We believe the world needs specialty retail and Drip is the platform that enables competition with the impersonal Ecommerce giants. We are well-funded, growing super fast, and building a beautiful product. We have offices in Minneapolis and Greater Salt Lake.
www.drip.com/about


About the role

As a Senior Data Engineer, you will be part of a brand new team on a mission to turn data into actionable insights. Your job is twofold:  

  1. Leverage data to help our customers be more efficient and effective in their communications than ever before.
  2. Leverage data to influence our product roadmap to ensure were building a world-class product our customers love.

You will be working arm in arm with a winning team of data scientists and engineers that move quickly, get things done, and love what they do. Your contributions will include:

  • Developing efficient and maintainable data pipelines in AWS
  • Architecting scalable and maintainable data-intensive applications complete with automated testing
  • Optimizing data flows for simplicity and efficiency of computing resources
  • Writing and maintaining high-throughput APIs


About you


You live and breathe big-data.
You have deep knowledge of the following concepts with hands-on experience with many popular technologies:

  • Event stream processing (Kinesis, Kafka, RabbitMQ, SQS)
  • ETL (AWS Glue, Airflow)
  • Massive parallel processing (Apache Spark, EMR, Hadoop)
  • Data warehousing (RedShift, RedShift Spectrum, Athena, columnar storage)
  • Databases (Postgres, MongoDB, DyanomoDB)
  • Software development (TDD, API development, Python)

You care about the customer.
You are proud of your attention to detail and exceedingly aware of how what youre building will be used. Were looking for engineers who are passionate about solving problems for real humans.

You strive for simplicity.
You know there are no points for clever. Whether youre using a product or designing data pipelines consuming petabytes of data, you often think of ways to make it simpler. Youre the one in the room to point out if we did it this way, we could get 90% of the way there in 50% of the time.

You move fast.
Perfect is the enemy of done. You are often embarrassed by the first version of something youve built but proud of how fast you released it. Speed is your superpower and you iterate quickly and often.

You get things done.
Not my job is not in your vocabulary. Youre the one who steps up to the plate and you are not afraid to get your hands dirty. You set your own goals and meet them.

Finding talented engineers like you is an ongoing effort so even if you are not available now, wed still love to hear from you!


Who we are:

We consider working for a successful early-stage tech company to be a lifestyle choice rather than a job choice. You will work hard and face exciting challenges, but our positions come with amazing advantages and fulfillment to those who earn them. If you bring your best self to the table, heres what well bring in return:

  • Competitive pay, benefits, and equity
  • A fully-stocked snack bar and drink fridge to keep you fueled
  • Relocation reimbursement, as needed. Please note that residence in or relocation to Minnesota / Utah is our one non-negotiable
  • The chance to learn from some of the best people in the business, including our fiercely compassionate leadership team
  • Challenging and meaningful problems to solve - you will invariably make a difference and impact
  • A vibrant and devoted team, who still finds time for fun
  • Finally, no politics and no jerks


At Drip, we strive to create an inclusive workplace that upholds the dignity of all people. We value, respect, and celebrate everyones individualities and honor their unique strengths from all different walks of life. We believe that embracing the diversity of thought and perspective encourages collaboration that leads to product (and people!) innovation, diverse products and a successful business.

Drip
  • Minneapolis, MN

Join Drip as we positively impact tens of thousands of passionate users and marketing professionals around the globe.


Drip is the worlds first ECRM: an ecommerce CRM that enables retailers to build personal and profitable relationships with their customers at scale. We believe the world needs specialty retail and Drip is the platform that enables competition with the impersonal Ecommerce giants. We are well-funded, growing super fast, and building a beautiful product. We have offices in Minneapolis and Greater Salt Lake.
www.drip.com/about


About the role

As a Senior Data Engineer, you will be part of a brand new team on a mission to turn data into actionable insights. Your job is twofold:  

  1. Leverage data to help our customers be more efficient and effective in their communications than ever before.
  2. Leverage data to influence our product roadmap to ensure were building a world-class product our customers love.

You will be working arm in arm with a winning team of data scientists and engineers that move quickly, get things done, and love what they do. Your contributions will include:

  • Developing efficient and maintainable data pipelines in AWS
  • Architecting scalable and maintainable data-intensive applications complete with automated testing
  • Optimizing data flows for simplicity and efficiency of computing resources
  • Writing and maintaining high-throughput APIs


About you


You live and breathe big-data.
You have deep knowledge of the following concepts with hands-on experience with many popular technologies:

  • Event stream processing (Kinesis, Kafka, RabbitMQ, SQS)
  • ETL (AWS Glue, Airflow)
  • Massive parallel processing (Apache Spark, EMR, Hadoop)
  • Data warehousing (RedShift, RedShift Spectrum, Athena, columnar storage)
  • Databases (Postgres, MongoDB, DyanomoDB)
  • Software development (TDD, API development, Python)

You care about the customer.
You are proud of your attention to detail and exceedingly aware of how what youre building will be used. Were looking for engineers who are passionate about solving problems for real humans.

You strive for simplicity.
You know there are no points for clever. Whether youre using a product or designing data pipelines consuming petabytes of data, you often think of ways to make it simpler. Youre the one in the room to point out if we did it this way, we could get 90% of the way there in 50% of the time.

You move fast.
Perfect is the enemy of done. You are often embarrassed by the first version of something youve built but proud of how fast you released it. Speed is your superpower and you iterate quickly and often.

You get things done.
Not my job is not in your vocabulary. Youre the one who steps up to the plate and you are not afraid to get your hands dirty. You set your own goals and meet them.

Finding talented engineers like you is an ongoing effort so even if you are not available now, wed still love to hear from you!


Who we are:

We consider working for a successful early-stage tech company to be a lifestyle choice rather than a job choice. You will work hard and face exciting challenges, but our positions come with amazing advantages and fulfillment to those who earn them. If you bring your best self to the table, heres what well bring in return:

  • Competitive pay, benefits, and equity
  • A fully-stocked snack bar and drink fridge to keep you fueled
  • Relocation reimbursement, as needed. Please note that residence in or relocation to Minnesota / Utah is our one non-negotiable
  • The chance to learn from some of the best people in the business, including our fiercely compassionate leadership team
  • Challenging and meaningful problems to solve - you will invariably make a difference and impact
  • A vibrant and devoted team, who still finds time for fun
  • Finally, no politics and no jerks


At Drip, we strive to create an inclusive workplace that upholds the dignity of all people. We value, respect, and celebrate everyones individualities and honor their unique strengths from all different walks of life. We believe that embracing the diversity of thought and perspective encourages collaboration that leads to product (and people!) innovation, diverse products and a successful business.

X-Mode Social
  • Reston, VA

X-Mode Social, Inc. is looking for a full-time back-end developer to work on X-Mode's data platform and join our rapidly growing team. For this position, you can either work remotely OR in our Reston, VA Headquarters.


WHAT YOU'LL DO:




    • Use big data technologies, processing frameworks, and platforms to solve complex problems related to location

    • Build, improve, and maintain data pipelines that ingest billions of data points on a daily basis

    • Efficiently query data and provide data sets to help Sales and Client Success teams' with any data evaluation requests

    • Ensure high data quality through analysis, testing, and usage of machine learning algorithms



WHO YOU ARE:




    • 1+ years of Spark and Scala experience

    • Experience working with very large databases and batch processing datasets with hundreds of millions of records

    • Experience with Hadoop ecosystem, e.g. Spark, Hive, or Presto/Athena

    • Real-time streaming with Kinesis, Kafka or similar libraries

    • 4+ years working with SQL and relational databases

    • 4+ years Linux experience

    • 2 years working with cloud services, ideally in AWS

    • Self-motivated learner who is willing to self-teach

    • Self-starter who can maintain a team-centered outlook

    • BONUS: Experience with Python, Machine Learning, and Apache Elasticsearch or Apache Solr

    • BONUS: GIS/Geospatial tools/analysis and any past experience with geolocation data



WHAT WE OFFER:




    • Competitive Salary

    • Medical, Dental and Vision

    • 15 Days of PTO (Paid Time Off)

    • Lunch provided 2x a week 

    • Snacks, snacks, snacks!

    • Casual dress code

    • Free Parking on-site


The Wellcome Trust Sanger Institute
  • Cambridge, UK




Salary range: £36,000-£44,000 per annum depending on experience plus excellent benefits. Fixed Term Contract for 3 Years.

Open Targets has recently launched Open Targets Genetics (https://genetics.opentargets.org), a portal that aggregates large scale GWAS data with functional genomics data to identify potential drug targets at disease-associated loci.

A Statistical Geneticist role funded by Open Targets, is available at the Wellcome Sanger Institute in a new team under the leadership of Dr. Maya Ghoussaini. This is an exciting opportunity for you to participate in the enhancement of the existing Open Targets Genetics Portal through de development of new functionality and features.

You will actively engage in the integration of new eQTL datasets and tissue-specific chromatin interaction datasets.

You will have the opportunity to work across a range of analysis such as:
  • Aggregate large scase GWAS data from multiple consortia and across a wide range of disease and traits.
  • Perform association analysis on UK Biobank data with a particular focus on therapeutic areas important for Open Targets
  • Work together with other members of the Open Targets team on statistical genetics analysis for large scale sequence analysis
  • Work with existing members of the team to integrate genetic and cell-specific genomic data to identify and validate causal links between targets and diseases and improve the Genetics Portal.


We welcome candidates with a background in statistical genetics or relevant discipline with advanced level of programming skills suitable for statistical genetic analyses of complex diseases. Experience in functional genomics data analysis is highly desirable. You will have the opportunity to interact with active computational and experimental research teams using cutting edge genomic techniques.

Essential Skills

  • PhD in Statistical Genetics, Computational Biology or a closely related discipline.
  • Advanced level programming skills suitable for statistical genetic analyses, such as R, Python, MATLAB.
  • Firm grounding in statistical methods of complex disease genetics such as genome wide association studies, fine-mapping, high-throughput expression data, whole exome/genome sequencing, PheWAS, Mendelian Randomisation.
  • Previous experience in working with large-scale genetic datasets.
  • Ability to work to tight timelines.
  • Demonstrable good project management and organisational skills.
  • Fluent in written and spoken English.
  • Ability to communicate ideas and results effectively.
  • Ability to work independently and organise own workload.


Ideal Skills

  • Experience in functional genomics data analysis (RNAseq, ChIPseq, etc);
  • Experience with generating reproducible bioinformatics pipelines;
  • A strong track record in preparing publications and other written materials;
  • Interest in target validation and translational research.


Other information



Open Targets is a pioneering public-private initiative between GlaxoSmithKline (GSK), Biogen, Takeda, Celgene, Sanofi, EMBL-EBI (European Bioinformatics Institute) and the WSI (Wellcome Sanger Institute), located on the Wellcome Genome Campus in Hinxton, near Cambridge, UK.

Open Targets aims to generate evidence on the biological validity of therapeutic targets and provide an initial assessment of the likely effectiveness of pharmacological intervention on these targets, using genome-scale experiments and analysis. Open Targets aims to provide an R&D framework that applies to all aspects of human disease, to improve the success rate for discovering new medicines and share its data openly in the interests of accelerating drug discovery.

Genome Research Limited is an Equal Opportunity employer. As part of our dedication to gender equality and promoting women's careers in science, we hold an Athena SWAN Bronze Award. We will consider all qualified applicants without discrimination on grounds of disability, sexual orientation, pregnancy or maternity leave status, race or national or ethnic origin, age, religion or belief, gender identity or re-assignment, marital or civil partnership status, protected veteran status (if applicable) or any other characteristic protected by law.

Please include a covering letter and CV with your application

Closing date: 28th February, however applications will be reviewed on an ongoing basis and therefore the post may be filled before the deadline.
Asurion
  • Sterling, VA

At Asurion, we don’t just redefine—we reinvent. We began by establishing a culture that rewards results and isn’t confined by hierarchy. As a result we have achieved phenomenal growth.  Today, this entrepreneurial spirit is as strong as ever. It’s in our DNA. We foster a culture where our team members are encouraged daily to make a difference—for our clients, customers, and themselves. Our dynamic and rewarding environment ensures that each of our 17,000+ team members has the opportunity to reach their full potential, while at the same time fulfilling the needs of more than 280 million consumers.


We value open source technologies, solve challenging and unique problems, and innovate quickly. We embrace continuous delivery and Lean Startup principles.  We encourage creativity from our engineers every step of the way, working with various teams including product, user experience, call center operations, mobile and systems. Our teams are small enough to make fast decisions, yet our audience is large enough that our work makes a tremendous impact.


Do you know how to write robust and reliable systems? Can you ensure performance, quality and security aspects while delivering a product that delivers an awesome user experience? Do you enjoy discussing innovative ideas with your peers, coming up with great product solutions and passing on your knowledge to others frequently?  We're looking for developers who are passionate about developing great software, have a love for solving hard problems, and enjoy learning about new technology.  If this sounds like you, get in touch!


In This Role, You Will



  • Work with the Cloud Computing subject matter experts (SMEs) to design and manage the cloud technologies environments

  • Assist in evaluating, designing, configuring and implementing software components within assigned technology areas or projects

  • Perform research and development efforts pertaining to development of new features and capabilities

  • Lead efforts to convert existing code from Tibco and Oracle to Node , Spark , Glue and Aurora.


Other responsibilities include serving as a technical point of escalation for the cloud technology areas, coordinating work with engineering and operational staff to optimize existing technologies in our environment, providing feedback to improve workgroup effectiveness and ensuring that the tools and processes are in place to manage performance, capacity, availability, and quality.


 Our Ideal Candidate :



  • Bachelor’s degree in Computer Science or a related field and 2+ years of software development experience, or an equivalent combination of education and experience may be considered



  • 1+ year(s) experience with Amazon Web Services including EC2, VPC, S3. RDS, IAM, ELB, DynamoDB, and CloudWatch



  • Demonstrated ability to build high performance multi-platform applications and robust APIsin an agile environment



  • 2+ year(s) Experience with a mixture of the following technologies:

  • AWS(EC2, Lambda, Glue, ECS, EMR, CloudFormation, S3, SQS, SNS) or similar cloud platforms

  • Apache Spark, with functional knowledge of either Python or Scala

  • Experience with or working knowledge of NodeJS

  • Working knowledge of AWS Big Data Services (Redshift, Athena, Hive etc.)

  • Container-based (ECS) or microservice development

  • PostgreSQL, MySQL or another RDBMSSpark



  • Experience with Automated testing including unit, integration, API, contract, and end-to-end testing

    • Knowledge of source control and versioning tools (Git)

    • Ability to build and innovate tools or automation to replace operating manual processes, deployment, and operational tasks

    • Understanding of AWS IaaS and PasS offerings.

    • Understanding accessibility and security compliance

    • Bring critical thinking and new ideas to the team to contribute to the continued evolution and improvement of our systems

    • Team player that embraces change, and can deliver work in incremental biweekly sprints.



HOOQ
  • Singapore
  • Salary: SGD 48k - 96k

Homegrown stories and hollywood hits. At HOOQ we’re telling millions of stories to billions of people across Singapore, Philippines, Thailand, Indonesia, and India. Just like our content, our team comes from around the world, we’re ambitious, driven and unique, we embrace the difference. There’s many paths people take to join us, what links us together is our love of stories.


HOOQ is backed by some of the biggest players in entertainment, we’re a joint venture between Singtel, Sony and Warner Brothers. We build for the customer first to deliver original, local and international content on their phone, tablet, computer and television wherever they are. 


It’s an exciting time to be at HOOQ! We are currently seeking experienced and energetic Data Engineers at different levels for our Singapore office with experience in dealing with large volumes, variety and velocity of Data. Data team is tasked with absorbing billions of rows of data from dozens of sources, organizing them, analyzing them, and visualizing them to help inform both short- and long-term decision-making.


Primary responsibilities



  • Design, implement and support an analytical data infrastructure providing ad-hoc access to large datasets and computing power.

  • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies.

  • Creation and support of real-time data pipelines built on AWS technologies including EMR, Glue, Kinesis, Redshift/Spectrum and Athena

  • Supporting existing ETL/ELT infrastructure built on Pentaho, Python, EMR

  • Continual research of the latest big data, elasticsearch technologies to provide new capabilities and increase efficiency

  • Working closely with team members to drive real-time model implementations for monitoring and alerting of systems.

  • Collaborate with other tech teams to implement advanced analytics algorithms that exploit our rich datasets for statistical analysis, prediction, clustering and machine learning

  • Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers


Your CV should show:



  • 4+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets

  • Demonstrated strength in data modeling, ETL development, and data warehousing

  • Experience in programming in Python

  • Experience using big data technologies (Hadoop, Hive, Hbase, Spark etc.)

  • Experience using business intelligence reporting tools (Tableau, Cognos etc.)

  • Knowledge of data management fundamentals and data storage principles

  • Knowledge of distributed systems as it pertains to data storage and computing

  • Experience working with AWS big data technologies (Redshift, S3, EMR)

  • Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations

ConocoPhillips
  • Houston, TX
Our Company
ConocoPhillips is the worlds largest independent E&P company based on production and proved reserves. Headquartered in Houston, Texas, ConocoPhillips had operations and activities in 17 countries, $71 billion of total assets, and approximately 11,100 employees as of Sept. 30, 2018. Production excluding Libya averaged 1,221 MBOED for the nine months ended Sept. 30, 2018, and proved reserves were 5.0 billion BOE as of Dec. 31, 2017.
Employees across the globe focus on fulfilling our core SPIRIT Values of safety, people, integrity, responsibility, innovation and teamwork. And we apply the characteristics that define leadership excellence in how we engage each other, collaborate with our teams, and drive the business.
Description
The Sr. Analytics Analyst will be part of the Production, Drilling, and Projects Analytics Services Team within the Analytics Innovation Center of Excellence that enables data analytics across the ConocoPhillips global enterprise. This role works with business units and global functions to help strategically design, implement, and support data analytics solutions. This is a full-time position that provides tremendous career growth potential within ConocoPhillips.
Responsibilities May Include
  • Complete end to end delivery of data analytics solutions to the end user
  • Interacting closely with both business and developers while gathering requirements, designing, testing, implementing and supporting solutions
  • Gather business and technical specifications to support analytic, report and database development
  • Collect, analyze and translate user requirements into effective solutions
  • Build report and analytic prototypes based on initial business requirements
  • Provide status on the issues and progress of key business projects
  • Providing regular reporting on the performance of data analytics solutions
  • Delivering regular updates and maintenance on data analytics solutions
  • Championing the data analytics solutions and technologies at ConocoPhillips
  • Integrate data for data models used by the customers
  • Deliver Data Visualizations used for data driven decision making
  • Provide strategic technology direction while supporting the needs of the business
Basic/Required
  • Legally authorized to work in the United States
  • 5+ year of Structure Querying Language experience (ANSI SQL, T-SQL, PL/SQL)
  • 5+ years hands-on experience delivering solutions with an Analytics Tools i.e. (Spotfire, SSRS, Power BI, Tableau, Business Objects)
  • 5+ years of Oil and Gas Industry experience
  • 7+ years of related IT experience
Preferred
  • Bachelor's Degree in Information Technology or Computer Science
  • 5+ years hands-on experience delivering solutions with Informatica PowerCenter
  • 5+ years architecting data warehouses and/or data lakes
  • 5+ years with Extract Transform and Load (ETL) tools and best practices
  • 3+ years hands-on experience delivering solutions with Teradata
  • 1+ years developing analytics models with R or Python
  • 1+ years developing visualizations using R or Python
  • Experience with Oracle (11g, 12c) and SQL Server (2008 R2, 2010, 2016) and Teradata 15.x
  • Experience with Hadoop technologies (Hortonworks, Cloudera, SQOOP, Flume, etc.)
  • Experience with AWS technologies (S3, SageMaker, Athena, EMR, Redshift, Glue, etc.)
  • Thorough understanding of BI/DW concepts, proficient in SQL, and data modeling
  • Familiarity with ETL tools (Informatica, etc.) and ETL processes
  • Solutions oriented individual; learn quickly, understand complex problems, and apply useful solutions
  • Ability to work in a fast-paced environment independently with the customer
  • Ability to work as a team player
  • Ability to work with business and technology users to define and gather reporting and analytics requirements
  • Strong analytical, troubleshooting, and problem-solving skills experience in analyzing and understanding business/technology system architectures, databases, and client applications to recognize, isolate, and resolve problems
  • Demonstrates the desire and ability to learn and utilize new technologies in data analytics solutions
  • Strong communication and presentation skills
  • Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right
  • Delivers results through realistic planning to accomplish goals
  • Generates effective solutions based on available information and makes timely decisions that are safe and ethical
To be considered for this position you must complete the entire application process, which includes answering all prescreening questions and providing your eSignature on or before the requisition closing date of January 16, 2019.
Candidates for this U.S. position must be a U.S. citizen or national, or an alien admitted as permanent resident, refugee, asylee or temporary resident under 8 U.S.C. 1160(a) or 1255(a) (1). Individuals with temporary visas such as A, B, C, D, E, F, G, H, I, J, L, M, NATO, O, P, Q, R or TN or who need sponsorship for work authorization in the United States now or in the future, are not eligible for hire.
ConocoPhillips is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, national origin, age, disability, veteran status, gender identity or expression, genetic information or any other legally protected status.
Job Function
Information Management-Information Technology
Job Level
Individual Contributor/Staff Level
Primary Location
NORTH AMERICA-USA-TEXAS-HOUSTON
Organization
ANALYTICS INNOVATION
Line of Business
Corporate Staffs
Job Posting
Jan 2, 2019, 12:01:00 AM